Today, Sony released a new technical presentation from PlayStation architect Mark Cerny, in which he took a deep dive into the PS5 Pro – in particular, the system’s new AI upscaling tech, PlayStation Spectral Super Resolution. During the talk, Cerny also announced that Sony and AMD are entering into a strategic partnership, called Project Amethyst, with the aim of advancing game graphics through machine learning – that is, the tech behind Sony’s PSSR and AMD’s FSR.
Last month, I had the opportunity to visit PlayStation headquarters and preview today's presentation, as well as sit down with Cerny, getting answers to a few of our lingering questions about PS5 Pro, as well as more details about Project Amethyst – and how it could be relevant not just for the future of game graphics on PlayStation, but also Xbox and PC.
The following conversation has been lightly edited for length and clarity.
IGN: You talked about there being a difference between RDNA 2 and RDNA 3 and how you wanted to stick to RDNA 2 because you didn't want to create a lot of work for developers upgrading to RDNA 3.
Mark Cerny: So, it's 2.X, right? So we asked what are the features you can take from RDNA 3 that won't cause a lot of work for developers. And so we took those.
My question is, why is that an issue on console, whereas on PC with PC games, you can just plop in a new generation GPU and it just works.
Drivers in the console world tend to be very, very thin. That's viewed as one of the benefits of consoles – that you can take full advantage of all of the hardware features. But that does mean that when you move from generation to generation, there's more work. But you would see issues in the PC world, too. If you had a new card, you would need to either be compiling all of your shaders live, which of course, can lead to hitches, or you'd have to have some strategy for supplying all of these shaders compiled to that new GPU.
I love the term flopflation. Did you come up with that term? Or is that something from the marketing department?
Yeah, I came up with that. I'm not sure how much credit I want to take, but yes, I came up with that.
So on that topic though, how do you feel about the general teraflops arms race? I know that there are a lot of different factors that go into gaming performance, but then, on the other hand, the general public likes to have a hard kind of number that they can latch on to. What's your take on it?
Well actually, I don't see that happening in the PC world. In the PC world, I don't think consumers talk much about teraflops. It seems to be just for consoles and just in recent generations. As I've said many times, teraflops are not a good indication of GPU performance.
Tell me more about the Amethyst partnership. What kind of information is being shared between the companies? And how does this partnership differ from the existing relationship between SIE and AMD with regards to how you've built PS5 on their hardware and previous consoles, what's new?
So, first, I should give the nature of the collaboration. There are two targets we are working on with AMD. One is: better hardware architectures for machine learning. And that is not about creating proprietary technology for PlayStation – the goal is to create something that can be used broadly across PC and console and cloud. The other collaboration is: with regards to these lightweight CNNs for game graphics. So you know, the sorts of things that are used in PSSR and perhaps the sorts of things that would be used in future FSR.
So that's the broad nature of the collaboration. Then, what kind of information is being shared between the companies to achieve those goals?
We are directly collaborating on both of those goals.
Does that mean that we can expect the findings from that collaboration to be reflected in future AMD hardware that isn't necessarily PlayStation hardware?
Absolutely. This is not about creating proprietary technology or hardware for PlayStation.
To extend a step even further, does that mean it could be used for Xbox Hardware, potentially?
It can be used by anyone who wants to use it.
Who initiated the partnership? Was it Sony wanting to get more in bed with AMD? Or were they coming to you and saying how can we learn from your experience on the gaming side?
Well, they are long-term partners, and it was very clear that we were going after similar goals.
Let's go back to the PS5 Pro. I want to talk about some of the internals. You didn't mention the CPU at all in the presentation today. Is there any difference in the PS5 Pro's CPU, from the base PS5?
There are some smaller improvements throughout, and the CPU clock frequency is one of those improvements. If you want the CPU to run at a 10% higher clock frequency, you can have it run at a 10% higher clock frequency.
And that is?
It would be 3.85 GHz.
Architecture-Wise. It's still on…
Zen 2. It's the same Zen 2 CPU.
As for the AI upscaler that you're using for PSSR – is that a discrete piece of hardware or is it built into the GPU itself?
We needed hardware that had this very high performance for machine learning. And so we went in and modified the shader core to make that happen. Specifically, as far as what you touch on the software side, there are 44 new machine learning instructions that take a freer approach to register RAM access. Effectively, you're using the register RAM as RAM. And also implement the math needed for the CNNs.
To put that differently, we enhanced the GPU. But we didn't add a tensor unit or something to it.
You mentioned frame generation, but it sounded like that was not the focus of the machine learning aspects of the PS5 Pro. Is there any sort of frame generation tech in the PS5 Pro?
On PS5 Pro, at this time, all we have is PSSR, which is super resolution. We are fantastically interested in all of the things that can be done with machine learning to make game graphics better. And I think it's pretty obvious to everyone that frame generation, frame extrapolation, and ray tracing denoising are also very interesting targets.
Ray tracing is notably more performance demanding than traditional rasterization and traditional reflections and lighting techniques. Do you feel that the games communities' interest in Ray tracing, as you mentioned in the talk, is worth continuing development in that direction?
Ray tracing is not one thing, it's many things. You can use ray tracing for audio queries – that's pretty inexpensive – or you can use ray tracing to improve your lighting, which is a bit more expensive but not tremendously so. You can do reflections, and because the reflections can be done at lower resolution, that can be done without breaking the budget as well. And then if you keep going, you end up at path tracing where your technology is fundamentally based on ray tracing, and if you don't have incredibly performative ray tracing hardware, you won't get too far with it. So what we're doing is we're providing tools to the developers and allowing them to work out where on that spectrum they'd like to be.
In the development of PS5 Pro, which started roughly four years ago… Knowing what you know now about the advancements that you found along the way, and everything that has happened in the four years since, what are the main things that you would have liked to get into the PS5 Pro, technology-wise, that you were not able to, that are now "next on your hit list"?
I can tell you, it was an incredible education to build the machine learning hardware for PlayStation 5 Pro. And rather than say "oh, we'd have done it differently," I look at it as we now have a good understanding of how this works, and as a result anything we do in the future has significantly more potential.
You did also mention the prospect of building it yourself versus buying or outsourcing the technology. Could you elaborate on the thought process there?
One very simple way to look at this is: are we taking the next roadmap AMD technology, or are we, in fact, going in and trying to design the circuitry ourselves – and we chose the latter. And we chose the latter because we really wanted to start working in the space ourselves. It was clear that the future was very ML-driven. And by that, you know, the world's talking about LLMs and generative AI, but I'm really mostly just looking at game graphics and the boost for game graphics we can get. So, based on that, we wanted to be working in that space.
More broadly, what is your opinion of the world's recent fascination with AI as a buzzword?
Well, we are living in very interesting times.
That is certain.
I will say, these are very different issues. So you might see something like a smartphone that comes out with AI capabilities or a laptop that comes out with AI capabilities, and it might not be clear immediately what those capabilities are going to get you as a consumer. But the console space is delightfully simple for this, because there's several things that have already been proven to have high benefit – if you just have sufficiently powerful hardware.
To jump to the future then, you said the console development is a roughly four-year process. Does this mean we can imply that work has begun on the PS6?
We are not discussing PS6 at this time.
Can we expect it in roughly four years?
Same answer.