Playstation 5 [PS5] [Release November 12 2020]

View attachment 3676

Naughty Dog seem to be very happy with it. Can’t wait for their new PS5 game, available in stores in time for the holidays 2026.
Either he is bullshitting or he hasn't had a very long career cause no matter how big of an improvement the SSD is, there has been way more impressive things that happened in even the last 10 years that are way more revolutionary. The drive probably barely makes up for the fact that consoles only got a 2x increase in memory this gen while the previous increase was 16x. If this gen consoles came out with 128GB of ram, then it would be equivalent to what happened in the jump from last gen and I am willing to bet devs would much rather have 128GB of ram than w/e this SSD tech can do.
 
Last edited:
Either he is bullshitting or he hasn't had a very long career cause no matter how big of an improvement the SSD is, there has been way more impressive things that happened in even the last 10 years that are way more revolutionary. The drive probably barely makes up for the fact that consoles only got a 2x increase in memory this gen while the previous increase was 16x. If this gen consoles came out with 128GB of ram, then it would be equivalent to what happened in the jump from last gen and I am willing to be devs would much rather have 128GB of ram than w/e this SSD tech can do.
128GB is expensive and overkill
 
128GB is expensive and overkill
Expensive? yes. Overkill? No. 128GB of ram will outperform any SSD streaming solution. 128GB would be a normal console generational leap. The SSD is a cheap solution to work around the problem of not being able to put in 128GB of ram. To claim the SSD as the best thing ever would be ignoring the increases of past generations completely.
 
That text isn't big at all ;) What strikes me the most is that dev didn't know the PS5 SSD spec before march 19th?



Typo?

Probably knew it, probably had input on the design, at least the more senior guys.

But couldn't publicly say anything until after Cerny's presentation?
 
Expensive? yes. Overkill? No. 128GB of ram will outperform any SSD streaming solution. 128GB would be a normal console generational leap. The SSD is a cheap solution to work around the problem of not being able to put in 128GB of ram. To claim the SSD as the best thing ever would be ignoring the increases of past generations completely.
Clamshell 512-bit memory interface, NBD.
 
More tidbits from Jason on PS5's SSD
This is going to lead to weeks and weeks of talk about how Xbox is the most powerful console, and so on. Meanwhile I'm getting texts even today from developers being like this is such a shame -- the PS5 is superior in all these other ways that they're not able to message right now or can't talk about right now. I heard from at least three different people in the past couple of hours since the Cerny thing being like, wow, the PS5 is actually the more superior piece of hardware in a lot of different ways, despite what we were seeing in these spec sheets. So, again, yes, plenty of room to talk about this, for all these companies to keep messaging and showing games, but I do think Sony has really dropped the ball from what we've seen so far.
We really need to see some fine examples of this SSD in action that fully utilizes its potential and how much on screen loading and visual improvements we could see. Cerny could have used some graph to illustrate the increment in different speeds even.
 
More tidbits from Jason on PS5's SSD

We really need to see some fine examples of this SSD in action that fully utilizes its potential and how much on screen loading and visual improvements we could see. Cerny could have used some graph to illustrate the increment in different speeds even.

yeah his presentation was not deep enough. It was weird, feels like a neutered version of a GDC deep dive. Hmm, maybe sony looking for the most appropriate way to properly convey PS5's "secret sauce" to the public. Something that not just for tech heads.
 
ultragpu said:
PS5 is actually the more superior piece of hardware in a lot of different ways

Well, they do say less is more...

Jokes aside, I'm curious if this is actually meaningful at all, hyperbolic as it is, or if it's just literal nonsense.

It would be fascinating to see any additional architectural cleverness. Things like the PS4's improvements to async compute.

In an age where everything's pretty much off the shelf PC hardware, any console specific customisations are very welcome.
 
More tidbits from Jason on PS5's SSD

We really need to see some fine examples of this SSD in action that fully utilizes its potential and how much on screen loading and visual improvements we could see. Cerny could have used some graph to illustrate the increment in different speeds even.
I guess there's too much noise right now, sony had to repeat three times it has hardware raytracing until the journalists stopped saying random crap about it being maybe software. The explanations about the SSD's complete data path will possibly suffer the same thing.

Specs are for devs, they need to show games.

They are not in a position to cut through that noise yet, but they still have 8 months before launch, and games cannot be anywhere near ready at this point. Maybe in june?
 
Yet the talk had no mentions of any separate modes, if they actually do have separate modes that will run all PS4/Pro games as they are, Cerny did horrible job on conveying it to the audience, instead giving the impression that they're not going to even have full backwards compatibility on launch.
There were a number of times where Cerny used the words "expect to see" or "can expect", which I suppose is part of his role as a spokesperson for Sony. At this early stage, it doesn't seem like they have the depth of data to make a material claim about product features like that, in case there's a deviation from expectations given to the public and investors.

Agreed.

Also, I seem to remember that GG were showing off their Decima Engine when they made Horizon, and they had a system where data was streamed in from the HDD based on the camera angles, so it would (try to) load more data as you turned the camera.

Wanna bet Cerny spent a lot more time with them than other devs - for obvious reasons, that engine is Sony's shining moment this gen - and asked them what they believe would make more of an impact in the next gen?
I recall various developers trying virtual texturing or similar methods. I thought the Trials series eventually adopted one variant, and the megatextures from Rage were another.
Partially resident textures as a concept came out in the time period of the original GCN, though it seems like even now the vendors are trying to make it stick.
HBCC attempts to migrate data from lower tiers of memory for similar purposes.

I guess Oberon will dip below 10TF during regular play with variable frequency introduced, no? Because I am pretty sure they would have rather have locked 10TF across all consoles then have it variable.

Even 2-3% drop will result in chip running below 10TF, but this looks better on paper and people will not exactly notice a difference in performance if it drops few GFLOPs lower.
This may depend on how representative the scenario Cerny gave for how the prior generation set its cooling, clock, and power supply levels. If the idea is that the platform as a whole had to make a best-guess at what the worst-case power draw might be, and it could leave performance on the table if too conservative or have loud cooling or reliability issues if too aggressive, then something like the PS5's method can get a much better best-guess. It could come down to how almost all games on the PS4 Pro don't go over 160W, but there's one or two that have been confirmed to go over. If this were the PS5, the possibility of those one or two games wouldn't be causing designers to second-guess their clock targets so much.

Wasn't there the notion that temperatures have no impact on clock boosts? Yes it sounds strange, but that was what was said?
I'd expect it's true within the bounds of whatever ambient temperature range and airflow requirements Sony decides are part of normal operation. A sufficiently capable cooler with ambient temperature at or below the design boundary should be able to pull heat out of on-die hotspots quickly enough that there's no need to modify clocks (for some potentially non-zero but arguably not detectable periods of time).

Anything beyond those limits and the system may consider itself to be in a thermal recovery mode, which is not a new concept. In that state, the console is no longer in the range of what it promised to developers, and will probably start logging warnings or warning the user, if it doesn't hit a thermal trip condition and immediately shut down.
It's possible that under the old system, games that were exceeding expected power were doing so in a way that threatened shutdowns or were potentially compromising the longevity of the system.

So if PS5 is running at a constant power delivery....then is the fan going to be "max" the entire time?
I thought it was funny when Cerny mentioned that one of the most demanding cases he saw was the Horizon map screen, with its simpler geometry. It would have been nice if he had said they were going to give an easy way for developers to put a lid on the demands of their map and menu screens, rather than just boost the console to run the exit prompt at an uncapped frame rate.

Some games may not use avx because the instruction uses too much power? That is one of the dumbest things I’ve heard. SIMD is essential in leveraging maximum performance from a cpu.
Depends if you mean in general, or on the PS5 with a Zen 2 processor. In general, there's a lot of variability in terms of which CPUs support what set of extensions, and for some of the broader AVX types there are many CPUs with very abrupt downclocks for relatively long periods of time just from encountering a single instruction of that type. It's gotten better with more recent cores, particularly with Intel. AMD's processors have generally just lagged in adopting the wider widths until they were mostly handled by reducing boost levels. The PS5's CPU seems to be capped below where such variability would become noticeable in terms of single and all-core boost.

Cerny himself said that ps5 will run at boost clock all the time, except when a game hammer the chips too hard.

So I'm confused, are all of these boost on ps5?

1. PS5 default clock (boost except when power budget limits hit) running ps5 games

2. PS5 default clock running ps4 games (even more boost than ps4 pro running ps4 games). This only has been tested on 100 top games

3. Ps4 games running at ps4 pro boost clock backwards compatibility on ps5. Does this mean ps5 will run at lower clock than the normal boost? Or they will cut the CU counts to maintain compatibility? Both?
Cerny seems to have only addressed what AMD calls boost.
Running at these high speeds seems to be part of where he says games might be experiencing issues, but he didn't say that was all there was to the situation.
I'm pretty sure he's aware about the Pro's unrelated boost mode, but it's possible he's not committing to it yet.
 
Expensive? yes. Overkill? No. 128GB of ram will outperform any SSD streaming solution. 128GB would be a normal console generational leap. The SSD is a cheap solution to work around the problem of not being able to put in 128GB of ram. To claim the SSD as the best thing ever would be ignoring the increases of past generations completely.
You still need something that will ensure 128GB is fed fully by something like....an SSD? Otherwise you will have unused ram.
Also you would have never EVER got 128GB or RAM in a console this generation, just by cost and size alone.
So it was never ever a solution. So why talk about an unrealistic scenario when the SSD is indeed a realistic practical and cost effective solution?
Ideally we would want everything to be better but we live in the real world with lots of limitation.
So yes I d say SSD at this point is one of the greatest things ever because without it you d be stuck with just 16GB of RAM
 
There were a number of times where Cerny used the words "expect to see" or "can expect", which I suppose is part of his role as a spokesperson for Sony. At this early stage, it doesn't seem like they have the depth of data to make a material claim about product features like that, in case there's a deviation from expectations given to the public and investors.


I recall various developers trying virtual texturing or similar methods. I thought the Trials series eventually adopted one variant, and the megatextures from Rage were another.
Partially resident textures as a concept came out in the time period of the original GCN, though it seems like even now the vendors are trying to make it stick.
HBCC attempts to migrate data from lower tiers of memory for similar purposes.


This may depend on how representative the scenario Cerny gave for how the prior generation set its cooling, clock, and power supply levels. If the idea is that the platform as a whole had to make a best-guess at what the worst-case power draw might be, and it could leave performance on the table if too conservative or have loud cooling or reliability issues if too aggressive, then something like the PS5's method can get a much better best-guess. It could come down to how almost all games on the PS4 Pro don't go over 160W, but there's one or two that have been confirmed to go over. If this were the PS5, the possibility of those one or two games wouldn't be causing designers to second-guess their clock targets so much.


I'd expect it's true within the bounds of whatever ambient temperature range and airflow requirements Sony decides are part of normal operation. A sufficiently capable cooler with ambient temperature at or below the design boundary should be able to pull heat out of on-die hotspots quickly enough that there's no need to modify clocks (for some potentially non-zero but arguably not detectable periods of time).

Anything beyond those limits and the system may consider itself to be in a thermal recovery mode, which is not a new concept. In that state, the console is no longer in the range of what it promised to developers, and will probably start logging warnings or warning the user, if it doesn't hit a thermal trip condition and immediately shut down.
It's possible that under the old system, games that were exceeding expected power were doing so in a way that threatened shutdowns or were potentially compromising the longevity of the system.


I thought it was funny when Cerny mentioned that one of the most demanding cases he saw was the Horizon map screen, with its simpler geometry. It would have been nice if he had said they were going to give an easy way for developers to put a lid on the demands of their map and menu screens, rather than just boost the console to run the exit prompt at an uncapped frame rate.


Depends if you mean in general, or on the PS5 with a Zen 2 processor. In general, there's a lot of variability in terms of which CPUs support what set of extensions, and for some of the broader AVX types there are many CPUs with very abrupt downclocks for relatively long periods of time just from encountering a single instruction of that type. It's gotten better with more recent cores, particularly with Intel. AMD's processors have generally just lagged in adopting the wider widths until they were mostly handled by reducing boost levels. The PS5's CPU seems to be capped below where such variability would become noticeable in terms of single and all-core boost.


Cerny seems to have only addressed what AMD calls boost.
Running at these high speeds seems to be part of where he says games might be experiencing issues, but he didn't say that was all there was to the situation.
I'm pretty sure he's aware about the Pro's unrelated boost mode, but it's possible he's not committing to it yet.

This is interesting because seeing HellBalde 2 we will probably much more geometry in gameplay and it means for next-gen games with SSD and probably tons of geometry the case where the GPU will run at more than 10Tflops will maybe be in gameplay moment after all this continue to be true and culling will be important but if you have a small triangle visible you can't cull it.

optimizing-the-graphics-pipeline-with-compute-gdc-2016-51-638.jpg


It will probably been mitigated by compute shading but this is interesting.

For boost mode in backward compatibility I suppose much more games will not support it compared to the PS4 Pro.
 
You still need something that will ensure 128GB is fed fully by something like....an SSD? Otherwise you will have unused ram.

cerny also explains that without fast storage to keep the RAM fed, developers will need to waste RAM space by loading tons of data to RAM instead of streaming it from storage as needed.
 
This is interesting because seeing HellBalde 2 we will probably much more geometry in gameplay and it means for next-gen games with SSD and probably tons of geometry the case where the GPU will run at more than 10Tflops will maybe be in gameplay moment after all this continue to be true and culling will be important but if you have a small triangle visible you can't cull it.



It will probably been mitigated by compute shading but this is interesting.

For boost mode in backward compatibility I suppose much more games will not support it compared to the PS4 Pro.

Raster efficiency maps back to the SIMD hardware as well, since a small triangle leaves many wavefront lanes inactive. For GCN, there would have been more cycles lost due the 4-cycle cadence with very tiny triangles. Navi's improves on this a bit with smaller wavefronts, and so may have more active SIMD cycles as a result.
There seems to be some specific cases regarding instancing where GCN could pack more than one triangle into a wavefront. If the next gen can increase the number of times that happens, maybe the SIMDs could be more efficiently used.

As far as boost mode in the PS4 Pro goes, I don't think games support the mode so much as Sony gives the user the option in a system menu to hope their games won't have problems. Not the most sound foundation to base the PS5's reputation in a new generation.
 
cerny also explains that without fast storage to keep the RAM fed, developers will need to waste RAM space by loading tons of data to RAM instead of streaming it from storage as needed.
Yes exactly, thats a big problem when you ve got low amounts of RAM, because its filled with information that the player cant experience, living even less for what the player can view and interact with. Ideally if you had 128GB of RAM, due to size it wouldnt be as much of a limiting factor. You would "waste" space but there would have still been more space to fill with information that directly interact with the player. But as you pointed, probably too much gigabytes of information sitting idle in expensive RAM when the SSD is cheaper and can feed all available RAM with the useful information. The latter certainly sounds like the efficient and cost effective solution for RAM usage.
 
Either he is bullshitting or he hasn't had a very long career cause no matter how big of an improvement the SSD is, there has been way more impressive things that happened in even the last 10 years that are way more revolutionary. The drive probably barely makes up for the fact that consoles only got a 2x increase in memory this gen while the previous increase was 16x. If this gen consoles came out with 128GB of ram, then it would be equivalent to what happened in the jump from last gen and I am willing to bet devs would much rather have 128GB of ram than w/e this SSD tech can do.
Either he is bullshitting, or maybe - just maybe - 128GB of RAM wouldn't really be a bloody realistic target, now would it!
 
Back
Top