Playstation 5 [PS5] [Release November 12 2020]

Wrong I had the SSD compressed speed 8 GB/s, never gave the real speed because my friend asks me to not do it and I just tell more than 7 GB/s. After I heard by him the PS5 was less powerful than Xbox Series X on paper but I was thinking it will be worse.

I was pointing to the likes of klee, not you. You didnt make a show of it, neither did i, waited to just before. Yes got to hear between 9/10tf, and the clock, doubted him at first, but he could prove it.
Could have been worse yes, in the 8TF range, but rdna2 is clock friendly, kinda.
 
36 CU seems to be very tied to backwards compat.
I'm still unsure of a reason for why it would have capped the CU count. However, it seems reasonable that backwards compatibility is one possible reason why they didn't have fewer.

Honestly, adding this complex boost mode / throttling for some "small" clock adjustment wouldn't make any sense...
The PS5 is on the edge of 10 TF, which a more conservative approach would have dropped below. A downclock of more than 2.5% on the GPU drops it to 9.x.

Primitive shaders are mesh shaders.
Vega's whitepaper proposed a number of future directions that primitive shaders could take, although that generation saw them take direction of being discarded.
Navi has something like the automatically generated culling shaders, although the numbers given fall short of what Vega claimed.
Some of what Cerny discussed might be along the lines of those future directions for Vega. Mesh and primitive shaders do exist in a somewhat similar part of the overall pipeline, but without more details the amount of overlap is still not determined.

By doing last minute chip revisions and clock increases.
Last-minute in this instance would be last-month or last-quarter for chip revisions. That process has long lead times. Just reaching for 10 TF might have been on their mind for longer.

What is disappointing is that only 100 PS4 games will be playable at launch (with no guarantee that others game will be available, only promises).
Seems like they haven't had the resources or the time to test broadly enough. This strikes me as a process that might have been waiting on final silicon or more recent steppings, plus any other components needed like the SSD for full testing. These would be the kinds of tests where the real hardware is needed for appreciable accumulation of testing hours, and those need time and sufficient testers/hardware to accumulate.
As far as a sample goes, going by most hours played isn't really a statistically random sample, so it may not be fully representative. A random sample of 30 games and their overall rate of issues might be a decent indicator of how many games could be expected to work out of the box. Maybe some games like Stardew Valley and Anthem need to be profiled (two games known to hard-lock PS4s and PS4 Pros).

It's not, Cerny just said that to drop 10 % power you just need to drop couple (or was it few) % clocks
That sounds like it's past the comfort zone at least a bit for the hardware.

What will likely happen is that devs will set clock speed targets and stick to them and not rely on boost. This is ideal as you want to deliver a consistent experience across the range and don't want variable based support issues.

While Cerny claims they're using the power as a limiter, it's not that simple. There's no hiding from heat. As the load builds up and heat builds up, you end up drawing more power and needing more voltage to maintain the same frequency. This never scales in your favor due to the relationship.
The boost algorithm probably drops the skin-temp or thermal capacity calculations from AMD's turbo algorithm, which is a likely source of much of the variability. The current-based monitoring and activity counters AMD uses would be the most likely originator's of Sony's idealized SOC, which would be a somewhat conservative model based on the physical characterization of produced chips.
There would be localized hot-spot modeling, but at that time scale overall temperature likely of second-order importance to the algorithm versus the current and activity that's causing a few units to experience accelerated heating in the space of a few ms or us.
I think Sony would need to specify a cooler with a thermal resistance and dissipation capability that is sufficiently over-engineered to let the boost algorithm neglect temps outside of thermal trip scenarios.

Well it saves some money on storage too...not sure how much 175GB of NAND costs...
There would be 50% more NAND chips, though they would be of lower capacity each. Downside for capacity is that the next increment might not be reachable without bumping the capacity of all those chips together.

I do wonder if this will be annoying for a developer to juggle.
You're already having to fit in as much as you can into your game, leveraging your hardware to the maximum. Then when you finally do get the most out of the available cycles of CPU or GPU, you suddenly take a hit and you're playing a weird balancing game.
The algorithm should be more stable than the twitchy mobile or Radeon algorithms, but I'm curious if there are certain complexities in benchmarking performance based on instruction mix, like if certain events or assets might spring a bunch more AVX code rather than total unit counts.
 
What is disappointing is that only 100 PS4 games will be playable at launch (with no guarantee that others game will be available, only promises).

So they made a Frankenstein GPU (that will probably consume as much watts or even more than Xbox with its 20% more powerful GPU) in order to play 4% of PS4 games ? What a waste.

This reminds me a lot of Wii U. But at least Wii U could play all Wii games, not 4% of them.

I read what he said as only 100 games have tested with the boost mode and that all PS4/Pro games would be compatible from the start because it has profiles to exactly mimic them.
 
Ironic people finally got what they wanted, a super fast SSD and BC but now they’re showing so much concerns on CUs, clocks, teraflops, etc. it only goes to show raw TF power and bandwidth are the utmost importance after all, let’s not lie to ourselves any more.
Ps5 is clearly a gimped design, it got its priorities wrong and it tried hard to catch up to the competition. Sony is lucky it didn’t dip to a single digit marketing wise but the hardware hype is just not there, in fact it’s on the depressing side. Maybe Mark Cerny should step down and let someone else take the reign of PS5 Pro? Yep, the Pro will be the only redemption if Sony go wild at it.
Unless of course their exclusives look so good they eclipse the Series X games, then I’ll have to readjust my judgement. But right now it’s not ultra terrible but ain’t no hype either.
 
I'm curious about the coherence engine and cache scrubbers in the PS5, and how they slot into the pipeline versus partially resident or virtual texturing.
Does this means Sony tried to have a more active path to fulfilling virtualized or partially resident texture fetches rather than purely falling back to lower-resolution assets?
This seems like it's trying to resolve a similar problem to volatile flag in the PS4, where a flag on L2 cache lines with compute data allowed an internal loop to invalidate only those cache lines in a shorter interval than a full cache flush. Apparently these scrubbers can invalidate data in many cache layers?
In terms of latency, this could align with the clock speed and smaller numbers of CUs. Higher clocks can help resolve synchronization events faster, fewer CUs need less time to ramp after barriers release, and active cache scrubbing might reduce the impact of certain pipeline events that require cache invalidations or global stalls.
Some of these latency elements could benefit primitive shaders, which seem like they are one type of shader that RDNA's workgroup processor organization is meant to benefit. There's an additional serial element to where those shaders have been traditionally inserted that could delay the ramping of pixel shading by less time if the clocks are higher.
These could help in other scenarios where additional CUs wouldn't, though I am not sure how much that counters a broader GPU with more bandwidth. I'd be interested in seeing that kind of analysis, but I wonder if that kind of information sharing would be discouraged or subject to NDA.

The Tempest audio solution is similar in some aspects to proposals to modify CUs for audio data that have shown up in AMD patents, but sounds different in the the removal of caches. I'd be interested in seeing more detail in how that works, and whether it's total removal and whether a single CU can be as flexibly programmed vs 8 independent cores. An RDNA standar CU still has a decently large batch compared to a CPU, for example.

I'm somewhat more in favor of Sony's leaving a user-expandable storage option with a bay that users can put an SSD into versus how Microsoft has a slot in the back. While I'd expect dislocation to be rare, I feel like some kind of screw or tab that requires more purposeful access to remove an SSD while it is active would be nice given how SSDs can vary widely in how they handle abrupt power loss.
 
I imagine MS’s SSD route allows you to treat it like a memory card. M.2 connector itself isn’t designed for hot swapping.
 
Someone please correct me if I’m wrong. Isn’t the X-series considered the pro or premium model within the Xbox line of products? If so, then Lockhart (if available at launch) would be considered the next-gen entry level Xbox – correct? If this is all true, wouldn’t the PS5 be simply the next-gen entry level system, not the Pro?

The only reason that I’m me asking this, is the re-visioning (more like goalpost shifting) some (including some game journalist) are doing across boards that the PS5 is the pro or premium model competing with the X-series in terms of performance. That seems quite ignorant to believe the PS5 is actually a Pro model, unless I missed something in Cerny’s tech-dive.

Anyhow, I personally believe Sony blew it. Way too many excuses on why the GPU has less CUs than their competitors (if this is the Pro model), and the PR doublespeak at times was very off-putting. As a PC gamer and part-time console gamer, Sony really failed in my book. There is no denying their SSD technology sounds awesome, however, I still have to wonder if their GPU tech (CU count) was really crippled by wanting BC, or did they simply cheap-out within that area of design. Personally, I would have dropped any BC designs if the process required me to stick mostly to the prior generation GPU design or layout.

One thing is for sure, Microsoft deserves all the credit and praises for pushing the GPU tech boundaries within the console space.
 
Last edited:
Anyhow, I personally believe Sony blew it. Way too many excuses on why the GPU has less CUs than their competitors, and the PR doublespeak at times was very off-putting. As a PC gamer and part-time console gamer, Sony really failed in my book. There is no denying their SSD technology sounds awesome, however, I still have to wonder if their GPU tech (CU count) was really crippled by wanting BC, or did they simply cheap-out within that area of design. Personally, I would have dropped any BC designs if the process required me to stick mostly to the prior generation GPU design or layout.

Or MS simply aimed lot higher for CU count than Sony could ever have expected. It is too early to assume Sony has "failed" without knowing the launch price of those consoles, as well as marketing and exclusive titles. If console war is won by specs alone, Xbox one X should have turned around MS's fortune, but it didn't.
 
Or MS simply aimed lot higher for CU count than Sony could ever have expected. It is too early to assume Sony has "failed" without knowing the launch price of those consoles, as well as marketing and exclusive titles. If console war is won by specs alone, Xbox one X should have turned around MS's fortune, but it didn't.

Mind you, I'm not talking about "sales failure," but a hardware comparison to Microsoft's offerings. Yes the SSD tech sounds great, but everything else is meh.
 
Ironic people finally got what they wanted, a super fast SSD and BC but now they’re showing so much concerns on CUs, clocks, teraflops, etc. it only goes to show raw TF power and bandwidth are the utmost importance after all, let’s not lie to ourselves any more.
Ps5 is clearly a gimped design, it got its priorities wrong and it tried hard to catch up to the competition. Sony is lucky it didn’t dip to a single digit marketing wise but the hardware hype is just not there, in fact it’s on the depressing side. Maybe Mark Cerny should step down and let someone else take the reign of PS5 Pro? Yep, the Pro will be the only redemption if Sony go wild at it.
Unless of course their exclusives look so good they eclipse the Series X games, then I’ll have to readjust my judgement. But right now it’s not ultra terrible but ain’t no hype either.

On other forums? Because for the people here, the same people that were most interested in audio, storage and BC are still most interested in audio, storage, and BC.

Speaking for myself...
  • MS has the obvious advantage WRT to BC. They've gone far beyond my expectations.
  • Sony is ahead by default on Audio as MS hasn't revealed anything other than 3D hardware audio. And Sony have gone beyond what I expected as well.
    • Considering that Ninja Theory is one of MS's internal studios now, one would hope they haven't skimped on the 3D hardware audio, but it's highly unlikely that even if it was a priority for them that they've been trying to come up with a solution that could have potentially thousands of sound sources.
  • Sony is ahead WRT storage speeds. This one is the closest between the two.
    • Both of them are prioritizing immediate on demand low latency access to the storage pool.
    • Sony's is obviously the faster solution.
    • MS claims a guaranteed performance level at all times for their SSD solution. It's uncertain at this moment whether Sony's number is peak and whether it is affected by thermal throttling of the NAND chips similar to high speed PC NVME drives.
      • However, even if the Sony solution thermally throttles, it's still likely going to be comfortably faster than the MS solution.
As far as I'm concerned the GPU specs are close enough that it's not terribly interesting to discuss that. MS have a slight advantage here in that developers can utilize both the CPU and GPU to maximum effect at all times while developers will have to juggle CPU versus GPU loads for PS5. That said, I don't think this will be anything that results in any major differences in presentation. Even at 9.2 TF, it wouldn't have been an interesting discussion for me.

Regards,
SB
 
Mind you, I'm not talking about "sales failure," but a hardware comparison to Microsoft's offerings. Yes the SSD tech sounds great, but everything else is meh.

Well, Yeah, MS managed to put GPU with higher CU count, but other than that, their architecture design and capabilities are very similar. XBSX is no doubt a very impressive showing, but PS5 isn't really a disappointment. I expect the difference between those two consoles to be smaller than XB1X vs PS4Pro.
 
I imagine MS’s SSD route allows you to treat it like a memory card. M.2 connector itself isn’t designed for hot swapping.
The Digital Foundry article described the SSD setup as having PCIe links to the internal SSD and a PCIe link to the expansion connector. The expansion card is described as being physically heavier, and one possible reason was to prevent degradation in performance due to overheating of the controller logic. If the controller's in the card, absent hot-swapping logic or some of the design features in cards rated for handling power loss safely, SSDs have a high risk of total data loss or potentially bricking.

Anyhow, I personally believe Sony blew it. Way too many excuses on why the GPU has less CUs than their competitors (if this is the Pro model), and the PR doublespeak at times was very off-putting. As a PC gamer and part-time console gamer, Sony really failed in my book. There is no denying their SSD technology sounds awesome, however, I still have to wonder if their GPU tech (CU count) was really crippled by wanting BC, or did they simply cheap-out within that area of design. Personally, I would have dropped any BC designs if the process required me to stick mostly to the prior generation GPU design or layout.
It hasn't been explained why BC would limit the maximum number of CUs.
One way of looking at things is that there have been two Sony APUs that had more than 36 CUs: the PS4 Pro and PS5. Each physically has 40 CUs, and the hardware is inherently capable of using all of them, but is instructed by fuses or firmware to ignore four.
Up until certain limits that AMD has discussed for GCN, I only see BC providing a floor value where there would be problems if the PS5 had fewer.

However, the PS5 has a 256-bit bus, which is a design decision that would have been set in the same phase as CU count, and that points to size or cost considerations for a lower overall target. Why their target was where it was could have various reasons, given their projections in years past.
 
Or MS simply aimed lot higher for CU count than Sony could ever have expected. It is too early to assume Sony has "failed" without knowing the launch price of those consoles

Also we don't know the price of the two consoles.
I find it perplexing that they didn't have a tech demo to demonstrate what is possible with the storage speed.
 
I would argue BC is that part they got the least right. Everything else except for GDDR6 bandwidth is a great baseline for next gen.

The least right? Are you jumping on the outlandish train that thinks they already have confirmation the PS5's BC will extend to only 2% of the PS4's library?

They've gotten the communication over it the least right, definitely , but BC itself? We don't have enough details to make that call yet. If Sony come along and clarify that the only PS4 games that work on the PS5 are Fortnite, GTAV, CoD, and (fingers crossed) Life of the Black Tiger, then I will agree with you. But it's too early to tell if that's the case.

Slightly OT, but hopefully this will factor into encouraging Rockstar to invest some of their $1 billion annual GTAV revenue into higher resolution versions for the Pro and X1X. Piss takers.

The PS5's bandwidth is pretty shit though, I completely agree there. A less powerful GPU is fine, and it would've made for some interesting DF comparisons: fewer CU's but considerably higher clocked. As it is, those fewer CU's will be getting less bandwidth too. Makes for boring tech discussions that just quickly descend into fanboys blathering on about why their more/less powerful toaster is actually, technically better, because they have special bread and they like their toast that way.
 
Someone please correct me if I’m wrong. Isn’t the X-series considered the pro or premium model within the Xbox line of products? If so, then Lockhart (if available at launch) would be considered the next-gen entry level Xbox – correct? If this is all true, wouldn’t the PS5 be simply the next-gen entry level system, not the Pro?

Only according to rumours. We've yet to see any real evidence that Lockhart actually exists. From the credible information we have, the PS5 and the XSX are the base consoles.

Microsoft are in a pretty good position right now: more powerful, and probably not much more expensive, if at all. The only way they could mess it up, IMO, would be to actually release Lockhart. If they announce it, I will literally piss myself laughing that they managed to squander their standing.
 
The Tempest audio solution is similar in some aspects to proposals to modify CUs for audio data that have shown up in AMD patents, but sounds different in the the removal of caches. I'd be interested in seeing more detail in how that works, and whether it's total removal and whether a single CU can be as flexibly programmed vs 8 independent cores. An RDNA standar CU still has a decently large batch compared to a CPU, for example.

If its like the SPU, like Cerny claims then it will have automated caches but instead a programmer controlled local store which should be good for transistor size but a bit of a headache as they realised on the PS3. Maybe they'll abstract it away this time. I can see it working better as audio processor vs the generalised Cell because audio is inherently stream based no?.
 
What is disappointing is that only 100 PS4 games will be playable at launch (with no guarantee that others game will be available, only promises).

So they made a Frankenstein GPU (that will probably consume as much watts or even more than Xbox with its 20% more powerful GPU) in order to play 4% of PS4 games ? What a waste.

...

looks good overall. The BC capability is regretful.
But they'll get the 100 most played games, so I guess that is enough to ensure that specific populations (the ones that matter) can continue forward.

As posters above mentioned.. Cerny didn't say only 100 will be available, he provided an example in which the top 100 PS4 games were tested and nearly all of them worked.

Presumably firmware may need to be updated and/or individual games patched to work fully without bugs. Considering the low-level APIs used for PS4 development, this is perfectly reasonable. Hopefully Sony stay on top of it.
 
Last edited:
Question. In the presentation Cerny mentioned that they made sure that after all things are processed the SSD maintains consistent speed and performance whereas normally as the information pass through various steps the final result in the game is much lower.
Is this something unique to the PS5?
MS reports the IO Throughput but is this the actual after all things are done?
 
Back
Top