Playstation 5 [PS5] [Release Holiday 2020]

Discussion in 'Console Technology' started by BRiT, Mar 17, 2020.

  1. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    2,505
    Likes Received:
    776
    I was pointing to the likes of klee, not you. You didnt make a show of it, neither did i, waited to just before. Yes got to hear between 9/10tf, and the clock, doubted him at first, but he could prove it.
    Could have been worse yes, in the 8TF range, but rdna2 is clock friendly, kinda.
     
  2. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    I'm still unsure of a reason for why it would have capped the CU count. However, it seems reasonable that backwards compatibility is one possible reason why they didn't have fewer.

    The PS5 is on the edge of 10 TF, which a more conservative approach would have dropped below. A downclock of more than 2.5% on the GPU drops it to 9.x.

    Vega's whitepaper proposed a number of future directions that primitive shaders could take, although that generation saw them take direction of being discarded.
    Navi has something like the automatically generated culling shaders, although the numbers given fall short of what Vega claimed.
    Some of what Cerny discussed might be along the lines of those future directions for Vega. Mesh and primitive shaders do exist in a somewhat similar part of the overall pipeline, but without more details the amount of overlap is still not determined.

    Last-minute in this instance would be last-month or last-quarter for chip revisions. That process has long lead times. Just reaching for 10 TF might have been on their mind for longer.

    Seems like they haven't had the resources or the time to test broadly enough. This strikes me as a process that might have been waiting on final silicon or more recent steppings, plus any other components needed like the SSD for full testing. These would be the kinds of tests where the real hardware is needed for appreciable accumulation of testing hours, and those need time and sufficient testers/hardware to accumulate.
    As far as a sample goes, going by most hours played isn't really a statistically random sample, so it may not be fully representative. A random sample of 30 games and their overall rate of issues might be a decent indicator of how many games could be expected to work out of the box. Maybe some games like Stardew Valley and Anthem need to be profiled (two games known to hard-lock PS4s and PS4 Pros).

    That sounds like it's past the comfort zone at least a bit for the hardware.

    The boost algorithm probably drops the skin-temp or thermal capacity calculations from AMD's turbo algorithm, which is a likely source of much of the variability. The current-based monitoring and activity counters AMD uses would be the most likely originator's of Sony's idealized SOC, which would be a somewhat conservative model based on the physical characterization of produced chips.
    There would be localized hot-spot modeling, but at that time scale overall temperature likely of second-order importance to the algorithm versus the current and activity that's causing a few units to experience accelerated heating in the space of a few ms or us.
    I think Sony would need to specify a cooler with a thermal resistance and dissipation capability that is sufficiently over-engineered to let the boost algorithm neglect temps outside of thermal trip scenarios.

    There would be 50% more NAND chips, though they would be of lower capacity each. Downside for capacity is that the next increment might not be reachable without bumping the capacity of all those chips together.

    The algorithm should be more stable than the twitchy mobile or Radeon algorithms, but I'm curious if there are certain complexities in benchmarking performance based on instruction mix, like if certain events or assets might spring a bunch more AVX code rather than total unit counts.
     
  3. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,691
    Likes Received:
    165
    Location:
    In the land of the drop bears
    I read what he said as only 100 games have tested with the boost mode and that all PS4/Pro games would be compatible from the start because it has profiles to exactly mimic them.
     
    disco_ likes this.
  4. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,208
    Likes Received:
    2,254
    Location:
    Australia
    Ironic people finally got what they wanted, a super fast SSD and BC but now they’re showing so much concerns on CUs, clocks, teraflops, etc. it only goes to show raw TF power and bandwidth are the utmost importance after all, let’s not lie to ourselves any more.
    Ps5 is clearly a gimped design, it got its priorities wrong and it tried hard to catch up to the competition. Sony is lucky it didn’t dip to a single digit marketing wise but the hardware hype is just not there, in fact it’s on the depressing side. Maybe Mark Cerny should step down and let someone else take the reign of PS5 Pro? Yep, the Pro will be the only redemption if Sony go wild at it.
    Unless of course their exclusives look so good they eclipse the Series X games, then I’ll have to readjust my judgement. But right now it’s not ultra terrible but ain’t no hype either.
     
  5. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    I'm curious about the coherence engine and cache scrubbers in the PS5, and how they slot into the pipeline versus partially resident or virtual texturing.
    Does this means Sony tried to have a more active path to fulfilling virtualized or partially resident texture fetches rather than purely falling back to lower-resolution assets?
    This seems like it's trying to resolve a similar problem to volatile flag in the PS4, where a flag on L2 cache lines with compute data allowed an internal loop to invalidate only those cache lines in a shorter interval than a full cache flush. Apparently these scrubbers can invalidate data in many cache layers?
    In terms of latency, this could align with the clock speed and smaller numbers of CUs. Higher clocks can help resolve synchronization events faster, fewer CUs need less time to ramp after barriers release, and active cache scrubbing might reduce the impact of certain pipeline events that require cache invalidations or global stalls.
    Some of these latency elements could benefit primitive shaders, which seem like they are one type of shader that RDNA's workgroup processor organization is meant to benefit. There's an additional serial element to where those shaders have been traditionally inserted that could delay the ramping of pixel shading by less time if the clocks are higher.
    These could help in other scenarios where additional CUs wouldn't, though I am not sure how much that counters a broader GPU with more bandwidth. I'd be interested in seeing that kind of analysis, but I wonder if that kind of information sharing would be discouraged or subject to NDA.

    The Tempest audio solution is similar in some aspects to proposals to modify CUs for audio data that have shown up in AMD patents, but sounds different in the the removal of caches. I'd be interested in seeing more detail in how that works, and whether it's total removal and whether a single CU can be as flexibly programmed vs 8 independent cores. An RDNA standar CU still has a decently large batch compared to a CPU, for example.

    I'm somewhat more in favor of Sony's leaving a user-expandable storage option with a bay that users can put an SSD into versus how Microsoft has a slot in the back. While I'd expect dislocation to be rare, I feel like some kind of screw or tab that requires more purposeful access to remove an SSD while it is active would be nice given how SSDs can vary widely in how they handle abrupt power loss.
     
  6. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,375
    Likes Received:
    1,406
    I imagine MS’s SSD route allows you to treat it like a memory card. M.2 connector itself isn’t designed for hot swapping.
     
  7. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    4,093
    Likes Received:
    2,316
    Someone please correct me if I’m wrong. Isn’t the X-series considered the pro or premium model within the Xbox line of products? If so, then Lockhart (if available at launch) would be considered the next-gen entry level Xbox – correct? If this is all true, wouldn’t the PS5 be simply the next-gen entry level system, not the Pro?

    The only reason that I’m me asking this, is the re-visioning (more like goalpost shifting) some (including some game journalist) are doing across boards that the PS5 is the pro or premium model competing with the X-series in terms of performance. That seems quite ignorant to believe the PS5 is actually a Pro model, unless I missed something in Cerny’s tech-dive.

    Anyhow, I personally believe Sony blew it. Way too many excuses on why the GPU has less CUs than their competitors (if this is the Pro model), and the PR doublespeak at times was very off-putting. As a PC gamer and part-time console gamer, Sony really failed in my book. There is no denying their SSD technology sounds awesome, however, I still have to wonder if their GPU tech (CU count) was really crippled by wanting BC, or did they simply cheap-out within that area of design. Personally, I would have dropped any BC designs if the process required me to stick mostly to the prior generation GPU design or layout.

    One thing is for sure, Microsoft deserves all the credit and praises for pushing the GPU tech boundaries within the console space.
     
    #447 Shortbread, Mar 19, 2020
    Last edited: Mar 19, 2020
  8. Proelite

    Veteran Regular Subscriber

    Joined:
    Jul 3, 2006
    Messages:
    1,453
    Likes Received:
    800
    Location:
    Redmond
    I would argue BC is that part they got the least right. Everything else except for GDDR6 bandwidth is a great baseline for next gen.
     
    milk and goonergaz like this.
  9. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    391
    Likes Received:
    32
    Or MS simply aimed lot higher for CU count than Sony could ever have expected. It is too early to assume Sony has "failed" without knowing the launch price of those consoles, as well as marketing and exclusive titles. If console war is won by specs alone, Xbox one X should have turned around MS's fortune, but it didn't.
     
    egoless and milk like this.
  10. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    4,093
    Likes Received:
    2,316
    Mind you, I'm not talking about "sales failure," but a hardware comparison to Microsoft's offerings. Yes the SSD tech sounds great, but everything else is meh.
     
    PSman1700 and disco_ like this.
  11. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,044
    Likes Received:
    6,338
    On other forums? Because for the people here, the same people that were most interested in audio, storage and BC are still most interested in audio, storage, and BC.

    Speaking for myself...
    • MS has the obvious advantage WRT to BC. They've gone far beyond my expectations.
    • Sony is ahead by default on Audio as MS hasn't revealed anything other than 3D hardware audio. And Sony have gone beyond what I expected as well.
      • Considering that Ninja Theory is one of MS's internal studios now, one would hope they haven't skimped on the 3D hardware audio, but it's highly unlikely that even if it was a priority for them that they've been trying to come up with a solution that could have potentially thousands of sound sources.
    • Sony is ahead WRT storage speeds. This one is the closest between the two.
      • Both of them are prioritizing immediate on demand low latency access to the storage pool.
      • Sony's is obviously the faster solution.
      • MS claims a guaranteed performance level at all times for their SSD solution. It's uncertain at this moment whether Sony's number is peak and whether it is affected by thermal throttling of the NAND chips similar to high speed PC NVME drives.
        • However, even if the Sony solution thermally throttles, it's still likely going to be comfortably faster than the MS solution.
    As far as I'm concerned the GPU specs are close enough that it's not terribly interesting to discuss that. MS have a slight advantage here in that developers can utilize both the CPU and GPU to maximum effect at all times while developers will have to juggle CPU versus GPU loads for PS5. That said, I don't think this will be anything that results in any major differences in presentation. Even at 9.2 TF, it wouldn't have been an interesting discussion for me.

    Regards,
    SB
     
    goonergaz and London-boy like this.
  12. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    391
    Likes Received:
    32
    Well, Yeah, MS managed to put GPU with higher CU count, but other than that, their architecture design and capabilities are very similar. XBSX is no doubt a very impressive showing, but PS5 isn't really a disappointment. I expect the difference between those two consoles to be smaller than XB1X vs PS4Pro.
     
    egoless and London-boy like this.
  13. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,363
    Likes Received:
    3,944
    Location:
    Well within 3d
    The Digital Foundry article described the SSD setup as having PCIe links to the internal SSD and a PCIe link to the expansion connector. The expansion card is described as being physically heavier, and one possible reason was to prevent degradation in performance due to overheating of the controller logic. If the controller's in the card, absent hot-swapping logic or some of the design features in cards rated for handling power loss safely, SSDs have a high risk of total data loss or potentially bricking.

    It hasn't been explained why BC would limit the maximum number of CUs.
    One way of looking at things is that there have been two Sony APUs that had more than 36 CUs: the PS4 Pro and PS5. Each physically has 40 CUs, and the hardware is inherently capable of using all of them, but is instructed by fuses or firmware to ignore four.
    Up until certain limits that AMD has discussed for GCN, I only see BC providing a floor value where there would be problems if the PS5 had fewer.

    However, the PS5 has a 256-bit bus, which is a design decision that would have been set in the same phase as CU count, and that points to size or cost considerations for a lower overall target. Why their target was where it was could have various reasons, given their projections in years past.
     
  14. Xbat

    Veteran Newcomer

    Joined:
    Jan 31, 2013
    Messages:
    1,389
    Likes Received:
    967
    Location:
    A farm in the middle of nowhere
    Also we don't know the price of the two consoles.
    I find it perplexing that they didn't have a tech demo to demonstrate what is possible with the storage speed.
     
  15. bgroovy

    Regular Newcomer

    Joined:
    Oct 15, 2014
    Messages:
    799
    Likes Received:
    626
    No one is really going to need a second SSD day one. It's not really worth fretting about the holiday 2020 price of such drives. In 2023 it'll probably be a pretty cost effective upgrade.
     
    egoless and goonergaz like this.
  16. Tkumpathenurpahl

    Tkumpathenurpahl Oil my grapes.
    Veteran Newcomer

    Joined:
    Apr 3, 2016
    Messages:
    1,546
    Likes Received:
    1,384
    The least right? Are you jumping on the outlandish train that thinks they already have confirmation the PS5's BC will extend to only 2% of the PS4's library?

    They've gotten the communication over it the least right, definitely , but BC itself? We don't have enough details to make that call yet. If Sony come along and clarify that the only PS4 games that work on the PS5 are Fortnite, GTAV, CoD, and (fingers crossed) Life of the Black Tiger, then I will agree with you. But it's too early to tell if that's the case.

    Slightly OT, but hopefully this will factor into encouraging Rockstar to invest some of their $1 billion annual GTAV revenue into higher resolution versions for the Pro and X1X. Piss takers.

    The PS5's bandwidth is pretty shit though, I completely agree there. A less powerful GPU is fine, and it would've made for some interesting DF comparisons: fewer CU's but considerably higher clocked. As it is, those fewer CU's will be getting less bandwidth too. Makes for boring tech discussions that just quickly descend into fanboys blathering on about why their more/less powerful toaster is actually, technically better, because they have special bread and they like their toast that way.
     
    London-boy likes this.
  17. Tkumpathenurpahl

    Tkumpathenurpahl Oil my grapes.
    Veteran Newcomer

    Joined:
    Apr 3, 2016
    Messages:
    1,546
    Likes Received:
    1,384
    Only according to rumours. We've yet to see any real evidence that Lockhart actually exists. From the credible information we have, the PS5 and the XSX are the base consoles.

    Microsoft are in a pretty good position right now: more powerful, and probably not much more expensive, if at all. The only way they could mess it up, IMO, would be to actually release Lockhart. If they announce it, I will literally piss myself laughing that they managed to squander their standing.
     
    egoless and upnorthsox like this.
  18. Betanumerical

    Veteran

    Joined:
    Aug 20, 2007
    Messages:
    1,691
    Likes Received:
    165
    Location:
    In the land of the drop bears
    If its like the SPU, like Cerny claims then it will have automated caches but instead a programmer controlled local store which should be good for transistor size but a bit of a headache as they realised on the PS3. Maybe they'll abstract it away this time. I can see it working better as audio processor vs the generalised Cell because audio is inherently stream based no?.
     
  19. Mitchings

    Newcomer

    Joined:
    Mar 13, 2013
    Messages:
    113
    Likes Received:
    172
    As posters above mentioned.. Cerny didn't say only 100 will be available, he provided an example in which the top 100 PS4 games were tested and nearly all of them worked.

    Presumably firmware may need to be updated and/or individual games patched to work fully without bugs. Considering the low-level APIs used for PS4 development, this is perfectly reasonable. Hopefully Sony stay on top of it.
     
    #459 Mitchings, Mar 19, 2020
    Last edited: Mar 19, 2020
    egoless and Rootax like this.
  20. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,411
    Likes Received:
    2,715
    Question. In the presentation Cerny mentioned that they made sure that after all things are processed the SSD maintains consistent speed and performance whereas normally as the information pass through various steps the final result in the game is much lower.
    Is this something unique to the PS5?
    MS reports the IO Throughput but is this the actual after all things are done?
     
    PSman1700 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...