Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I posted details a few pages back.

Here's my post, where I started discussing:
https://forum.beyond3d.com/posts/2178977/

Poster, @Digidi summarised the driver leaks here:
https://forum.beyond3d.com/posts/2176653/

Poster, @tinokun made a nice table below:
Code:
                Property Navi10 Navi14 Navi12 Navi21Lite Navi21 Navi22 Navi23 Navi31
                  num_se      2      1      2          2      4      2      2      4
           num_cu_per_sh     10     12     10         14     10     10      8     10
           num_sh_per_se      2      2      2          2      2      2      2      2
           num_rb_per_se      8      8      8          4      4      4      4      4
                num_tccs     16      8     16         20     16     12      8     16
                num_gprs   1024   1024   1024       1024   1024   1024   1024   1024
         num_max_gs_thds     32     32     32         32     32     32     32     32
          gs_table_depth     32     32     32         32     32     32     32     32
       gsprim_buff_depth   1792   1792   1792       1792   1792   1792   1792   1792
   parameter_cache_depth   1024    512   1024       1024   1024   1024   1024   1024
double_offchip_lds_buffer     1      1      1          1      1      1      1      1
               wave_size     32     32     32         32     32     32     32     32
      max_waves_per_simd     20     20     20         20     16     16     16     16
max_scratch_slots_per_cu     32     32     32         32     32     32     32     32
                lds_size     64     64     64         64     64     64     64     64
           num_sc_per_sh      1      1      1          1      1      1      1      1
       num_packer_per_sc      2      2      2          2      4      4      4      4
                num_gl2a    N/A    N/A    N/A          4      4      2      2      4
                unknown0    N/A    N/A    N/A        N/A     10     10      8     10
                unknown1    N/A    N/A    N/A        N/A     16     12      8     16
                unknown2    N/A    N/A    N/A        N/A     80     40     32     80
      num_cus (computed)     40     24     40         56     80     40     32     80
                Property Navi10 Navi14 Navi12 Navi21Lite Navi21 Navi22 Navi23 Navi31

There was a Tweet by a famous leaker, Yuko Yoshida (@KityYYuko):

Code:
XSX
Front-End: RDNA 1
Render-Back-Ends: RDNA 2
Compute Units: RDNA1
RT: RDNA2

Navi21 Lite is considered XSX. And the driver is comparing it to RDNA1 and RDNA2 GPUs. Front-end for XSX matches RDNA1 - Scan Converters and Packers per Scan Converters (rasterisation); and SIMD waves (CUs) are RDNA1 for XSX and change for RDNA2 GPUs (Navi2x). Navi21 Lite (XSX) has same Render Backends per Shader Engine as RDNA2.
Im not seeing the connection to XSX. Or better worded I mean the literal connection.

So the fact that it’s labelled as Navi 21 lite and found in OSX drivers tells me that the product exists in the AMD line. While that could very well be what XSX is based upon, it does not imply that it is as per the driver states.
Wrt to the driver, or even that product they may have positioned the product to be specifically compute heavy. Reducing more on the front end to cater to that markets needs.

I don’t see this as a sure fire Navi 21 lite is XSX therefore all these other claims now apply.

If that makes sense. Aside from 1 claim that seems disputed by RGT, I’m can’t make much more commentary. I know of no method to declare what makes a CU RDNA2 or RDNA1. The likelihood that you can pull just the RT unit and not the whole CU with it is unlikely. I get we do arm chair engineering here; but this is an extremely far stretch. MS weren’t even willing to shrink their processors further and thus upgraded to Zen 2 because it would be cheaper. The consoles are semi-custom; not full custom. They are allowed to mix and match hardware blocks as they require but it’s clear there are limitations. But If you know the exact specifications you can share it, but I don’t.

Typically things like front end being RDNA 1, is a weird claim given Mesh shaders are part of that front end. The GCP needs to be outfitted with a way to support mesh shaders. The XSX also supports NGG geometry pipeline as per the leaked documentation (which as of June was not ready) so once again, I’m not sure what would constitute it to be RDNA1 vs RDNA2.
 
Better source a dev with 24 AAA games release



https://medium.com/@mattphillips/te...-of-comparing-videogame-consoles-4207d3216523

He was probably knowing the surprising performance of PS5 compared to XSX and write this tweet and medium article. After maybe in two years XSX will be a bit above PS5 but at the end the gap is close.

We never had since the GPU being based on PC GPU a generation where the two platform holder decided to do some console easy to use for developer and centered on games. And they targeted the same MSRP.

No CPU difficult to program like the CELL with a last minute GPU change, no hardware made around Kinect or like with the PS4 PRo and XB1X one year of gap release and 100 dollars difference in MSRP.

And the APU is made by the same supplier, nothing so surprising about the PS5 and XSX situation.

So it appears that he's saying that the single most important thing that will differentiate titles performance this generation is the tools (programming environment, profilers, etc.) and not the hardware. The hardware just has to be modern and good enough. Interesting.

I can't say that I necessarily disagree, and going by what DF have heard, PS5 thus far have the easier to use and more robust suite of tools available to developers. This should make it significantly easier to extract performance out of the PS5 versus the XBS-X/S.

I'm sure someone will fire back with...but the Dirt 5 developer said... This doesn't run contrary to what was just said. The XBS-X/S tools may be good in isolation, perhaps even better than the outgoing XDK, but if all the other developers are to be believed, it's still not nearly as good as the tools available for the PS5.

Whether and if the GDK can catch up remains to be seen. I'm somewhat doubtful it will ever get as easy to extract performance as the PS5 tools, however, due to the need to support easy cross platform development between Xbox, PC, and any other potential platforms. And of course, Sony isn't going to stop improving their dev. tools.

The only way I see Microsoft's GDK advancing in larger leaps than the PS5 tools is if the GDK is missing large chunks of functionality that aren't missing in the PS5 dev environment.

End result is that XBS-X needs higher performance in order to make up the deficit in dev. tools, thus crossplatform games may end up being relatively similar throughout the gen. with PS5 possibly performing slightly better due to better tools.

Regards,
SB
 
Seeing PS5 vs XSX real world performances is there any possibility MS can slightly upclock the XSX GPU ? On today HW it may be impossibile but in a future revision I think yes... as it was done for OneS vs original One.
 
Seeing PS5 vs XSX real world performances is there any possibility MS can slightly upclock the XSX GPU ? On today HW it may be impossibile but in a future revision I think yes... as it was done for OneS vs original One.
I'm sure it's possible in future hardware revisions, but it's important to remember that the theoretical and real world performance of PS5 and XSX are fairly close. Much closer than PS4 and XBO ever were. Honestly, I'm doubtful most people could tell them apart in blind tests at this point, outside maybe Dirt's poor showing and lower settings on Xbox. I don't know if it's really worth MS's time to give a mild performance boost to their hardware in this case.
 
I'm sure it's possible in future hardware revisions, but it's important to remember that the theoretical and real world performance of PS5 and XSX are fairly close. Much closer than PS4 and XBO ever were. Honestly, I'm doubtful most people could tell them apart in blind tests at this point, outside maybe Dirt's poor showing and lower settings on Xbox. I don't know if it's really worth MS's time to give a mild performance boost to their hardware in this case.
Well thermal dissipation of XSX is so good that I think -by this aspect- this can be done also on today HW already sold with maybe the failure of a few units around (units that can easily be replaced)... I think MS was so sure to be superior in performances vs PS5. Don't know if an upclock is actually technically feasible on today's HW via firmware...
 
It's certainly possible, Even Cerny in his presentation said they could have gone higher freq for their GPU.
But it would still reduce the reliability of the consoles in the end.
They are good for now, these are just rushed launch titles.
Great things are coming from both.
 
It's certainly possible, Even Cerny in his presentation said they could have gone higher freq for their GPU.
But it would still reduce the reliability of the consoles in the end.
They are good for now, these are just rushed launch titles.
Great things are coming from both.
Seeing the trouble around of quite many PS5 users I think Sony pushed quite a lot his silicon frequency. On the other side XSX around seems totally reliable... and silent.

Imho MS has been quite conservative on this, I think seeing the situation MS could (and should) give a boost to the frequencies... [emoji16]... maybe a 10%
 
Even the ability to upclock the GPU by 5% could have huge impact on the XSX GPU perf.
BUT, does anyone know of a precedent for this?

eg. a manufacturer increasing clocks on a product AFTER launch?
seems like realm of fantasy to me!
 
Seeing the trouble around of quite many PS5 users I think Sony pushed quite a lot his silicon frequency. On the other side XSX around seems totally reliable... and silent.

Imho MS has been quite conservative on this, I think seeing the situation MS could (and should) give a boost to the frequencies... [emoji16]... maybe a 10%

There are software bugs on PS5 but no overheating issue.
I remember having a lot of blue screen errors on PS4 at launch with BF4 :runaway:
 
Even the ability to upclock the GPU by 5% could have huge impact on the XSX GPU perf.
BUT, does anyone know of a precedent for this?

eg. a manufacturer increasing clocks on a product AFTER launch?
seems like realm of fantasy to me!
Well OneS is an upclocked version of the original One (the silicon is different beeing 16 nm vs 28 nm).
 
Honestly, I'm doubtful most people could tell them apart in blind tests at this point, outside maybe Dirt's poor showing and lower settings on Xbox. I don't know if it's really worth MS's time to give a mild performance boost to their hardware in this case.
So a game supporting four Xbox hardware configurations and offering 120fps on two of them, which has a few missing plants and a minor LOD issue is a "poor showing"? Come on..
 
I think the differences in architectures are being looked at from an incorrect perspective. I don't believe it is correct to say Series X has more compute units per shader array, it should be:

Series X has 33% less shader arrays per compute unit compared to PS5

To make it more complete: Series X has 33% less shader arrays per compute unit compared to PS5, and those shader arrays are operating at an 18% lower frequency compared to PS5

That might sound weird at first, but it is in line with what, outside of MS and its' fans, everybody has been been saying; that the 12TF number is not a real measurement for actual game performance. There are around 45% more compute units though on Series X which is why it is able to keep up with PS5 games as good as it is, only showing lower actual resolution and performance in some scenes.

To me this makes a lot more sense than 'MS has bad tools, developers don't know how to utilise 12TF yet' and so on, as has been heard on many forums by now.

Just my 2 cents. Or rather, my 49900 cents :D
 
Or their potential performance is just similar, and teraflops don't tell the whole story.

Is it really that hard to wait until we have the full picture? To me it makes far more sense that it's either the drivers, middleware/toolset or the games aren't well ported than assuming it's the hardware's fault.

If a title runs with an unexpected low power footprint either the XSX requires far less power than we've ever assumed it would or the freaking console is idling. Personally I consider it extremely unlikely that the console is doing that because it's "stalled" somewhere.
 
Seeing PS5 vs XSX real world performances is there any possibility MS can slightly upclock the XSX GPU ? On today HW it may be impossibile but in a future revision I think yes... as it was done for OneS vs original One.
I think XSX biggest issue is the memory set-up not the TFlops
Two devlopers complained about the "interleaved" memory publicly
8shdRq0_d.webp

And also the Crytek developer, both deleted their statements
 
I think XSX biggest issue is the memory set-up not the TFlops
Two devlopers complained about the "interleaved" memory publicly
8shdRq0_d.webp

And also the Crytek developer, both deleted their statements

Was that in regards to the Series X?

Not really the same, but remember the 970 having 3.5GB + 0.5 GB? With drivers this bottleneck was mitigated
 
So developers are lamenting that the console designed to run at 1/4 of the resolution will need to run at 1/4 of the resolution to keep up?
 
He was talking about Series S, but it applies to Series X too

I don’t think it applies to SeriesX. In Series S it is true that if the slower 2GB aren’t enough to hold information the CPU needs, it can start eating away bandwidth that the GPU needs. I guess you either have the option of lowering res and apply reconstruction techniques to increase pixel count or lower the quality and size of assets. 2GB of system/slow memory seems very small, maybe a 6/4 would had been a better split.
 
Status
Not open for further replies.
Back
Top