Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Discussion in 'Console Technology' started by Proelite, Mar 16, 2020.

Thread Status:
Not open for further replies.
  1. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,019
    Likes Received:
    15,763
    Location:
    The North
    Im not seeing the connection to XSX. Or better worded I mean the literal connection.

    So the fact that it’s labelled as Navi 21 lite and found in OSX drivers tells me that the product exists in the AMD line. While that could very well be what XSX is based upon, it does not imply that it is as per the driver states.
    Wrt to the driver, or even that product they may have positioned the product to be specifically compute heavy. Reducing more on the front end to cater to that markets needs.

    I don’t see this as a sure fire Navi 21 lite is XSX therefore all these other claims now apply.

    If that makes sense. Aside from 1 claim that seems disputed by RGT, I’m can’t make much more commentary. I know of no method to declare what makes a CU RDNA2 or RDNA1. The likelihood that you can pull just the RT unit and not the whole CU with it is unlikely. I get we do arm chair engineering here; but this is an extremely far stretch. MS weren’t even willing to shrink their processors further and thus upgraded to Zen 2 because it would be cheaper. The consoles are semi-custom; not full custom. They are allowed to mix and match hardware blocks as they require but it’s clear there are limitations. But If you know the exact specifications you can share it, but I don’t.

    Typically things like front end being RDNA 1, is a weird claim given Mesh shaders are part of that front end. The GCP needs to be outfitted with a way to support mesh shaders. The XSX also supports NGG geometry pipeline as per the leaked documentation (which as of June was not ready) so once again, I’m not sure what would constitute it to be RDNA1 vs RDNA2.
     
    milk, tinokun, function and 7 others like this.
  2. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    17,702
    Likes Received:
    7,705
    So it appears that he's saying that the single most important thing that will differentiate titles performance this generation is the tools (programming environment, profilers, etc.) and not the hardware. The hardware just has to be modern and good enough. Interesting.

    I can't say that I necessarily disagree, and going by what DF have heard, PS5 thus far have the easier to use and more robust suite of tools available to developers. This should make it significantly easier to extract performance out of the PS5 versus the XBS-X/S.

    I'm sure someone will fire back with...but the Dirt 5 developer said... This doesn't run contrary to what was just said. The XBS-X/S tools may be good in isolation, perhaps even better than the outgoing XDK, but if all the other developers are to be believed, it's still not nearly as good as the tools available for the PS5.

    Whether and if the GDK can catch up remains to be seen. I'm somewhat doubtful it will ever get as easy to extract performance as the PS5 tools, however, due to the need to support easy cross platform development between Xbox, PC, and any other potential platforms. And of course, Sony isn't going to stop improving their dev. tools.

    The only way I see Microsoft's GDK advancing in larger leaps than the PS5 tools is if the GDK is missing large chunks of functionality that aren't missing in the PS5 dev environment.

    End result is that XBS-X needs higher performance in order to make up the deficit in dev. tools, thus crossplatform games may end up being relatively similar throughout the gen. with PS5 possibly performing slightly better due to better tools.

    Regards,
    SB
     
    pjbliverpool likes this.
  3. HBRU

    Regular Newcomer

    Joined:
    Apr 6, 2017
    Messages:
    837
    Likes Received:
    180
    Seeing PS5 vs XSX real world performances is there any possibility MS can slightly upclock the XSX GPU ? On today HW it may be impossibile but in a future revision I think yes... as it was done for OneS vs original One.
     
  4. see colon

    see colon All Ham & No Potatos
    Veteran

    Joined:
    Oct 22, 2003
    Messages:
    1,995
    Likes Received:
    1,072
    I'm sure it's possible in future hardware revisions, but it's important to remember that the theoretical and real world performance of PS5 and XSX are fairly close. Much closer than PS4 and XBO ever were. Honestly, I'm doubtful most people could tell them apart in blind tests at this point, outside maybe Dirt's poor showing and lower settings on Xbox. I don't know if it's really worth MS's time to give a mild performance boost to their hardware in this case.
     
    HBRU likes this.
  5. HBRU

    Regular Newcomer

    Joined:
    Apr 6, 2017
    Messages:
    837
    Likes Received:
    180
    Well thermal dissipation of XSX is so good that I think -by this aspect- this can be done also on today HW already sold with maybe the failure of a few units around (units that can easily be replaced)... I think MS was so sure to be superior in performances vs PS5. Don't know if an upclock is actually technically feasible on today's HW via firmware...
     
  6. Karamazov

    Veteran Regular

    Joined:
    Sep 20, 2005
    Messages:
    3,735
    Likes Received:
    3,665
    Location:
    France
    It's certainly possible, Even Cerny in his presentation said they could have gone higher freq for their GPU.
    But it would still reduce the reliability of the consoles in the end.
    They are good for now, these are just rushed launch titles.
    Great things are coming from both.
     
  7. HBRU

    Regular Newcomer

    Joined:
    Apr 6, 2017
    Messages:
    837
    Likes Received:
    180
    Seeing the trouble around of quite many PS5 users I think Sony pushed quite a lot his silicon frequency. On the other side XSX around seems totally reliable... and silent.

    Imho MS has been quite conservative on this, I think seeing the situation MS could (and should) give a boost to the frequencies... ... maybe a 10%
     
  8. vjPiedPiper

    Newcomer

    Joined:
    Nov 23, 2005
    Messages:
    117
    Likes Received:
    72
    Location:
    Melbourne Aus.
    Even the ability to upclock the GPU by 5% could have huge impact on the XSX GPU perf.
    BUT, does anyone know of a precedent for this?

    eg. a manufacturer increasing clocks on a product AFTER launch?
    seems like realm of fantasy to me!
     
  9. Karamazov

    Veteran Regular

    Joined:
    Sep 20, 2005
    Messages:
    3,735
    Likes Received:
    3,665
    Location:
    France
    There are software bugs on PS5 but no overheating issue.
    I remember having a lot of blue screen errors on PS4 at launch with BF4 :runaway:
     
    rekator likes this.
  10. ToTTenTranz

    Legend Veteran

    Joined:
    Jul 7, 2008
    Messages:
    12,065
    Likes Received:
    7,029
    Or their potential performance is just similar, and teraflops don't tell the whole story.
     
    mr magoo and London Geezer like this.
  11. HBRU

    Regular Newcomer

    Joined:
    Apr 6, 2017
    Messages:
    837
    Likes Received:
    180
    Well OneS is an upclocked version of the original One (the silicon is different beeing 16 nm vs 28 nm).
     
  12. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,904
    Likes Received:
    11,015
    Location:
    London, UK
    So a game supporting four Xbox hardware configurations and offering 120fps on two of them, which has a few missing plants and a minor LOD issue is a "poor showing"? Come on..
     
    BRiT and JPT like this.
  13. BillSpencer

    Regular Newcomer

    Joined:
    Nov 18, 2020
    Messages:
    304
    Likes Received:
    117
    I think the differences in architectures are being looked at from an incorrect perspective. I don't believe it is correct to say Series X has more compute units per shader array, it should be:

    Series X has 33% less shader arrays per compute unit compared to PS5

    To make it more complete: Series X has 33% less shader arrays per compute unit compared to PS5, and those shader arrays are operating at an 18% lower frequency compared to PS5

    That might sound weird at first, but it is in line with what, outside of MS and its' fans, everybody has been been saying; that the 12TF number is not a real measurement for actual game performance. There are around 45% more compute units though on Series X which is why it is able to keep up with PS5 games as good as it is, only showing lower actual resolution and performance in some scenes.

    To me this makes a lot more sense than 'MS has bad tools, developers don't know how to utilise 12TF yet' and so on, as has been heard on many forums by now.

    Just my 2 cents. Or rather, my 49900 cents :D
     
  14. Nisaaru

    Veteran Regular

    Joined:
    Jan 19, 2013
    Messages:
    1,083
    Likes Received:
    363
    Is it really that hard to wait until we have the full picture? To me it makes far more sense that it's either the drivers, middleware/toolset or the games aren't well ported than assuming it's the hardware's fault.

    If a title runs with an unexpected low power footprint either the XSX requires far less power than we've ever assumed it would or the freaking console is idling. Personally I consider it extremely unlikely that the console is doing that because it's "stalled" somewhere.
     
    Johnny Awesome likes this.
  15. Insight

    Newcomer

    Joined:
    Sep 30, 2020
    Messages:
    94
    Likes Received:
    273
    I think XSX biggest issue is the memory set-up not the TFlops
    Two devlopers complained about the "interleaved" memory publicly
    [​IMG]
    And also the Crytek developer, both deleted their statements
     
    egoless likes this.
  16. BillSpencer

    Regular Newcomer

    Joined:
    Nov 18, 2020
    Messages:
    304
    Likes Received:
    117
    Was that in regards to the Series X?

    Not really the same, but remember the 970 having 3.5GB + 0.5 GB? With drivers this bottleneck was mitigated
     
  17. dskneo

    Regular

    Joined:
    Jul 25, 2005
    Messages:
    649
    Likes Received:
    193
    That was just BF4 being BF4. It took more than a year and a studio change to make it good.
     
    Karamazov likes this.
  18. Insight

    Newcomer

    Joined:
    Sep 30, 2020
    Messages:
    94
    Likes Received:
    273
    He was talking about Series S, but it applies to Series X too
     
  19. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,961
    Likes Received:
    930
    Location:
    Somewhere over the ocean
    So developers are lamenting that the console designed to run at 1/4 of the resolution will need to run at 1/4 of the resolution to keep up?
     
  20. jayco

    Veteran Regular

    Joined:
    Nov 18, 2006
    Messages:
    1,718
    Likes Received:
    1,255
    I don’t think it applies to SeriesX. In Series S it is true that if the slower 2GB aren’t enough to hold information the CPU needs, it can start eating away bandwidth that the GPU needs. I guess you either have the option of lowering res and apply reconstruction techniques to increase pixel count or lower the quality and size of assets. 2GB of system/slow memory seems very small, maybe a 6/4 would had been a better split.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...