Digital Foundry Article Technical Discussion [2021]

Discussion in 'Console Technology' started by BRiT, Jan 1, 2021.

  1. Seanspeed

    Newcomer

    Joined:
    Apr 23, 2021
    Messages:
    85
    Likes Received:
    122
    That seems to be the case for Elden Ring at the least. Every Xbox version seems to be underperforming somewhat. This seems to quite in line with FROM's previous multiplatform efforts, where Playstation has basically always been the best performing platform. I think Dark Souls Remastered might be the lone exception. Dark Souls 3 didn't even get an XB1X patch or anything(granted, it came shortly after the DLC was wrapped up, but still, it was a recent title and very much could have used one). And with FROM obviously having built exclusive games in partnership with Playstation, it seems pretty clear where they put their priorities. Not that I'm moaning at all(I've got no dog in the fight), I'd say that the sales differences between the platforms probably justifies it well enough on their end.
     
    egoless likes this.
  2. davis.anthony

    Regular Newcomer

    Joined:
    Aug 22, 2021
    Messages:
    302
    Likes Received:
    98
    But does it? When you have platforms running the same CPU and GPU arch with a good API should it really make that much of a difference?

    This isn't PS2 vs OG Xbox era where PS2 was so backwards.
     
  3. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    771
    Likes Received:
    767
    Yes, it can still make a huge difference.
    E.g. the API can handle "commands" differently. Just as an example: DX11 commands tend to do more than is necessary. They make things for memory management, to automatically dispose unused memory, log many things that might be used, ....
    This can make a huge overhead if you have designed your engine for "one call to get it all" but than the "requested design" shifts to many, many small calls instead. Now the small overhead you had with each call (but made things in the end much easier) get to a point where this overhead is really huge. It is not only memory but also cpu-/gpu-usage that are affected by this.
    This was basically the shift from DX9 to DX10 to DX11 to finally DX12 (which is more or less, just do it yourself). And still many engines for games are DX11 based and do not really translate good to DX12 on PCs. Yes the API of the xbox is a little bit different but still has the same issues, as it lets developers decide what they want to use. The "old" way is much easier to get things done, but not the most efficient way. To use "almost" the same API for PC & xbox is very attractive for developers. Easy to port and get it to run quite fast. In the optimal case, now it would be good to optimize the calls for the system, but as everything more or less works "fine enough" more optimizations are not needed to ship a product. On PS4/5 you don't have these "issues". The developer must translate the commands to Sonys API. There is "nothing" compatible. So in this process (s)he already optimizes the code for the system (if (s)he wants it or not). Also not every command can be translated 1:1 which also leads to more optimal code, because the developer must do something.

    Btw, we can often still see this in games, where the PS4 is in front of the xbox one in CPU limited scenes, even though the xbox has the (a bit) stronger CPU so it should always be in front in cpu-limited scenes. But the API makes the difference here.
     
    #3803 Allandor, Nov 25, 2021
    Last edited: Nov 25, 2021
    milk, DSoup, Pete and 3 others like this.
  4. snc

    snc
    Veteran Newcomer

    Joined:
    Mar 6, 2013
    Messages:
    1,572
    Likes Received:
    1,182
    I'm not sure its the case, ps4pro has perf advantage over xox but also use cb not native res so hard to compare
     
    BRiT likes this.
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    11,823
    Likes Received:
    2,790
    Location:
    New York
    ROPs are the HW blending pipeline right? They go away when blending of independent samples goes away. And that goes away when full path tracing is a thing.

    Well by definition Nanite isn’t a complete solution. It’s rendering opaque non-deformable geometry into a gbuffer via compute. But it still has to use pixel shaders and ROPs to tie everything together (including transparent and animated geometry) for the final render.
     
  6. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    14,200
    Likes Received:
    17,662
    Location:
    The North
    I’m not really sure that is a thing. As long the GPU can hit its max theoretical TFLOPs or near max on a synthetic GEMM benchmark, additional bandwidth beyond that requirement won’t increase performance further. You need more bandwidth to support the ROPS after shaders, but we are then looking at cumulative bandwidth, not bandwidth per CU.
     
    #3806 iroboto, Nov 25, 2021
    Last edited: Nov 25, 2021
  7. davis.anthony

    Regular Newcomer

    Joined:
    Aug 22, 2021
    Messages:
    302
    Likes Received:
    98
    I know it dude/t, it was said purely as another example where XSX doesn't seem to fit in with the other RDNA2 based GPU's
     
  8. Lurkmass

    Regular Newcomer

    Joined:
    Mar 3, 2020
    Messages:
    490
    Likes Received:
    586
    How bold of you to assume that path tracing will replace ordered blending. Even ray tracing can't elegantly handle the case of coplanar geometry as is commonly seen in UIs or decals ...

    You have to apply more complex work arounds to the content itself like making small offsets to your geometry so that they aren't in the same plane anymore. With ordered blending you can simply declare your UI or decal geometry in another separate draw call and submit them after issuing draw calls for the main scene itself ...

    Yes but it serves as an example for developers to take responsibility for the things they've taken for granted in old graphics pipeline because they can't make the presumption in the future that all content will behave predictably as compared to before ...

    Hardware vendors have had to implement many of these questionable design points in the graphics pipeline over the years at the convenience of developer productivity. Extending many of these dated notions to new graphics pipelines is not in their interest because it would severely restrict how they design new hardware. Graphics programmers should throw away all of these old conceptions they've become accustomed to or start providing a solution themselves rather than forcing hardware vendors to make their graphics pipeline a reality ...
     
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    18,619
    Likes Received:
    9,150
    You still need to take into account clock speed, however. For any given CU, the faster it processes data the faster you need to feed it data or it'll just sit there idle and you've lost any advantage in clock speed you might have had.

    Thus bandwidth per CU is fairly irrelevant given different clockspeeds and a desire to ignore the effects of clockspeed. Bandwidth per FLOP would be more relevant as that takes into account clock speed. Not perfect, but certainly better than bandwidth per CU. Not perfect as, for example, workloads that can effectively use on die cache and avoid main memory would then tend to favor cache size and speed.

    Regards,
    SB
     
  10. Seanspeed

    Newcomer

    Joined:
    Apr 23, 2021
    Messages:
    85
    Likes Received:
    122
    Given the compute and bandwidth advantage of the XB1X over the PS4 Pro, which is quite significant, this does not seem like a simple case of resolution vs performance difference.

    And with the previous trends of FROM, I think this seems a pretty reasonable conclusion when *every* version seems to be underperforming.
     
  11. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    4,440
    Likes Received:
    3,321
    Location:
    France
    There is actually no underperforming on previous gen Xbox.

    Sekiro and Elden rings:
    - XB1 900p vs PS4 1080p is the usual difference. XB1 fares better or similar than in others recent comparisons like Cyberpunk (another recent open-world game) where it's even worse for XB1 compared to PS4 (and the developer is western + Xbox marketing).

    - X1X native vs Pro CBR. Even considering CBR is quite costly, performance difference is about expected and X1X is usually slightly sharper thanks to the native resolution. Dont forget Pro has FP16 that can slighty improve performance if used properly. Sure they could have used the same CBR they used on Dark Souls Remastered where the performance was better on X1X, but at the cost of image quality parity against Pro.
     
  12. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    11,823
    Likes Received:
    2,790
    Location:
    New York
    Decals are an interesting use case. Isn’t it already best practice to use a slight offset for geometry based decals? That leaves UI which are probably cheap enough to blend in software. Nanite is doing 64-bit atomic blends to its visibility buffer with tons of overdraw.

    Those design decisions were also driven by practical limits of the tech available at the time. Lazy devs aren’t the problem. Will Nanite run well on last generation hardware? Probably not.
     
    iroboto likes this.
  13. snc

    snc
    Veteran Newcomer

    Joined:
    Mar 6, 2013
    Messages:
    1,572
    Likes Received:
    1,182
  14. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    14,200
    Likes Received:
    17,662
    Location:
    The North
    hmm Oliver is actually quite a decent shot.
     
  15. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    2,309
    Likes Received:
    1,744
    Location:
    France
    With the spread issu it's quite random... I believe they kind of fix it already ?

    I really like his style of review / analysis btw, I enjoy his videos a lot.
     
    iroboto likes this.
  16. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    14,200
    Likes Received:
    17,662
    Location:
    The North
    lol I didn't know there was that much bloom issues. Oliver seemed to have good aiming bloom and recoil control, good timing with launchers. As far as I could see, he's probably middle upper pack in a typical match.
     
  17. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    19,762
    Likes Received:
    22,942
    https://www.eurogamer.net/articles/...ames-and-fps-boost-are-a-match-made-in-heaven

    Xbox 360 games and FPS Boost are a match made in heaven
    Sonic, Fallout, Gears 3, Mirror's Edge and Assassin's Creed tested.

    As part of its 20th anniversary celebrations, Microsoft didn't just add to its backwards compatibility library, it also added FPS Boost to Xbox 360 games for the first time. Not only that, it also doubled the frame-rate on select Xbox 360 titles that had already received enhanced 4K support for Xbox One X. Spurred on by the addition of FPS Boost to one of my favourite Sonic games, I decided to take a look at some of these improved experiences, gaining further appreciation for some classic titles.

    In doing so, it reminded me of something I hadn't thought deeply about for some time - the fact that the PlayStation 3/Xbox 360 era actually delivered what must surely be the biggest gen-on-gen downgrade in overall game performance... if you ignore Nintendo 64, that is. In looking at these newly enhanced FPS Boost releases, I also decided to go back and revisit their showings on Xbox 360 too - because Series consoles aren't just delivering a doubling of performance, but often much, much more. Standard back-compat on Xbox Series consoles effectively solves their original performance issues - they hit their (mostly) 30fps frame-rate caps. However, FPS Boost goes one step further, reminding us that 60fps used to be the norm, not the exception.

    ...

     
    PSman1700, Silent_Buddha, Jay and 3 others like this.
  18. Riddlewire

    Regular

    Joined:
    May 2, 2003
    Messages:
    466
    Likes Received:
    315
    In games with half-rate animation effects (fire, explosions, etc.) do those animations double with FPS Boost?
    If so, do 30hz animations in a 60hz game look worse than 15hz animations in a 30hz game?
     
  19. dobwal

    Legend Veteran

    Joined:
    Oct 26, 2005
    Messages:
    5,785
    Likes Received:
    2,083
    No. At 15 hz you are basically at the lower threshold frame rate where lay people can perceive a series of still images instead of the perception of motion.
     
  20. Riddlewire

    Regular

    Joined:
    May 2, 2003
    Messages:
    466
    Likes Received:
    315
    Yeah, but when the game is at 30hz, then both the overall visuals and the animations are well below the limit of human perception of fluid motion. So they both look unnatural. Whereas when a game is running at 60hz with 30hz animations, you end up with a perceptual disconnect. The game world appears fluid with close to real life movement, but the animations stand out more, being still at a choppy 30hz.
    That's my speculation, anyway.
    Also, I still don't know if those cut-rate animations are locked to framerate on these games (I assume so, but that's why I asked the question in the first place).
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...