Current Generation Hardware Speculation with a Technical Spin [post launch 2021] [XBSX, PS5]

Discussion in 'Console Technology' started by pjbliverpool, Feb 9, 2021.

  1. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,018
    Likes Received:
    15,763
    Location:
    The North
    Indeed. The main challenge for me has been library support, usually just pulling up anaconda and importing in gpu based libraries. I've not really spent a lot of time coding CUDA as the solutions we require have no required me to get that low level yet though I dabble there from time to time if something isn't working. If you guys have more coming in library support that would be great. It would be great to see competition in the market for smaller DS groups.
     
  2. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,596
    Likes Received:
    840
    Location:
    Finland
    Well, just like with old temporal AA which jittered rendered scene without any blur, it would cause flickering and increase in framerate reduces time in which change is visible.
    With temporal antialiasing flickering could be eliminated, although at that point it might be more feasible just to use knowledge of shading resolution and sample points with screen jittering to try to get more information from the image just like with normal TAA.

    It certainly will be interesting to see in which direction VRS or resolution independent shading techniques will be going in future.
     
  3. Jay

    Jay
    Veteran Regular

    Joined:
    Aug 3, 2013
    Messages:
    3,470
    Likes Received:
    2,826
    Just for the record, when I'm talking about software VRS, I'm not talking about a possible PS5 GE implementation.

    I wouldn't lable GE VRS as software, gets confusing.
    GE VRS is easier to follow.
     
  4. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,018
    Likes Received:
    15,763
    Location:
    The North
    yea, oh yea, I know ;) There's no shading done in the geometry engine. You're good ;)

    I only talked about geometry engine because I often see the Matt H tweet brought up as what is happening in the GE as being equivalent to VRS. But they aren't the same functions technically from our understanding of it.
    Meshing and culling the amount of triangles before moving forward to shading is not VRS. It's an important aspect of maximizing the pipeline but they are not inherently the same. And I'm fairly positive that Mesh Shaders will provide this particular benefit.

    As noted by others here the Geometry Engine shares with the original FF front end. Mesh Shaders are on a completely separate programming path. There's no ability for Mesh shaders to have the same ability to convert the VS, GS, HS, etc into primitive shaders as it does the geometry engine
     
    RagnarokFF likes this.
  5. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    591
    Likes Received:
    528
    Halo Infinite: This was an early demo. There was more wrong than just the VRS (if the problem was a VRS problem and not a typical LOD problem).
    Dirt 5: I really don't know what went wrong with this title. Yes sometimes it looks like they did some blur-processing over the whole image, but that all has nothing to do with a correct VRS implementation. But you can always do something wrong (e.g. the Checkerboard implementation in RDR2 was also total broken), that has nothing to do with the feature, just with the developers implementation/usage.

    E.g. just look at RDR2 and its CBR implementation and judge CBR buy that game, CBR would be a catastrophe for image quality. But we know that there are some good examples and some bad examples of a feature implementation. So we can say, CBR can be a great feature if it is implemented well and can save a lot of rendering time ;)

    VRS, Mesh Shaders, ... aren't automatically bad because only one console (and the PC space) support it (or if we look at other forums/pages "because they are MS exclusive". Time will tell if those features can really be used well. But especially for VRS we also have a good implementation (Gears5) and in this case, it was just added and not implemented from the beginning.
     
    mr magoo, PSman1700, tinokun and 6 others like this.
  6. Pete

    Pete Moderate Nuisance
    Moderator Legend Veteran

    Joined:
    Feb 7, 2002
    Messages:
    5,482
    Likes Received:
    1,290
    This stood out to me:

    “Microsoft stated that because of this difference and now the CPU was the thermal hot spot, the acoustics now center around that point. As a result of Microsoft’s testing, the company is stating that the CPU is disproportionately responsible for the acoustics of the design: every additional Watt the CPU uses is worth five times more to the acoustic budget than the GPU.”

    It would be interesting to know how this differs with PS5’s chip, as they’re starting slightly lower on the CPU but way higher on the GPU.

    If we tire of acronyms can we refer to DLSS as chessboarding.
     
  7. liams

    Regular Newcomer

    Joined:
    Jul 1, 2020
    Messages:
    325
    Likes Received:
    267
    I vote we refer to DLSS as go boarding



    Go (game)
     
    HLJ and milk like this.
  8. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,596
    Likes Received:
    840
    Location:
    Finland
    What went wrong is that they applied it to the image apparently purely by variance or contrast within the image.

    There is no motion blur or DoF to hide the image space aligned low resolution shading.
    They could have marked player car or some other things to always shade in full resolution and tweak it to be less aggressive overall.
    Yes, handling all problem cases in temporal sampling methods can involve quite bit of work.
    The Dark Souls remastered presentation and the one for Frostbite show how much has to be changed to get it work decently. (And that there are plenty of options when creating it.)
    Agreed, they both are great features to have.
    Mesh Shader looks to be amazing feature to have. (Interestingly if you use MS to create polygonally dense worlds, VRS becomes less usable as it is MSAA based.)
     
    Pete and function like this.
  9. Lurkmass

    Regular Newcomer

    Joined:
    Mar 3, 2020
    Messages:
    305
    Likes Received:
    344
    IMO Xbox ATG should expose primitive shaders if they can because what ultimately happens behind that interface and the driver is the hardware running those programs on primitive shaders. As we can see from the comments inside of the code they are probably generating indirect commands which invokes a compute shader to emulate task (amplification) shaders and they run mesh shaders on NGG GS (geometry shaders or primitive shaders in AMD's marketing material) ...

    Exposing primitive shaders has the upside of being compatible with the traditional geometry pipeline which has some implications for productivity. Instead of forcing engine developers to rewrite a lot of shaders for a new and incompatible geometry pipeline, they could see some benefits by porting existing shaders to another compatible geometry pipeline and then gradually decide to optimize specific shaders on a case by case basis to get all of the same benefits. Offering this path would've been arguably less work for developers and made for a smoother transition. Primitive shaders, at least on consoles could've seen more immediate adoption than mesh shaders of which there are currently zero games using it ...
     
    Pete, iroboto, BRiT and 3 others like this.
  10. scently

    Veteran Regular

    Joined:
    Jun 12, 2008
    Messages:
    1,083
    Likes Received:
    420
    As at the June GDK, NGG was not yet working on XSX/XSS but they note that they plan to get it working in the future. I personally suspect that a lot of the issues with the launch software on Xbox Series, and maybe up till now, is that their software stack is not in the best of state. @iroboto speculated this has to do with primitive shaders/NGG not being exposed or not yet functional on the XS yet.
     
    VitaminB6 and function like this.
  11. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,018
    Likes Received:
    15,763
    Location:
    The North
    Yea. I originally made that postulation with respect to the first set of launch titles (6 months) but after that, the argument had a hard expiry date. MS should have enabled the NGG pipeline by December and most games are unlikely to switch GDK versions with 3-4 months to launch. So realistically I give them until end of April for games to have transitioned.
    After April I have no response as to why PS5 and 6XXX move together as a pack and XSX sort of falls in with nvidia as a pack. Frankly makes no sense to me.
     
    RagnarokFF and scently like this.
  12. cwjs

    Newcomer

    Joined:
    Nov 17, 2020
    Messages:
    164
    Likes Received:
    342
    Correct me if I'm wrong, but i haven't seen any reason to suspect just exposing primitive shaders would give a significant performance boost. We know just switching over to mesh shaders (which use a primitive shader path, right?) and not adding any code to cull doesn't give any meaningful performance boost. What matters is taking advantage of the new paths to add culling and other new logic to drastically cut down on whats rendered.

    If i understand correctly from all of the whispers, Sony's GE provides some significant amount of that (either 'automatically' on the compiler side or exposed to developers, it's not clear).

    Mesh shaders provide the same opportunities to win speed back and are just an obviously better way to program than the clunky, giant fixed function pipeline with ~5+ geometry steps and no access to neighboring faces.

    Also, I'm not 1000% sure why we call Mesh Shaders a microsoft thing -- obviously they're supported in dx12u, and via partnerships with amd and nvidia they're probably already super optimized on the back end, but vulkan and opengl* already have mesh shader support too. Presumably Sony's tools also do or will. Task and Mesh shaders seem like a very good pattern for programming that people will want to adopt for reasons other than just performance. Is there something I'm missing here?

    *(Opengl may only be via vendor extension right now)
     
    #172 cwjs, Feb 17, 2021
    Last edited: Feb 17, 2021
    PSman1700 likes this.
  13. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    13,018
    Likes Received:
    15,763
    Location:
    The North
    Lurkmass and 3dcgi provides additional details over on this thread:
    https://forum.beyond3d.com/posts/2192799/

    Primitive Shaders and Mesh shaders do not use the same shader path. Mesh shaders run an entirely separate path and when they are done they go straight to pixel shaders IIRC.
    Culling is a big advantage yes, of having primitive shaders do it up front prior to assembly. But there is more to it than that.

    The issue isn't primitive vs mesh shaders, it's about enabled vs not enabled. Mesh shaders haven't been programmed for, so AMD cards that are leverage primitive shaders either explicitly or as a recompile are gaining performance on the front end. I don't know what else Sony has done to customize the GE further, that part hasn't been clear with respect to what AMD has done with it for RDNA2.
     
    RagnarokFF, Pete, PSman1700 and 4 others like this.
  14. cwjs

    Newcomer

    Joined:
    Nov 17, 2020
    Messages:
    164
    Likes Received:
    342

    Thanks -- doesn't that post confirm that Mesh Shaders map to Geometry Shaders? Are Geometry Shaders and Primitive Shaders different things? (Forgive my confusion if so -- they both run on primitives, and the graphics industry has a huge history of having an nvidia name, microsoft name, khronos group name, and amd name for each thing)


    Edit: ah, I see in the next post in the thread, PS is a new stage that replaces GS and combines vertex operations.
     
    PSman1700 likes this.
  15. chris1515

    Legend Regular

    Joined:
    Jul 24, 2005
    Messages:
    6,109
    Likes Received:
    6,389
    Location:
    Barcelona Spain
  16. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    14,899
    Likes Received:
    11,009
    Location:
    London, UK
    Do we trust somebody who thinks GPUs are magic? It's like taking climate change advice from Republicans. :runaway:

    "time will tell" really sums it up. If you rewind to 2013 when Mark Cerny was explaining why PS4 had an excess (or "non-round") configuration of CUs compared to Xbox One, it was because Sony anticipated far greater use of compute. Although increased use of compute this did happen, and because of the low-powered Jaguar cores had to happen, in terms of multiplatform game development, compute didn't explode as much because Xbox One didn't have as much excess compute capacity once CUs had finished servicing graphical needs.

    Sometimes doing something a bit weird (Cell, 360 EDRAM) is a gamble. Cell was a problem but 360 EDRAM would a boon for 720p MSAA - a post-process effect good for Xbox and mostly just absent on PS3 games but which didn't materially impact the design of games.

    What am I saying? I have no idea. Maybe time will tell? :runaway:

    edit: typos/grammar.
     
    #176 DSoup, Feb 20, 2021
    Last edited: Feb 21, 2021
    mr magoo, goonergaz, Arwin and 4 others like this.
  17. snc

    snc
    Regular Newcomer

    Joined:
    Mar 6, 2013
    Messages:
    815
    Likes Received:
    568
    About point:
    But what does that mean ? Accrording to techpowerup 6800 on avarage has 2225mhz so it's 17tf gpu and 5700xt on avarage 1871mhz making it 9.6tf. In 4k 6800 has 1.64x advantage over 5700xt even tough it has 1.77x advantage in tflops. 6800xt avarage 2297mhz making it 21tf and in 4k 1.89x performance advantage over 5700xt with 2.19x tflops advantage. So maybe its good thing it's performance-wise closer to rdna 1 :d Tough we will have better view when smaller rdna2 radeon cards arrive.
     
    #177 snc, Feb 20, 2021
    Last edited: Feb 20, 2021
  18. chris1515

    Legend Regular

    Joined:
    Jul 24, 2005
    Messages:
    6,109
    Likes Received:
    6,389
    Location:
    Barcelona Spain




    They compare photo of GPU block with locuza and they find the front end in Xbox Series and PS5 look like RDNA 1 but they aren't 100% sure because they don't have a good photo of a RDNA 2 PC GPU. We will need to wait to be sure but at least te configuration looks like RDNA1, after maybe the block are different but we need to wait a good photo of a 6800 0r 6900.
     
    snc and Globalisateur like this.
  19. t0mb3rt

    Newcomer

    Joined:
    Jun 8, 2020
    Messages:
    42
    Likes Received:
    94
    If the XSX supports mesh shaders, doesn't that mean that it can't have the RDNA 1 front end since RDNA 1 does not support mesh shaders?
     
  20. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,354
    Likes Received:
    2,001
    Location:
    Maastricht, The Netherlands
    What authority does this Nemez guy have that makes it interesting to read his quotes in this thread? Because I am not sure what he adds to the discussion ...
     
    RagnarokFF likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...