Nvidia DLSS antialiasing discussion *spawn*

Discussion in 'Architecture and Products' started by DavidGraham, Sep 19, 2018.

Tags:
  1. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    2,231
    Likes Received:
    1,652
    Location:
    msk.ru/spb.ru
    Again, the fact that you can access additional functions on DXR on Xbox doesn't mean that the DXR pipelines on Xbox and PC are different.
    As an example D3D12 FL12_0 and FL12_1 are the same pipeline, the latter has additional functions in its stages.
    And unless there are more shader types / pipeline stages in Xbox DXR the pipelines are likely the same as well.
     
    PSman1700 likes this.
  2. Lurkmass

    Regular Newcomer

    Joined:
    Mar 3, 2020
    Messages:
    317
    Likes Received:
    355
    D3D12 FL 12_0 and FL 12_1 technically do have different pipelines. The rasterization stage in the latter is modified to have over-estimate conservative rasterization. The pixel shader stage in the latter also have access to a new resource type not found in the former.

    Also I'm not even sure if ray tracing shader stages as defined with PC DXR have any real meaning on Xbox DXR. PC DXR pipeline is currently a subset of Xbox DXR pipeline for a more accurate description. Console developers are doing things not possible with PC DXR such as custom traversal, ray-caching, access to Blas leaf triangles, etc ...
     
  3. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,812
    Likes Received:
    3,971
    Location:
    Finland
    Just so everyone is on the same page, this is what I was referring to:
    https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs
     
    w0lfram, Rootax and BRiT like this.
  4. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    I didn't say anything about output resolution since both have an output resolution of 2160p, with a mixture of scaling techniques depending on the platform.

    In Alex's video he shows a 2060 using DLSS Performance Mode at 2160p (internal resolution is 50% of 2160p) and DLSS Quality at 1440p (internal resolution of 60% of 1440p). Internal resolution scales depending on which mode is selected.

    I hope you're able to agree that the Performance Mode of DLSS is inferior to the Quality Mode of DLSS and definitely inferior to native rendering?

    In the video, Alex stated that the DLSS Performance Mode was undesirable since it was getting framerates in the low 40s. I saw a low of 41FPS in the video, presumably there are instances where it drops lower . He recommends using the DLSS Quality Mode at 1440p (internal 60% of native) for a consistent 60fps. As I'm sure you know, GPUs need to have an average framerate of much higher than 60 in order to maintain that framerate/frametime. You included a screenshot in one of your replies that showed an average frame of ~60fps (can't remember the exact number). I don't think you need me to tell you that a 60fps average is not a 60fps game, as you're getting dips much lower and no vsync/cap applied. So non-vsynced/capped average of 60fps is not equivalent to a vsynced/capped 60fps.

    I hope you can agree that the PS5's average framerate it likely significantly higher than 60, but unfortunately we're unable to measure higher than 60 due to the correctly applied vsync and framerate cap.

    In the VG Tech video you posted, it was stated that the 4k PS5 mode (i.e., the closest match to the highest setting on PC) that there were three resolutions on the PS5; 2160p, 1872p, and 1440p. With the "common" resolution being 3328x1872. the guy that created that video posts here (@VGA), so maybe he can clarify if the resolution drops are tied to the framerate drops on PS5. As we know, some games drop resolution by area, not by frametime. I'd pixelcount it myself if I knew how.

    Anyway, I'll go with the assumption that 1440p is the genuinely lowest native resolution (DF counted a much higher low resolution in their video) and I'll also go with the idea that the lowest resolution is tied to framerate. I notice a low of 54fps in the DF video, they assume that it's a streaming issue, rather than a rendering problem. VG Tech get a low of 57, also looks like streaming as no action happens and the character is moving though the level. So arguably, the PS5 does not drop frames due to rendering load, I'm sure you'll disagree, but that's what the videos show and the creators seem to suggest.

    So we have (at least) three internal resolutions of the PS5, as follows:

    3840x2160 which calculates as 8,294,400 pixels/frame
    3328x1872 which calculates as 6,230,016 pixels/frame
    2560x1440 which calculates as 3,686,400 pixels/frame

    The 2060 has two resolutions, as follows:

    50% of 3840x2160 which calculates as 4,147,200
    60% of 2560x1440 which calculates as 2,211,840

    Since the 2060 doesn't have dynamic resolution scaling we have to compare the pixels over time. So when the 2060 is running the higher resolution it drops to 41fps, over the course of 1 second the hardware has a render resolution of 170,035,200 pixels.

    By comparison the PS5 renders at 2560x1440 at 60fps (at its lowest), which is 221,184,000 pixels over a second. I know you'll suggest that the lowest framerate of the PS5 is actually 54fps (contrarian to the videos), so that calculates as 199,065,600.

    This is the best possible scenario for the 2060 and it's still behind by 15%. and it's very likely that the PS5 is hitting a streaming problem with its own low. It'd be interesting to see the PS5 render in that same scenario, to check the framerate and resolution to calculate the total pixels over time, but it's likely that the PS5 is still framecapped to 60fps, so capacity to render more frames and probably at a higher resolution.

    The worst possible scenario for the 2060 is the PS5 rendering at full resolution at 60fps (although framerate is likely a lot higher than this and is vsynced/capped); 497,664,000 pixels/second.

    Is the image quality as good as PS5 when using DLSS Performance at 2160p (internal 50% of 4k), with lows of 41fps? The answer is no. The image quality of the Performance Mode isn't a match for native resolution and the ~20fps deficit is insufficient.

    Is the image quality as good as PS5 when using the DLSS Quality mode at 1440p (internal 60% of 1440p) with a matched framerate of 60? No, as the internal resolution is significantly lower than the PS5 and even the scaled DLSS output resolution is much lower.
     
    #1764 ThePissartist, Mar 8, 2021
    Last edited: Mar 8, 2021
    w0lfram likes this.
  5. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    424
    Except that's not true, because DLSS takes up to 3ms to run which takes away from the frametime that rasterisation can take place in. In other words, if the card didn't have to do DLSS pass after the rasterisation pass, it would be much faster. Instead of doing 41-60 fps, it would do 50-75fps with the same settings.

    And then there's the fact that the 2060S was mentioned, which is 15% faster then the non-S version shown in the video. So 57-86 fps in the same situation.
     
    PSman1700 likes this.
  6. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    You think an extra 3ms will equate to an extra 10-15fps? 41fps has a frame time of 24.39024.

    You're wrong on your second point too, 41/100*15=6.25, so the framerate in the same situation would be 47fps.
     
  7. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    424
    Yes depending on the frametime it makes a huge difference. 100 fps is 10 ms frametime and 10-3 = 7ms frametime equates 142 fps just as an (extreme in this case but not actually that extreme) example.
    in regards to 47 to 50 I may have made a typo or bad rounding somewhere, not tha it makes a difference. Comparing lowest seen framerate on the PC is disingenous at best anyway.
     
    PSman1700 likes this.
  8. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    8,573
    Likes Received:
    2,928
    Location:
    Guess...
    No I don't agree with this. This is specifically what VRS is designed to eliminate. When framerate is significantly higher than 60fps then resolution should be maxed out (so there may be instances at 4K where this is the case), however at every resolution lower than 4K, the game is running at less than 4K because it can't achieve 60fps at that resolution and thus if the VRS is working properly, it should be dropping resolution just enough to keep the framerate just above 60fps.

    So in those instances where the PS5 has to run at 1872p, or even 1440p, we should assume it's not running at much more than 60fps, or else it would also be running at a higher resolution.

    I also disagree with the this. The PS5's 4K mode has significantly inferior shadow quality to the PC's max settings so it's not a direct point of comparison. Nor is the PS5 standard mode which has the same shadows but higher LOD than the PC. SO a direct comparison is somewhere between those two modes.

    We know this not to be the case from the DF video which shows realtime resolution drops within a scene from a simple camera movement.

    This is the biggest issue I have with your analysis. Why are you making a comparison of the PS5 to a 2060 with DLSS but then using the internal rendering resolution rather than the DLSS output resolution as you measuring stick for how much work is being done? You're either comparing to a 2060 with DLSS in which case the comparison resolution is the DLSS output resolution, in this case 4K, or you're comparing to a 2060 without DLSS in which case you can compare to the native/internal resolution (which for the record would be lower than what you state above). But no-ones arguing that the PS5 won't be faster in that latter case so it's a moot point.

    Except the pixels drawn on screen by a 2060 with DLSS in 4k performance mode aren't 4,147,200, they're 8,294,000. So using you're own math that's 340,054,000 pixels per second at the 41fps low. Well in excess of the PS5 numbers you calculated above.

    To be clear, I don't think this is a good way of directly comparing performance so I'm not drawing any specific conclusions from it other than to say the evidence doesn't strongly suggest the PS5 is outperforming a 2060 in DLSS 4K performance mode here, let alone a 2060S, and certainly not by a clear and substantial margin.

    Why is that the worst scenario for the 2060? There are clearly scenes where the 2060 at 4K DLSS(P) is running at over 60fps. It's likely these are the same scenes that the PS5 is also able to run at it's highest resolution of 4K.

    This is a different argument. Your previous argument is that the PS5 is outperforming the 2060(S) even with DLSS engaged. Essentially saying that the PS5 can match whatever output resolution DLSS is offering and still deliver higher frame rate. The question of whether said DLSS mode is offering superior image quality to native is an altogether different one.
     
    #1768 pjbliverpool, Mar 8, 2021
    Last edited: Mar 8, 2021
    T2098, PSman1700 and pharma like this.
  9. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    No, it isn't.

    Wrong again.

    Something like a typo, sure.

    This is how we've always measured performance
     
  10. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    With respect, I don't think we can discuss this any more. I literally don't have the time to continually go around in circles and explaining every point.
     
  11. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    424
    Not sure if you're kidding here or... (don't make me say it)

    100 frames-per-second times 10 milisecods-per-frame = 1000 miliseconds = 1 second.
    142 frames-per-second times 7 milisecods-per-frame = 994 miliseconds ~ 1 second.

    No, it's average framerate which drives gaming experience mostly and to a lower degree 99% percentile mostly, none of which are the number you provided. Besides taking lowest fps on PC when comparing hardware is disingenous becase it is basically a byproduct of worse optimization on PC, rather than hardware capabilities*.

    *Specially on a game that uses dynamic resolution on console as a means to effectively mantain fps.
     
  12. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,773
    Likes Received:
    6,919
    1s/100 = 0.010s
    1s/0.007s=143 approximately

    What’s the issue here?
     
  13. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    100fps is 10ms, but it's not contextually correct. You can't say that 10-3=7 and 7ms is 147fps, therefore the GPU can produce way more than 41fps.

    I'm seriously at a loss here.

    Also, just relating to other posts, if DLSS Performance outputting 4k is the same as outputting 4k, then why not only ever use 4k Ultra Performance, because that also outputs at 4k.

    I'm fact, why do Nvidia even offer options if the output is the same? /s
     
    #1773 ThePissartist, Mar 8, 2021
    Last edited: Mar 8, 2021
  14. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,773
    Likes Received:
    6,919
    @ThePissartist Why is 10ms contextually incorrect? Legit curious.
     
    PSman1700 likes this.
  15. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    Because 41fps has a frametime 24.39024, you can remove 3ms from every frame of 41 and get a net gain of 123ms, but you're only gaining 5 frames.
     
  16. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    424
    It is 100% correct. If a 2060 is taking 10 ms (100fps) to render a frame with DLSS ON in a particular game/scene, it's doing the rasterisation part in 7 ms, which would translate to 143 fps. Not only that, DLSS ON is more than likely doing at least part of all the post effects at full resolution (4K), so actual framerate would be higher if doing the lower resolution entirely.

    It simply isn't.
     
    PSman1700 likes this.
  17. ThePissartist

    Veteran Regular

    Joined:
    Jul 15, 2013
    Messages:
    2,030
    Likes Received:
    996
    @Scott_Arm does this make sense to you?
     
  18. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    424
    It's 6 not 5 frames by your own number. That's for 41 fps.
    For 60 FPS it's 15 frames.
    For 100 FPS it's 43 frames.
     
    PSman1700 likes this.
  19. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    14,773
    Likes Received:
    6,919
    This is obviously oversimplified, but it's what he's getting at.

    upload_2021-3-8_13-8-39.png

    720p + 30% would be well under 1440p. I only used performance mode as an example because the math is simple 1/4 resolution.
     
  20. Benetanegia

    Regular Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    424
    It may confuse him more tho. What we know here is performance with DLSS ON, so 10ms would be with DLSS inlcuded, 10ms - 3ms = 7ms instead ot the other way around (10ms + 3ms)

    edit: You did add another data point in the middle tho :) for 77 fps with DLSS on it's 100 fps (+23 fps). Haha
     
    PSman1700 and BRiT like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...