NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
I don't know
But I do not remember such artifacts with DLSS 2 even though I always turn it on
It introduces slight ghosting and certain artefacts with overlapping geometrie. Performances uses informations from 4 previous frames. It cant always hide the temporal aspect.
 
It introduces slight ghosting and certain artefacts with overlapping geometrie. Performances uses informations from 4 previous frames. It cant always hide the temporal aspect.
I think it has to do with optical flow frame generation
Artifacts appear on exactly one frame, the frame which was generated by dlss3
You can see for yourself. In YouTube, you can move the video one frame forward or backward (keys , and . on your keyboard). Timecode 1:33
 
I just watched the old RTX3000 reveal and even back then they stated 2x RT performance improvement over RTX2000 series.

I don't think we ever saw a 2x jump in the real world between the RTX2000/3000 series in gaming with RT enabled did we?
 
Last edited:
I think it has to do with optical flow frame generation
Artifacts appear on exactly one frame, the frame which was generated by dlss3
You can see for yourself. In YouTube, you can move the video one frame forward or backward (keys , and . on your keyboard). Timecode 1:33

Cool thx for the info about the one frame jump. I used it at this short benchmark part of the video:
3090TI: ~1 frame every 3 frames
4090: ~2 frames every 3 frames.

So in 4K with Psycho-RT settings looks like the 4090 is twice as fast.
 
We did in Quake and Minecraft which i think they were refering to when they said up to twice the speed in RT

View attachment 7002
I think there will be outliers like Q2RTX but even CP2077 at 1440p shows 3080ti is 62% faster than a 2080ti, which is still a huge performance uplift in its own right but still not 2x.

So I would assume in the average game that isn't fully path traced we might get around the same real world performance difference of a 60-70% increase.

Which again is a very healthy increase but not quite the 2x.
 
Last edited:
I don't think we ever saw a 2x jump in the real world between the RTX2000/3000 series in gaming with RT enabled did we?
The 2x jump was really in benchmarks and games where raster rendering is not as heavy. For example Q2RTX, Minecraft RTX and Serious sam/Doom with RT
 
I just watched the old RTX3000 reveal and even back then they stated 2x RT performance improvement over RTX2000 series.

I don't think we ever saw a 2x jump in the real world between the RTX2000/3000 series in gaming with RT enabled did we?
I'm pretty sure the claim was 3080 would be 2x as fast as 2080 (non-Ti). Ampere RT core was claimed to be 1.7x faster than Turing.

But please do link with timestamp if they claimed 2x perf over 2080 Ti.
 
I think it has to do with optical flow frame generation
Artifacts appear on exactly one frame, the frame which was generated by dlss3
You can see for yourself. In YouTube, you can move the video one frame forward or backward (keys , and . on your keyboard). Timecode 1:33

That's a very interesting find and yes, it's very obvious when looking at the individual frames. The issue occurs in other scenes in the video too, not just that one.
 
Given that DLSS3 inserts 1 frame inbetween ever pair of rendered frames, thus exactly doubling the performance, the fact that we're are only seeing a 50% overall performance uplift over DLSS2 in that DF preview is interesting.

Either the work involved in creating that extra frame is quite heavy - on a 4090 - or they are doing more work on the upscale side of things (but as that should also be a lot faster on the 4090 I doubt it's that).

So if this is so heavy on a 4090 how much benefit are the 4080's going to get? And worse, is this even feasible further down the stack?

EDIT: it's worth noting that the benefits for CPU limited games could be immense. In some ways I'd say DLSS 3 is much more interesting for what it can do on the CPU side than on the GPU side. Especially given the capability of the current gen console CPU's.
 
I just watched the old RTX3000 reveal and even back then they stated 2x RT performance improvement over RTX2000 series.

I don't think we ever saw a 2x jump in the real world between the RTX2000/3000 series in gaming with RT enabled did we?

I'm pretty sure the claim was 3080 would be 2x as fast as 2080 (non-Ti). Ampere RT core was claimed to be 1.7x faster than Turing.

But please do link with timestamp if they claimed 2x perf over 2080 Ti.

The Ampere launch video was never specific in terms of what actual product they were comparing or even if they were doing that as opposed to some type of more specific abstract comparison (eg. SM to SM).

Instead the product page at launch used this more specific product comparison -

geforce-rtx-30-series-delivers-up-to-2x-performance.png


 
I just watched the old RTX3000 reveal and even back then they stated 2x RT performance improvement over RTX2000 series.

I don't think we ever saw a 2x jump in the real world between the RTX2000/3000 series in gaming with RT enabled did we?
It was very specifically the jump from the 2080 to 3080 that was claimed to be up to 2x but that was only in fully path traced games. Otherwise it was 60% faster or so.
 
I'm pretty sure the claim was 3080 would be 2x as fast as 2080 (non-Ti). Ampere RT core was claimed to be 1.7x faster than Turing.

But please do link with timestamp if they claimed 2x perf over 2080 Ti.

Where did I say they stated it was 2x jump over 2080ti?

I simply used the 2080ti vs 3080ti as they're both high end '80ti models.

Do you have a time stamp for 2080 vs 3080?
 
Another interesting point from the DF preview is the DLSS 3 frame generation appears to be an additional option that can be activated in addition to and not instead of DLSS 2 on the 4090.
 
Status
Not open for further replies.
Back
Top