Nvidia GeForce RTX 4090 Reviews

Hardware unboxed DLSS3 capture with the comment...... "just don't feed it big changes between frames"

FeymThzUUAIrvIt
 
Hardware unboxed DLSS3 capture with the comment...... "just don't feed it big changes between frames"

I suspect that part of 'optimizing' for DLSS3 will be instructing it not to engage in certain cases like these when you have camera cuts. If this occurred on the regular then cutscenes involving two character faces with significantly different backgrounds would have these macro-blocking frames with every cut.

Kind of a trolling comment but HB does go on to say it's much farther along than DLSS 1.0 was and it's generally good. Their 4090 review was very positive.

Der8auer notes that with a 5% performance loss you can also drop wattage by up to 150 watts.
 
Last edited:
I suspect that part of 'optimizing' for DLSS3 will be instructing it not to engage in certain cases like these when you have camera cuts. If this occurred on the regular then cutscenes involving two character faces with significantly different backgrounds would have these macro-blocking frames with every cut.
Exactly. I think this is more of an implementation issue than anything else. The games will have to be "smart" about these kinds of things.
 
Nah not really. They can interpret it any way they want.

If I change a camera angle rapidly and see a bunch of blurring that doesn't occur otherwise.. I'm well within reason to assume it's because of that.
It's all about Buzz words and the cause of the actual blurring. I have my doubts whether the actual cause is "big changes between frames", though have to admit it does have some clickbait appeal.
 
They could use all the buzz words in the world or none at all, doesn't change fact of where the interleaving process fails to generate a close to native image with insignificant to no artifacts present.
It's not as if the game was running by some other reviewer would suddenly make all the technical limitations that cause such artifacts to disappear.
 
I'm just guessing that a lot of those issues are currently "implementation" issues into the games themselves, and not necessarily issues with Nvidia's algorithms.
 
They could use all the buzz words in the world or none at all, doesn't change fact of where the interleaving process fails to generate a close to native image with insignificant to no artifacts present.
It's not as if the game was running by some other reviewer would suddendly make all the technical limitations that cause such artifacts to disappear.

Then we'll see this occur in every game whenever there's a camera cut. If we don't, then it's just a failure of implementation with F1.
 
Can some people here chill and stop being way defensive of a technology (DLSS3) that was just introduced and is being analysed? What do you want? For people, especially in a technical forum, to just accept anything and everything from one company without scrutiny? Enough with shooting down anyone who isn't yet onboard with "DLSS3 is great!". The tech looks very complex and naturally raises a lot of questions. AMD's FSR1 and FSR2 were scrutinized to oblivion, DLSS3 should not be any different.
 
It's all about Buzz words and the cause of the actual blurring. I have my doubts whether the actual cause is "big changes between frames", though have to admit it does have some clickbait appeal.
Wouldn't it be much more worrying if big distortions like that happened between two very similar frames rather than camera cuts?
 
It's all about Buzz words and the cause of the actual blurring. I have my doubts whether the actual cause is "big changes between frames", though have to admit it does have some clickbait appeal.
I think it's important to note that the game in question is running at 157 fps in 4K with RT in their own benchmark without DLSS.

Which gives us two kinda important points here:
  1. Good luck finding a solid 4K display with >144Hz refresh.
  2. Good luck seeing anything in a frame which remains on your >144Hz refresh display for less than 4ms.
 
Back
Top