Nvidia Geforce RTX 5090 reviews

The point of benchmarks isn't "real world usage", they're benchmarks, they almost never reflect actual gameplay realities. The point is to normalize visual quality across a set of GPUs to see how they perform rendering the exact same set of frames.
This is an absolutely asinine comment. The objective of a benchmark is to provide a representative, yet repeatable facsimile of a real-world scenario. Yes, the ability to perform contrastive evaluations on competing platforms is important, but if the workload being tested is irrelevant to the real-world use, then the benchmark has failed its primary reason to exist. It's like trying to compute Pi on a gaming GPU. It's repeatable and portable but utterly useless (* I should note that there are other kinds of tools -- microbenchmarks, power viruses, etc. that are used in ablation studies and stress-tests to analyze/debug specific elements of a design, but that's not what we are talking about here).

Intellectual wars are fought on a regular basis to determine if benchmark suites like SPEC are representative of real-world CPU use cases. They never really get there, but it is the goal. To say that the point of benchmarks isn't real-world usage is utterly ridiculous -- that literally *is* the point, even though different benchmarks achieve varying degrees of success.

In this specific context, yes, using FSR to evaluate an NVIDIA GPU is completely absurd because *nobody* will be using the GPU that way. It's like comparing an Airbus A350 and a Boeing 787 but forcing both planes to be flown by Boeing-trained pilots to "normalize" the pilot variable.
 
Which one?
Ghost of Tsushima, Plague Tale Requiem, Forza Horizon 5 are some examples of games that run max out on a 4k120hz using just DLSS-SR so no need for FG.

Which one? UE5 games with Software Lumen are limited at around 100 FPS on a 4090 in 1080p. So how can i use these 500 Hz on my new 500 Hz 1080p display with the fastest GPU available?
Lower the settings? Turn on MFG? I already mentioned that MFG is good for high refresh rate monitors and so did Tim in his review. I'm not sure what point you are trying to make here.
 
Ghost of Tsushima, Plague Tale Requiem, Forza Horizon 5 are some examples of games that run max out on a 4k120hz using just DLSS-SR so no need for FG.
These games cant even hit 200 FPS in 1080p. How can i use my new cheap 280Hz 1080 display to full potential without MFG? Especially on slower GPUs?

Lower the settings? Turn on MFG? I already mentioned that MFG is good for high refresh rate monitors and so did Tim in his review. I'm not sure what point you are trying to make here.
Lower settings means worse image and more artefacts. How is this a better way to improve FPS? Do you really think that a 5090 owner wants to lower the settings in 1080p?!
 
These games cant even hit 200 FPS in 1080p. How can i use my new cheap 280Hz 1080 display to full potential without MFG? Especially on slower GPUs?


Lower settings means worse image and more artefacts. How is this a better way to improve FPS? Do you really think that a 5090 owner wants to lower the settings in 1080p?!
Now you're just being silly my friend... Take a breath.

30 -> 120 MFG won't probably feel satisfyingly responsive to most people.
100 -> 400 MFG will likely provide a good gameplay experience to most people.

Do we really disagree anywhere? Am I being dishonest if I point out the case where MFG has issues?
 
30 -> 120 FPS will feel (input) exactly like 30 -> 60 FPS. MFG do not really introduce more latency. This hit happens from using FG. 30 FPS is enough to make gaming a much better experience with the mouse even when latency is up to 1 frame.

We need to stop talking for other people. If you dont like 30 FPS then you would not play with 30 FPS at all. Which makes the discussion about the baseline so pointless.
 
30 -> 120 FPS will feel (input) exactly like 30 -> 60 FPS.
According to latency measurements by Digital Foundry, the latency is not identical between 30 -> 60 using 2x FG and 30 -> 120 using 4x FG . 4x FG has some added latency to it, so it will likely not feel exactly the same.

We need to stop talking for other people. If you dont like 30 FPS then you would not play with 30 FPS at all. Which makes the discoussion about the baseline so pointless.
I disagree, discussion about baselines is increasingly important with MFG since the difference between baseline fps and total fps is vastly increased. And because Nvidia made a claim about 5070 performing like a 4090. A reviewer is only doing their job when they investigate such claims.

120 or 144 Hz displays are quite common. One can't just buy a 5070 and get the same experience with 30 -> 120 as they would on a 4090 doing 60 -> 120. Sorry.
 
The debate around framegen is far too subjective. Every reviewer is basically playing games and telling us how it “feels”. There really needs to be some sort of objective measure like the blurbusters UFO test but for games.

Thing is I have a no idea what people mean when they refer to “motion”. I’ve been playing around with 240hz for a few days and while walking around is pretty smooth moving the camera rapidly even at 240hz is still very unpleasant and you can easily make out individual frames. It’s nothing like moving your head or eyes rapidly in reality.

So when people talk about framegen smoothness what motion exactly are they referring to? Is my experience with rapid camera movement at 240hz an issue on my end?
 
Camera movement is one of benefits with MFG. It looks much better.

I think the problem is that we are playing with LCDs (and OLEDs) for over 20 years and gamers have forgotten that motion smoothness and clarity doesnt have to go hand in hand. Playing a 60 FPS game on my 60HZ HD CRT isnt smooth like 120FPS, 240 FPS etc. but motion clarity is much better than on a LCD. Strobing allows for >4x better motion clarity than sample and hold. MFG is basically "emulating" this. 60 FPS but clarity is like 240 FPS. It would even be better on displays with dynamic strobing like nVidia's G-Sync Pulsar. MFG can bring the frames up to 360 FPS and the strobing at 360Hz allows for motion clarity of 1200 Hz...
 
Camera movement is one of benefits with MFG. It looks much better.

Yeah I’m sure it does but how much better? All we’re getting are anecdotal reports. There’s no objective measurement of this stuff. Like I said my experience with 240hz camera panning is “better” but still not good.
 
These games cant even hit 200 FPS in 1080p. How can i use my new cheap 280Hz 1080 display to full potential without MFG? Especially on slower GPUs?


Lower settings means worse image and more artefacts. How is this a better way to improve FPS? Do you really think that a 5090 owner wants to lower the settings in 1080p?!
I bet a minority of 5090 will do 1080p.
Same with 4090.

I keep hearing it used as an argument, but I doubt it's validity in the real world.
Any RTX x090 owners running at 1080p, please raise your hand?
 
This is an absolutely asinine comment. The objective of a benchmark is to provide a representative, yet repeatable facsimile of a real-world scenario. Yes, the ability to perform contrastive evaluations on competing platforms is important, but if the workload being tested is irrelevant to the real-world use, then the benchmark has failed its primary reason to exist. It's like trying to compute Pi on a gaming GPU. It's repeatable and portable but utterly useless (* I should note that there are other kinds of tools -- microbenchmarks, power viruses, etc. that are used in ablation studies and stress-tests to analyze/debug specific elements of a design, but that's not what we are talking about here).

Intellectual wars are fought on a regular basis to determine if benchmark suites like SPEC are representative of real-world CPU use cases. They never really get there, but it is the goal. To say that the point of benchmarks isn't real-world usage is utterly ridiculous -- that literally *is* the point, even though different benchmarks achieve varying degrees of success.

In this specific context, yes, using FSR to evaluate an NVIDIA GPU is completely absurd because *nobody* will be using the GPU that way. It's like comparing an Airbus A350 and a Boeing 787 but forcing both planes to be flown by Boeing-trained pilots to "normalize" the pilot variable.
In all my days I have never seen a canned benchmark be even close to representative of actual gameplay performance and yet it’s basically all anybody uses.
 
Back
Top