Value of Hardware Unboxed benchmarking *spawn

Once more, Hardware Unboxed comes at the center of controversy, by stating they will test RT games with FSR (and only FSR) on both NVIDIA and AMD GPUs, as they think this is "fair" and that DLSS doesn't provide extra fps on NVIDIA hardware compared to FSR. This started an uproar of criticism towards their seemingly haphazard and ill thought standards.

 
Last edited:
@DavidGraham They can definitely be dumb about things. Should be pretty easy to prove that to be the case. I think I've seen that testing done in the past, maybe even from them, and I think there were differences.
 
What’s interesting is that Steve acknowledges DLSS is often higher quality than FSR 2.0. The only real benefit of ignoring DLSS is to simplify their testing process and the number of graphs they need to publish. It wouldn’t matter as much if there was a clear and obvious statement accompanying each review saying that DLSS is the better option for Nvidia users but somehow I doubt this will happen.

Once again they’re offering poor justification for a questionable decision.
 
What’s interesting is that Steve acknowledges DLSS is often higher quality than FSR 2.0. The only real benefit of ignoring DLSS is to simplify their testing process and the number of graphs they need to publish. It wouldn’t matter as much if there was a clear and obvious statement accompanying each review saying that DLSS is the better option for Nvidia users but somehow I doubt this will happen.

Once again they’re offering poor justification for a questionable decision.
They state DLSS is better in every review. I’m not aware of any plans to do away with DLSS testing. This was a specific 50 game benchmark video comparing a 4070ti to a 7900xt with FSR 2 being used to equalize the workload.
 
I doubt they would do away with DLSS or XeSS in testing. It would provide a reasonable justification for Nvidia and Intel to stop supplying cards for all tests.
 
This was a specific 50 game benchmark video comparing a 4070ti to a 7900xt with FSR 2 being used to equalize the workload.
No, they (also) did this for an NVIDIA-only 4070ti vs. 3080 test, which is very odd. Benchmarks are meant to be representative of real-world use, without which they are just irrelevant data points.

TBH it's a non-trivial problem to address for mixed-vendor benchmarks. *Today* it's not that hard of a problem to solve because DLSS on NV, FSR on AMD and XeSS on Intel are optimal in both IQ and perf on each vendor's respective platforms, but you can imagine a hypothetical scenario in future if, for example, there's a tradeoff between speed-vs-quality with FSR-vs-DLSS on NV cards. Which would you pick? Some would argue for native testing only, but then people like me would argue that that's not representative of real world use either because most DLSS implementations are good enough to always leave on, especially in Q mode. No matter what they do, someone will complain.

However, for single-vendor benchmarks it's very odd to use a suboptimal reconstruction technique from another vendor, especially one that's enabled in far fewer games than the vendor's own option.
 
TBH it's a non-trivial problem to address for mixed-vendor benchmarks.
It is rather trivial to turn rendering resolution down on all GPUs.
There really isn't a lot of need to do FSR2/DLSS/XeSS benchmarks as a rule. It is quite enough to do them occasionally to see if there are any changes to how those perform relative to a simple resolution drop.
Using FSR2 only for such benchmarks is unfair because it is an AMD technology despite being capable of running on all GPUs. Steve doesn't use Hairworks for example IIRC despite it also being capable of running on all GPUs. It shouldn't be different for FSR2.
 
No, they (also) did this for an NVIDIA-only 4070ti vs. 3080 test, which is very odd. Benchmarks are meant to be representative of real-world use, without which they are just irrelevant data points.

TBH it's a non-trivial problem to address for mixed-vendor benchmarks. *Today* it's not that hard of a problem to solve because DLSS on NV, FSR on AMD and XeSS on Intel are optimal in both IQ and perf on each vendor's respective platforms, but you can imagine a hypothetical scenario in future if, for example, there's a tradeoff between speed-vs-quality with FSR-vs-DLSS on NV cards. Which would you pick? Some would argue for native testing only, but then people like me would argue that that's not representative of real world use either because most DLSS implementations are good enough to always leave on, especially in Q mode. No matter what they do, someone will complain.

However, for single-vendor benchmarks it's very odd to use a suboptimal reconstruction technique from another vendor, especially one that's enabled in far fewer games than the vendor's own option.
Because it allowed for the 4070ti data to be reused.

It is rather trivial to turn rendering resolution down on all GPUs.
There really isn't a lot of need to do FSR2/DLSS/XeSS benchmarks as a rule. It is quite enough to do them occasionally to see if there are any changes to how those perform relative to a simple resolution drop.
Using FSR2 only for such benchmarks is unfair because it is an AMD technology despite being capable of running on all GPUs. Steve doesn't use Hairworks for example IIRC despite it also being capable of running on all GPUs. It shouldn't be different for FSR2.
In the Witcher 3 he did start enabling hairworks as a default once GPUs became more performant.
 
It is rather trivial to turn rendering resolution down on all GPUs.
True, but that isn't a perfect proxy for reconstruction either because each solution has a potentially different (and non-trivial) cost, and I believe some post-processing effects are applied in the final reconstructed space, so their costs are proportional to final upscaled resolution.

There's been some good recent work on neural image quality assessors. I'm hoping to see some advances and mainstream adoption so that we can quantitatively and objectively navigate the quality-vs-performance tradeoff space. That should resolve some of these tensions, and would nudge reviewers towards more journalistic integrity (i.e., they will have fewer excuses to find ways to play to the biases of their follower communities).
 
They state DLSS is better in every review. I’m not aware of any plans to do away with DLSS testing. This was a specific 50 game benchmark video comparing a 4070ti to a 7900xt with FSR 2 being used to equalize the workload.

"Equalizing the workload" by using FSR which runs on general compute cores is equivalent to hobbling the Nvidia GPU's given the NV GPU's dedicate silicone to accelerating DLSS. If FSR were higher quality then that would be fine. But since its almost always lower quality, and often slower than DLSS on NV GPU's, it's a blatant way of providing a boost to AMD GPU's for benchmarking that has no bearing on reality.
 
"Equalizing the workload" by using FSR which runs on general compute cores is equivalent to hobbling the Nvidia GPU's given the NV GPU's dedicate silicone to accelerating DLSS. If FSR were higher quality then that would be fine. But since its almost always lower quality, and often slower than DLSS on NV GPU's, it's a blatant way of providing a boost to AMD GPU's for benchmarking that has no bearing on reality.
DLSS and FSR performance are typically within 1-2% of each other on an NV GPU. The benefit of DLSS is in IQ, not performance.
 
DLSS and FSR performance are typically within 1-2% of each other on an NV GPU
That's not true, in Deathloop for example, the difference can be up to 10% more fps in favor of DLSS, it's typically around 6%. Even a consistent 1% or 2% is not something to scoff at. And it's statistically wrong to ignore it.

"Equalizing the workload" by using FSR which runs on general compute cores is equivalent to hobbling the Nvidia GPU's given the NV GPU's dedicate silicone to accelerating DLSS.
That's exactly why they are doing it. Previously they insisted they will not test with upscaling at all, only native resolution, they even said DLSS2 is unreliable for testing. But they noticed that in RT games, native resolution presents a too large advantage for NV GPUs, and FSR2 closes the gap a little between latest AMD GPUs and NVIDIA GPUs, so they have now shifted to testing with FSR2 on RT games (and ONLY RT games), I guess they found that FSR2 is reliable more than DLSS for some reason!
 
That's not true, in Deathloop for example, the difference can be up to 10% more fps in favor of DLSS, it's typically around 6%. Even a consistent 1% or 2% is not something to scoff at. And it's statistically wrong to ignore it.


That's exactly why they are doing it. Previously they insisted they will not test with upscaling at all, only native resolution, they even said DLSS2 is unreliable for testing. But they noticed that in RT games, native resolution presents a too large advantage for NV GPUs, and FSR2 closes the gap a little between latest AMD GPUs and NVIDIA GPUs, so they have now shifted to testing with FSR2 on RT games (and ONLY RT games), I guess they found that FSR2 is reliable more than DLSS for some reason!
They have been testing DLSS before AMD even had RT GPUs available. Since DLSS 2 they have strongly supported the technology. DLSS 1 was awful so of course they decided against using it.
 
They have been testing DLSS before AMD even had RT GPUs available. Since DLSS 2 they have strongly supported the technology.
They only tested DLSS2 in isolated cases, or sporadically, and never in a systemic wide manner. Now they plan to use FSR2 systematically.

DLSS 1 was awful so of course they decided against using it.
FSR1 was as much awful as DLSS1, yet they still found a few good things to say about it and many justifications to use it, and even called it an equivalent to DLSS2. Just like now, when they falsely claim FSR2 offers the same fps as DLSS2, they have a habit of twisting facts to suit their AMD bias. And now they call DLSS3 "fake" frames, he says down in the comments:

DLSS3 only smooths frames, my personal experience in Witcher 3 RT with DLSS 3 is still horrible, the input lag is shocking. Basically DLSS 3 works really nice when the game is already running quite well, there is a very small window where I'd say it's great. In any case fake frames in a benchmark graph are extremely misleading as it's not true fps performance given the input remains the same (at half the displayed frame rate).

He is basing his entire decision on his definitely flawed and subjective experience in one game. Let's see if they change their minds once FSR3 comes out.

 
They only tested DLSS2 in isolated cases, or sporadically, and never in a systemic wide manner. Now they plan to use FSR2 systematically.


FSR1 was as much awful as DLSS1, yet they still found a few good things to say about it and many justifications to use it, and even called it an equivalent to DLSS2. Just like now, when they falsely claim FSR2 offers the same fps as DLSS2, they have a habit of twisting facts to suit their AMD bias. And now they call DLSS3 "fake" frames, he says down in the comments:



He is basing his entire decision on his definitely flawed and subjective experience in one game. Let's see if they change their minds once FSR3 comes out.

Provide a link of them ever saying FSR 1 or even 2 is equivalent to DLSS.
 
Doesn't DLSS performance mode provide the same image quality as FSR quality mode at 4k?

That in itself is a huge performance advantage to Nvidia with no loss in IQ.

On another note, I do with Nvidia would add an 'Ultra Quality' mode to DLSS as I have a few games that are in the 50-60fps range where I want to hit 60fps while still feeding DLSS with as many pixels as possible.

So it would go like this

  • Ultra Quality at 4k would be 1800p
  • Ultra Quality at 1440p would be 1200p
  • Ultra Quality at 1080p would be 900p
 
Back
Top