Start an architecture thread somewhere else, maybe they won't notice
Just dont report anything
Start an architecture thread somewhere else, maybe they won't notice
They kinda already did. They tend to only test RT in titles where AMD is doing fine due to various reasons (CPU limitations mostly).They'll stop testing RT altogether next as it performs better on Nvidia.
They state DLSS is better in every review. I’m not aware of any plans to do away with DLSS testing. This was a specific 50 game benchmark video comparing a 4070ti to a 7900xt with FSR 2 being used to equalize the workload.What’s interesting is that Steve acknowledges DLSS is often higher quality than FSR 2.0. The only real benefit of ignoring DLSS is to simplify their testing process and the number of graphs they need to publish. It wouldn’t matter as much if there was a clear and obvious statement accompanying each review saying that DLSS is the better option for Nvidia users but somehow I doubt this will happen.
Once again they’re offering poor justification for a questionable decision.
No, they (also) did this for an NVIDIA-only 4070ti vs. 3080 test, which is very odd. Benchmarks are meant to be representative of real-world use, without which they are just irrelevant data points.This was a specific 50 game benchmark video comparing a 4070ti to a 7900xt with FSR 2 being used to equalize the workload.
It is rather trivial to turn rendering resolution down on all GPUs.TBH it's a non-trivial problem to address for mixed-vendor benchmarks.
Because it allowed for the 4070ti data to be reused.No, they (also) did this for an NVIDIA-only 4070ti vs. 3080 test, which is very odd. Benchmarks are meant to be representative of real-world use, without which they are just irrelevant data points.
TBH it's a non-trivial problem to address for mixed-vendor benchmarks. *Today* it's not that hard of a problem to solve because DLSS on NV, FSR on AMD and XeSS on Intel are optimal in both IQ and perf on each vendor's respective platforms, but you can imagine a hypothetical scenario in future if, for example, there's a tradeoff between speed-vs-quality with FSR-vs-DLSS on NV cards. Which would you pick? Some would argue for native testing only, but then people like me would argue that that's not representative of real world use either because most DLSS implementations are good enough to always leave on, especially in Q mode. No matter what they do, someone will complain.
However, for single-vendor benchmarks it's very odd to use a suboptimal reconstruction technique from another vendor, especially one that's enabled in far fewer games than the vendor's own option.
In the Witcher 3 he did start enabling hairworks as a default once GPUs became more performant.It is rather trivial to turn rendering resolution down on all GPUs.
There really isn't a lot of need to do FSR2/DLSS/XeSS benchmarks as a rule. It is quite enough to do them occasionally to see if there are any changes to how those perform relative to a simple resolution drop.
Using FSR2 only for such benchmarks is unfair because it is an AMD technology despite being capable of running on all GPUs. Steve doesn't use Hairworks for example IIRC despite it also being capable of running on all GPUs. It shouldn't be different for FSR2.
True, but that isn't a perfect proxy for reconstruction either because each solution has a potentially different (and non-trivial) cost, and I believe some post-processing effects are applied in the final reconstructed space, so their costs are proportional to final upscaled resolution.It is rather trivial to turn rendering resolution down on all GPUs.
They state DLSS is better in every review. I’m not aware of any plans to do away with DLSS testing. This was a specific 50 game benchmark video comparing a 4070ti to a 7900xt with FSR 2 being used to equalize the workload.
DLSS and FSR performance are typically within 1-2% of each other on an NV GPU. The benefit of DLSS is in IQ, not performance."Equalizing the workload" by using FSR which runs on general compute cores is equivalent to hobbling the Nvidia GPU's given the NV GPU's dedicate silicone to accelerating DLSS. If FSR were higher quality then that would be fine. But since its almost always lower quality, and often slower than DLSS on NV GPU's, it's a blatant way of providing a boost to AMD GPU's for benchmarking that has no bearing on reality.
That's not true, in Deathloop for example, the difference can be up to 10% more fps in favor of DLSS, it's typically around 6%. Even a consistent 1% or 2% is not something to scoff at. And it's statistically wrong to ignore it.DLSS and FSR performance are typically within 1-2% of each other on an NV GPU
That's exactly why they are doing it. Previously they insisted they will not test with upscaling at all, only native resolution, they even said DLSS2 is unreliable for testing. But they noticed that in RT games, native resolution presents a too large advantage for NV GPUs, and FSR2 closes the gap a little between latest AMD GPUs and NVIDIA GPUs, so they have now shifted to testing with FSR2 on RT games (and ONLY RT games), I guess they found that FSR2 is reliable more than DLSS for some reason!"Equalizing the workload" by using FSR which runs on general compute cores is equivalent to hobbling the Nvidia GPU's given the NV GPU's dedicate silicone to accelerating DLSS.
They have been testing DLSS before AMD even had RT GPUs available. Since DLSS 2 they have strongly supported the technology. DLSS 1 was awful so of course they decided against using it.That's not true, in Deathloop for example, the difference can be up to 10% more fps in favor of DLSS, it's typically around 6%. Even a consistent 1% or 2% is not something to scoff at. And it's statistically wrong to ignore it.
AMD FSR 2.0 vs. DLSS Performance in Deathloop
FSR 2.0 is AMD's second upscaling technology attempt to compete with Nvidia's DLSS. In this article we use Deathloop to benchmark DLSS, FSR 1.0 and FSR 2.0...www.techspot.com
That's exactly why they are doing it. Previously they insisted they will not test with upscaling at all, only native resolution, they even said DLSS2 is unreliable for testing. But they noticed that in RT games, native resolution presents a too large advantage for NV GPUs, and FSR2 closes the gap a little between latest AMD GPUs and NVIDIA GPUs, so they have now shifted to testing with FSR2 on RT games (and ONLY RT games), I guess they found that FSR2 is reliable more than DLSS for some reason!
They only tested DLSS2 in isolated cases, or sporadically, and never in a systemic wide manner. Now they plan to use FSR2 systematically.They have been testing DLSS before AMD even had RT GPUs available. Since DLSS 2 they have strongly supported the technology.
FSR1 was as much awful as DLSS1, yet they still found a few good things to say about it and many justifications to use it, and even called it an equivalent to DLSS2. Just like now, when they falsely claim FSR2 offers the same fps as DLSS2, they have a habit of twisting facts to suit their AMD bias. And now they call DLSS3 "fake" frames, he says down in the comments:DLSS 1 was awful so of course they decided against using it.
DLSS3 only smooths frames, my personal experience in Witcher 3 RT with DLSS 3 is still horrible, the input lag is shocking. Basically DLSS 3 works really nice when the game is already running quite well, there is a very small window where I'd say it's great. In any case fake frames in a benchmark graph are extremely misleading as it's not true fps performance given the input remains the same (at half the displayed frame rate).
Provide a link of them ever saying FSR 1 or even 2 is equivalent to DLSS.They only tested DLSS2 in isolated cases, or sporadically, and never in a systemic wide manner. Now they plan to use FSR2 systematically.
FSR1 was as much awful as DLSS1, yet they still found a few good things to say about it and many justifications to use it, and even called it an equivalent to DLSS2. Just like now, when they falsely claim FSR2 offers the same fps as DLSS2, they have a habit of twisting facts to suit their AMD bias. And now they call DLSS3 "fake" frames, he says down in the comments:
He is basing his entire decision on his definitely flawed and subjective experience in one game. Let's see if they change their minds once FSR3 comes out.
He is just a biased reviewer. Ignore them, nVidia should reconsider to block them again.And now they call DLSS3 "fake" frames, he says down in the comments: