No DX12 Software is Suitable for Benchmarking *spawn*

Read my post again. If you want RT then the performance comparison is not relevant since there's only one card that does RT. He mentions this at the start of the video. At the end of the video Steve also spends quite a bit of time on the extra features of the 2060 including DLSS.
RT isn't something which you can "want" or not universally, it's a rendering approach which can be used in an unlimited number of ways. In some games you may want to use it and in some not.
The point is that this should be shown on the graphs which are a quantifiable result of benchmarking and then instead of talking about what these features do you'd have data which would show what 2060S can do in comparison to 5700XT.
 
RT isn't something which you can "want" or not universally, it's a rendering approach which can be used in an unlimited number of ways. In some games you may want to use it and in some not.
The point is that this should be shown on the graphs which are a quantifiable result of benchmarking and then instead of talking about what these features do you'd have data which would show what 2060S can do in comparison to 5700XT.

You mean like showing RT benches with a N/A score for the 5700xt? I wouldn’t find that very useful since I already know it doesn’t support RT.
 
You mean like showing RT benches with a N/A score for the 5700xt? I wouldn’t find that very useful since I already know it doesn’t support RT.
Yeah, show 2060S without RT, with RT / DLSS / RT+DLSS. This way the data for comparison against 5700XT would be fully available for you to decide whether these features - and their performance impact - are of any importance to you.
Without that you have Steve's word on what he thinks about these features and that's just not how things should be benchmarked between these two cards.
 
You mean like showing RT benches with a N/A score for the 5700xt? I wouldn’t find that very useful since I already know it doesn’t support RT.

I think buyers of the 5700XT dont care either. On the other hand buyers of a RTX2060S will likely use DLSS...
 
Wonder why he didn't compare 5700XT Anniversary edition vs the 2070 or 2070S. Introductory prices were similar.
 
Yeah, show 2060S without RT, with RT / DLSS / RT+DLSS. This way the data for comparison against 5700XT would be fully available for you to decide whether these features - and their performance impact - are of any importance to you.
Without that you have Steve's word on what he thinks about these features and that's just not how things should be benchmarked between these two cards.

You aren't doing much of a benchmark between them when you aren't using same settings.
 
You aren't doing much of a benchmark between them when you aren't using same settings for them.
Depends on what you want to show - how these card perform in real world where DLSS and RT exist and people are playing MEEE already - or make a synthetic statement on how these two GPUs compare when running the same code.
The latter is mostly irrelevant to an end user though and is a bit impossible to even do since there are no way of making sure that the games you test do in fact run the same code on these GPUs.
Thus you're down to using synthetic benchmarks (which are proven to run the same code on all h/w) and benchmarks you write yourself in this case.
And this test doesn't look like the latter.
 
Depends on what you want to show - how these card perform in real world where DLSS and RT exist and people are playing MEEE already - or make a synthetic statement on how these two GPUs compare when running the same code.

That is false dichotomy and the games tested are just as much real world.
 
That is false dichotomy and the games tested are just as much real world.
Absolute majority of 2060S owners would play Death Stranding with DLSS in real world. Same goes for a number of other titles on that list. And I'm not even mentioning the fact that in many of these titles 2060S would in fact perform better under DX11 while not loosing anything at all in IQ.
 
Absolute majority of 2060S owners would play Death Stranding with DLSS in real world. Same goes for a number of other titles on that list. And I'm not even mentioning the fact that in many of these titles 2060S would in fact perform better under DX11 while not loosing anything at all in IQ.

And that's fine. By the way, do you have any data to back that absolute majority up?
 
He says at the start of the video that if you want RT then the 2060S is the only option and the comparison to the 5700XT isn't relevant. He also acknowledges that his opinion that RT isn't viable on the 2060S is just that - one man's opinion. So have to give him credit for being up front about it at least.
He uses the video to enforce the notion that picking up the 5700XT was the absolute best option back then, and remains the best option even right now, despite being proven wrong on multiple fronts: the accelerated adoption of DLSS in dozens of games, the ray tracing capabilities of the Turing GPUs which enables them to access higher levels of image quality not available to RDNA1 GPUs, and the included DX12U features with Turing, which means it's more future proof than the RDNA1 GPUs that lack them.

For how much they cry about how future proof a 16GB VRAM GPU vs 8GB GPU is, they seem to suspiciously discard all these factors when they discuss RDNA1 vs Turing, as if suddenly future proofing is no longer a concern. So in summary he enforces his past opinion even though time has proven him wrong, which is incredibly deceptive of him.
 
How the performance difference between them changed in two years.
But he has the picture inverted, by not using DLSS2 in the games that support it: Outriders, Call of Duty Modern Warfare, DOOM Eternal, Death Stranding, Watch Dogs: Legion, Cyberpunk 2077, Fortnite and Rainbow Six Siege, which makes all of his comparisons invalid IMO, the 2060 Super would come out solidly ahead with DLSS, yet he uses this flawed methodology to state the opposite!

He used to complain not enough DLSS titles exist 2 years ago, but now that half his tested games include DLSS support, he causally dismisses DLSS2 and didn't even put it on the charts next to native rendering, as it SHOULD be.
 
Last edited:
He uses the video to enforce the notion that picking up the 5700XT was the absolute best option back then, and remains the best option even right now, despite being proven wrong on multiple fronts: the accelerated adoption of DLSS in dozens of games, the ray tracing capabilities of the Turing GPUs which enables them to access higher levels of image quality not available to RDNA1 GPUs, and the included DX12U features with Turing, which means it's more future proof than the RDNA1 GPUs that lack them.

For how much they cry about how future proof a 16GB VRAM GPU vs 8GB GPU is, they seem to suspiciously discard all these factors when they discuss RDNA1 vs Turing, as if suddenly future proofing is no longer a concern. So in summary he enforces his past opinion even though time has proven him wrong, which is incredibly deceptive of him.

He acknowledges all of what you said in his summary at the end of the video. I'm not sure what else he can do. It's not like you can predict or quantify the benefits of sampler feedback or the other currently unused features. This is one of their better videos I've seen where they actually take the time to talk about value added features. It's an improvement on their other stuff where they pretend those features don't even exist or have no value.
 
Back
Top