No DX12 Software is Suitable for Benchmarking *spawn*

Yes, what is this even an answer to?
To your claim that the numbers aren't "personal views" but the settings are. As I've said benchmarking and reviews are not "personal views".
Once you start doing them going off your "personal views" on what should be benchmarked and what not the results loose any value very fast.
You can provide a limited number of results because you didn't have time to do more but then you don't have enough data to make blanket claims about things which you *didn't benchmark*.

They did have special videos focusing only on raytracing too, but do consider that their viewers aren't robots either, if they were convinced on what they saw of raytracing, either from HU or other sites, they are free to voice their own opinion that raytracing should have a bigger part in their future tests. Worst case for HU, they could even lose viewers.
They have a couple of "special" videos instead of providing the results in all videos and give viewers an option of thinking for themselves.
Note that Metro Exodus which was present in their 2020 results is absent from 2021 suite. Gee I wonder why.

It would be interesting however if eg Sony or Microft could show us the numbers whether most of their userbase decide to play with raytracing on or turning it off for better performance.
You can have RT on 2060S or better performance with it being off. You can't have RT on 5700XT.
You can have better performance through DLSS on 2060S or better IQ with it being off. You can't have that on 5700XT.
The point of a review / benchmark isn't to show how people prefer to play games. It is to provide information which will allow people to decide for themselves how they would like to play games. And this particular review doesn't provide such information.
HUB is trapped in their confirmation bias for some time now. They've made claims which they don't want to admit were wrong and this is the sole reason for such benchmarking.
 
To your claim that the numbers aren't "personal views" but the settings are. As I've said benchmarking and reviews are not "personal views".
Once you start doing them going off your "personal views" on what should be benchmarked and what not the results loose any value very fast.
You can provide a limited number of results because you didn't have time to do more but then you don't have enough data to make blanket claims about things which you *didn't benchmark*.

They don't lose any value as long as it's clear how the tests were done. Providing the information of what was tested and why was something ruled out.

It sounds like you mean that any review not done having all settings on the highest possible should be ignored.

They have a couple of "special" videos instead of providing the results in all videos and give viewers an option of thinking for themselves.
Note that Metro Exodus which was present in their 2020 results is absent from 2021 suite. Gee I wonder why.

Videos focusing on raytracing do give it more attention than a review providing all options at once, and even if you do think that they tell their viewers what is right and what is wrong, the viewers are still free to form and voice their own opinion, or even skip watching HU if they think they aren't relevant anymore

Removing Metro Exodus Enhanced is understandable as long as they still test GPUs that don't support DXR. If their next special video for raytracing doesn't include Metro I will agree that they are biased.

You can have RT on 2060S or better performance with it being off. You can't have RT on 5700XT.
You can have better performance through DLSS on 2060S or better IQ with it being off. You can't have that on 5700XT.
The point of a review / benchmark isn't to show how people prefer to play games. It is to provide information which will allow people to decide for themselves how they would like to play games. And this particular review doesn't provide such information.

Their benchmarks do provide information of how the game will play under the specific tested circumstances, which they decided to do how they do because that is what their viewers have asked for. A review should be more fleshed out than pure benchmarks however, but it can still be subjective what areas a review decided to focus on.

But I don't see what the fuzz is about. If we are talking of the same video, it's a rematch of two two-year-old GPUs and how they do today. It goes back to their initial shootout in which HU believed the 5700XT to be better.
He does say outright that the 5700XT isn't even an option if you want raytracing. But the takeaway is that the 5700XT typically does have a performance advantage today if you don't use raytracing, which should be pleasant news to the ones who did buy their 5700XT based on HU's videos.

If we return to this topic three years from now and the 5700XT evidently became useless in 2022, I can see why people will gloat at Hardwareunboxed because they kept saying they believed that to be better than the 2060 Super.
AFAIK, the only game that does require DXR today is Metro Enhanced, and I do think it's given a pass because many already are satisfied with having played the Original. I do expect alot more complaints if a new game like Battlefield 2042 requires DXR.
 
They don't lose any value as long as it's clear how the tests were done. Providing the information of what was tested and why was something ruled out.
If we're looking at numbers only without any conclusions given alongside them then sure, that's fair. And I've said as much already.

It sounds like you mean that any review not done having all settings on the highest possible should be ignored.
Wide blanket conclusions based on "trust me" should be ignored when no data to back them up is presented.

Removing Metro Exodus Enhanced is understandable as long as they still test GPUs that don't support DXR. If their next special video for raytracing doesn't include Metro I will agree that they are biased.
Should we remove all games which can't run on 6GB 2060 too? This is a very relevant data for this particular comparison.

Their benchmarks do provide information of how the game will play under the specific tested circumstances, which they decided to do how they do because that is what their viewers have asked for.
I can't be bothered remembering this but did their viewers actually asked for benchmarking games only in DX12/VK and without DLSS? RT is out, sure, why would anyone want a bad feature which does nothing but kills performance - which is what Steve has been saying for almost three years now.

But I don't see what the fuzz is about. If we are talking of the same video, it's a rematch of two two-year-old GPUs and how they do today. It goes back to their initial shootout in which HU believed the 5700XT to be better.
It's a rematch which fails to take into account any of 2060S strengths - which was arguably fine back in 2019-2020 but we're in 2021 where DLSS doesn't suck and RT is becoming required to launch games. This is what all the fuzz is about.

If we return to this topic three years from now and the 5700XT evidently became useless in 2022, I can see why people will gloat at Hardwareunboxed because they kept saying they believed that to be better than the 2060 Super.
There will be games which will run fine on 5700XT for many many years to come. This is the problem here - it will not "become useless" over night. But it's usefulness will diminish more and more if you would account for features which it doesn't support. And if HUB won't account for them even in three years from now they'll get the same result again - with 5700XT beating 2060S in those games which will be able to run on it, disregarding everything else what is happening on the market. This approach is wrong already and in three years it will become nothing short of hilarious.
 
If we're looking at numbers only without any conclusions given alongside them then sure, that's fair. And I've said as much already.

Wide blanket conclusions based on "trust me" should be ignored when no data to back them up is presented.

I still don't get what your argument here actually is. HU does say how their tests are done and why they do so, they do what they have intended, and you are always free to look up other benchmarks focusing on what you consider important.

Should we remove all games which can't run on 6GB 2060 too? This is a very relevant data for this particular comparison.

If games actually can't run on the 6GB 2060, why should those benchmarks be included? Just explain it in the text or video and everyone will know why it wasn't even represented as N/A.

It'd be wrong if the reviewers skip these games but don't mention why in order to make the cards look better than how they are, but I don't see any trouble with it if they actually do explain why they didn't include them.

I can't be bothered remembering this but did their viewers actually asked for benchmarking games only in DX12/VK and without DLSS? RT is out, sure, why would anyone want a bad feature which does nothing but kills performance - which is what Steve has been saying for almost three years now.

I don't actually remember that either, but I haven't seen an influx of users demanding it either, despite raytracing even being bigger now than three years ago.

It's a rematch which fails to take into account any of 2060S strengths - which was arguably fine back in 2019-2020 but we're in 2021 where DLSS doesn't suck and RT is becoming required to launch games. This is what all the fuzz is about.

He did outright say that the 5700XT isn't even an option if you want raytracing, I just can't see how you're able to ignore that and just insist his video didn't take any of the 2060's strengths into account!

Right now, there is only one single game that that three year old 5700XT can run, far from a widespread industry requirement, and right now, a minor issue for someone who bought the 5700XT over the 2060S three years ago.

I do admit I did miss the DLSS being out of the equation. But even then, DLSS still isn't a universal feature available for all games and supported at release for all games either, and if DLSS is the deciding factor for someone but then the game doesn't actually support it or the support comes after already finishing it, tough luck

There will be games which will run fine on 5700XT for many many years to come. This is the problem here - it will not "become useless" over night. But it's usefulness will diminish more and more if you would account for features which it doesn't support. And if HUB won't account for them even in three years from now they'll get the same result again - with 5700XT beating 2060S in those games which will be able to run on it, disregarding everything else what is happening on the market. This approach is wrong already and in three years it will become nothing short of hilarious.

It's nothing wrong with users enlightening eachother regarding hardware, but can they at least try to be humble regarding their future importance? DXR might make the 5700XT irrelevant soon, we don't know. 5700 owners could keep ignoring DXR and enjoy better performance than 2060 owners, we don't know..

We just don't know if GPUs not supporting DXR will be unable to play new games already two or until five years from now, we don't even know if the differences until then will be larger than today.

We've been through this for years in different forms, DX11.2 supported only in full on AMD, Maxwell being better than Polaris due to feature level 12_1 over 12_0, GPU X lacking support for a specific tier in the DX12 feature set.
 
The nature of the PC market makes it incredibly unlikely the 5700xt becomes unusable. It’s likely to run nearly every title released over the next several years.
 
The point of a review / benchmark isn't to show how people prefer to play games. It is to provide information which will allow people to decide for themselves how they would like to play games.

I think this is the problem with you- not acknowledging other purposes of benchmarks than this one.

Should we remove all games which can't run on 6GB 2060 too?

In such a comparison yes, because it wouldn't be informative.
 
The devil is in the details...
Ya there may be some small/indy title that partners with Nvidia and requires RTX. Even Metro is a flawed example because one can just play the standard version which is the very same game minus RT.
 
He did outright say that the 5700XT isn't even an option if you want raytracing, I just can't see how you're able to ignore that and just insist his video didn't take any of the 2060's strengths into account!

I cannot speak for DegustatoR of course, but for me it seems like the distribution of information is a bit off. Yes, Steve mentions the RT thing in the beginning but in all their tables he's showing, especially in the Averages (1080p max quality, when it straight isn't) and in the Overview, there's no showing of it.

I don't know how the majority of youtube viewers use this content, but judging from the chapters in which HUB cuts it's video, people might be like me, jumping to individual benchmarks they're interested in. Of course, you can blame them (and me) for not viewing the whole video, but to me, the topic of RT seems a bit like it's hidden somewhere in the fine print.
 
Last edited:
Removing Metro Exodus Enhanced is understandable as long as they still test GPUs that don't support DXR. If their next special video for raytracing doesn't include Metro I will agree that they are biased.

The 5700XT is not the baseline. Removing games or ignoring them is showing a biased picture. Metro Exodus is a DX12U games. And a two years old GPU can not run it.
 
Yes, Steve mentions the RT thing in the beginning but in all their tables he's showing, especially in the Averages (1080p max quality, when it straight isn't) and in the Overview, there's no showing of it.

But how could you show it in averages when the other card does not run it?
 
But how could you show it in averages when the other card does not run it?
For starters, you could label the chart something like "1080p highest quality w/o Raytracing" or similar. Better yet, do away with "geomeans" or indices completely. I realize, they're good for easy consumption, but they are inadequate to visualize a multi-faceted topic.

Because Raytracing is not another game (yet) and instead it's an option for image quality, i.e. more work, as in performance is work per time, it would be preferrable to stick some hard-to-agree-upon measure of image quality next to all those fps values.

Or else go with Kyle's approach: Maximum playable settings, where it's subjective again, which game requires 45 instead of 30, 120 instead of 60 or 144 instead of 120 Hz, thus normalizing performance and let the viewer decide if the IQ gain is wort it.

VERY debatable topic.
 
But how could you show it in averages when the other card does not run it?
It's similar to showing performance numbers between the next-gen consoles when only one console has a particular feature.
 
  • Like
Reactions: HLJ
I cannot speak for DegustatoR of course, but for me it seems like the distribution of information is a bit off. Yes, Steve mentions the RT thing in the beginning but in all their tables he's showing, especially in the Averages (1080p max quality, when it straight isn't) and in the Overview, there's no showing of it.

I don't know how the majority of youtube viewers use this content, but judging from the chapters in which HUB cuts it's video, people might be like me, jumping to individual benchmarks they're interested in. Of course, you can blame them (and me) for not viewing the whole video, but to me, the topic of RT seems a bit like it's hidden somewhere in the fine print.

I do admit that it is possible to miss it, and the presentation could very well have been better, and I've always prefered reading the text instead of watching the video because it's easier to read segment by segment and pinpoint, but I do also expect users that are very criticizing towards a tester to watch the content carefully before posting.


I rewatched their review of the 3080 Ti from the beginning of June and their raytracing tests there are for RE:Village, Metro Enhanced and Watch Dogs Legion, but they only have the highest tier cards featured in the test.

It's subjective but I have always prefered a separate section with eg older GPUs tested at adequate settings, rather than taking up space as empty bars or N/A in the graphs.

I for sure however would actually appreciate if HU did include raytracing benchmarks so one could see how it using raytracing compares both to the 2060 and 5700XT without it, but then again that wasn't the purpose of the video.

The 5700XT is not the baseline. Removing games or ignoring them is showing a biased picture. Metro Exodus is a DX12U games. And a two years old GPU can not run it.

It is the baseline in the context of that video, their point back three years ago was that if you didn't care about about raytracing, and don't care today either, their belief of the 5700XT being better typically holds true. The video was never supposed to be "Is DXR worth it?" or the like, it was directly intended to compare the performance without raytracing. And everyone who does care about raytracing are told the 5700XT is a no-go.

Metro Exodus Enhanced does indeed require full support for DX12U, it's however just an updated version of the 2018 game. It might be and probably is a sign that 4A's upcoming games will go that path, but it's hardly an issue for 5700XT users today.
Just like the 5700XT can't run the DXR path in other games that have had it added after release or include it from the box, it's limited to running the original 2019 release rather than the updated and separate version fully intended only for DXR-capable GPUs.
 
Last edited:
It is the baseline in the context of that video, their point back three years ago was that if you didn't care about about raytracing, and don't care today either, their belief of the 5700XT being better typically holds true. The video was never supposed to be "Is DXR worth it?" or the like, it was directly intended to compare the performance without raytracing. And everyone who does care about raytracing are told the 5700XT is a no-go.

The video compares two different GPUs. HUB uses the latest games. And most of them support Raytracing and/or DLSS. Otherwise why are they using a game like AC:Valhalla which runs a lot worse on nVidia than AMD?

Metro Exodus Enhanced does indeed require full support for DX12U, it's however just an updated version of the 2018 game. It might be and probably is a sign that 4A's upcoming games will go that path, but it's hardly an issue for 5700XT users today.
Just like the 5700XT can't run the DXR path in other games that have had it added after release or include it from the box, it's limited to running the original 2019 release rather than the updated and separate version fully intended only for DXR-capable GPUs.

Metro Exodus EE is a remaster of a two years old game. It is a free update for owners of the orginal and its the same game next gen console users are getting. A 5700XT can not play this game. There are no excuses for outdated hardware. In a fair and neutral comparision the reviewer has to show this.
 
Excuse, excuses, excuses...

The community has indeed become alot more nitpicky today than ten years ago.
Crysis 2 getting a DX11 patch months after release and the DX9/10 owners being unable to run the separate DX11 exe didn't make users say that DX11 GPUs were now required for gaming. Neither did any of the other games that got separate DX11 exes, or DX10 prior to that, make users come with those statements.

The video compares two different GPUs. HUB uses the latest games. And most of them support Raytracing and/or DLSS. Otherwise why are they using a game like AC:Valhalla which runs a lot worse on nVidia than AMD?

As said, it compares two different GPUs under the specific circumstance of how they compare without using raytracing. It wasn't meant to investigate whether DXR is more worth it today.
As for why they include AC: Valhalla, sure it does seem to favor AMD, but why not just accept that it's a sample of a modern game?

Metro Exodus EE is a remaster of a two years old game. It is a free update for owners of the orginal and its the same game next gen console users are getting. A 5700XT can not play this game. There are no excuses for outdated hardware. In a fair and neutral comparision the reviewer has to show this.

It's the same game but updated and intended only for DXR users, just like how older games were patched and sometimes required you to run the separate DX11 exe if you wanted the DX11 features.I don't remember people back then saying the games were now unplayable on older GPUs, why is it different now?
 
And most of them support Raytracing and/or DLSS.
Exactly, the hypocrisy of their logic is at an all times high right about now, they claim RT and DLSS still have no enough support, yet half the games they tested support either Ray Tracing or DLSS, even after they suspiciously omitted some of the heavy games from their testing, like Control and Metro Exodus.
Metro Exodus EE is a remaster of a two years old game. It is a free update for owners of the orginal and its the same game next gen console users are getting. A 5700XT can not play this game.
The 5700XT can't also access other titles: Minecraft RTX, Quake 2 RTX and Stay In The Light (an indie game that requires an RT capable GPU). The new avatar game will also rely on hardware RT to run, with a software fallback mode that is worse looking and performing for non RT GPUs.
 
As said, it compares two different GPUs under the specific circumstance of how they compare without using raytracing. It wasn't meant to investigate whether DXR is more worth it today.
As for why they include AC: Valhalla, sure it does seem to favor AMD, but why not just accept that it's a sample of a modern game?

I didnt say anything about DXR. I said that HUB uses the latest games with image quality settings not available on a 5700XT. Does it matter that this raytracing?

It's the same game but updated and intended only for DXR users, just like how older games were patched and sometimes required you to run the separate DX11 exe if you wanted the DX11 features.I don't remember people back then saying the games were now unplayable on older GPUs, why is it different now?

I remember that AMD's R300 was much better for DX9 than nVidia's CineFX. But i cant remember that it was forbidden to play the DX9 version of games...
 
Back
Top