No DX12 Software is Suitable for Benchmarking *spawn*

I didnt say anything about DXR. I said that HUB uses the latest games with image quality settings not available on a 5700XT. Does it matter that this raytracing?.

Every reviewer is free to use whatever settings he wants to, the important part in order to be taken seriously is just explaining how the tests were conducted and why something was disabled or enabled.

I remember that AMD's R300 was much better for DX9 than nVidia's CineFX. But i cant remember that is was forbidden to play the DX9 version of games...

Rofl, no one has forbidden anyone to play the DX9 version or the DXR version. I don't even remember when DX8 hardware actually went extinct, in contrast, my strongest memory from the early 2000s is when ID Software begged people not to buy the Geforce 4 MX for Doom 3 and the game did end up supporting that
 
Every reviewer is free to use whatever settings he wants to
And every reader is free to criticize the reviewer if their choice of settings is irrelevant to the market.

the important part in order to be taken seriously is just explaining how the tests were conducted and why something was disabled or enabled.
If I will use only RT+DLSS completely removing all non-Nv cards from my benchmarks it won't be taken seriously no matter how I would try to explain it. Same thing for someone who ignores RT in 2021.
 
Every reviewer is free to use whatever settings he wants to, the important part in order to be taken seriously is just explaining how the tests were conducted and why something was disabled or enabled.

HUB dont need to compare GPUs from different IHVs. But when i start to make a cross comparision i cant ignore alle the shortcomings from one vendor. Metro Exodus EE is a DX12U game. So when a 5700XT cant run it it has to be shown.
 
The community has indeed become alot more nitpicky today than ten years ago.
Crysis 2 getting a DX11 patch months after release and the DX9/10 owners being unable to run the separate DX11 exe didn't make users say that DX11 GPUs were now required for gaming. Neither did any of the other games that got separate DX11 exes, or DX10 prior to that, make users come with those statements.

Selective memory is a bad argument.
AF was a major contention point back in the day.
So was AA.
So was SM2.0 vs SM3.0 (eg. like when FarCry got the SM3.0 HDR patch).
And you must have been asleep when DX10 got introduced too.
 
And every reader is free to criticize the reviewer if their choice of settings is irrelevant to the market.


If I will use only RT+DLSS completely removing all non-Nv cards from my benchmarks it won't be taken seriously no matter how I would try to explain it. Same thing for someone who ignores RT in 2021.

You are indeed free to criticize the reviewer. You aren't, however, The Authority who decides what settings are irrelevant to the market, so can you drop that attitude?

And there wouldn't actually be anything wrong with the numbers themselves if you did so, you simply say you focus on RT + DLSS, do your part, and readers are free to take part of your results. AMD fanboys can say they don't like your benchmarks but they can't say your numbers are wrong for Nvidia users.
You would probably be the favourite reviewer amongst the Nvidia fanboys but the obvious drawback is that your viewers/readers probably would be tiny compared to the other reviewers.
 
You are indeed free to criticize the reviewer. You aren't, however, The Authority who decides what settings are irrelevant to the market, so can you drop that attitude?

And there wouldn't actually be anything wrong with the numbers themselves if you did so, you simply say you focus on RT + DLSS, do your part, and readers are free to take part of your results. AMD fanboys can say they don't like your benchmarks but they can't say your numbers are wrong for Nvidia users.
You would probably be the favourite reviewer amongst the Nvidia fanboys but the obvious drawback is that your viewers/readers probably would be tiny compared to the other reviewers.

HUB is 100% irrelevant for me.
Like it or not.
 
I do admit that it is possible to miss it, and the presentation could very well have been better, and I've always prefered reading the text instead of watching the video because it's easier to read segment by segment and pinpoint, but I do also expect users that are very criticizing towards a tester to watch the content carefully before posting.
And rightfully so! There is however a whole lot of people who are just casual consumers, who you might not hold to those high standards. And they might or might not miss the hint at the begining or would not listen to Steve anymore already when he goes over the topic of RT in his conclusion. Let alone all the screenshots of his graphs that are posted online completely without context.
 
Selective memory is a bad argument.
AF was a major contention point back in the day.
So was AA.
So was SM2.0 vs SM3.0 (eg. like when FarCry got the SM3.0 HDR patch).
And you must have been asleep when DX10 got introduced too.


Right, so do you actually remember that everyone back then went nuts and said eg the SM3.0 HDR patch for Far Cry now meant that the older GPUs couldn't run games anymore?

I wasn't asleep during the DX10 era, I do remember multiple games having a separate exe for DX9 and DX10, and later DX11, which is why I take great care not to use a single updated version as proof that older GPUs now can't run games anymore, and if you really were around during this time, I'd hope that you'd know better than that too.

HUB is 100% irrelevant for me.
Like it or not.

I don't care, and no one can put you on trial for it either. But at least have a respectful attitude instead of badmouthing them simply because they have a different opinion than you.
 
They did have special videos focusing only on raytracing too, but do consider that their viewers aren't robots either, if they were convinced on what they saw of raytracing, either from HU or other sites, they are free to voice their own opinion that raytracing should have a bigger part in their future tests. Worst case for HU, they could even lose viewers.

It would be interesting however if eg Sony or Microsoft could show us the numbers whether most of their userbase decide to play with raytracing on or turning it off for better performance.
Oh come on, after 3 years I think we've had enough of this anti-progress and anti-technology sentiment, just because some less educated fans/reviewers think RT is irrelevant because their GPU either outright lacks it completely or is incapable of using it at sufficient speed, we get to endure this constant parade of doubting the impact of RT.

RT and DX12 Ultimate are nothing more than an upgrade to the base DX12, call it DX12 Ultimate, DX13, DXR, or whatever .. buying GPUs that don't support the latest DX version has been universally accepted to be the dumbest choice you can make, and has been proven time and time again to be incredibly harmful for the longevity of such hardware, as the user will gradually get shafted as new games come by, especially if the current generation of consoles already support such DX standard.

But because one reviewers thinks his opinion is better than the standards set by history and technology, we get to aimlessly debate this in 2021 among tech professionals!
 
Oh come on, after 3 years I think we've had enough of this anti-progress and anti-technology sentiment, just because some less educated fans/reviewers think RT is irrelevant because their GPU either outright lacks it completely or is incapable of using it at sufficient speed, we get to endure this constant parade of doubting the impact of RT.

RT and DX12 Ultimate are nothing more than an upgrade to the base DX12, call it DX12 Ultimate, DX13, DXR, or whatever .. buying GPUs that don't support the latest DX version has been universally accepted to be the dumbest choice you can make, and has been proven time and time again to be incredibly harmful for the longevity of such hardware, as the user will gradually get shafted as new games come by, especially if the current generation of consoles already support such DX standard.

But because one reviewers thinks his opinion is better than the standards set by history and technology, we get to aimlessly debate this in 2021 among tech professionals!

If you think Hardwareunboxed is bad you can simply skip their videos completely and stick to the sites who focus on what you consider important.

I do the 5700XT will age worse in the long run, simply because both new consoles are fully DX12U capable. However, the importance of DX versions can still be debated. Has Maxwell supporting feature level 12_1 made it age better than Polaris?
 
Attacking other members and being overly aggressive is not welcome on B3D. Remain civil or accounts will be banned.
 
However, the importance of DX versions can still be debated
No it can NOT, never has. Peiord.

Has Maxwell supporting feature level 12_1 made it age better than Polaris?
Sub features are not at the same level as a MAJOR DX version upgrade. DX10.1 was not crucial, but DX10 was, as were DX9 and DX11 .. etc.

Though even sub features are sometimes important as well, as was the case with DX9c.
 
However, the importance of DX versions can still be debated.

Is DXR relevant? Is missing graphical options relevant? Is starting a game relevant?
A 5700XT doesnt support DXR1.0 (unlike Pascal). With Pascal you can enable Raytracing effects in DXR1.0 games. At which point is a (sub) DX feature so relevant that not supporting it is a negative?
 
No it can NOT, never has. Peiord.


Sub features are not at the same level as a MAJOR DX version upgrade. DX10.1 was not crucial, but DX10 was, as were DX9 and DX11 .. etc.

Though even sub features are sometimes important as well, as was the case with DX9c.

The main argument amongst users who have stuck with their older GPUs or chose a significant cheaper older GPU over a modern one, is that by the time games are made for the new DX version in mind, they will be too slow anyway. If one wants to draw a hard line, I'd say they have been wrong, but that requirement has very rarely hit them until five years after everyone started offering support.

I do agree that it's dumb today to buy an older non-DX12U GPU today unless it's much cheaper, and all new GPUs support DX12U fully, you can't even chose a non-DXR Ampere or RDNA2 AFAIK. But if the now two year old 5700XT does continue having support for all new games and loses support first when it's seven years old, I don't see people freaking out about it. Many sites even stop benchmarking GPUs that are so old.

I have told my friends who asked me for advice during this shortage that they hold out and try to get an RDNA2 or Ampere GPU if they can get one. I have also told them of Nvidia's advantage in DXR and with DLSS, but also that I think a high-end GPU like the 3070 will be hurt by only having 8GB VRAM long term.
 
Last edited:
Is DXR relevant? Is missing graphical options relevant? Is starting a game relevant?
A 5700XT doesnt support DXR1.0 (unlike Pascal). With Pascal you can enable Raytracing effects in DXR1.0 games. At which point is a (sub) DX feature so relevant that not supporting it is a negative?

What I consider relevant is when it actually becomes a requirement. If I were a 5700XT owner and it happened tomorrow, I'd be pissed. If it happens five years from now, I wouldn'waste energy on venting my frustration.. Right now, I don't consider the 5700XT at a particular disadvantage compared to the 2060, and in the context of Hardwarunboxed recommending it over the 2060 two years ago, the users who trusted them should have well known that it lacked DXR.

I definitely wouldn't complain if AMD offered software DXR for the 5700XT, but most Pascal users I've seen talking about it complain that the performance is too low anyway.

Btw, isn't there one somewhat modern game that has an exclusive graphics option for Intel IGPs? It's always funny to find the extreme examples of when one game has an advantage on other hardware.
 
You are indeed free to criticize the reviewer. You aren't, however, The Authority who decides what settings are irrelevant to the market, so can you drop that attitude?

And there wouldn't actually be anything wrong with the numbers themselves if you did so, you simply say you focus on RT + DLSS, do your part, and readers are free to take part of your results. AMD fanboys can say they don't like your benchmarks but they can't say your numbers are wrong for Nvidia users.
You would probably be the favourite reviewer amongst the Nvidia fanboys but the obvious drawback is that your viewers/readers probably would be tiny compared to the other reviewers.
Again, the problem is that this particular comparison doesn't work no matter how you try to explain it. 2060S users have bought the card because of it's features and this comparison ignores them completely while trying to compare the cards. All the data which is relevant to proper assessment of what 2060S can do is missing from the comparison. It's just flawed.

And saying "trust me guys, RT and DLSS sucks so we're not even benchmarking them" doesn't improve this in the slightest. Steve is the last guy on the planet who I would trust in anything related to RT right now. And they should make effort in earning that trust back instead of producing more sensational videos made on a wrong premise.
 
Oh come on, after 3 years I think we've had enough of this anti-progress and anti-technology sentiment, just because some less educated fans/reviewers think RT is irrelevant because their GPU either outright lacks it completely or is incapable of using it at sufficient speed, we get to endure this constant parade of doubting the impact of RT.

RT and DX12 Ultimate are nothing more than an upgrade to the base DX12, call it DX12 Ultimate, DX13, DXR, or whatever .. buying GPUs that don't support the latest DX version has been universally accepted to be the dumbest choice you can make, and has been proven time and time again to be incredibly harmful for the longevity of such hardware, as the user will gradually get shafted as new games come by, especially if the current generation of consoles already support such DX standard.

But because one reviewers thinks his opinion is better than the standards set by history and technology, we get to aimlessly debate this in 2021 among tech professionals!

Many Game Developer are JUST NOW learning fully DX12 U, because many are taking a crash course in Developing for the new rdna2 consoles. As mentioned, older GTX/Polaris cards are going to start choking under these newer DX12 games.

These new "modern" games will have DXR on/off, and make use of directML and the full gamut of opensource/industry standard features. There can only be one standard. It's the Highlander rule. RTX technically is not DXR and DLSS technically is not directML & game developer's know that NVidia, Intel and AMD will fully support DX12u.



David...
I can tell you that I know tons and tons of people who have a $800+ RTX card, nearly everyone in my ARMA clan, that do not use ray tracing while gaming in other games. Like me, I turned RTX-on in BF:V with my 2080 just to see what all the noise was about. It was a novelty, nothing more. And if it was an option on games I play, I would "try it" to see what it looks like, but again the novelty wore off within hours. I was forcing myself at that point, the performance hit was just to great for any meaningful use.

I understand that my $900 RTX 2080 is not going to get any faster in ray-tracing, now or later. And that RT will always remain a novelty on RTX 2k Series, no matter what game. The Game Developer's know this too.

They know that it takes a $1k+ dGPU to render ray-tracing with any actual use, so Game Developers are slow-walking their use of ray tracing, until the foundation is set as a STANDARD. I am an early adopter and highly educated fan of ray tracing and I DO think ray-tracing is irrelevant in current games, because it's use is for showcasing paid-for content and not universal in games, or industry. All ray-traced Games do right now, is saps performance and adds novelty to the idea.
 
Again, the problem is that this particular comparison doesn't work no matter how you try to explain it. 2060S users have bought the card because of it's features and this comparison ignores them completely while trying to compare the cards. All the data which is relevant to proper assessment of what 2060S can do is missing from the comparison. It's just flawed.

And saying "trust me guys, RT and DLSS sucks so we're not even benchmarking them" doesn't improve this in the slightest. Steve is the last guy on the planet who I would trust in anything related to RT right now. And they should make effort in earning that trust back instead of producing more sensational videos made on a wrong premise.

Once again, this comparison was only intended to show the performance without raytracing. Perhaps he should have named it "Is 5700XT still better if you don't use raytracing?" to be clear, but
if one watches the video carefully, he confesses that the 2060 was the only choice if you do care about raytracing. This all goes back to their initial recommendation of the 5700XT over RTX 2060 if you don't care about raytracing, and based on the games they tested, that recommendation seems to still hold true.

This comparison is useful for users who don't care about raytracing. It's not useful for people who do care about raytracing. That's all there is to it and I don't see why it needs constant debating.
I for one do appreciate sites focusing on different aspects, that gives users alot of data to base decisions on, and whether I think HU sometimes is faulty or not, I always check other sites too to get a broader vision. For years my old go-to sites for benchmarks was DSOgaming because their old test system had a Sandy Bridge E which they also tested core scaling on and that site was the one that gave the best estimation of how my PC with a Sandy Bridge CPU actually would perform


For all the new found strengths of the 2060 though, I'm one of those guys who have a bittersweet experience with it. I did buy one when I needed to replace my old Titan, and at that time, I did skip raytracing and DLSS because DLSS sucked and I though the performance with RT wasn't worth it. And now when I have checked the games out, specifically for DLSS, I just have no interest to replay them.
 
Last edited:
Once again, this comparison was only intended to show the performance without raytracing.
A. Nowhere in the title or video thumbnail does it say this. Thus this wasn't the intent of creators.
B. Read my post again. The comparison of these two cards without RT makes as much sense as comparing them only in DX7 titles in modern day. RT is a feature of 2060S for which people were buying the card in the first place.

This all goes back to their initial recommendation of the 5700XT over RTX 2060 if you don't care about raytracing, and based on the games they tested, that recommendation seems to still hold true.
Does it? About a third of the titles they've compared support DLSS2. This is also completely ignored for no reason whatsoever. Back in 2019 their saying was "2060S could be better if DLSS will get traction and support". It got these but they simply ignore it now.
Cmon, it's confirmation bias total over there. He is even using Doom Eternal already to "confirm" that 8GB isn't enough, "just like he said" - totally ignoring the fact that DE's texture quality are the same across three top streaming buffer settings. He's just incapable of saying "yeah, we were wrong on that one".
 
if the now two year old 5700XT does continue having support for all new games and loses support first when it's seven years old
It lost support for 35+ titles just 2 years after it launched, it can't access several titles because of that restriction, and that is only a few months after the start of the new console cycle. There is not going to be even 4 years for this card let alone 7.
 
Back
Top