No DX12 Software is Suitable for Benchmarking *spawn*

A. Nowhere in the title or video thumbnail does it say this. Thus this wasn't the intent of creators.
B. Read my post again. The comparison of these two cards without RT makes as much sense as comparing them only in DX7 titles in modern day. RT is a feature of 2060S for which people were buying the card in the first place.


Does it? About a third of the titles they've compared support DLSS2. This is also completely ignored for no reason whatsoever. Back in 2019 their saying was "2060S could be better if DLSS will get traction and support". It got these but they simply ignore it now.
Cmon, it's confirmation bias total over there. He is even using Doom Eternal already to "confirm" that 8GB isn't enough, "just like he said" - totally ignoring the fact that DE's texture quality are the same across three top streaming buffer settings. He's just incapable of saying "yeah, we were wrong on that one".

I have already agreed that the presentation could have been done better, but if you actually watch the video carefully, I do think he addresses those points.

This video wasn't targeted towards 2060S owners who bought that card due to DXR and DLSS. As for why they skip DLSS, perhaps they're simply still reluctant to use it since it's still provided as a case-by-case for selected titles instead of universally supported, and sometimes coming after launch?

As for Doom Eternal, I don't know how well he's read up on that, they typically aren't doing in-depth tests like Digitalfoundry, but why can it not just be a human error?
I do however think that 8GB VRAM on a 3070 is a warning sign, the bloody 3060 even has 12GB, the sweet thing with having overkill VRAM is that you typically can put textures to Ultra without noticing a performance decrease even on weak GPUs. And even on weak GPUs, the GTX 1050 2GB has hit the VRAM limit in modern games whereas the 4GB model still is ok.


Why can't you just leave this video as is? It's obvious that you don't agree with Hardwareunboxed at all, but for the users who don't care about raytracing, that video is a fully relevant comparison. And for the ones who do find raytracing super important, they can simply ignore Hardwareunboxed altogether and check other benchmarks.
 
Last edited:
It lost support for 35+ titles just 2 years after it launched, it can't access several titles because of that restriction, and that is only a few months after the start of the new console cycle. There is not going to be even 4 years for this card let alone 7.

What are those 35 titles that they are locked out from playing? If the majority of PC gamers cared about not being able to play the 23 old Quake 2 with DXR, you'd probably see more topics about it.
 
For starters, you could label the chart something like "1080p highest quality w/o Raytracing" or similar. Better yet, do away with "geomeans" or indices completely. I realize, they're good for easy consumption, but they are inadequate to visualize a multi-faceted topic.

Because Raytracing is not another game (yet) and instead it's an option for image quality, i.e. more work, as in performance is work per time, it would be preferrable to stick some hard-to-agree-upon measure of image quality next to all those fps values.

Or else go with Kyle's approach: Maximum playable settings, where it's subjective again, which game requires 45 instead of 30, 120 instead of 60 or 144 instead of 120 Hz, thus normalizing performance and let the viewer decide if the IQ gain is wort it.

VERY debatable topic.

Very debatable indeed :)
Do not downplay averages as easy consumption, they can efficiently show overall trends and be enough for people playing many games.
Who would be the authority on quantifying image quality? I am afraid this would be even more passionately contested no matter how implemented.
I am glad there are more approaches, we get more information from reviewers going their own way.

In the same way the same HUB is showing cards which can't run a game due to a lack of VRAM. I believe they've put "need more minerals" at them in place of fps numbers? Seems fitting for a 5700XT in RT workloads.

If it was the same kind of video it would be highly dubious inconsistency and I would agree. Don't know what was it in.

Exactly, the hypocrisy of their logic is at an all times high right about now, they claim RT and DLSS still have no enough support, yet half the games they tested support either Ray Tracing or DLSS, even after they suspiciously omitted some of the heavy games from their testing, like Control and Metro Exodus.

Now you are just lying, they did point out the high adoption of DLSS.

But because one reviewers thinks his opinion is better than the standards set by history and technology, we get to aimlessly debate this in 2021 among tech professionals!

Very strange how you cannot get over it.

RT is a feature of 2060S for which people were buying the card in the first place.

I have doubts. Do you have any data to back that up?
 
Very debatable indeed :)
Do not downplay averages as easy consumption, they can efficiently show overall trends and be enough for people playing many games.

Who would be the authority on quantifying image quality? I am afraid this would be even more passionately contested no matter how implemented.
I am glad there are more approaches, we get more information from reviewers going their own way.
I am not! :) I explicitly said geomeans - because that's what basically can take all the meaning out of an average compared to a median or normalized averages. I'm sure I don't need to spell it out here in detail why that is.

WRT to IQ - yeah, that's the hard part. If people would settle for example an PSNR analysis like done in real film, maybe that could help. But as it is... pretty much no chance.

I bought a rtx card because I wanted rt ....
I will be buying either a 3070 or a 6800, because i want RT as an option too. Just depends on which one I'd get for or very near MSRP first.
 
What I consider relevant is when it actually becomes a requirement. If I were a 5700XT owner and it happened tomorrow, I'd be pissed. If it happens five years from now, I wouldn'waste energy on venting my frustration.. Right now, I don't consider the 5700XT at a particular disadvantage compared to the 2060, and in the context of Hardwarunboxed recommending it over the 2060 two years ago, the users who trusted them should have well known that it lacked DXR.
I definitely wouldn't complain if AMD offered software DXR for the 5700XT, but most Pascal users I've seen talking about it complain that the performance is too low anyway.

Not having options is not a disadvantage for you?
DXR is a pure compute solution. AMD can provide a DXR1.0 driver so that reviewers can compare a 5700XT and 2060 Super with every graphical options. Isnt this the situation viewers/readers should demand?

Btw, isn't there one somewhat modern game that has an exclusive graphics option for Intel IGPs? It's always funny to find the extreme examples of when one game has an advantage on other hardware.

But these extreme examples are relevant. Using Valhalla is okay but not using Metro Exodus EE is okay, too? That doesnt make any sense. Either we care about extreme examples or not.
 
Why do you keep bringing up Metro Enhanced? 5700xt owners would just play the regular version. Why are you acting likes its a game 5700xt owners can’t play?
 
Why do you keep bringing up Metro Enhanced? 5700xt owners would just play the regular version. Why are you acting likes its a game 5700xt owners can’t play?
5700XT owners can play the enhanced version?

Already there is a divide, relegating the 5700XT to "Basic".
Enhanced is a bridge to far for that SKU.

And if I.Q. doesn't matter, console is the way to go.
 
Metro Exodus EE is a remaster. Its a new game. You can play the old one but for benchmarking reviewers should always use the lastest game version and this is the EE.
 
Metro Exodus EE is a remaster. Its a new game. You can play the old one but for benchmarking reviewers should always use the lastest game version and this is the EE.
Remaster? What was changed besides adding more RT? There were no gameplay changes/improvements/fixes. Its no different than lowering visual settings in any game. It’s intellectually dishonest to use it as an example of a game 5700xt cant play.
 
Metro Exodus EE is a remaster. Its a new game. You can play the old one but for benchmarking reviewers should always use the lastest game version and this is the EE.
Yeah. It's a fantastic experience that looks leagues ahead of the original one and it can be enjoyed to a great extent with every RTX or RDNA2 GPU.

It's an experience RDNA1 owners can't have.

Remaster? What was changed besides adding more RT? There were no gameplay changes/improvements/fixes. Its no different than lowering visual settings in any game. It’s intellectually dishonest to use it as an example of a game 5700xt cant play.

Actually:

Metro-Exodus-Enhanced-Merkmale.png


It definately is an remaster. And RDNA1 can't play it, simple as that.
 
MEEE is worth mentioning simply because it runs better than the original with RT and have a good DLSS2 implementation. It is thus a version of the game which 2060S owners would likely prefer to play over the original and also deals with the argument of "RT kills performance and DLSS sucks lol".
 
So, it's the same game (missions, gameplay, story) but with a lot of new graphical effects and polishing.
I am not sure, if that's qualifying as a new game in my books. Is it really that different from not being able to select highest texture details a 4-GiByte-card in a game? You play the same game, it just does not look as good. I think, if missions, gameplay and story stayed the same, I am more on the "Quality settings"-side of the argument.


MEEE is worth mentioning simply because it runs better than the original with RT and have a good DLSS2 implementation. It is thus a version of the game which 2060S owners would likely prefer to play over the original and also deals with the argument of "RT kills performance and DLSS sucks lol".
That would be an interesting comparison: ME on RX 5700 XT and MEE on RTX 2060S.
 
Remaster? What was changed besides adding more RT? There were no gameplay changes/improvements/fixes. Its no different than lowering visual settings in any game. It’s intellectually dishonest to use it as an example of a game 5700xt cant play.

Definition of "remaster":
to make a new master (= a recording from which all copies are made) of an earlier recording, usually in order to produce copies with better sound quality:
https://dictionary.cambridge.org/de/worterbuch/englisch/remaster
 
That would be an interesting comparison: ME on RX 5700 XT and MEE on RTX 2060S.
Faster than the original with RT on. Since a Rdna1 gpu can't run RT anyway, it would of course be faster.
It would be interesting to compare both versions though maybe at High settings on both and with DLSS on the 2060S. RDNA1 GPU running the original game without RT, while 2060S running the Enhanced Edition with DLSS, then compare performance and visual quality. Should get quite interesting indeed.
 
But with Xbox Series X|S and PlayStation 5 consoles offering Ray Tracing support, we took the decision to radically overhaul our proprietary 4A Engine and realise our ambitions for a fully Ray Traced experience on next gen consoles and high-end PC.
https://www.4a-games.com.mt/4a-dna/...-for-playstation-5-and-xbox-series-xs-upgrade

It is not the same game. It is a new game using the latest DX12 features. The content (without lightning) is the same because it is not a remake.

I mean your logic dictates that we would have never needed DX12 because there doesnt exist any content difference between both API.
 
https://www.4a-games.com.mt/4a-dna/...-for-playstation-5-and-xbox-series-xs-upgrade

It is not the same game. It is a new game using the latest DX12 features. The content (without lightning) is the same because it is not a remake.

I mean your logic dictates that we would have never needed DX12 because there doesnt exist any content difference between both API.
How does my claim that it’s not accurate to consider it a game 5700xt can’t play dictate that we never needed any API beyond DX11? What is going on in these threads with Nvidia fans?
 
Back
Top