No DX12 Software is Suitable for Benchmarking *spawn*

Resident Evil 2 Remake update version:
And really though, that is about it - if you want to check out the RT features, I'd recommend this patch. However, in all other scenarios, the 'upgrade' is poor and not worth your time or trouble - and that starts with performance. Turn off RT and compare like-for-like with the older version of the game and remarkably, there's a huge drop in frame-rate for the exact same image quality. This appears to be because of a sub-optimal deployment of DirectX 12, whereas the older versions simply ran a lot faster on the old DX11 API. Tested on a GTX 1060, Resident Evil 2 Remake seems to be 25 to 30 percent slower in terms of average frame-rate - an astonishing state of affairs. At 1440p, that's the difference between running 1440p at a relatively consistent 60fps or alternatively, running in the 40s on the new version. This is a damning indictment of Capcom's DX12 implementation and it affects all GPUs, not just the GTX 1060 - even heavyweights like the RX 6800 XT and RTX 3080 are affected.

DX12 is now 8 years old. Something never change...
 
Last edited:
Something just struck me as amusing. We have this thread about how Dx12 is bad and shouldn't be used for benchmarking.

Then we have another thread about benching RT in games, but RT in Windows basically requires Dx12. So, according to this thread, no RT software is suitable for benchmarking? :D

Conversely that means that people posting RT benchmarks in the RT benchmarking thread are saying that Dx12 is fine for benchmarking? :D

Regards,
SB
 
Yes although DX12 can still present issues in the hands of inexperienced devs it's good that it's now, or soon to be essentially established as the primary current gen API thanks to both RT and UE5's reliance on it.
 
Conversely that means that people posting RT benchmarks in the RT benchmarking thread are saying that Dx12 is fine for benchmarking? :D
This thread was created to track the horrendous DX12 titles that suffer bad fps, bad stutters and framepacing issues, that are categorically worse than the DX11 path of said titles, what is even more offesnive is that DX12 offer no image quality upgrades offer DX11.

After DXR was established, with it's reliance on DX12, we've seen some of the DXR games (especially UE4 games) have bad DX12 implementations, with reduced fps compared to DX11 even before DXR is activated. However, many others have solid DX12 path that is equal in fps to the DX11 path (before DXR is activated), examples include Metro Exodus, Control ..etc. Some DXR games even offer better fps in DX12 like Shadow Of Tomb Raider.

The problem however, is that most DXR titles offer no DX11 path, you are stuck with DX12 and DXR only, so the comparison is dead before we even start. There is nothing to talk about here.
 

5700XT still comparing well in 2022.
That's like the most self indulgent video I've seen from HUB so far, to the point of being borderline disgusting.

Raytracing? Doesn't matter! (There are a number of games which will run fine with RT on 2060S, especially with DLSS but hey, we have an agenda here so we skip them all!)

DLSS? Who cares when we have FSR2 now! (We have FSR2 in what? Four titles, officially? The rest are hacks which exist only because of work made to integrate DLSS in them in the first place but hey, I'll just say that FSR2 is just as good as DLSS now! Who cares that 2060S owners could've used DLSS for three years already while 5700XT owners had to wait till 2022 for FSR2!)

How does UE5 run on 5700XT in comparison to 2060S?
There is like 2-3 UE4 games in his whole benchmark - does that actually reflect the percentage of games using UE4 right now?
Why are the top 6 games performing best on 5700XT there are AMD "sponsored"? And what does that mean for 2060S features being used in these titles?
I mean, Steve is a weird one with his GPU preferences (he is like the only reviewer who manages to still say in 2022 that raytracing doesn't matter on a regular basis) but that's a new low for him.
 
He modified his testing methodology this time, he used High settings and avoided Ultra or Max settings, which contributed to the margins, this video is decieving, the guy is so self indlugent and so vulnerable about his flawed recommendation of the 5700XT over modern Turing cards, that he is willing to cheat his audience for it.

For example: last time he did that same comparison, he tested Cyberpunk at High settings, Valhalla at Very High and Watch Dogs at Ultra settings, he also tested dozens of games at their max settings.

This time he switched and lowered the quality settings a notch or two: Cyberpunk at Medium, Valhalla at High and Watch Dogs at Very High, the other dozens of games were tested using significantly lower than max settings.

Last time the 5700XT had a 15% lead over the 2060Super, now it has a 13% lead despite the lowered quality settings. So the margin is decreasing, a "finewine" aging for the 2060Super in Steve's book, necessitating the deceptive move to try and uphold his "narrative" of the 5700XT being the superior choice by lesseing the reduction of the margin.
 
Last edited:
He modified his testing methodology this time, he used High settings and avoids Ultra or Max settings, which contributed to the margins, this video is decieving, the guy is so self indlugent and so vulnerable about his flawed recommendation of the 5700XT over modern Turing cards, he is willing to cheat his audience for it.
I mean 5700XT *was* a great value option against 2060S. But that value was there precisely because of features which 5700XT lacks, and if you understood that and was fine without said features - 5700XT was a great product which did in fact punch above is price - something which most SKUs in RDNA2 lineup are sorely missing.

But his approach to this comparison is to basically shit on Turing features, in 2022, which is... ehm...
 
He modified his testing methodology this time, he used High settings and avoided Ultra or Max settings, which contributed to the margins, this video is decieving, the guy is so self indlugent and so vulnerable about his flawed recommendation of the 5700XT over modern Turing cards, that he is willing to cheat his audience for it.

For example: last time he did that same comparison, he tested Cyberpunk at High settings, Valhalla at Very High and Watch Dogs at Ultra settings, he also tested dozens of games at their max settings.

This time he switched and lowered the quality settings a notch or two: Cyberpunk at Medium, Valhalla at High and Watch Dogs at Very High, the other dozens of games were tested using significantly lower than max settings.

Last time the 5700XT had a 15% lead over the 2060Super, now it has a 13% lead despite the lowered quality settings. So the margin is decreasing, a "finewine" aging for the 2060Super in Steve's book, necessitating the deceptive move to try and uphold his "narrative" of the 5700XT being the superior choice by lesseing the reduction of the margin.
I think it was quite clear this was a look back at which was the better buy when new, thus assuming you now have one of these cards. It would seem perfectly reasonable to lower settings on newer games to provide frame rates close to 60. No one cares if the 2060S is equal or even faster than the 5700XT on Ultra at 1440p if the frame rate is in the 30's.
 
The one comparison HUB tends to leave out in these Nvidia vs AMD benchmarks is UE4/5. That's the engine that matters the most and where AMD remains woefully behind Nvidia, often grotesquely so. So while a RX 6600XT can match a RTX 3060 Ti in Ass. Creed Valhalla, the latter beats the AMD card by 60% in Unreal Engine games. Go figure.
 
The one comparison HUB tends to leave out in these Nvidia vs AMD benchmarks is UE4/5. That's the engine that matters the most and where AMD remains woefully behind Nvidia, often grotesquely so. So while a RX 6600XT can match a RTX 3060 Ti in Ass. Creed Valhalla, the latter beats the AMD card by 60% in Unreal Engine games. Go figure.
The review contains UE4 titles. 5 to be exact. Thats 15% of the 33 game selection. Jesus christ. Why does UE4 matter the most?
 
Last edited:
It would seem perfectly reasonable to lower settings on newer games to provide frame rates close to 60. No one cares if the 2060S is equal or even faster than the 5700XT on Ultra at 1440p if the frame rate is in the 30's.
Firstly, you don't lower settings in games you've previously tested at higher settings (Cyberpunk, Valhalla, Watch Dogs). Secondly, you don't lower settings in games that are achieving 200+fps (such as Siege), he is trying hard to optimize the shrinking margin to uphold his narrative. He also introduced ReBAR this time for the 5700XT, which wasn't available last time, and which very few users will manage to actually get it working on the 5700XT.


Thats 15% of the 33 game selection. Jesus christ. Why does UE4 matter the most?
Because almost half of the combined (AA +AAA) titles release with it?
 
Why does UE4 matter the most?
a. Because it's the most used engine out there, and in majority of non-AAA games using it AMD is actually losing to Nv counterparts more often than not (the most recent example being Stray).
b. Because it's the most h/w agnostic engine out there. His top-5 best performing games have only one UE4 game and it's an AMD "sponsored" title.

Also if you want to "look back at which was the better buy when new" you really shouldn't even mention FSR.
And if you want to look at how these cards do now (which is what was the purpose of this video AFAIU) then brushing off DLSS and RT is a very weird thing to do.
 
Back
Top