No DX12 Software is Suitable for Benchmarking *spawn*

Why do you use the term "simpler" and what do you mean by "simpler"?

I guess primitive shaders might be 'simpler' in the sense that both it and mesh shaders share a common subset in terms of functionality but other than that primitive shaders has the exclusive capability to map both the traditional geometry and the next generation geometry pipelines to it so there might arguably less hardware implementation redundancies on AMD HW. On NV HW, you straight up have two different geometry pipelines implemented in hardware ...
 
From Fermi to Ampere, 10 years of Nvidia GeForce GPU: 44 models to the | Hardware Upgrade (hwupgrade.it)
April 30, 2021
In this article - others will follow - let's see how the performance of Nvidia GPUs has changed over the last 10 years or so, from 2010 to 2020,from the Fermi architecture of the GeForce GTX 400 series (a warm product, in every sense, of the time) to the current Ampere range. The tests were carried out on a platform with Ryzen 9 5900X CPU and 16 GB of RAM, using a total of six tests - four games and two synthetics (Unigine Valley and 3DMark Fire Strike).
....
During the tests we saw how Full HD, now the "basic" resolution for gaming PC - managed without great difficulty by the entire Nvidia lineup - is poorly digested until 2014, but from geforce GTX 900 cards it has been a crescendo of performance, up to the excellent management of 4K by the current high-end RTX 3000.
 
Why isn't it DX12 though?

Is it played by many gamers with older systems?
 
There seems to be some validity in the idea that the performance gap has extended over time but as ever HUB appear to be over representing AMD's advantage.

TUP for example shows the gap widening from around 4% to 12% at 4K since the launch of the 5700XT which isn't entirely unexpected given that modern games will be starting to use RDNA specific optimisations thanks to the consoles, and the 2060's frame buffer will be hurting it more as the memory footprint of games grows.

But the analysis completely ignores RT and DLSS despite them being supported in several of the compared titles. Also, why no Metro Enhanced comparison?

One also has to wonder how things might change in the future once features like Mesh shaders and VRS start to get used more liberally, or indeed when more games start to require RT as a baseline.
 
The 5700xt was significantly faster than the 2060S at launch in some games. Did it pull ahead even further or did HWUB just forget?
The question of performance changes between two completely different benchmarks is something which can be explored if there's a desire to do so. Between 2020 and 2021 there were changes in platforms (Intel swapped for AMD), Windows, patches and games used. Blanket statements like "X pulled ahead of Y" mean about zilch in such conditions.

But the point is that 2060S had features which were relatively unused earlier but not anymore - and this is being completely ignored because, uh, I dunno, Steve doesn't like them?
How many fps does 5700XT show in MEEE for example? The game is absent from 2021 suite.
About a third of games in 2021 suite have a good DLSS implementation too. But we'll just ignore that I guess.

and the 2060's frame buffer will be hurting it more as the memory footprint of games grows
2060S has the same 8GBs as 5700XT though.
2060 was (ill) positioned against 5700 but was dropped to compete against 5600XT eventually.
 
Last edited:
But the point is that 2060S had features which were relatively unused earlier but not anymore - and this is being completely ignored because, uh, I dunno, Steve doesn't like them?

He says at the start of the video that if you want RT then the 2060S is the only option and the comparison to the 5700XT isn't relevant. He also acknowledges that his opinion that RT isn't viable on the 2060S is just that - one man's opinion. So have to give him credit for being up front about it at least.
 
He says at the start of the video that if you want RT then the 2060S is the only option and the comparison to the 5700XT isn't relevant. He also acknowledges that his opinion that RT isn't viable on the 2060S is just that - one man's opinion. So have to give him credit for being up front about it at least.
How is the RT comparison not relevant? MEEE is just the start here, how many games won't even launch on 5700XT in a couple of years from now? If you can't play some games on 5700XT then it is very much relevant to this comparison IMO and should be shown in graphs which claim some % of advantage as a conclusion of the whole benchmark. After all he didn't have any issues showing 2060 running out of VRAM (as in 0 fps) in his recent DE RT comparison AFAIR.
 
How is the RT comparison not relevant? MEEE is just the start here, how many games won't even launch on 5700XT in a couple of years from now? If you can't play some games on 5700XT then it is very much relevant to this comparison IMO and should be shown in graphs which claim some % of advantage as a conclusion of the whole benchmark. After all he didn't have any issues showing 2060 running out of VRAM (as in 0 fps) in his recent DE RT comparison AFAIR.

Read my post again. If you want RT then the performance comparison is not relevant since there's only one card that does RT. He mentions this at the start of the video. At the end of the video Steve also spends quite a bit of time on the extra features of the 2060 including DLSS.
 
Back
Top