Value of Hardware Unboxed benchmarking

Phantom88

Regular
That COD performance remains insane on radeons

caw9o8.png



 
It's crazy that it can be 40% faster or slower depending on the game.

Although including Modern Warfare in there twice at different settings blatantly skews the average in a nonsensical way.

EDIT: also why would you include Cyberpunk without RT and then class it as a win for the XTX? That's not how most people with these class of GPU's would play the game, especially at 1440p.
 
It's crazy that it can be 40% faster or slower depending on the game.

Although including Modern Warfare in there twice at different settings blatantly skews the average in a nonsensical way.

EDIT: also why would you include Cyberpunk without RT and then class it as a win for the XTX? That's not how most people with these class of GPU's would play the game, especially at 1440p.
Cyberpunk's non-RT settings above "high" or even "medium" are awful bang for the buck visuals.
 
There are more games with multiple settings, so I don't think they are trying to skew the average.

Yes but as far as I can see they are all RT on vs off (with Cyberpunk RT noticeably absent) or API differences, I.e. things that could genuinely change the state of play between the two architectures, not a simple (non RT) settings change in a title that is already a performance outlier which is never going to change the state of play. They're effectively just counting the XTX's biggest win twice.
 
Not really. It does better in broken DX12 games like Modern Warfare 2. Performance difference isnt hardware related anymore with these GPUs outside of Raytracing.
Are Battlefield, Plague Tale and Wolfenstein also broken DX12 games?
 
Battlefield 5? Yes, if he would have used DX12... The other games are just games - like Warhammer: Darktide.

+40% in Modern Warfare is just nonsense and shows how broken DX12 and Vulkan can be on modern nVidia GPUs.
 
Not really. It does better in broken DX12 games like Modern Warfare 2. Performance difference isnt hardware related anymore with these GPUs outside of Raytracing.
It's funny. When NVIDIA isn't performing as well as expected it's broken game, when AMD isn't performing as well as expected it's AMDs fault.
Other than AMD cards running it better than average, how exactly is CoD DX12 "broken"?
 
That's Hardware Unboxed for you, no testing standards whatsoever, they change them on a regular basis on their whims.
What a ridiculous and hyperbolic statement.
This claim has been multiple times by the Nvidia crowd both on here and Twitter, each time with no proof or substance. HUB literally goes through test setups and procedures in each of their video and articles.

HUB's tests are also cross-checked with LTT and GN, and the three channels regularly communicate and coordinate test procedures with one another. Are you also claiming that GN and LTT have "no testing standards whatsoever" too? :rolleyes:
 
Maybe because the 320W 4080 is using less than 200W in 1440p with the balance setting:

Makes this 4080 2,45x times more efficient than a 3080 12GB...
Possibly it is in this workload. RDNA 3 in its current iteration is a disaster.
 
Are you also claiming that GN and LTT have "no testing standards whatsoever" too? :rolleyes:
LTT? Plz .. that clown doesn't know how to test shit if his life depneded on it, that ship has sailed long time ago, their benchmarks are full of mistakes and contradictions that are constantly corrected by the community on a regular basis.

GN is above all of this of course. But they test so few games to begin with unfortunately. Their methodology is to sample enough games to establish a trend, after that they stop the testing.

each time with no proof or substance. HUB literally goes through test setups and procedures in each of their video and articles.
The proof is here, they tested COD twice (one for low settings and one for Ultra), and added both results to the averages, despite COD not offering any ray tracing or any other API. How is that a valid testing?
 
LTT? Plz .. that clown doesn't know how to test shit if his life depneded on it, that ship has sailed long time ago, their benchmarks are full of mistakes and contradictions that are constantly corrected by the community on a regular basis.

GN is above all of this of course. But they test so few games to begin with unfortunately. Their methodology is to sample enough games to establish a trend, after that they stop the testing.


The proof is here, they tested COD twice (one for low settings and one for Ultra), and added both results to the averages, despite COD not offering any ray tracing or any other API. How is that a valid testing?
You do realize its hardly affecting the net result of the summary right?
 
No, it must be the API. I can get up to 600W in Control with DX12 w/o RT, but in AC:Valhalla the 4090 stops at 330W in 4K...
Something is limiting processing and nVidia has optimized Lovelace for better efficiency instead of pure brute force like Ampere.
 
You do realize its hardly affecting the net result of the summary right?
It's not about that, it's about establishing a standard and sticking with it, not winging it according to his whims. They tested COD twice, but tested Cyberpunk once without RT, despite the game offering RT. They did the same with Battlefield V, Control, Resident Evil Village, Ghostwire, Watch Dogs Legion, Dying Light 2 and Hitman 3 .. etc. They all got tested once, with no RT. Fortnite/Witcher 3 were tested 3 times, one for each API! While The Division, Battlefield V and Borderlands 3 get tested only for one API despite each having two! Where is the logic!
 
Last edited:
Back
Top