Value of Hardware Unboxed benchmarking

I haven’t watched the video but isn’t this kind of the point Tim is making?

My issue with the 20 series was it was before mainstream RT titles came out so there wasn’t really anything ‘period appropriate’ for it to run, but particularly the lower end cards aren’t all that useful for modern RT.

Like yes, I expect a 6 year old budget card to perform rather poorly these days, but almost all the relevant RT titles came out way after this card was considered ‘modern’, so that’s my gripe.

I think the point the video was trying to make is that the RT capabilities of the 2060 were essentially meaningless/offered nothing over a comparable GPU with no RT capability because they simply weren't usable.

And that's just not true. They were entirely usable - if you accepted various compromises that one should probably expect to experience with a GPU of that tier trying to run the most advanced graphical features available. i.e. you might have to live with sub-60fps and less than stellar image quality along with other lowered graphical settings. But you can still engage some RT features with the GPU and have a perfectly viable gaming experience. Whether you choose to becomes a matter of preference rather than capability.

If you look at the final summary from the video they show that all the games tested bar 2 are playable with at least 30fps at 1080p DLSS Quality on the GPU when lowering other graphical settings. And of the two that weren't, one was Cyberpunk where they only tested at RT Ultra settings as opposed to say the far more moderate console level RT settings that the same offers on PC.
 
A slower RTX GPU will still end up offering a better image.
This means nothing. You have to look at prices to compare and last time I've checked any GPU which would produce even just comparable image would still cost you more (in sum with other PC parts) than a console. Which has always been a reason why someone would buy it over a PC.
 
This means nothing. You have to look at prices to compare and last time I've checked any GPU which would produce even just comparable image would still cost you more (in sum with other PC parts) than a console. Which has always been a reason why someone would buy it over a PC.
I should have stated I was giving my opinion on the discussion regarding the amount of PCs comparable to console performance. As long as VRAM isn’t an issue, the performance advantage the consoles have over slower RTX GPUs is negated.
 

Nice coverage of various resolutions and quality presets. Funny how dropping to high or medium all of a sudden makes a bunch of cards more viable.

I really don’t get why Steve thinks $400 should guarantee you “Epic” settings though. $400 is not an epic amount of money in 2024.
 
If you look at the final summary from the video they show that all the games tested bar 2 are playable with at least 30fps at 1080p DLSS Quality on the GPU when lowering other graphical settings.
Which is garbage for people paying $300+ for a GPU.

1080p/30fps and lower settings is not how anybody plays PC games unless they're on some super budget setup. Ray tracing is not some 'upgrade' in this case, it's actually degrading most of the rest of the experience.
 
$400 is not an epic amount of money in 2024.
$400 is roughly the same amount of money that it was for most people in 2018.

It's undeniable how much less we get for our money these days when it comes to GPU's. Plus Turing sucked. Technologically forward, but ultimately terrible for consumers.
 
$400 is roughly the same amount of money that it was for most people in 2018.

Even if that’s true I’m not sure how it’s relevant. Game developers and hardware designers don’t base their roadmaps on wage inflation.

It's undeniable how much less we get for our money these days when it comes to GPU's. Plus Turing sucked. Technologically forward, but ultimately terrible for consumers.

If by less you mean ability to crank up settings in the latest games at the highest resolutions that’s probably true. It’s been a very long time since you could do that for $400.

For anyone who just wants to enjoy playing games you can certainly do that on a $400 graphics card today. The problem with that statement from Steve is that he’s promoting this idea that people are entitled to “maxed out” PC games for $400 when $700 game consoles still come with massive compromises. It’s unrealistic and misleading and just leads to more pointless complaining.
 
For anyone who just wants to enjoy playing games you can certainly do that on a $400 graphics card today. The problem with that statement from Steve is that he’s promoting this idea that people are entitled to “maxed out” PC games for $400 when $700 game consoles still come with massive compromises. It’s unrealistic and misleading and just leads to more pointless complaining.
Maxed out PC games would presumably include RT, where as what Steve is talking about is being able to play with max texture detail at 1080p. Which you can do on a 6700 XT.

Edit: Or even in a bunch of cases the 12GB 3060.
 
Last edited:
Which is garbage for people paying $300+ for a GPU.

1080p/30fps and lower settings is not how anybody plays PC games unless they're on some super budget setup. Ray tracing is not some 'upgrade' in this case, it's actually degrading most of the rest of the experience.
according to Steam Hardware Survey, those gpus that are more 1080p/30/lower settings are the most common ones. the top 5 are all xx60 gpus and make up 21.11% of the survey
 
Which is garbage for people paying $300+ for a GPU.

1080p/30fps and lower settings is not how anybody plays PC games unless they're on some super budget setup. Ray tracing is not some 'upgrade' in this case, it's actually degrading most of the rest of the experience.

This is just your opinion. It does not mean that RT is unviable on this GPU for those that hold a different option.

The RT modes on the consoles are testament to that which are almost always 30fps, and show a bigger delta between the RT and non-RT modes than the 2060 would. But people still use those RT modes.
 
I think it's worth keeping in mind that PC games span a broad spectrum in terms of hardware demands. I feel hardware enthusiasts (or PCMR types) seem to more so stuck in this idea that you buy your hardware for the most demanding game you want to play, as all games must meet some minimum threshold or they will upgrade or not play it.

While the broader PC gamer likely just buys something and then adapts the games to what they have, they are fine if some games need to be run at lower settings/fps then what they typically play at. As in someone primarily plays say lighter esports titles and then also wants to try something like Cyberpunk, they don't just buy an entire new GPU for Cyberpunk only that is then overkill for their typical game set.
 
Maxed out PC games would presumably include RT, where as what Steve is talking about is being able to play with max texture detail at 1080p. Which you can do on a 6700 XT.

Edit: Or even in a bunch of cases the 12GB 3060.

Except one of the points of contention has been that in terms of how coverage is approached RT has been segmented into it's own separate category. Coverage for "max/ultra/etc." settings mean exclusive of RT.

This is an issue I personally have. Either we should test academically at true "max" settings which would include everything (including RT). Or if we're saying some settings aren't optimal, then we need to test at truly optimal settings, this would mean not just looking at RT but also non RT settings. With that from an user experience stand point we would even need to have different "optimized" settings depending on the strengths and weaknesses of various hardware.

Really from an actual user experience review stand point, as opposed to academic benchmarking, coverage would really need to shift towards what the user experience difference is actually between GPUs. Let's just look at say the RTX 4080 vs 4090, if the performance difference between them is say 4k DLSS quality vs. 4k native than that would be the user experience difference.
 
according to Steam Hardware Survey, those gpus that are more 1080p/30/lower settings are the most common ones. the top 5 are all xx60 gpus and make up 21.11% of the survey
I got curious and just looked at the number of RTX GPU's on steam:

Code:
NVIDIA GeForce RTX 2050                 0.31%
NVIDIA GeForce RTX 2060                 3.35%
NVIDIA GeForce RTX 2060 SUPER           1.18%
NVIDIA GeForce RTX 2070                 0.81%
NVIDIA GeForce RTX 2070 SUPER           1.07%
NVIDIA GeForce RTX 2080                 0.36%
NVIDIA GeForce RTX 2080 SUPER           0.44%
NVIDIA GeForce RTX 2080 Ti              0.33%
2000 series total                       7.85%


NVIDIA GeForce RTX 3050                 2.71%
NVIDIA GeForce RTX 3050 6GB Laptop GPU  0.21%
NVIDIA GeForce RTX 3050 Laptop GPU      0.57%
NVIDIA GeForce RTX 3050 Ti              0.20%
NVIDIA GeForce RTX 3050 Ti Laptop GPU   0.86%
NVIDIA GeForce RTX 3060                 5.76%
NVIDIA GeForce RTX 3060 Laptop GPU      2.80%
NVIDIA GeForce RTX 3060 Ti              3.28%
NVIDIA GeForce RTX 3070                 3.26%
NVIDIA GeForce RTX 3070 Laptop GPU      0.64%
NVIDIA GeForce RTX 3070 Ti              1.38%
NVIDIA GeForce RTX 3070 Ti Laptop GPU   0.31%
NVIDIA GeForce RTX 3080                 1.98%
NVIDIA GeForce RTX 3080 Laptop GPU      0.15%
NVIDIA GeForce RTX 3080 Ti              0.69%
NVIDIA GeForce RTX 3090                 0.48%
3000 series total                       24.28%

NVIDIA GeForce RTX 4050 Laptop GPU      1.08%
NVIDIA GeForce RTX 4060                 4.06%
NVIDIA GeForce RTX 4060 Laptop GPU      4.32%
NVIDIA GeForce RTX 4060 Ti              3.32%
NVIDIA GeForce RTX 4070                 2.82%
NVIDIA GeForce RTX 4070 Laptop GPU      0.85%
NVIDIA GeForce RTX 4070 SUPER           1.45%
NVIDIA GeForce RTX 4070 Ti              1.15%
NVIDIA GeForce RTX 4070 Ti SUPER        0.52%
NVIDIA GeForce RTX 4080                 0.74%
NVIDIA GeForce RTX 4080 Laptop GPU      0.19%
NVIDIA GeForce RTX 4080 SUPER           0.59%
NVIDIA GeForce RTX 4090                 0.91%
4000 series total:                      22%

RTX GPU's in total                      54,13%

It seems we have now crossed the threshold of more raytracing capable GPU's than not.
The 4000 series have done better than I presumed when reading on forums at 22% currently
Will be fun to observe the numbers when the 5000 series is close to replacement.

I did not calculate the AMD numbers and they both seem low and not segemented as well as NVIDIA's so the acutal number of raytracing capable GPU's are higher than these numbers.

(I am not claming that all these GPU's will be able to run 4K Max settings though, even a 4090 struggles with path traced games, as usual on a PC, configuring ingame settings apply)
 
Tim is a sensible dude.

I assume we'll be getting more $350+ 8GB and $600+ 12GB GPUs in the next few months. What a shame.

This leaves an opening for AMD to legitimately compete until the bigger GDDR7 modules arrive. If they just put a reasonable amount of memory on their RDNA4 cards. Regarding the 5060, you won't be having much fun on a 4070 level card with only 8GB. It's hard to even make good use of framegen with <12GB.
 
Yep let’s hope they take advantage. They completely failed to do so this generation.
If they do that and can put up a viable alternative to DLSS I'll take interest again. I didn't appreciate how poorly FSR compared until I got a 4070 and tried DLSS. If anything the youtubers underemphasize how superior DLSS is. I don't think they should test cards with upscaling using FSR for AMD and DLSS for NVIDIA. Apples to apples it is not. More like apples to rotten anchovies.
 
If they do that and can put up a viable alternative to DLSS I'll take interest again. I didn't appreciate how poorly FSR compared until I got a 4070 and tried DLSS. If anything the youtubers underemphasize how superior DLSS is. I don't think they should test cards with upscaling using FSR for AMD and DLSS for NVIDIA. Apples to apples it is not. More like apples to rotten anchovies.

Agreed that AMD have a big opportunity this gen. If they can get RT performance up to acceptable levels on their mid range parts (say around Ampere level) plus offer an AI based FSR4 ( which I expect to share a common lineage with PSSR) plus offer both a vram and price advantage over NV at similar raster levels, then they have a good chance of winning back some significant market share.

The big question mark though is what new 'magic' AI features if any NV will launch with the 5 series.
 
Sigh, Hardware Unboxed continued their poor stance on ray tracing on another recent podcast rant (timestamped), they claim ray tracing is bad because it impacts high refresh rate gaming, because apparently high refresh rate is all that matters when gaming, game graphics be damned.

I don't really know why an experienced reviewer would think that way, if gamers care only about high refresh rate gaming alone, we wouldn't be having 4K resolution, advanced graphics, global illumination, Lumen, complex geometry, Nanite, lifelike characters, hair simulation, physics simulation .. etc. We would be stuck doing 1080p @200fps with PS4 graphics! I wouldn't be surprised if Hardware Unboxed started attacking Lumen and Nanite because they are anti high refresh rate gaming!

They claim their audience only truly cares about high refresh rate gaming, according to their biased social media posts (on Youtube and Twitter), they admit that NVIDIA argued with them hard over this, that this data isn't representative of the actual gamer base at large, still Steve continues to be not convinced. Yet, at the same time, Steve admits that their data and their polls are not in line with reality, where RTX GPUs continue to dominate marketshare.

Worse yet, they are surprised new games with ray tracing are getting more demanding every year! I mean hello! that's been the case since forever, new games are more demanding each year well before the introduction of ray tracing! Steve says the 4070 will not be able to play the new ray traced games after 6 years!! I mean duh! What's so strange about this? This has been the norm since forever, 6 years old GPUs are not suitable to play the latest titles, none of this is exclusively related to ray tracing. Yet Hardware Unboxed twists these regular occurrences into ray tracing is bad because ray traced games will become more demanding!

I feel like these guys are setting themselves up to be the anti progress messengers of the tech world, they flip flop around the rationalizing of their hate towards progress. Even 6 years after the introduction ray tracing, they are coming up with more ridiculous justifications each time to hate on the tech, they will be stuck in this position forever, they are certainly laying the groundwork to justify this position forever.


At the end of the podcast, Steve says he doesn't use DLSS in multiplayer games because it increases latency! Even though DLSS actually reduces latency and increases fps by a large amount to serve his high refresh rate gaming needs, so they are that ignorant about the tech they review! Steve wants high refresh rate gaming, but he will turn off the tech that would ensure high refresh rate gaming! Typical Hardware Unboxed logic right there.

 
Last edited:
Back
Top