Value of Hardware Unboxed benchmarking

I mostly like seeing their benchmark results in the shape of Techspot articles. And Tim's content on screens is usually pretty interesting. The videos are often a slog for me, personally.

And Steve's opinions.. ehh. He must be from some parallel universe where the Australians aren't all likeable and easy going. Though presumably still more often than not called Steve.
 
At the end of the podcast, Steve says he doesn't use DLSS in multiplayer games because it increases latency!
lol I said a few pages back he said ages ago native or go home and got called out on it, couldn't be bothered to go back and find the q and a video to defend myself. Ahem if I had that intel clip from gamers nexus I could use that but in my best intel rep voice "Thanks Steve".

nd Steve's opinions.. ehh. He must be from some parallel universe where the Australians aren't all likeable and easy going.
I'm Australian and have met him briefly in passing years back, he's a nice enough guy in person but we only talked about power tools briefly.
 
Steve says the 4070 will not be able to play the new ray traced games after 6 years!! I mean duh! What's so strange about this? This has been the norm since forever, 6 years old GPUs are not suitable to play the latest titles, none of this is exclusively related to ray tracing. Yet Hardware Unboxed twists these regular occurrences into ray tracing is bad because ray traced games will become more demanding!

There’s no way he really said that if he wants people to take him seriously. Is he also upset at how heavily taxing new games have gotten even without RT?

If you’re a high refresh gamer who doesn’t care about raytracing you can just disable it in most games. It seems he’s upset that Nvidia is selling too many RTX cards but raytracing isn’t the sole reason for that. It’s not clear what alternate outcome he would’ve preferred. Should raytracing have never been brought to market? How would that have benefited gamers at large?
 
Fun observation: I'm playing a mix of old and new titles usually, and it is generally much harder to run a title from about 10 years ago with good enough IQ and high fps on Ultra settings despite them lacking RT or anything advanced enough by modern standards (they generally top out on voxelized volumetric lighting, the most visible new rendering "feature" of PS4 era).

This is partially because of their use of either MSAA or some bad post-AA forcing you to run them with supersampling sometimes but also because they lack any sort of modern upscaling resulting in need to use native rendering resolutions and because their "Ultra" settings are so weird that even today it is hard to get playable framerates in these games.

So from my casual observation I would say that RT did not have the impact on framerates Steve is saying it has. Even decoupled from DLSS/FSR it seems like a much better way to spend performance than what games were spending it on 10 years ago. With DLSS though everything is just faster, you're getting great IQ without the need to resort to some SGSSAA for that.

In other words - I sincerely don't know what they are even talking about.
 
Never understand why people care about these channels. They dont have any technical understanding. Battlefield 5 is six years old and i have to play with SSRs in UE5 games why UE5 is so far behind the curve. And yet this channel thinks that cubemaps and SSRs are the best ways to render reflections - in 2024.
 
lol I said a few pages back he said ages ago native or go home and got called out on it, couldn't be bothered to go back and find the q and a video to defend myself. Ahem if I had that intel clip from gamers nexus I could use that but in my best intel rep voice "Thanks Steve".
He seems stuck in a world run by rules from 10 years ago and refuses to budge from that world and accept reality!
@DavidGraham Was he talking about dlss frame generation or just dlss in general?
The context in the video was DLSS upscaling, but maybe he was referring to frame generation? That would be weird though, at least he would say he uses upscaling only, but apparently he uses none of it at all.

Is he also upset at how heavily taxing new games have gotten even without RT?
No, apparently not!

It’s not clear what alternate outcome he would’ve preferred. Should raytracing have never been brought to market? How would that have benefited gamers at large?
He never provided any alternative outcome, he just kept referring to ray tracing as an upsell tactic.

Even decoupled from DLSS/FSR it seems like a much better way to spend performance
They never talked about that aspect, that with DLSS you recoup most of the performance lost by ray tracing, and you end up in roughly the same performance profile as before you activated ray tracing.

In their RTX 2060 video, most titles they tested run at sub 60fps on the 2060 even before ray tracing is enabled, but they never cared to mention that fact، instead they called out ray tracing because it can't deliver 60fps gaming!
 
Last edited:
They never talked about that aspect, that with DLSS you recoup most of the performance lost by ray tracing, and you end up in roughly the same performance profile as before you activated ray tracing.
I am saying that the performance hit you get from RT is generally providing a much better visual gain than whatever games used to spend performance on prior to RT becoming a thing. Some older versions of UE4 for example has this weird "Ultra" shadows setting which literally halves the framerate for pretty much zero visual change in comparison to "Very High". Other games from these years also often has similar non-RT "improvements". Trading that for even some minimal RT like RTAO seems like a great thing to me.
 
I haven't watched the video (and quite frankly, I'm not going to), but we're already in the "mandatory ray tracing" phase of game development. a game that mandates it just sold a million copies (after another game that mandates it also sold a million copies, and after one that sold something like 10M). unless software Lumen doesn't count, for whatever reason.
 
I am saying that the performance hit you get from RT is generally providing a much better visual gain than whatever games used to spend performance on prior to RT becoming a thing
Indeed it is, Steve should be more pissed off about expensive Ultra settings that doesn't deliver much of visual gains, or about the CPU limited titles that severely impacts his precious high refresh rate gaming for no visual gain.

Take Dragon Dogma 2 for example, Hardware Inboxed critized the game for the not so great ray tracing implementation, while completely omiiting the bad CPU limited performance of the game which impacts the performance of the game whether ray tracing is active or not.

Some older versions of UE4 for example has this weird "Ultra" shadows setting which literally halves the framerate for pretty much zero visual change in comparison to "Very High"
Yeah, over the years I have documented many similar cases.

In Gears 4, the Insane screen space reflection setting had a huge performance impact that was equal to adding true RT reflections, despite having little effect on visual quality in comparison.

Similarly, in Gears 5 the software screen space global illumination setting also has a very massive performance impact despite adding very little to the final image, adding true hardware RT GI or reflections would have yielded a much better image quality outcome with similar or better performance profile.

In Assassin's Creed Odyssey, Watch Dogs 2 and Borderlands 3, setting volumetric clouds/fog to max plummeted performance badly for little image quality improvement.

In Watch Dogs 2, Arma 3, Crysis Remastered, using draw distance settings at their max values destroyed performance, because draw distance is CPU heavy, and our current CPUs are not fast enough single threaded wise. So we end up with horrific performance at max settings. Same thing applies to Flight Simulator games, whether 2010 or 2020.

In Quantum Break, running the game on native resolution destroys performance, the advanced lighting of the game was designed to be performant only when upscaled from lower resolutions.

Advanced non hardware RT methods for AO always end up costing massive performance, that remained true for VXAO (in Final Fantasy 15 and Rise of Tomb Raider) or Signed Distance Field AO (in Ark Survival Evolved), adding special shadowing techniques from the sun such HFTS (The Division, Watch Dogs 2, Battlefront 2) or PCSS (Assassin's Creed Syndicate) also cost massive performance.

All of these (and many others) are examples of effects that reduce performance by a huge amount, that can be replaced easily with real RT effects for a massively better image quality gain and/or performance.
 
I find that RT indirect lighting is almost always worth the performance cost. RT reflections and shadows are cool but not something I can't live without. Cube and shadowmaps don't bother me too much and I turn SSR off if it's distracting (usually is). But RTGI (not sure if that's the correct term for the indirect light bouncing and occlusion) looks so much better that a game would have to perform very poorly for me to turn it off. Realistically I'd probably wait on that game until I have hardware that can run it with RT on.
 
Fun observation: I'm playing a mix of old and new titles usually, and it is generally much harder to run a title from about 10 years ago with good enough IQ and high fps on Ultra settings despite them lacking RT or anything advanced enough by modern standards (they generally top out on voxelized volumetric lighting, the most visible new rendering "feature" of PS4 era).

This is partially because of their use of either MSAA or some bad post-AA forcing you to run them with supersampling sometimes but also because they lack any sort of modern upscaling resulting in need to use native rendering resolutions and because their "Ultra" settings are so weird that even today it is hard to get playable framerates in these games.

So from my casual observation I would say that RT did not have the impact on framerates Steve is saying it has. Even decoupled from DLSS/FSR it seems like a much better way to spend performance than what games were spending it on 10 years ago. With DLSS though everything is just faster, you're getting great IQ without the need to resort to some SGSSAA for that.

In other words - I sincerely don't know what they are even talking about.
Maybe he means this: if the resources devoted to RT were instead used to enhance traditional performance, the games would run faster.
 
Maybe he means this: if the resources devoted to RT were instead used to enhance traditional performance, the games would run faster.
At the same lighting quality and power efficiency? That is a dubious claim. Efficiency is performance when you're power limited. I'm not saying that statement is definitely wrong, but I've seen no evidence that something like software lumen can match hardware RT at the same quality, performance and efficiency level.
 
If that was true someone would’ve done it already.

Hoepfully this doesn't devolve into a IHV argument.

However I'm guessing their interperetion and that of a lot of others is they look at the correlation that AMD provides more "raster" performance per dollar (at least at market prices) for most of their consumer stack and that of their RT (and I guess ML) performance. Also that Turing pushed "raster" performance per dollar forward very little compared to previous generations.

Which then we can argue that would be a rather superficial and limited understanding of what is actually happening but there's a lot of "bro" level understanding and analysis of this stuff.
 
Hoepfully this doesn't devolve into a IHV argument.

However I'm guessing their interperetion and that of a lot of others is they look at the correlation that AMD provides more "raster" performance per dollar (at least at market prices) for most of their consumer stack and that of their RT (and I guess ML) performance. Also that Turing pushed "raster" performance per dollar forward very little compared to previous generations.

Which then we can argue that would be a rather superficial and limited understanding of what is actually happening but there's a lot of "bro" level understanding and analysis of this stuff.

When people talk about “raster” performance they’re usually talking about generic compute & memory performance and maybe more VRAM for pretty textures. Raster is relevant in specific situations with mesh shaders etc but that’s typically not what they’re referring to.

The question is how much more compute would we get if we tossed all the RT transistors and what graphical innovations would that enable. Lumen and Nanite are actually great examples of that.

The thing is that no matter what hardware you use to do it you must cast rays at some point. Raster is fundamentally limited in the type of graphics it can produce. Now you can cast those rays with dedicated transistors or with compute but that just becomes a question of performance and efficiency. Having a religious objection to RT just doesn’t make sense if you care about advancing graphics.
 
Hoepfully this doesn't devolve into a IHV argument.

However I'm guessing their interperetion and that of a lot of others is they look at the correlation that AMD provides more "raster" performance per dollar (at least at market prices) for most of their consumer stack and that of their RT (and I guess ML) performance. Also that Turing pushed "raster" performance per dollar forward very little compared to previous generations.

Which then we can argue that would be a rather superficial and limited understanding of what is actually happening but there's a lot of "bro" level understanding and analysis of this stuff.
A DX7 GPU with AD102 complexity would likely have higher DX7 performance than AD102. Does that mean that we should drop all advancements in rendering tech between DX7 and DX12U?
 
A bit of history about APIs .. During Half Life 2 development, Valve had to create many rendering systems, spanning 3 DirectX versions: DX7, DX8 and DX9, the differences between these 3 APIs were vast, each requiring different data storage, programming language and rendering approach. In the end Valve had to create 9 rendering systems and spent a very long time making sure all of them look consistent.

Contrast this to today, where Hardware Unboxed seems to think that ray tracing is bad because it forces developers to do two rendering paths in their games (one for raster and one for RT). Someone needs to brush up on their history.

 
Contrast this to today, where Hardware Unboxed seems to think that ray tracing is bad because it forces developers to do two rendering paths in their games (one for raster and one for RT). Someone needs to brush up on their history.
I wonder if they think baked lighting just materialises into existence without needing a RT renderer to do the baking...
 
Back
Top