Value of Hardware Unboxed benchmarking

literal "ray tracing" follows (to the best of our ability) the physical model of how photons move through our universe

This is the challenge with coming up with alternative approaches. One of the most fundamental questions in rendering is a visibility query between two points in 3 dimensional space. I don't know how you answer that question without sampling a 3D representation of the environment. So it all boils down to querying a world space acceleration structure.

Whether you do that with rays or cones or whether it's a triangle BVH or voxel grid it's still the same paradigm. Epic made a valiant effort with tracing SDFs but quickly pivoted to triangle BVHs when the limitations became impossible to ignore. Would love to see other engines trying something novel but not holding my breath.

The other part of the argument against RT is essentially "we don't need accurate light transport because the hacks look good enough and nobody notices the IQ improvement anyway". I think we can agree there's no merit in that stance as technology always drives forward.
 
Realistically, where does Gaussian Splatting and Radiance Field Rendering, et al, take us? This is the first 'next-gen' result from a completely different paradigm. Is there anything there even hinting at an alternative for realtime graphics?
 
until affordable GPUs can run full RT, it’s of no benefit to the average user
We spent the entire last page detailing how this is never going to be true, high end visual features will always require high end hardware, this has been the history of PC gaming since forever.

If you buy a low end GPU, don't expect it to run max path tracing at max native res at 120fps, you may be able to do medium ray tracing at a good 60fps or so, you may be able to do selective ray tracing at high fps, but expecting full RT to run on a 4060 at 60fps is not realistic at all.

In the past, Anti Aliasing remained a very high end feature for a long time, the same was true for shader model 3 effects, dynamic shadow maps, soft shadows, high levels of tessellation, physics, ... etc. Ray Tracing is no exception.

Also ray tracing stuff increases in complexity each year, like anything else in graphics .. higher tiers of graphics are always introduced, expecting average hardware to handle the newest stuff is never going to happen.

For example, a 4060 can handle Cyberpunk with ray tracing just fine, but with path tracing? Of course not! This is a high end feature introduced to run on high end hardware, you can't demand that a 4060 be able to do path tracing at 60fps. However, you are guaranteed to be able to do it on 6060 in the future.

It's like asking a a mid GPU to do FEAR or Oblivion or Crysis at max settings (in their time), It's just not possible. Hell, the 4060 can't do 60fps in many raster games even at native 1080p (Flight Sim 2024, Final Fantasy XVI, Frostpunk 2, Alan Wake 2, STALKER2, Dragon Age Veilguard, Black Myth Wukong, Hellblade 2, Silent Hell 2, Horizon 2, Dragon Dogma 2 ..etc), why are you not complaining that these games are not doing 60fps for average users? Hardware Unboxed criticized the 2060 for failing to do good max ray tracing at 1080p in most contemporary ray tracing titles, but the 2060 can't even do 60fps in most contemporary raster titles as well .. so where is the issue? Why RT is being singled out like that when it's behaving like any other visual features that existed before it?
 
Last edited:
We spent the entire last page detailing how this is never going to be true, high end visual features will always require high end hardware, this has been the history of PC gaming since forever.
But this assumes RT will always be just high-end feature, surely you can't at the same time assume that and RT becoming more relevant to people
 
His point is that there's always something the most capable contemporary GPUs of the day can do that older or lesser models can't. It's an inherent part of the marketing of any new architecture and product generation, showing off what they can do and why they're worth the money, which includes convincing developers to lean on what's new and break ground for the future. Performance is also a feature after all.

It's high-performance ray tracing and cutting edge upscaling and frame generation today, it'll be something else eventually.
 
But this assumes RT will always be just high-end feature, surely you can't at the same time assume that and RT becoming more relevant to people
Yes, RT will always have high end features, RT is a new rendering paradigm that will scale up indefinitely for the near future at least.

Developers were first doing selective ray tracing, either reflections or shadows or indirect illumination, then they started combining them, shadows + reflections, or reflections + illumination, then they started piling them up on top of each other, achieving a semi full ray tracing pipeline (shadows + reflections + indirect illumination), then path tracing was introduced to add direct illumination on top + higher resolution for shadows, reflections, indirect illumination, then we are going to do caustics, transmission (skin and sub surface scattering) and volumetrics inside the path tracing, then higher rays per pixels are going to be unlocked to allow for a more offline rendering look .. with more powerful hardware more ray tracing features are going to be unlocked, because why not?

Scalability is the main feature of PC games, visual features are not only introduced to run on current hardware, but on more capable future hardware as well, this has been the hallmark of PC gaming since forever, it's why we got Far Cry and Crysis and FEAR and Metro .... etc. Asking developers to stop making high end graphics serves whom exactly? what is the benefit? high end features are optional, you don't have to run them at all, just ignore them If you don't think they are worthy.

Meanwhile, everyone stands to benefit when they play old games with advanced graphics on future hardware, these old games will still look fantastic as they endured the test of time thanks to their advanced graphics. Crysis remained the best looking game for years after it's introduction, and Cyberpunk will continue that legacy too, it's why we have great looking old games to this day, it's why developers remaster old games in the first place, it's why gamers mod old games with high res textures, reshade ray tracing and actual path tracing ... etc. Developers and gamers actually care about this.

And now, thanks to ray tracing and path tracing, we will have many Crysis moments every year, instead of every few years.
 
Last edited:
Steve forms strong opinions early and triples down on them. He personally only wants to play shooters with max frame rates so is quick to diminish graphics fidelity tech.

He doesn’t care for RT, hdr or even oled.

Their active community is very pro amd from years of Steve going to bat for Zen repeatedly even when it was notably inferior for gaming by coming up with power draw comparisons as his main talking points. So polls and comments on HuB content will be very pro amd regardless of reality.
 
And yet he ignores Reflex which makes playing "shooters with max frame rates" even more awesome. Obviously this guy likes everything which makes AMD equal or better than nVidia.
 
These simple concepts seem to elude Hardware Unboxed and the other copy cat Youtubers, they seem fundamentally incapable of understanding the PC platform itself, I would have respected them if they went on a crusade against all taxing visual features, at least then they would be consistent! But no, they went on a crusade against ray tracing alone, confirming their bias and arbitrary standards.

I don't know why we are even still debating against ray tracing when the archetict of PS5 (Mark Cerny) said that raster is no longer good enough and ray tracing + machine learning are the future in which we are going to achieve big jumps in quality and performance.
 
I don't know why we are even still debating against ray tracing when the archetict of PS5 (Mark Cerny) said that raster is no longer good enough and ray tracing + machine learning are the future in which we are going to achieve big jumps in quality and performance.
All praise the Great Cerny! 🙇🏽‍♂️
 
Blenders (also known as ROPs) had to be larger to accelerate AA.

RT hardware is very small, much smaller than tensor cores even. So it's very cheap for the huge acceleration it adds. NVIDIA did a comparison 6 years ago detailing how large GPUs needed to be to do the same ray tracing performance but without RT cores, it needed to be much much larger for the same performance.


The same is true for most ray tracing implementations, no one is forcing you to do anything.


AA was a high end only feature, it literally cut fps in half. Most GPUs couldn't handle it.

You bring the controversy on yourself by thinking and acting like RT is forced on you, it isn't, even though it's part of DirectX and Vulkan.
ROPs where often "enlarged" to make AA faster, true. How does that compare to the size of RT units along with all the other changes needed?

It is not true, we have only the options developers expose.

AA wasn't a high end only feature, I was using it since GeForce 2MX. It does not literally cut fps in half, it all depends on implementations and settings. Settings put into users hands. That is why it also doesn't make sense to say most GPUs couldn't handle it.

Oh, and that framerate comparison with the past doesn't make sense either. There is clearly an increase in 3d framerate expected ever since the middle of the nineties. Doesn't have to be any entitlement. I could be happy with 25 fps on a CRT, but with an OLED screen, I need around double that to feel the same fluency. Saying that the same framerate today is fine just because people were fine with it decades ago is plain dumb.

After this transition period, many games will have only the RT path. It will be forced on us. I don't mind that. I just cannot pretend there aren't drawbacks today like you do. This transitional period is painful for everyone.

Pretty sure RT works exactly like this now (except forced unless we are gonna use screenspace reshades or rtx remix) but yeh screw anyone who wants RT they shouldn't have options I don't like/want.
Whoosh.
 
He doesn’t care for RT, hdr or even oled.

There’s a surprising lack of HDR coverage in general. LG is rolling out some interesting ultra wide OLED panels in 2025 and I’m considering making the jump to HDR on PC. Seems no one really talks about the quality or benefit of HDR implementations in game reviews though.
 
AA was a highend feature until GF3 with the introduction of MSAA. SSAA is the most inefficient way to do AA. The mainstream only got access to MSAA with the GF3 Ti200 at the end of 2001.
 
Oh, and that framerate comparison with the past doesn't make sense either. There is clearly an increase in 3d framerate expected ever since the middle of the nineties. Doesn't have to be any entitlement. I could be happy with 25 fps on a CRT, but with an OLED screen, I need around double that to feel the same fluency. Saying that the same framerate today is fine just because people were fine with it decades ago is plain dumb.
That's categorically false, frames don't lose their essence just because we switched display technologies, otherwise all of those console games being played at 30fps on OLEDs are what? 15fps games?

It does not literally cut fps in half
Freshen up your memory. Doom 3 from 75fps to 42fps.

3428.png
3430.png


Far Cry from 58fps to 32fps.

1897.png
1895.png


FEAR from 53fps to 26fps.
fear3.jpg
fear2.jpg


You get the picture ...
 
There’s a surprising lack of HDR coverage in general. LG is rolling out some interesting ultra wide OLED panels in 2025 and I’m considering making the jump to HDR on PC. Seems no one really talks about the quality or benefit of HDR implementations in game reviews though.

Because native hdr implementation in games often sits between trash and mediocre. Also the same people used to doing game image quality reviews would need to learn what to look for in hdr as it’s a different skill set.

Best course of action atm is to use RTX hdr for a consistent experience across games.

If we want better hdr implementations, we need reviewers and IQ analysts to learn about HDR and call it out.

The joke is that it’s one of the biggest gains in image quality with no performance hit so it’s even more of a disservice that it doesn’t get attention.

I’m optimistic that as time passes and more enthusiast displays perform well with hdr, we’ll see more emphasis on it.
 
AA (ignoring the edge aa of cards like the verite) only became a usable feature with the voodoo 5 (with it's t-buffer and it's RGSS method of anti-aliasing) and the Geforce 2 GTS with it's brute force ordered-grid over-sampling.
 
That's categorically false, frames don't lose their essence just because we switched display technologies, otherwise all of those console games being played at 30fps on OLEDs are what? 15fps games?

30fps games are 30fps games. Try a relevant question. What is the essence of frame?

You get the picture ...

Yep, you need to learn so much about AA and I don't have the time to help you.
 
That's categorically false, frames don't lose their essence just because we switched display technologies, otherwise all of those console games being played at 30fps on OLEDs are what? 15fps games?


Freshen up your memory. Doom 3 from 75fps to 42fps.

3428.png
3430.png


Far Cry from 58fps to 32fps.

1897.png
1895.png


FEAR from 53fps to 26fps.
fear3.jpg
fear2.jpg


You get the picture ...

This graph can be misleading. 1600x1200 resolution would be something like 4K nowadays, it wasn't common. The most common would be 800x600, 1366x768, 1024x768 or 1280x1024. Apart from the wonder that CRT monitors were, in which lowering the resolution did not have a very noticeable impact on the image quality. Using AA was more of a requirement for LCD monitors.

56MACcqGSnMWqt8XBYtZ6U-1200-80.gif
farcry.png
 
Back
Top