Value of Hardware Unboxed benchmarking

It's the same arguments we keep circling back to, RT is not noticeable, RT costs too much performance, RT cards are expensive, RT is irrelevant and the loop goes on forever.

But we forget, we forget the times were 4X Anti Aliasing alone used to cut fps in half, we forget AAA games with advanced graphics running at 30 to 50fps on high end hardware.

Here is Doom 3 at max settings on the highest GPUs, 42fps!

3426.png


Far Cry 1: max settings, 50 fps.

2849.png


Crysis Warhead, max settings: 22fps!

Enthusiast_03-p.webp



F.E.A.R. running at max settings: 28fps!

fear2.jpg


Oblivion at max settings: 38fps!

oblivion-highend-bloom.png


Far Cry 3 max settings: 31fps!

2560_01-p.webp

Metro 2033 max settings: 38fps!

45130.png


And I could go on and on and on. But we forget, and so called YouTube "reviewers" want max graphics + 120fps + max native 4K resolution. It doesn't work like that, it never did work like that in PC gaming, and history remains the evidence for that.
 
There was a brief period of time in which the 1080 Ti could crush most games and deliver a solid 1080P 120FPS or 4K60 FPS at or near max settings, and even a budget card like the 1060 could easily beat PS4 performance. People who started PC gaming at that time might have naively assumed that they could just keep cranking resolution and frame rate higher and higher with each generation, and were blind to the fact that performance requirements to hit a certain resolution/frame rate target constantly increase, especially when a new console generation arrives.
 
That's a really good point. Back in the day I don't remember people whining so much when max settings crushed the top GPUs. Maybe the extreme entitlement of the current generation is leaking into GPU performance too.

Oh they definitely did. People were not excited to replace one or two year old GPUs. It really limited the growth of pc gaming. Probably something we never want to go back to.
 
Indeed as shown above, old Games from the PC golden erra at Ultra settings at Ultra res we're pretty much never 60 FPS. If anyone wanta me to test I have nearly every higher end GPU since around 2000. But now, our Ultra settings are far more visually impactful than they ever we're before thanks to RT I would argue. As I see it Ray Tracing is the best "ultra" settings that have possiblyy ever existed since MSAA or shadow maps. It is a meaningful visual difference that also arguably scales infinitely into the future If the dev is smart enough to allow more Rayper pixel.

I really hope devs start exposing the RPP in someway for games, so we can have RTX 1090s and Run a Games reflections at alike 10 RPP with little denoising needed.
 
Oh they definitely did. People were not excited to replace one or two year old GPUs. It really limited the growth of pc gaming. Probably something we never want to go back to.
The dark age has only began after there was only nVidia and AMD (ATi) on the market. From 1993 till 2006 or so was the growth phase of PC gaming. And this was the time when every six months there was a new GPU and another new must have feature.
 
The dark age has only began after there was only nVidia and AMD (ATi) on the market. From 1993 till 2006 or so was the growth phase of PC gaming. And this was the time when every six months there was a new GPU and another new must have feature.

I don't know. I went through the whole thing. Being a teen in the 90s it was hard to get money to constantly get new graphics cards. I remember borrowing a 3dfx card from a friend at one point because he upgraded and I just couldn't get anything at the time. I went through a lot of cards, and then tapped out when the xbox360 released because it had good multiplayer and COD4 finally figured out good fps controls on a gamepad.
 
I am suggesting a DF Retro episode about this,
I do feel a DF article on the impact of raytracing and comparing relative performance with gains across time would be entertainingly informative and a positive contribution to the industry discussion. A recalibration is definitely in order. I'd like to see inflation adjusted pricing and what the more expensive cards got yo from the best games of the period to see what the value-add really was.
 
Please speak for yourself. I notice SSR all the time during gameplay without looking for it, and it’s absolutely immersion breaking.

Yeah there’s some noise here and there. It doesn’t bug me as much as the artifacts from the metric ton of hackery that RT does away with. I don’t want to go back.
Thats fair and it’s understandable why you prefer it. However, keep in mind that I started this discussion way back by talking about the average user. The average user does not know what ssr is and a majority don’t care. The average user can tell you if a game looks good, if it smooth or stuttering, if they enjoy playing the game, etc. They can tell you things they don’t like about the gameplay, etc.

You’re not the average user as you’re on beyond 3d. While I appreciate having options in the pc space, the foundational premise of my argument is that in resource constrained devices, developers should cater to the needs of the average user first as they make up the majority of the market. If you like RT reflection, that option should be available to you on pc to turn in as you see fit. However, it should not be the primary concern when designing a game at this point in time.

While this may seem controversial to some, I’m strongly of the opinion that RT in its current iteration is not about improving graphics. It’s about helping devs save money and time. The performance of hardware the average user has access to is only able to perform raytracing with significant compromises. Frankly, whether a studio saves money or time is not my concern. However, if they deliver a substandard product in the quest to save time and money, then it becomes a huge problem for me.

As it stands, games that tack RT on often have poor implementation of raster effects such as ssr, cube maps, etc. They often have very poor performance on the average users machine. And for most, the difference isn’t immediately noticeable except you know where to look. For users who don’t have access to sufficient hardware, the experience is now worse than it would have been a few short years ago.

Finally as a side note, despite all my gripes with nvidia, they have done a good job releasing the first 1080p/60fps “full RT” card. As someone who owns a 4080, I consider it inadequate for ray tracing. However, with a 4090, you can get 1080p60fps native in most full RT games. How long will that performance take to filter down to the x60 series? 3-5 generations? Until then, devs should not forget their primary audience to cater to the 1%.
 
I do feel a DF article on the impact of raytracing and comparing relative performance with gains across time would be entertainingly informative and a positive contribution to the industry discussion. A recalibration is definitely in order. I'd like to see inflation adjusted pricing and what the more expensive cards got yo from the best games of the period to see what the value-add really was.
Hub already covered this topic comprehensively? Not sure what DF could add to the discussion.
 
Thats fair and it’s understandable why you prefer it. However, keep in mind that I started this discussion way back by talking about the average user. The average user does not know what ssr is and a majority don’t care. The average user can tell you if a game looks good, if it smooth or stuttering, if they enjoy playing the game, etc. They can tell you things they don’t like about the gameplay, etc.
That's exactly the point. What looks good or "correct" is determined by our visual systems, based on our past experience in the world, and not based on our abstract knowledge of how games are designed. We don't need to know anything about SSR or ray tracing to know that a shiny surface shouldn't suddenly stop reflecting everything just because we have approached it. The look of surfaces is defined precisely by how light interacts with them, and the average user should have a perfectly functioning visual system that can tell when a surface looks wrong, just like they can tell when an animation seems unnatural, or a face looks "off".

The point of ray tracing is to remove the artifacts that prevent a scene from seeming real, without the user necessarily being able to articulate everything that has changed.
 
100% agreed with @Subtlesnake 's summary above. The "average gamer" couldn't possibly care less how the tech works, just so long as the game "looks right" as they go whizzing past.

Speaking only for myself, one of the absolute best moments in Cyberpunk 2077 was seeing water puddles that weren't an SSR randomly-reflective shitshow. I actually took some pictures and posted them somewhere in this forum a few years back, explicitly of a section of road where I was comparing raytraced reflections versus "not" raytraced reflections (SSR) and holy hell is it obvious. For a game with so many water puddles all over the world, even when zooming over / past them at 100mph in-game inside a car, my hind-brain immediately noticed something as right but it took a few seconds for me to consciously figure out why it looked better.

Now that I've witnessed raytraced reflections, SSR can just die in a fire kthx ;)

Edit: Found my old screenshots https://forum.beyond3d.com/threads/cyberpunk-2077-xo-xbsx-s-pc-ps4-ps5.60786/page-43#post-2242499
Example in my pictures above: the reason that water puddle is basically white in the SSR / RTX-OFF screenshot is because my viewport doesn't include the scenery which should actually be reflected in that water. The jarring part: if I were to look "up" slightly, suddenly the reflection would generate as a visible pop-in. The RTX-ON screencapture correctly rasterizes the reflection in the water puddle the entire time, no matter where my viewport is looking.

While actually playing the game, RTX lighting, shadowing and GI make a remarkable difference in the visual quality of the world and my personal immersion in it.
 
Last edited:
As I've said already I feel that a substantial part of anti-RT crowd have gotten too used to what are in fact non-RT rendering artifacts which is why we now have people arguing that RT looks bad when in practice it looks better by every measure for those who have never played a game in their life.

That and the costs of entry of course. When you see that a majority of gamers on Steam are sitting on 1060, then 1660 and even now it's just 3060/4060 then it's not hard to see where the general mindset against RT may come from.
 
It's the same arguments we keep circling back to, RT is not noticeable, RT costs too much performance, RT cards are expensive, RT is irrelevant and the loop goes on forever.

But we forget, we forget the times were 4X Anti Aliasing alone used to cut fps in half, we forget AAA games with advanced graphics running at 30 to 50fps on high end hardware.

Here is Doom 3 at max settings on the highest GPUs, 42fps!

3426.png


Far Cry 1: max settings, 50 fps.

2849.png


Crysis Warhead, max settings: 22fps!

Enthusiast_03-p.webp



F.E.A.R. running at max settings: 28fps!

fear2.jpg


Oblivion at max settings: 38fps!

oblivion-highend-bloom.png


Far Cry 3 max settings: 31fps!

2560_01-p.webp

Metro 2033 max settings: 38fps!

45130.png


And I could go on and on and on. But we forget, and so called YouTube "reviewers" want max graphics + 120fps + max native 4K resolution. It doesn't work like that, it never did work like that in PC gaming, and history remains the evidence for that.
I can not for the life of me, remember a single of those games (from that "golden age of gaming" everyone keeps talking about), that run at or above 60 at the highest settings on the flagship hardware of the time...
I would crank everything to 11, marvel at the pretty graphics at slideshow levels of performance,
(graphics that were actually a lot better at the highest setting and not marginally better like they are now from "High" to "ultra" excluding ray tracing that is IMO actually transformative)
adjust to enjoy the game at playable frame rates and then dream what the next generation of hardware will be able to do.
The only time I think many games ran absolutely amazing through brute force on pc, (if my memory isn't playing tricks on me, it might very well do), was the later half of the 360-ps3 era...

those who have never played a game in their life
I have a good friend (he is a graphic designer) that I occasionally, throughout the years, show him new (and shinny) games.
The last game he played was Marathon (2 maybe?) on a mac.
He wouldn't have a clue as to why (technically) things looked wrong, but he would always spot everything instantly, from low polygon counts to lighting errors, wrong and misaligned reflections, to occlusion artifacts.
Path-traced Cyberpunk was the first game that actually impressed him graphically.

Either way, real time or even off-line rendered graphics have always been about compromises.
You can't have it all.
 
It's the same arguments we keep circling back to, RT is not noticeable, RT costs too much performance, RT cards are expensive, RT is irrelevant and the loop goes on forever.

But we forget, we forget the times were 4X Anti Aliasing alone used to cut fps in half, we forget AAA games with advanced graphics running at 30 to 50fps on high end hardware.
If the second line is supposed to be a counter-argument to the first one, it fails.

The AA implementation was much easier and cheaper to add to the hardware than RT. It can be easily increased, decreased, forced, and turned off without drastically changing the rendering, making it a non-controversial feature.
Also if you forgot, speak for yourself.
 
It can be easily increased, decreased, forced, and turned off without drastically changing the rendering, making it a non-controversial feature.
Also if you forgot, speak for yourself.
Pretty sure RT works exactly like this now (except forced unless we are gonna use screenspace reshades or rtx remix) but yeh screw anyone who wants RT they shouldn't have options I don't like/want.
 
That's exactly the point. What looks good or "correct" is determined by our visual systems, based on our past experience in the world, and not based on our abstract knowledge of how games are designed. We don't need to know anything about SSR or ray tracing to know that a shiny surface shouldn't suddenly stop reflecting everything just because we have approached it. The look of surfaces is defined precisely by how light interacts with them, and the average user should have a perfectly functioning visual system that can tell when a surface looks wrong, just like they can tell when an animation seems unnatural, or a face looks "off".

The point of ray tracing is to remove the artifacts that prevent a scene from seeming real, without the user necessarily being able to articulate everything that has changed.
Yes, the purpose of ray tracing is well known. I’m not sure how what you’ve said contradicts anything I’ve said nor do I understand what point you’re trying to make with your response?

If your claim that a majority of average users were complaining that something looked off with visuals prior to Ray tracing, that argument holds no water at all and there’s not an ounce of evidence to support it. On the contrary, a majority complain that they cannot see the difference between RT implementations in its current iteration however they can see a huge performance loss, >50% in many cases.

You’ll need to clarify the point you’re trying to make. As for RT in general, until affordable GPUs can run full RT significant compromises, it’s of no benefit to the average user.
 
Back
Top