Value of Hardware Unboxed benchmarking

What I’m saying is that if Nvidia offered non-RT hardware for a discount then people would buy it, meaning they value RT less than a few hundred buck discount.

Most people would buy less non RT performance for a few hundred bucks less as well hence why the best selling cards are lower in the price range.

This is the entire problem with the RT value argument from the perspective that most people don't care. Most people don't feel strongly about non RT graphics fidelity either, nor do they care about avg fps (especially not another 10 percent). Even for the price framing in the range most people buy at <$100 for something you'd use for years isn't something people care about either.

If we're going to approach it from that angle than apply it consistently. All coverage should occur with that perspective in mind. And not just for GPU coverage either but other hardware as well.
 
Yes, you said they omitted ray tracing, when they straight up implemented the biggest forms of ray tracing, it's a stepping stone of what's to come. They clearly value it and want it.


The majority of enthusiasts do care about latest graphics technologies, because they are enthusiasts they like to run their games at max settings, hence why they do care about ray tracing, it's the ultimate form of max settings.

Enthusiasts care about ray tracing as evident by the huge number of mods enabling ray and path tracing in new and old games, whether through RTX Remix or through post process shaders.

Enthusiasts care about ray tracing, which is why developers are catering for them by releasing remastered games with integrated ray tracing like the Witcher 3, World of Warcraft, Crysis 1, Crysis 2, Crysis 3, Dark Picture Anthology: Man Of Medan .. etc.

Everywhere I go in enthusiasts circles I see people demanding more powerful ray tracing in this game or that game, I even see people demanding path tracing, heck, people still bitch and moan about the delay of ray tracing for the PC version of GTA V to this day.


RT is popular because of the reasons I listed above.


Guess what? This has been the case for most recent graphics upgrades, DX9 to DX10 transition, DX10 to DX11, DX11 to DX12. Nothing is particularly new in this regard with DX12 to DXR transition, it takes time until every implementation is stellar.

These last few years witnessed the whole gaming industry grinds to a halt progress wise, with most releases being based on last gen hardware, with dozens of remasters and superficial upgrades and very slow progress towards true next gen graphics, not to mention the devastating effect of COVID, but we are getting there, most new AAA releases are using ray tracing and even path tracing. Most of the new trailers for upcoming 2025/2026 games and beyond feature good use of ray tracing.
If Treyarch and IW ‘valued raytracing’ they would put it in the actual game instead of in the animated main menu screen lol.

Where's the "few hundred bucks" come from? 4060 and 4060Ti are a hundred buck apart - what makes you think that a 4060Ti-no-RT would cost less than 4060?
And I'm not even going into the whole "die size have zero relation to market price of a product" here.


The choice of h/w for prebuilds is made by assessing the demand on the market for some h/w so if you're saying that Nv's market share comes from people who buy all these prebuidts then Nv's h/w in these prebuilds comes from the same demand as on non-prebuild markets.
People like Nvidia GPUs for the same reasons they’ve liked them prior to RTX: they are reliable and consumers know this.

System integrators aren’t going to take a chance on buggy Radeon drivers, and those that did got bitten in the ass with the 5700XTs black screen issue.

I mean come on now, people buying 4060 and 3060 builds aren’t building for RT. They’re going Nvidia because of superior reliability and the fact they don’t really have a choice unless they DIY.
 
Most people would buy less non RT performance for a few hundred bucks less as well hence why the best selling cards are lower in the price range.

This is the entire problem with the RT value argument from the perspective that most people don't care. Most people don't feel strongly about non RT graphics fidelity either, nor do they care about avg fps (especially not another 10 percent). Even for the price framing in the range most people buy at <$100 for something you'd use for years isn't something people care about either.

If we're going to approach it from that angle than apply it consistently. All coverage should occur with that perspective in mind. And not just for GPU coverage either but other hardware as well.
Average people 100% care about fps hence why the majority of even console players enable the 60 fps mode.

I’m saying at a given raster performance tier, people would likely purchase an Nvidia product without RT if it yielded a decent discount. For most people I talk to RT isn’t even close to a top priority. They buy RTX hardware because it’s also usually the best choice for raster as well.
 
Enthusiasts aren't more than a drop in an ocean. Some enthusiasts liking something doesn't make it popular.
It's one data point, hardware sales is another data point, mods is another, remastered games with RT is another, developers heavy use of the feature is another, consoles doubling down on RT is another.

If you plot out all of these data points you would come up with the correct conclusion.

People like Nvidia GPUs for the same reasons they’ve liked them prior to RTX: they are reliable and consumers know this.
See above.

Also the status of NVIDIA's market share is the highest ever, they practically have 88% market share which never happened before, clearly it's not just the NVIDIA brand at play here.

If Treyarch and IW ‘valued raytracing’ they would put it in the actual game instead of in the animated main menu screen lol.
But they are not completely omitting it like you claimed either, they still have value for it. You also don't know whether they will bring it to Black Ops 6 post launch or not. People didn't expect the integration in Modern Warfare 3 either, yet it still happened.

Just admit the mistake you've made in this point and move on, there is no need to argue this point any further.
 
Average people 100% care about fps hence why the majority of even console players enable the 60 fps mode.

I’m saying at a given raster performance tier, people would likely purchase an Nvidia product without RT if it yielded a decent discount. For most people I talk to RT isn’t even close to a top priority. They buy RTX hardware because it’s also usually the best choice for raster as well.

The console performance modes you're discussing are 30 to 60 fps. You will not be getting anywhere near 2x non raster performance at any tier hypothetically with RT hardware swapped out. I mentioned the 10% number for a reason, that's the reality, a performance delta most people won't care about.

Also you can get performance modes on the PC. But coverage is rarely done with performance mode settings. Is it or isn't it what people want?
 
Enthusiasts rarely care about such ratio, they run their games at max settings even without ray tracing.
What kind of an enthusiast settles for max settings and doesn't edit the config file to at least triple the sample per pixel count of ray traced effects?? An enthusiast in name only... *scoffs*

Seriously though, games often have some graphics settings that make no sense to turn on with the hardware available at the time of the game's release. Like who actually plays Cyberpunk with PT and DLAA on a 4k+ monitor now that the game has DLAA with ray reconstruction as on option in the menu? It runs unplayably even on a 4090.

I like that my games have settings in the menu that are meant for future hardware/fucking around. Test them, use them for photomode but if you're actually gonna play a game, don't play at 20fps, jeez...
 
This is the entire problem with the RT value argument from the perspective that most people don't care. Most people don't feel strongly about non RT graphics fidelity either, nor do they care about avg fps (especially not another 10 percent). Even for the price framing in the range most people buy at <$100 for something you'd use for years isn't something people care about either.

If we're going to approach it from that angle than apply it consistently. All coverage should occur with that perspective in mind. And not just for GPU coverage either but other hardware as well.

Yep this is such an important point. People go on and on about the poor ROI on some RT implementations. Yes there are some poor attempts at RT usage but there are many more examples of poor ROI from “ultra” and “max” settings.

If people genuinely cared about ROI they would be equally concerned with the poor return when increasing most settings from medium to high/ultra. In many cases the visual improvement is negligible.

In terms of the future there is no option for better rasterized lighting. The physics just don’t work that way. So I don’t really know what people mean when they say they want “more raster”.
 
It's one data point, hardware sales is another data point, mods is another, remastered games with RT is another, developers heavy use of the feature is another, consoles doubling down on RT is another.

If you plot out all of these data points you would come up with the correct conclusion.


See above.

Also the status of NVIDIA's market share is the highest ever, they practically have 88% market share which never happened before, clearly it's not just the NVIDIA brand at play here.


But they are not completely omitting it like you claimed either, they still have value for it. You also don't know whether they will bring it to Black Ops 6 post launch or not. People didn't expect the integration in Modern Warfare 3 either, yet it still happened.

Just admit the mistake you've made in this point and move on, there is no need to argue this point any further.
‘Admit the mistake’? Do you even play call of duty? The game effectively has no raytracing, you spend like 1% of your time in the menu and shooting range.

The fact is 3Arch went from implementing RT as a major feature in CW to including it as a menu only option in Blops 6. This isn’t advancement lol.
 
And if devs want to make UE5 slop because it’s easier then I won’t be buying them lol, and considering the response to Outlaws I think a lot of people agree.

I like RT when it’s impactful but in very few titles is it really impactful.
and then you have Silent Hill 2 becoming the fastest selling game in the series.

Outlaws doesn't even use UE5 and is one of the best usages of RT as you desire. it's failure has little to do with it's rendering features
 
So I don’t really know what people mean when they say they want “more raster”.
I'm not picking on what you've wrote at all, just quoting it to frame a point I think is always far too easily lost, especially in a thread like this.

Ray tracing is just one (very valuable) tool in an incredibly varied, versatile real-time rendering toolbox, and we're a long way away from it kicking out every other rendering technique or use of GPU hardware, especially in real-time games. There's plenty of life left in non-ray traced real-time rendering techniques, despite what it might feel like sometimes, so it's a matter of how everything is combined together to the benefit of the whole. That goes for both the evolution of both hardware and software.

Anyone is sat thinking the hardware or software for non-ray traced real-time graphics processing is a solved problem and needs no further work to become "more" (even if just limiting themselves to think about the balance of perf in HW or some other blinkered view) isn't thinking about it properly. The same applies to ray tracing of course, both hardware and software.

So "more raster" makes sense to me, as the catch all bucket for things in graphics that aren't ray tracing. I want more of both myself, and I think everyone should.

However, that binary bucketing of real-time graphics into RT and non-RT parts is such an overly simplified way of thinking about it, to the point where it clearly actively harms reasoning (and here on Beyond3D, discourse) about the whole. We could do with just moving past that to think and talk about the much more complicated interplay of everything that constitutes real-time graphics together.

It's become far too much like two-party politics in how it's discussed and debated. Both sides have good ideas about the future and plenty of room to become "more", and also help each other be better together.
 
So "more raster" makes sense to me, as the catch all bucket for things in graphics that aren't ray tracing. I want more of both myself, and I think everyone should.

I know most people refer to everything non-RT as raster but I think it muddies the conversation. Better material shaders for example isn't a rasterization improvement. What I think people really mean when they say they want better raster is that they want better textures, better shaders, better animation etc. all things that are orthogonal to the raster vs RT debate.

However when it comes to the problem of light transport that RT specifically addresses I don’t know how raster helps there unless we go back to baked everything except that’s not rasterization either.

If we deem every other aspect of rendering to be “rasterization” I don’t think we can have a real conversation. I think the folks here at B3D who are excited about RT and skeptical about further advances in rasterization are aware that texturing, shading etc are also integral to RTs evolution.

So I’m still in the same boat not understanding what people mean by “more raster”. Is there some raster technique out there that has the potential to give us better shadows, reflections and GI? And by raster I mean throwing triangles at a rasterizer vs casting rays at triangles.
 
I know most people refer to everything non-RT as raster but I think it muddies the conversation. Better material shaders for example isn't a rasterization improvement. What I think people really mean when they say they want better raster is that they want better textures, better shaders, better animation etc. all things that are orthogonal to the raster vs RT debate.

However when it comes to the problem of light transport that RT specifically addresses I don’t know how raster helps there unless we go back to baked everything except that’s not rasterization either.

If we deem every other aspect of rendering to be “rasterization” I don’t think we can have a real conversation. I think the folks here at B3D who are excited about RT and skeptical about further advances in rasterization are aware that texturing, shading etc are also integral to RTs evolution.

So I’m still in the same boat not understanding what people mean by “more raster”. Is there some raster technique out there that has the potential to give us better shadows, reflections and GI? And by raster I mean throwing triangles at a rasterizer vs casting rays at triangles.
I think it mostly stems from them comparing a higher performing title with no RT that they find to be better looking than a variety of lower performing titles with RT. Instead, people should be desiring better RT implementations.
 
HUB because they don't like Nvidia like they want them to.
For me it's not that I think they don't like nvidia, it's they tell people to buy AMD gpus then use nvidia in their own personal systems then try to make excuses down the track when the hardware they recommended didn't pan out. Steve even put nvidia hardware in his daughters pc for predominantly fortnite and when asked in a q&a why said something like I just had the gpu sitting there (was like a 3080ti I think). So he didn't have an rdna gpu sitting around?

RT sucks, upscaling sucks, less than 16gb vram sucks, but i'm gonna use nvidia but you should totally use amd. That's how it comes across, money where your mouth is and all that.
 
Yes, you said they omitted ray tracing, when they straight up implemented the biggest forms of ray tracing, it's a stepping stone of what's to come. They clearly value it and want it.
There are no doubts about the future, but what do you want to do now? Benchmark a menu of a game?
 
For me it's not that I think they don't like nvidia, it's they tell people to buy AMD gpus then use nvidia in their own personal systems then try to make excuses down the track when the hardware they recommended didn't pan out. Steve even put nvidia hardware in his daughters pc for predominantly fortnite and when asked in a q&a why said something like I just had the gpu sitting there (was like a 3080ti I think). So he didn't have an rdna gpu sitting around?

RT sucks, upscaling sucks, less than 16gb vram sucks, but i'm gonna use nvidia but you should totally use amd. That's how it comes across, money where your mouth is and all that.
HUB recommend Nvidia GPUs all the time and have never stopped praising DLSS for having superior image quality than FSR and XeSS.
 
For me it's not that I think they don't like nvidia, it's they tell people to buy AMD gpus then use nvidia in their own personal systems then try to make excuses down the track when the hardware they recommended didn't pan out. Steve even put nvidia hardware in his daughters pc for predominantly fortnite and when asked in a q&a why said something like I just had the gpu sitting there (was like a 3080ti I think). So he didn't have an rdna gpu sitting around?

RT sucks, upscaling sucks, less than 16gb vram sucks, but i'm gonna use nvidia but you should totally use amd. That's how it comes across, money where your mouth is and all that.
They have rarely recommended AMD GPUs at all for many years actually. You are completely out of touch with reality.
 
I think it mostly stems from them comparing a higher performing title with no RT that they find to be better looking than a variety of lower performing titles with RT. Instead, people should be desiring better RT implementations.

I think there are lots of people who see SSR and baked GI (or no GI) as good enough and want GPU cycles to be spent elsewhere. So it’s less about wanting to solve those problems with raster and more that they’re not that bothered about those problems in the first place.
 
What kind of an enthusiast settles for max settings and doesn't edit the config file to at least triple the sample per pixel count of ray traced effects?? An enthusiast in name only... *scoffs*
I did exactly that, in my current play through of The Witcher 3, I downloaded multiple mods that increase the quality and sample count of ray traced reflections, shadows, and global illumination from local lights, most of these mods have been downloaded about 22k times (from Nexusmods), the least downloaded mod is downloaded 6.5k.

When I played Hogwarts Legacy I downloaded a similar mod that has been downloaded 36k times. There is clearly a big enthusiastic audience for such mods. There is simply no shortage for people hungry for more graphics, especially on PC.

We could do with just moving past that to think and talk about the much more complicated interplay of everything that constitutes real-time graphics together
I fully agree with this, ray tracing is just the latest version of "Ultra" settings, it's a step up from ultra if you will, 6 years after it's introduction it shouldn't be treated as a separate entity, but rather as a part of the usual options of max settings. Heavy max settings have existed long before ray tracing, and continues to exist to this day, I don't see any reason to separate max settings into two groups (RT and non RT), all games are hybrid between the two anyway.

It's as you said, games should be evaluated as a whole package, X game has delivered a certain visual quality while performing at certain fps vs Y game that did the same while performing at lower fps .. etc.
 
I believe a lot of debate around RT will simmer down when it’s cheaper to get running well. Right now only one vendor really gives you enough RT performance to matter, and they charge quite a bit. The day people can buy cheap Nvidia/AMD/Intel (maybe) GPUs that can run things like path tracing without crazy upscaling sacrifices will be the day RT comes into its own, coupled with a console cycle that focuses on it. Right now it’s basically a feature for high end 40/30 series owners which isn’t all that interesting imo, and I say that as someone who did enjoy using RT in some titles and who currently uses a 3080ti.
 
HUB recommend Nvidia GPUs all the time and have never stopped praising DLSS for having superior image quality than FSR and XeSS.
Yeh once fsr and xess came out before then it was screw rt and upscaling you shouldn't need it 4k native or go home, unfortunately for them eventually you do have to come to terms with reality.
They have rarely recommended AMD GPUs at all for many years actually. You are completely out of touch with reality.
To be fair, you were possibly right because I did stop watching them after ampere/rdna 2 releases so I went and looked at their recommended gpu video from 2 months ago to see if your actually right or not. The 1000 dollar segment they say the 4080s is the superior pick if you don't care about nvidias features the 7900xtx is an ok buy because it's like 50usd less, we can call it a draw here although if your dropping 1k on a gpu is 50 bucks really a deal breaker?

Next segement he mentions the 7900xt is faster at native 4k as a win then mentions overall the 4070ti is 15% more expensive but 16% faster across all games then says he leans towards the 7900xt here because it provides more value........... I didn't watch any further because I think that confirmed i'm not out of touch with reality. Don't worry I don't expect an apology.
 
Back
Top