Value of Hardware Unboxed benchmarking

Would you rather have an exterior product :D
What I would definitely be interested in is in an keyboard with something which is called AI which would be capable of actually understanding the context when making suggestions while I type.
It would also help some of those with real intelligence I suppose.
 
Please try and stick to the forum guidelines about addressing the post rather than the person.

I think we can agree that HUB brought up an interesting topic about how well BMG works on systems without resizeable BAR, and one worth putting in front of consumers, even if it shouldn’t have been a huge surprise. Hopefully the topic can (positively and curiously) move on to how they could have improved the value of that analysis.
 
I don't believe so, people were arguing ray tracing will be slowly adopted, but in practice it was never the case, it was rapidly adopted by most engines and most AAA titles and dozens of console titles, that we now moved on to path tracing, and the list of path traced titles is growing rapidly. Even Mark Cerny is susrpized of this accelerated pace of adoption.
Again, I'll reference the 19K games released on steam in 2024. Not even 1% have DLSS talk less of RT. Nothing about the adoption is rapid at all.
Again that's not practical. Average GPUs can't even run rasterized games at max settings at 1080p60 (native). You need upscaling, and frame generation to so that, especially with UE5 games. Why do you expect ray tracing to act differently?
But I wasn't talking about max settings so I'm not sure why you're equating full RT to max settings. Full RT would not be max settings as the ray counts would dictate quality.
A 4070 can do path tracing at 1080p60 with upscaling and frame generation, if that's your definition of averge GPU. Medium ray tracing works very very well on the 4070, enough for 1440p and high fps.
4070 is not average at all. the x60 class or the $249-$399 price point is what seems to be the most popular price point. As for the 4070 being a path tracing gpu, I strongly disagree having used one for a while.
 
I think we can agree that HUB brought up an interesting topic about how well BMG works on systems without resizeable BAR, and one worth putting in front of consumers, even if it shouldn’t have been a huge surprise. Hopefully the topic can (positively and curiously) move on to how they could have improved the value of that analysis.
It doesn't appear to be a rebar problem. B580 is also slow on Ryzen 3600 and even 5600 (compared to AMD/NV). Not nearly as bad as on the 2600 but both of those CPUs definitely support ReBAR. Also they showed that the 2600 loses even more when rebar is disabled, so I think rebar is working on the 2600.
 
Again, I'll reference the 19K games released on steam in 2024. Not even 1% have DLSS talk less of RT. Nothing about the adoption is rapid at all.
Again, 99% of these games don't need more than a DX9 GPUs. They don't need even a 1030 to work. When we talk, we talk about AAA or AA games with advanced graphics. You are mixing different things together.

But I wasn't talking about max settings so I'm not sure why you're equating full RT to max settings. Full RT would not be max settings as the ray counts would dictate quality.
Full RT by definition is the most max settings of all. It's the most taxing and visually impactful setting.

4070 is not average at all. the x60 class or the $249-$399 price point is what seems to be the most popular price point
You said you are not referencing the 4060 as the average GPU, yet here you are talking about the 4060 price point.

for the 4070 being a path tracing gpu, I strongly disagree having used one for a while.
I played Portal RTX (the heaviest path traced title) on the 4070 at 1080p DLSS Quality, it was almost locked at 60fps, with Frame Generation it went past that with ease, so I know what I am talking about.

Cyberpunk path tracing works very well on a 3080 at 1080p DLSS Quality. The 3080 is a 4070 class GPU.
 
Last edited:
Again, 99% of these games don't need more than a DX9 GPUs. They don't need even a 1030 to work. When we talk, we talk about AAA or AA games with advanced graphics. You are mixing different things together.
Ok, that's not the point. You said that the features were rapidly adopted and that's not true. You should qualify your statements as I cannot be expected to know the context in which you're speaking.
Full RT by definition is the most max settings of all. It's the most taxing and visually impactful setting.
It's the most taxing setting now because the gpu's of today cannot provide the granularity required to push rt further. There will come a time where even the most affordable gpus can do the most basic form of full rt and the differentiation will be things like ray counts, etc.
You said you are not referencing the 4060 as the average GPU, yet here you are talking about the 4060 price point.
No I did not, I said:
I never specified that the 4060 in general needs to be the one that allows average users to run full RT
I earlier specified that the average gpu needed to do full RT to be relevant to most users. That statement is a future looking statement meaning that I expect in about 2-3 generations, that tier of gpu would be able to do full rt.
I played Portal RTX (the heaviest path traced title) on the 4070 at 1080p DLSS Quality, it was almost locked at 60fps, with Frame Generation it went past that with ease, so I know what I am talking about.
So not 1080p and not 60fps?
Cyberpunk path tracing works very well on a 3080 at 1080p DLSS Quality. The 3080 is a 4070 class GPU.
No it's not. OG 4070 is weaker than the 3080 10gb. This is something I tested extensively. Also, the 3080 doesn't work well at DLSS quality. The image quality is poor and the frame rate is unstable.
 
Ok, that's not the point. You said that the features were rapidly adopted and that's not true.

If you’ve been around this scene for a while you know exactly what he meant. It takes forever for new graphics tech to be adopted by high profile games and engines far less for the mass produced fodder that you find on Steam. How many games are using mesh shaders, sampler feedback, VRS? DX12 as a whole has had a very slow ramp.

The adoption of RT (and upscaling) has been incredibly fast given the relatively short time they’ve been on the market.
 
It's the most taxing setting now because the gpu's of today cannot provide the granularity required to push rt further. There will come a time where even the most affordable gpus can do the most basic form of full rt and the differentiation will be things like ray counts, etc.
That's an interesting point. How well do entry-level GPUs cope with RT? Are we really getting anywhere close? I don't think so and I'm not convinced entry-level GPUs will do 'full RT' any time soon, but maybe the data plays that out differently. I keep hearing RT brings the top-end GPUs to their knees, so you'll be needing current top-line performance in an entry level card, and I don't think the silicon will scale down that low.
 
That's an interesting point. How well do entry-level GPUs cope with RT? Are we really getting anywhere close? I don't think so and I'm not convinced entry-level GPUs will do 'full RT' any time soon, but maybe the data plays that out differently.
And then we devolve into the pedantry of "what is full RT" and start circling the inevitable bowl of "even a 4090 can't do real RT because everything is a hack!"

I might suggest a different route: the entire reason games are offering ANY AMOUNT of raytracing today is because even the low-tier cards two generations ago can support and accelerate the basic BVH implementation, even if it's only for one specific effect and not literally calculating every virtual photon in the rastered scene. In my opinion, this is not dissimilar to how pixel shading started... The earliest pixel shaders were pretty simple, maybe a few instructions tops, and were only available to a scant selection of cards. Multiple generations of games came along without any pixel shading abilities, because the implementation could be quite slow even on the newest cards, and was only available for a generation or two.

Eventually after almost a decade of pixel shading hardware was available did most games finally really start using pixel shaders, and the craziest of those shaders are still being developed even to this day -- two and a half decades after DX8 released the PS1 profile.
 
You said that the features were rapidly adopted and that's not true
In 6 years, we have 600 games with DLSS. We have more than 220 games with ray tracing/path tracing. Not counting mods and RTX Remixes. This is literally the fastest adoption of a DirectX feature in recent history.

There will come a time where even the most affordable gpus can do the most basic form of full rt
Yeah I agree, I already said that a 6060 will handle Cyberpunk and Alan Wake 2 path tracing at 1440p60 even, but it won't handle max full RT for Cyberpunk 2 and Alan Wake 3. There will always be higher path tracing to chase (more rays per pixel, more path traced parts of the pipeline, more complex materials to path trace against .. etc).

No it's not. OG 4070 is weaker than the 3080 10gb.
Slightly weaker in raster, but much faster in path tracing.
 
Let the meltdown begin. I can feel it coming. Going to be a lot of talk about performance numbers including frame generation.
You know, i dont care. Fake graphic is still fake. Rasterizing has nothing to do with "real"ity. Getting two or three addional frames for free (same latency) makes it a no brainer. The only question is quality. But when one of these frames are only displayed for 5ms or less i guess quality is not really a problem.
 
You know, i dont care. Fake graphic is still fake. Rasterizing has nothing to do with "real"ity. Getting two or three addional frames for free (same latency) makes it a no brainer. The only question is quality. But when one of these frames are only displayed for 5ms or less i guess quality is not really a problem.

I do and I don't. I have no issue with frame generation. I'll use it to hit 240 Hz, for sure. I do think it's a bit weird to compare performance this way, but discounting the feature is also stupid too. Being able to generate 3 frames instead of 1 is a big improvement, assuming the quality is at least as good. Will make 480Hz displays a lot more viable for that smooth smooth gameplay.
 
Its the discussion from two years ago. Reflex is reducing latency without increasing frames. Makes this the non Reflex mode less real? All these youtubers and complainers nexer explained why they still using just FPS as a metric and not FPS/latency. 60FPS with 30ms input lag processes twice the input information than 60 FPS with 60ms. So the 60ms would have only 1/2 of "real" frames...
 
I do and I don't. I have no issue with frame generation. I'll use it to hit 240 Hz, for sure. I do think it's a bit weird to compare performance this way, but discounting the feature is also stupid too. Being able to generate 3 frames instead of 1 is a big improvement, assuming the quality is at least as good. Will make 480Hz displays a lot more viable for that smooth smooth gameplay.

The broader overarching issue we've been having is if we should be even focused on performance in terms of benchmarks or user experience.

Testing with performance numbers at the onset and the evolution of that was convey and compare user experience but it also later took a life of its own in terms people just comparing benchmark numbers for benchmark numbers sake.

What we're facing now is that there is a growing disconnect between actual user experience and the now traditional bench marking numbers. What I'm seeing is a lot of the "old guard" (for lack of a better term) community, whether supposed reviewers or consumers of that content, are having trouble reconciling with that.

In the past at one point we moved on from synthetic benchmarks to game testing to more frame data vs just avg fps. We might need to now move on from just looking at how much raw frames are spit out.
 
There’s already a huge disconnect between how people use their cards and how reviewers test them. Actual usage patterns among consumers (upscaling, frame gen) is probably a year or two ahead of the benchmark scene that’s sticking to raw fps.

I think raw fps is still a very important metric but GPU reviews will lose relevance over time if they don’t also cover how people actually use their GPUs.
 
Back
Top