Value of Hardware Unboxed benchmarking *spawn

If you watch all their videos all the time they are.
If you've just read a review of theirs then not really.

User error. Every person has preferences. If you're taking reviews at face value in spite of your own preferences, that's on you. If you want an all encompassing review that treats all things equally, then do the research and find a reviewer that reviews that way. If you have your own preferences and you want a review that biases the same way, then do the research and find a reviewer that shares your preferences. Not all reviews have to cater to all people. I actually think they're much better when they don't.
 
User error. Every person has preferences. If you're taking reviews at face value in spite of your own preferences, that's on you. If you want an all encompassing review that treats all things equally, then do the research and find a reviewer that reviews that way. If you have your own preferences and you want a review that biases the same way, then do the research and find a reviewer that shares your preferences. Not all reviews have to cater to all people. I actually think they're much better when they don't.
I do my research which is also why I've stopped watching bad reviews from Steve. But not everyone have the time to do the same.
 
Advertising products on their own preferences is "deceitful".

I think Linkin Park is bottom-of-the-barrel trash music with little to no redeeming value. I wish people listened to things that I think are much much better.

Is that a deceitful opinion? If you think yes, you've lost the plot. People like and value different things and that's great. It makes reviews better, not worse. You can find someone that's generally like-minded and you'll be able to extract more value from their reviews. If Hardware Unboxed doesn't give you what you want, why are you watching them?

The only issue is when people misrepresent things. If HU said a gpu was lacking a feature when it actually had it, that would be a problem. If they said a gpu performed a particular way and those results were not reproducible within some margin of error, that would be a problem. If they tested gpus with different settings to make one look better than another, that would be a problem. If they were accepting payment in exchange for positive impressions, that would be a problem. If they don't include a lot of ray tracing in their reviews because they don't like it, that is NOT a problem.
 
They’re open about their preferences.

Yep Tim and Steve clearly have different preferences and priorities when it comes to GPUs and gaming and they were very clear about their respective views in the video. The interaction between them was really interesting.
 
I think Linkin Park is bottom-of-the-barrel trash music with little to no redeeming value. I wish people listened to things that I think are much much better.

Is that a deceitful opinion? If you think yes, you've lost the plot. People like and value different things and that's great. It makes reviews better, not worse. You can find someone that's generally like-minded and you'll be able to extract more value from their reviews. If Hardware Unboxed doesn't give you what you want, why are you watching them?

The only issue is when people misrepresent things. If HU said a gpu was lacking a feature when it actually had it, that would be a problem. If they said a gpu performed a particular way and those results were not reproducible within some margin of error, that would be a problem. If they tested gpus with different settings to make one look better than another, that would be a problem. If they were accepting payment in exchange for positive impressions, that would be a problem. If they don't include a lot of ray tracing in their reviews because they don't like it, that is NOT a problem.
His settings are biased. So a GPU which lacks proper upscaling, latency reduction software and better raytracing support will look better than an objective better other product.

There is a difference in liking music (art) and the instruments. And claiming that lower latency and higher frames are a priority while recommending products which lacking lower latency and higher frames enabling software makes it very clear that this priority doesnt really exist.
 
His settings are biased. So a GPU which lacks proper upscaling, latency reduction software and better raytracing support will look better than an objective better other product.

There is a difference in liking music (art) and the instruments. And claiming that lower latency and higher frames are a priority while recommending products which lacking lower latency and higher frames enabling software makes it very clear that this priority doesnt really exist.

What do you mean "proper" upscaling? Is DLSS the only "proper" upscaling? Plus, they've highlighted that DLSS is better than other options repeatedly, from what I've seen.

I think there's good arguments to criticize their discussion of latency when Nvidia had reflex and AMD had not release anti-lag+ yet.

better raytracing support will look better than an objective better other product.

I don't know exactly what this means, so I'm just going to talk about raytracing support. Raytracing is a visibility test with pros and cons. You can have complex visibility tests, but normally at the expense of resolution, performance and noise. It can provide elements of realism in light "transport" that other methods cannot handle as accurately. Realism is not objectively better. In fact some people have very good arguments about why realism is bad for graphics.

See this thread for a discussion on image formation vs physically based rendering etc (read back through more of the thread than the thumbnail)
 
Realism is not objectively better. In fact some people have very good arguments about why realism is bad for graphics.
This sentence was poorly framed and I'm not sure you meant what you wrote. The pursuit of realism has always been the main driving force behind computer graphics. The fact that some games choose stylized art forms is an orthogonal issue -- even for stylized games you want physically correct light transport modeling (see Fortnite). Even if you forget about light transport and material simulation and simply think about resolution and framerate, those are also about getting as close to realism as possible by removing sampling artifacts. And yes, hardware constraints can and do force us to use approximations, and balance the various aspects of realism. But overall realism isn't just objectively better, it is THE objective.
 
This sentence was poorly framed and I'm not sure you meant what you wrote. The pursuit of realism has always been the main driving force behind computer graphics. The fact that some games choose stylized art forms is an orthogonal issue -- even for stylized games you want physically correct light transport modeling (see Fortnite). Even if you forget about light transport and material simulation and simply think about resolution and framerate, those are also about getting as close to realism as possible by removing sampling artifacts. And yes, hardware constraints can and do force us to use approximations, and balance the various aspects of realism. But overall realism isn't just objectively better, it is THE objective.

What I'm saying is whether someone actually cares about raytracing right now when they buy a gpu is subjective. You can argue that the feature in hardware is objectively better ... maybe ... but raytracing as it's implemented comes with pros and cons, which everyone is going to evaluate subjectively.

Also realism being objectively better and THE objective is just not true. The capability to do realism is an objective, and path tracing is the best known way to go there. There is another objective which is a fully programmable gpu that does not require any kind of a fixed pipeline, so all of the compute power can be used in any manner that developers want.

Is Spider-Man Into the Spider-Verse realism? It is when it wants to be, but mostly not. Is it "objectively" worse than some other digital animation that favours realism? If you go through the twitter thread I post, one of the comments mentions filmmakers being aware that realism is not the primary goal of image formation. Film sets add lights and intentionally bounce, block and change the colour of light. Yes, they're playing with light in the real world, because it's the physical world and they have to. But a lot of filmmakers/cinematographers will alter lighting in post to achieve a look that is not real.

Would a realistic game look better than a game that looked like some kind of impressionist painting? I would definitely be into trying impressionist games. Would that require physically correct light transport? Nope.

So to bring this back to gpu reviews, if you're arguing that reviews must cater to raytracing because it's objectively better, especially in how it's implemented, that's just plain wrong. And even if you could somehow demonstrate that it was objectively better and THE objective, it doesn't matter because no one has to like it. Some people prefer 2.5D pixel-art games. They could do a review that says some $100 gpu is the best buy because it plays all of the current 2.5D games really well and everything else is a waste. That would be a completely valid and useful review for people who play those games.
 
What I'm saying is whether someone actually cares about raytracing right now when they buy a gpu is subjective. You can argue that the feature in hardware is objectively better ... maybe ... but raytracing as it's implemented comes with pros and cons, which everyone is going to evaluate subjectively.

Also realism being objectively better and THE objective is just not true. The capability to do realism is an objective, and path tracing is the best known way to go there. There is another objective which is a fully programmable gpu that does not require any kind of a fixed pipeline, so all of the compute power can be used in any manner that developers want.

Is Spider-Man Into the Spider-Verse realism? It is when it wants to be, but mostly not. Is it "objectively" worse than some other digital animation that favours realism? If you go through the twitter thread I post, one of the comments mentions filmmakers being aware that realism is not the primary goal of image formation. Film sets add lights and intentionally bounce, block and change the colour of light. Yes, they're playing with light in the real world, because it's the physical world and they have to. But a lot of filmmakers/cinematographers will alter lighting in post to achieve a look that is not real.

Would a realistic game look better than a game that looked like some kind of impressionist painting? I would definitely be into trying impressionist games. Would that require physically correct light transport? Nope.

So to bring this back to gpu reviews, if you're arguing that reviews must cater to raytracing because it's objectively better, especially in how it's implemented, that's just plain wrong. And even if you could somehow demonstrate that it was objectively better and THE objective, it doesn't matter because no one has to like it. Some people prefer 2.5D pixel-art games. They could do a review that says some $100 gpu is the best buy because it plays all of the current 2.5D games really well and everything else is a waste. That would be a completely valid and useful review for people who play those games.
To be clear -- I'm not taking any position on the GPU review discussion.

My objection was focused solely on your comment on realism vs. computer graphics.

I'm also not debating the role of artistic manipulation of light in service of an artistic vision. Computer graphics seeks to model real life, and yes the same artistic manipulation of light is needed within a virtual world as well to serve an artistic vision. Having a different set of rules for light behavior in the virtual world does not help that artistic vision, especially when those rules may change from engine-to-engine depending on what hacks that specific engine chose to implement.

Edit: FWIW I also think that your assertion that path tracing is the best-known way to get realism is an over-simplification. Path tracing is the best-known way to model light transport. But as I said in my prev. post, light transport is not the only attribute of realism -- you need geometry, material modeling, and adequate spatial (resolution) and temporal (fps) sampling. That said, IMO light transport was in pretty bad shape before the recent RT/PT push and I do support the direction. Still, the other attributes are important, and you can't max out everything due to hardware constraints so you have to balance. But everything is in service of realistic physical simulation and the closest facsimile of realism that we can achieve within today's constraints.
 
Last edited:
Reaching in what way? It’s not necessary to try to read Steve’s mind. His words and the opinions that he shares in his reviews can speak for themselves.

Regarding his words and tone in context is not 'mind-reading' though. The implication you're making with "holy shit Steve actually admitted to thinking ray tracing sucks! Look at his words saying exactly that!" is that he fundamentally disregards RT as having any worth whatsoever and thus all of HUB's reviews are bunk, when with that wording he was quite clearly presenting a hyperbolic characterization of his views based on the games he plays.

I mean I think RT has definite value, I think Nvidia is doing a lot to advance the industry, but in terms of my actual gameplay experience with my 3060, it 'may not exist' for me either. I mean there's Metro Exodus, and maybe Doom Eternal with RT - and that's my exposure to RT in my gaming life really. If I had say, at least a 4080 that would be different, but when they're starting at $1600 CAD, it's still ultimately a 'future' technology for me atm.

As I've said before, I wish these kinds of sites would review/explain the technology more often than they do, but I also get for both better and worse, they're a fundamentally just different types of vehicles than say, Digital Foundry. Their focus is always on the deliverable end product and value per $, so it's not completely out of pocket that they're not going to devote that much coverage to technologies where they can only really start to strut their stuff with $1000+ products.

If your point is that his bias doesn’t influence his work well I wouldn’t put money on that.

No, I think it quite clearly does, and I think to some extent, Steve knows that as well, hence the constant deferrals to Tim. I think without a doubt Steve's bias played far more into his earlier reviews when Tim was just starting out on the channel, and HUB's work - while still spotty and click-baity - has increased in quality with Tim's increasing visibility. The early reviews where not even a single game was benchmarked with RT, even in isolation, were indeed ridiculous, as well as effectively ignoring DLSS well past until it's prevalence was impossible to deny.

I just don't think a video where he's very 'openly' discussing his bias and explaining how he's aware of that and knows that he needs to consult with his other reviewer to ensure that bias is not unfairly tainting his conclusion is any kind of smoking gun. You can believe this is PR just fine, and with my considerable preference for Tim, I obviously don't fully trust Steve either (or just don't think he's nearly as competent), but this one video clip is anything but damning.
 
Regarding his words and tone in context is not 'mind-reading' though. The implication you're making with "holy shit Steve actually admitted to thinking ray tracing sucks! Look at his words saying exactly that!" is that he fundamentally disregards RT as having any worth whatsoever and thus all of HUB's reviews are bunk, when with that wording he was quite clearly presenting a hyperbolic characterization of his views based on the games he plays.

The implication is that his conclusions and recommendations have been influenced by his strong personal views on the "right" way to game. Those personal views weren't always made obvious in the past hence I thought the video provided good context.

I mean I think RT has definite value, I think Nvidia is doing a lot to advance the industry, but in terms of my actual gameplay experience with my 3060, it 'may not exist' for me either. I mean there's Metro Exodus, and maybe Doom Eternal with RT - and that's my exposure to RT in my gaming life really. If I had say, at least a 4080 that would be different, but when they're starting at $1600 CAD, it's still ultimately a 'future' technology for me atm.

As I've said before, I wish these kinds of sites would review/explain the technology more often than they do, but I also get for both better and worse, they're a fundamentally just different types of vehicles than say, Digital Foundry. Their focus is always on the deliverable end product and value per $, so it's not completely out of pocket that they're not going to devote that much coverage to technologies where they can only really start to strut their stuff with $1000+ products.

That's true for most graphics card reviews. Most people don't buy $1000 graphics cards and therefore can't run at the settings that those guys are benchmarking. I think the main question is how much credibility you give to the subjective element in any review. If you watch the video closely you can even see Tim cringe when Steve blurts out a subjective opinion in a matter of fact way like "nobody is using RT on a 4060". It's just a dumb thing to say because obviously there are people using RT on 4060's out there.

No, I think it quite clearly does, and I think to some extent, Steve knows that as well, hence the constant deferrals to Tim. I think without a doubt Steve's bias played far more into his earlier reviews when Tim was just starting out on the channel, and HUB's work - while still spotty and click-baity - has increased in quality with Tim's increasing visibility. The early reviews where not even a single game was benchmarked with RT, even in isolation, were indeed ridiculous, as well as effectively ignoring DLSS well past until it's prevalence was impossible to deny.

I just don't think a video where he's very 'openly' discussing his bias and explaining how he's aware of that and knows that he needs to consult with his other reviewer to ensure that bias is not unfairly tainting his conclusion is any kind of smoking gun. You can believe this is PR just fine, and with my considerable preference for Tim, I obviously don't fully trust Steve either (or just don't think he's nearly as competent), but this one video clip is anything but damning.

Yep, the more recent review conclusions and benchmark selections have been more balanced. Likely due to Tim's influence as he certainly seems to care a whole lot more about the fancy new stuff than Steve does. That's exactly the point, we're hearing straight from the horses mouths exactly why those recommendations were what they were. I thought it was a good video which left nothing to interpretation.
 
Alright, I am going to summarize HUBs criticism in a single post ..

1-They don't test RT because they say it's useless, doesn't add much of visual enhancements and brings fps down (fps is king), yet they test Ultra settings, which is also useless and bring fps massively down (according to their own logic). You either test with Ultra, which includes RT at max settings. Or you don't test RT or Ultra settings, and test with the most visually optimized settings.

2-Even though they started testing with RT recently, they test it in the most superficial way possible, picking up many games with useless implementations that doesn't add much of visual enhancements (in contradictio with their logic), while excluding titles with more potent implementations, and actual visual quality upgrades.

3-They scolded GPUs with 8GB of VRAM based on testing that used Ultra settings, not suitable for such GPUs, and not in line with HUB's high fps mantra.

4-They didn't recognize the value of DLSS2 early on, and even after they did, they didn't do it enough justice, as they igonored the latency reduction associated with DLSS3.

if you're arguing that reviews must cater to raytracing because it's objectively better
Reviewers should cater to DXR though, it's the latest DX version, and it's widely used now. Ignoring that aspect is ignoring the future proofness of your product.
 
Alright, I am going to summarize HUBs criticism in a single post ..

1-They don't test RT because they say it's useless, doesn't add much of visual enhancements and brings fps down (fps is king), yet they test Ultra settings, which is also useless and bring fps massively down (according to their own logic). You either test with Ultra, which includes RT at max settings. Or you don't test RT or Ultra settings, and test with the most visually optimized settings.

2-Even though they started testing with RT recently, they test it in the most superficial way possible, picking up many games with useless implementations that doesn't add much of visual enhancements (in contradictio with their logic), while excluding titles with more potent implementations, and actual visual quality upgrades.

3-They scolded GPUs with 8GB of VRAM based on testing that used Ultra settings, not suitable for such GPUs, and not in line with HUB's high fps mantra.
You don't have to agree with their settings, you can ignore their reviews. Every reviewer has the right to choose whatever settings they want in whatever games they want with whatever reasoning they want. If they have audience, there's clearly people who agree with their choice of settings, if there's no audience they'll die away. Don't try to force your preferences to everyone.

4-They didn't recognize the value of DLSS2 early on, and even after they did, they didn't do it enough justice, as they igonored the latency reduction associated with DLSS3.
This must be the most selfrighteous mockery of a reasoning I've ever read. Review(er) is trash because they don't praise something as much as you'd like!?

Reviewers should cater to DXR though, it's the latest DX version, and it's widely used now. Ignoring that aspect is ignoring the future proofness of your product.
No it's not, it's a feature in the latest DX version among countless other features. And no, it wasn't the only new feature in the latest DX version either.
 
Alright, I am going to summarize HUBs criticism in a single post ..

1-They don't test RT because they say it's useless, doesn't add much of visual enhancements and brings fps down (fps is king), yet they test Ultra settings, which is also useless and bring fps massively down (according to their own logic). You either test with Ultra, which includes RT at max settings. Or you don't test RT or Ultra settings, and test with the most visually optimized settings.
I kind of agree with this. It's a general problem I have with reviews. Ultra settings have been historically been almost low-optimization settings. They're the settings where you compromise performance for virtually undetectable improvements in image quality. They're not particularly useful as they won't necessarily utilize hardware in a positive way. It would be hard to discover the "best" settings to use. I'd almost prefer reviewers defaulted to 'high' settings, but you do occasionally get ultra/epic settings that make a huge difference. Like I think Fortnite you get volumetric clouds if you set effects(?) to Epic.

2-Even though they started testing with RT recently, they test it in the most superficial way possible, picking up many games with useless implementations that doesn't add much of visual enhancements (in contradictio with their logic), while excluding titles with more potent implementations, and actual visual quality upgrades.

I agree that using games like F1 to benchmark ray tracing seems kind of useless, but there's probably utility in having a mix of heavy and low use of ray tracing. Someone might want to know how Nvidia, AMD, Intel compare if you only turn on one or two ray tracing settings, like AO, reflections. There are a lot of use cases. But if they're testing RT they'd probably want to cover the worst case games, like something path traced.

3-They scolded GPUs with 8GB of VRAM based on testing that used Ultra settings, not suitable for such GPUs, and not in line with HUB's high fps mantra.

This is tough if the competitors have products in the same price range that don't have to make compromises to texture quality, for example.

4-They didn't recognize the value of DLSS2 early on, and even after they did, they didn't do it enough justice, as they ignored the latency reduction associated with DLSS3.

I'd have to go back. I feel like they've been saying DLSS2 is the best option for quite a while. I feel all reviewers have a bias towards non-upscaled rendering in general. I think the best use for even the highest end GPUs is to always use some upscale when it's available to get more frames and take advantage of 144, 240, 360 and 540Hz monitors. Well, it's a tough call on a 1080p display, but anything 1440p and up I'd always have it on, but that's just me.

Reviewers should cater to DXR though, it's the latest DX version, and it's widely used now. Ignoring that aspect is ignoring the future proofness of your product.

Disagree. For example person that's tailoring their reviews to esports would have no reason to waste the time, because I don't think DXR will be relevant in esports for quite a while. To be honest, I have a 3080 and I've only played through one game with ray-tracing, which was Control. Every other game I've turned it off because I have a 240Hz display and I'll always adjust my settings to get 120 fps or more. And generally with my cpu it seemed like I'd have to many cases where fps would tank and my gpu utilization would drop way down. Just picked up a 5800x3d, so I might get better results, but the reality is I'm playing multiplayer games and performance matters more. Alan Wake 2 will be the exception and I'm going to cry from how bad it'll perform on my 3080.

If I could find a reviewer that was more heavily catering to my interests, that would actually be a great addition.
 
You don't have to agree with their settings, you can ignore their reviews. Every reviewer has the right to choose whatever settings they want in whatever games they want with whatever reasoning they want. If they have audience, there's clearly people who agree with their choice of settings, if there's no audience they'll die away. Don't try to force your preferences to everyone.
A review that only tells half the story is hardly a review, and less so if it is tailored to a reviewers personal bias.
 
A review that only tells half the story is hardly a review, and less so if it is tailored to a reviewers personal bias.
Would you please point to a review which covers all the possible angles, cases and whatnot to tell the whole story? Just agreeing with what you want to see isn't the same as telling the whole story. Every reviewer has personal bias regardless of what settings they use. Again, don't just point out those who aren't agreeing with your point of view as some sort of outliers. You speak for yourself, not everyone.
 
A review that only tells half the story is hardly a review, and less so if it is tailored to a reviewers personal bias.
I get the feeling that the disappointed reviews of HUB's reviews here are largely based on forum members' personal GPU vendor bias.

And for the record I'd say I'm Nvidia fan myself. I just don't need everyone else to be that too.
 
I get the feeling that the disappointed reviews of HUB's reviews here are largely based on forum members' personal GPU vendor bias.

And for the record I'd say I'm Nvidia fan myself. I just don't need everyone else to be that too.

That’s true. Primarily because only one GPU vendor is pushing the envelope. So if you’re a “graphics enthusiast” there’s really only one option.
 
Is Spider-Man Into the Spider-Verse realism? It is when it wants to be, but mostly not. Is it "objectively" worse than some other digital animation that favours realism? If you go through the twitter thread I post, one of the comments mentions filmmakers being aware that realism is not the primary goal of image formation. Film sets add lights and intentionally bounce, block and change the colour of light. Yes, they're playing with light in the real world, because it's the physical world and they have to. But a lot of filmmakers/cinematographers will alter lighting in post to achieve a look that is not real.
That is not very relevant...
The stylization aspect is a choice, as opposed to being the only choice, the same way pixel art, is a choice right now, instead of being a necessity a long time ago.
The goal is, being able to go from realism, to anything between, as far stylized or abstract as you like, as a choice, and not because of necessity.
I don't think that photorealism is the end all be all, but having it, opens up so much choice to pick from.

Anyway,
The only thing I find puzzling with those guys, is their definition regarding excessive or useless settings (in terms of image quality).
I would find their reviews a lot more interesting if they were just benchmarking on high settings and bellow.
Take everything that is excessive or "useless" out of the equation, and go from there.
If they were doing it that way, I'd take them more seriously.

I don't mind subjectivity, but I do like to hear a somewhat coherent and consistent reasoning behind it.
 
Back
Top