Advertising products on their own preferences is "deceitful".They’re open about their preferences. There’s no deceit. It’s totally fine. The only problem is if they're being deceitful or actually running tests poorly.
Advertising products on their own preferences is "deceitful".They’re open about their preferences. There’s no deceit. It’s totally fine. The only problem is if they're being deceitful or actually running tests poorly.
If you watch all their videos all the time they are.
If you've just read a review of theirs then not really.
I do my research which is also why I've stopped watching bad reviews from Steve. But not everyone have the time to do the same.User error. Every person has preferences. If you're taking reviews at face value in spite of your own preferences, that's on you. If you want an all encompassing review that treats all things equally, then do the research and find a reviewer that reviews that way. If you have your own preferences and you want a review that biases the same way, then do the research and find a reviewer that shares your preferences. Not all reviews have to cater to all people. I actually think they're much better when they don't.
Advertising products on their own preferences is "deceitful".
They’re open about their preferences.
His settings are biased. So a GPU which lacks proper upscaling, latency reduction software and better raytracing support will look better than an objective better other product.I think Linkin Park is bottom-of-the-barrel trash music with little to no redeeming value. I wish people listened to things that I think are much much better.
Is that a deceitful opinion? If you think yes, you've lost the plot. People like and value different things and that's great. It makes reviews better, not worse. You can find someone that's generally like-minded and you'll be able to extract more value from their reviews. If Hardware Unboxed doesn't give you what you want, why are you watching them?
The only issue is when people misrepresent things. If HU said a gpu was lacking a feature when it actually had it, that would be a problem. If they said a gpu performed a particular way and those results were not reproducible within some margin of error, that would be a problem. If they tested gpus with different settings to make one look better than another, that would be a problem. If they were accepting payment in exchange for positive impressions, that would be a problem. If they don't include a lot of ray tracing in their reviews because they don't like it, that is NOT a problem.
His settings are biased. So a GPU which lacks proper upscaling, latency reduction software and better raytracing support will look better than an objective better other product.
There is a difference in liking music (art) and the instruments. And claiming that lower latency and higher frames are a priority while recommending products which lacking lower latency and higher frames enabling software makes it very clear that this priority doesnt really exist.
better raytracing support will look better than an objective better other product.
This sentence was poorly framed and I'm not sure you meant what you wrote. The pursuit of realism has always been the main driving force behind computer graphics. The fact that some games choose stylized art forms is an orthogonal issue -- even for stylized games you want physically correct light transport modeling (see Fortnite). Even if you forget about light transport and material simulation and simply think about resolution and framerate, those are also about getting as close to realism as possible by removing sampling artifacts. And yes, hardware constraints can and do force us to use approximations, and balance the various aspects of realism. But overall realism isn't just objectively better, it is THE objective.Realism is not objectively better. In fact some people have very good arguments about why realism is bad for graphics.
This sentence was poorly framed and I'm not sure you meant what you wrote. The pursuit of realism has always been the main driving force behind computer graphics. The fact that some games choose stylized art forms is an orthogonal issue -- even for stylized games you want physically correct light transport modeling (see Fortnite). Even if you forget about light transport and material simulation and simply think about resolution and framerate, those are also about getting as close to realism as possible by removing sampling artifacts. And yes, hardware constraints can and do force us to use approximations, and balance the various aspects of realism. But overall realism isn't just objectively better, it is THE objective.
To be clear -- I'm not taking any position on the GPU review discussion.What I'm saying is whether someone actually cares about raytracing right now when they buy a gpu is subjective. You can argue that the feature in hardware is objectively better ... maybe ... but raytracing as it's implemented comes with pros and cons, which everyone is going to evaluate subjectively.
Also realism being objectively better and THE objective is just not true. The capability to do realism is an objective, and path tracing is the best known way to go there. There is another objective which is a fully programmable gpu that does not require any kind of a fixed pipeline, so all of the compute power can be used in any manner that developers want.
Is Spider-Man Into the Spider-Verse realism? It is when it wants to be, but mostly not. Is it "objectively" worse than some other digital animation that favours realism? If you go through the twitter thread I post, one of the comments mentions filmmakers being aware that realism is not the primary goal of image formation. Film sets add lights and intentionally bounce, block and change the colour of light. Yes, they're playing with light in the real world, because it's the physical world and they have to. But a lot of filmmakers/cinematographers will alter lighting in post to achieve a look that is not real.
Would a realistic game look better than a game that looked like some kind of impressionist painting? I would definitely be into trying impressionist games. Would that require physically correct light transport? Nope.
So to bring this back to gpu reviews, if you're arguing that reviews must cater to raytracing because it's objectively better, especially in how it's implemented, that's just plain wrong. And even if you could somehow demonstrate that it was objectively better and THE objective, it doesn't matter because no one has to like it. Some people prefer 2.5D pixel-art games. They could do a review that says some $100 gpu is the best buy because it plays all of the current 2.5D games really well and everything else is a waste. That would be a completely valid and useful review for people who play those games.
Reaching in what way? It’s not necessary to try to read Steve’s mind. His words and the opinions that he shares in his reviews can speak for themselves.
If your point is that his bias doesn’t influence his work well I wouldn’t put money on that.
Regarding his words and tone in context is not 'mind-reading' though. The implication you're making with "holy shit Steve actually admitted to thinking ray tracing sucks! Look at his words saying exactly that!" is that he fundamentally disregards RT as having any worth whatsoever and thus all of HUB's reviews are bunk, when with that wording he was quite clearly presenting a hyperbolic characterization of his views based on the games he plays.
I mean I think RT has definite value, I think Nvidia is doing a lot to advance the industry, but in terms of my actual gameplay experience with my 3060, it 'may not exist' for me either. I mean there's Metro Exodus, and maybe Doom Eternal with RT - and that's my exposure to RT in my gaming life really. If I had say, at least a 4080 that would be different, but when they're starting at $1600 CAD, it's still ultimately a 'future' technology for me atm.
As I've said before, I wish these kinds of sites would review/explain the technology more often than they do, but I also get for both better and worse, they're a fundamentally just different types of vehicles than say, Digital Foundry. Their focus is always on the deliverable end product and value per $, so it's not completely out of pocket that they're not going to devote that much coverage to technologies where they can only really start to strut their stuff with $1000+ products.
No, I think it quite clearly does, and I think to some extent, Steve knows that as well, hence the constant deferrals to Tim. I think without a doubt Steve's bias played far more into his earlier reviews when Tim was just starting out on the channel, and HUB's work - while still spotty and click-baity - has increased in quality with Tim's increasing visibility. The early reviews where not even a single game was benchmarked with RT, even in isolation, were indeed ridiculous, as well as effectively ignoring DLSS well past until it's prevalence was impossible to deny.
I just don't think a video where he's very 'openly' discussing his bias and explaining how he's aware of that and knows that he needs to consult with his other reviewer to ensure that bias is not unfairly tainting his conclusion is any kind of smoking gun. You can believe this is PR just fine, and with my considerable preference for Tim, I obviously don't fully trust Steve either (or just don't think he's nearly as competent), but this one video clip is anything but damning.
Reviewers should cater to DXR though, it's the latest DX version, and it's widely used now. Ignoring that aspect is ignoring the future proofness of your product.if you're arguing that reviews must cater to raytracing because it's objectively better
You don't have to agree with their settings, you can ignore their reviews. Every reviewer has the right to choose whatever settings they want in whatever games they want with whatever reasoning they want. If they have audience, there's clearly people who agree with their choice of settings, if there's no audience they'll die away. Don't try to force your preferences to everyone.Alright, I am going to summarize HUBs criticism in a single post ..
1-They don't test RT because they say it's useless, doesn't add much of visual enhancements and brings fps down (fps is king), yet they test Ultra settings, which is also useless and bring fps massively down (according to their own logic). You either test with Ultra, which includes RT at max settings. Or you don't test RT or Ultra settings, and test with the most visually optimized settings.
2-Even though they started testing with RT recently, they test it in the most superficial way possible, picking up many games with useless implementations that doesn't add much of visual enhancements (in contradictio with their logic), while excluding titles with more potent implementations, and actual visual quality upgrades.
3-They scolded GPUs with 8GB of VRAM based on testing that used Ultra settings, not suitable for such GPUs, and not in line with HUB's high fps mantra.
This must be the most selfrighteous mockery of a reasoning I've ever read. Review(er) is trash because they don't praise something as much as you'd like!?4-They didn't recognize the value of DLSS2 early on, and even after they did, they didn't do it enough justice, as they igonored the latency reduction associated with DLSS3.
No it's not, it's a feature in the latest DX version among countless other features. And no, it wasn't the only new feature in the latest DX version either.Reviewers should cater to DXR though, it's the latest DX version, and it's widely used now. Ignoring that aspect is ignoring the future proofness of your product.
I kind of agree with this. It's a general problem I have with reviews. Ultra settings have been historically been almost low-optimization settings. They're the settings where you compromise performance for virtually undetectable improvements in image quality. They're not particularly useful as they won't necessarily utilize hardware in a positive way. It would be hard to discover the "best" settings to use. I'd almost prefer reviewers defaulted to 'high' settings, but you do occasionally get ultra/epic settings that make a huge difference. Like I think Fortnite you get volumetric clouds if you set effects(?) to Epic.Alright, I am going to summarize HUBs criticism in a single post ..
1-They don't test RT because they say it's useless, doesn't add much of visual enhancements and brings fps down (fps is king), yet they test Ultra settings, which is also useless and bring fps massively down (according to their own logic). You either test with Ultra, which includes RT at max settings. Or you don't test RT or Ultra settings, and test with the most visually optimized settings.
2-Even though they started testing with RT recently, they test it in the most superficial way possible, picking up many games with useless implementations that doesn't add much of visual enhancements (in contradictio with their logic), while excluding titles with more potent implementations, and actual visual quality upgrades.
3-They scolded GPUs with 8GB of VRAM based on testing that used Ultra settings, not suitable for such GPUs, and not in line with HUB's high fps mantra.
4-They didn't recognize the value of DLSS2 early on, and even after they did, they didn't do it enough justice, as they ignored the latency reduction associated with DLSS3.
Reviewers should cater to DXR though, it's the latest DX version, and it's widely used now. Ignoring that aspect is ignoring the future proofness of your product.
A review that only tells half the story is hardly a review, and less so if it is tailored to a reviewers personal bias.You don't have to agree with their settings, you can ignore their reviews. Every reviewer has the right to choose whatever settings they want in whatever games they want with whatever reasoning they want. If they have audience, there's clearly people who agree with their choice of settings, if there's no audience they'll die away. Don't try to force your preferences to everyone.
Would you please point to a review which covers all the possible angles, cases and whatnot to tell the whole story? Just agreeing with what you want to see isn't the same as telling the whole story. Every reviewer has personal bias regardless of what settings they use. Again, don't just point out those who aren't agreeing with your point of view as some sort of outliers. You speak for yourself, not everyone.A review that only tells half the story is hardly a review, and less so if it is tailored to a reviewers personal bias.
I get the feeling that the disappointed reviews of HUB's reviews here are largely based on forum members' personal GPU vendor bias.A review that only tells half the story is hardly a review, and less so if it is tailored to a reviewers personal bias.
I get the feeling that the disappointed reviews of HUB's reviews here are largely based on forum members' personal GPU vendor bias.
And for the record I'd say I'm Nvidia fan myself. I just don't need everyone else to be that too.
That is not very relevant...Is Spider-Man Into the Spider-Verse realism? It is when it wants to be, but mostly not. Is it "objectively" worse than some other digital animation that favours realism? If you go through the twitter thread I post, one of the comments mentions filmmakers being aware that realism is not the primary goal of image formation. Film sets add lights and intentionally bounce, block and change the colour of light. Yes, they're playing with light in the real world, because it's the physical world and they have to. But a lot of filmmakers/cinematographers will alter lighting in post to achieve a look that is not real.