Value of Hardware Unboxed benchmarking *spawn

Watched the video. In the intro Steve states that the intent was to evaluate performance in the “latest and greatest” games. So there should be no debate about that. The game selection was fine I think even though some titles definitely don’t qualify as latest or greatest IMO. Avatar, CP, Immortals, RE4, AW2 all count as new games at least.

It was useful as a guide for 5700xt owners considering an upgrade to a 7700xt or similar. It was also interesting to see the 5700xt being very playable at 1080p in recent games. But then there was the little cringey boast about being right that the 5700 xt didn’t suffer from a lack of RT or upscaling and therefore was a better purchase at $400 vs the 2070 or 2060 super. So yeah it does seem Steve is trying to take a victory lap for his recommendations 5 years ago. I chuckled every time he mentioned how good FSR looks on the 5700xt at 1440p. How times have changed.

It’s kinda pointless though. The 5700xt was never competing with the 2060 super even though they both launched at $400. That’s not how the GPU market works these days. If Steve wants he can always point to a cheaper AMD card that provides the same raster performance as a more expensive Nvidia card and claim that it’s a bargain. I don’t see that changing anytime soon.
 
The primary reason to upgrade anything in a PC is when a new game doesn't work as well as you want on it.

Personally, my incentive to upgrade is tied to playing (or at least testing) virtually ALL of my games at higher settings. Even the ones I've completed, as I get a kick out of simply going back and testing them out on the new hardware.

That said, I agree the main driver is when games that I want start to be released that perform unacceptably on my current hardware. Note though there's a big difference between a game performing unacceptably (say sub 30fp, very low res, or missing settings that I feel are must have) vs simply not working as well as I would like (native 4k, locked 144hz max settings on everything). And obviously that those definitions will vary greatly from person to person.

I would probably re-word your above statement to something like "The primary reason to upgrade any gaming focused component in a PC is when newly acquired games no longer run acceptably". Nevertheless there is still a big secondary incentive that all your existing or future games that do/would "run acceptably" will run even better now.

On the HUB review point, I've not watched the whole thing, but the intro made it clear the point of the video was to see how the 5700XT performs today. Personally I think if you want to test how any GPU performs today (making a clear contrast from how it performed on release), then you should be testing it across a range of the most demanding, and popular games available today.

To provide actual benchmarks on a set of largely older games where the 5700XT is likely to do better, but then provide only game footage without cross GPU benchmarks on newer games where it may well perform worse in relation to it's peers seems somewhat bizarre to me. Although perhaps not quite as bizarre as the claim of "equivalence" to a more modern 3060 given that RT has been ignored in all the benchmarked games, and the lack of DLSS which in some cases will make the use of light RT effectively free at equivalent image quality settings. And not to mention the 50% larger frame buffer on the 3060.
 
How many years gone by does a game have to be defined as old? 5 years? 10 years? 20 years? Many people have a 5 - 10 year old game installed that they revisit periodically (ie Skyrim, Xcom2) simply because it offers open ended play.

For me anything that came out on PS3 is old.

So > 10 years
 
Review of a GPU is buyers advice, and its quality isn't only in their numbers accuracy but also in benchmark selection.

I wonder how much of an audience of reviews has the intent of buying some such product. Wouldn't surprise me if it was a minority. Some people want to simply be informed, some even entertained, and who knows how many motivations are there. Buyer's advice seems especially improbable for a video like "five years later".
 
Completely disagree. The primary reason to upgrade anything in a PC is when a new game doesn't work as well as you want on it.
That's just not true. Plenty of people have an old GPU, play games, wish they played better and get an upgrade. You might have bought a medium spec GPU in 2019 that could only play a 2019 moderately well, and then upgrade to a high end 2024 GPU to play that game much better. A lot of the most popular games are old - Fortnite, Apex Legends, Counter Strike 2, DOTA 2, GTA V, Fallout 4, where people want the same game they've been playing for years but at more resolution and better graphics settings than they had before.

It is up to you to prove they don't.
I'm not trying to prove one way or another. It's just a consideration and in light of an absence of proof either way, we have to use reason to factor in likelihood. A casual observation of the world and people in it shows plenty will choose the path with the most personal payback. It's left for the reader to decide for themselves, equipped with the knowledge that AMD content make more money, whether HUB is influenced by this or not. For me, this adds a little weighting to the notion of 'biased narrator', not just for HUB but all YT content, as there are definitely going to be some content creators picking subject material to maximise personal returns.

What's a good argument against that to suggest HUB isn't so influenced? Doesn't need to be proof, just good philosophy.
 
Watched the video. In the intro Steve states that the intent was to evaluate performance in the “latest and greatest” games. So there should be no debate about that. The game selection was fine I think even though some titles definitely don’t qualify as latest or greatest IMO. Avatar, CP, Immortals, RE4, AW2 all count as new games at least.
The issue was that he didn't benchmark these games against other cards but just showed gameplay footage. Instead, he was benchmarking older cross gen games like Marvels Spider Man, Resi 4, and all the other games that do not make use of modern features which could pull Turing ahead of RDNA1.

But then there was the little cringey boast about being right that the 5700 xt didn’t suffer from a lack of RT or upscaling and therefore was a better purchase at $400 vs the 2070 or 2060 super. So yeah it does seem Steve is trying to take a victory lap for his recommendations 5 years ago. I chuckled every time he mentioned how good FSR looks on the 5700xt at 1440p. How times have changed.

Indeed, that is their modus operandi. I vividly remember a FSR vs DLSS 2 comparison in the olden days where they compared these upscalers in a Marvel title at 1080p. The DLSS image clearly looked leagues ahead of FSR and very close to native resolution, in terms of detail reconstruction even better, yet they dismissed the clear advantage by telling their audience both upscalers are unuseable at 1080p which was clearly not true based on the footage they've shown on display. I've called this behavior out using a comparison I've made from their own footage and Steve got extremly aggressive towards me.

I don't care about IHV wars, if the positions were switched I would call mistakes like that out as well. Something about that whole Turing vs RDNA1 thing really resonates with Steve for reasons mentioned in this thread already.
 
I am the one with the agenda? Then tell me, why didn't the channel benchmark the most recent current gen only titles in a video about how the 5700XT performs today? Why does Steve feel the need to constantly tell people how much lower end cards suck at Raytracing despite there existing a ton of examples where even a 2060 can get 1080p at 60 FPS with Raytracing on? Is it so far-fetched to believe that they reached a significant audience in the days of the 5700XT and do not want to admit to them that they made a short-sighted purchase recommendation?

You're were clearly not following them for long enough. You have a very naive idea of this channel.

Let me just tell you I've have tried my very best to have some civil decisions on Twitter with Steve about different topics. Yes I did speak to him directly, just as we do so here in this very forum. Instead of being civilized, that guy was incredibly unprofessional, called me names and straight up insulted me, without me resorting to personal attacks even once.

There's a valid reason why I'm so wary of them. So I would appreciate it if you could refrain from twisting my words in a way that makes it look like I'm running a YouTube clown show.
I have been following them for plenty long enough. I'm just not watching their videos with a preconceived notion like you are.

The reason you are wary of them has more to do with yourself than anything they're doing. And that goes with all the people upvoting you, and all the clowns in the Youtube comment section who push this same clown rhetoric.

And this isn't to say I think they're perfect. I've literally had direct arguments with them on Reddit before after criticizing some points or methodology of theirs. But at no point have I observed any kind of brand-based biases in their coverage at all like you guys accuse them of. Biases that are so very clearly coming from yourselves, coloring your perception of their coverage.

It's so incredibly blatant, and it's embarrassing to see it so rampant here, of all places. It really shows there is no place to find reasonable tech discussion on this stuff. Y'all are the kings of seeing everything you want to see in their videos, while ignoring all the countless things that completely go against your claims. As somebody with a big interest in covering conspiracy-minded groups and all, you guys fit into this quite well. The exact same paranoia and confirmation bias mental routines are at play.

EDIT: I've straight up debunked a number of claims y'all have made about them in the past(like they idea they were ever down on DLSS2), and absolutely NOTHING will deter you from continuing on with it anyways. It's so stuck in your head, you will see it no matter what reality shows.
 
Last edited:
But at no point have I observed any kind of brand-based biases in their coverage at all like you guys accuse them of. Biases that are so very clearly coming from yourselves, coloring your perception of their coverage.

Maybe you’re not very observant 😄 Seriously though your opinion isn’t more valid than anyone else’s so drop the holier than thou attitude. Everybody’s perception of the world is clouded by “something”.
 
I don't care about IHV wars, if the positions were switched I would call mistakes like that out as well.

Therein lies the crux of the issue in that the majority of tech media, even before the youtube days (and metrics) realized the majority of views and engagements comes from people looking to argue this stuff.

It's not really primarily oriented towards actual purchasing guidance, which is why content oriented strictly at that (eg. rtings) is actually the minority as the audience (and business) isn't there to support it.
 
I'm not trying to prove one way or another. It's just a consideration and in light of an absence of proof either way, we have to use reason to factor in likelihood. A casual observation of the world and people in it shows plenty will choose the path with the most personal payback. It's left for the reader to decide for themselves, equipped with the knowledge that AMD content make more money, whether HUB is influenced by this or not. For me, this adds a little weighting to the notion of 'biased narrator', not just for HUB but all YT content, as there are definitely going to be some content creators picking subject material to maximise personal returns.

What's a good argument against that to suggest HUB isn't so influenced? Doesn't need to be proof, just good philosophy.

When a bias is proven, It is fine to be on guard and mute 'biased narrators', at least for a while, to keep public spaces clean.
Until a bias is proven, if you are merely skeptical, only keep your guard up, fine, as long as you keep in mind it is only a suspicion. And I found it reasonable to be more skeptical up to a certain extent where there are bigger incentives for bias. So we happen to agree, if I understand you correctly.
But something more happened here. Starting arguments on that premise and repeating its conclusion without substantiating the premise is another matter. The claim was made that HUB makes content like the last video only to satisfy an "AMD crowd". And then they run with it without a single fact of the content being shown wrong. The result is a suspension of reason because of, essentially, adhoms.

It is safe to say most big tech reviewers do make a living out of it and, therefore are in it for the money. Are we supposed to be suspicious? I agree. Are we to consider them corrupt? I say no, because I don't see a benefit in the axiom of "they are biased".
 
How many years gone by does a game have to be defined as old? 5 years? 10 years? 20 years? Many people have a 5 - 10 year old game installed that they revisit periodically (ie Skyrim, Xcom2) simply because it offers open ended play.
For a review of cards which you do now? Anything older than the youngest card among those you're comparing would be "old" IMO. But then you'd have to look at the review to read their own reasons for choosing benchmarks. These can be arbitrary but they should be there - if we're "looking at how a card is doing 5 years later" then we should also provide the rules of selecting the titles for that task.

I don't think Shifty needs to claim that it "drives the market". Just that it is a significant benefit and therefore that there is an audience for comparisons using older titles.
This is completely against the stated reason of the review in question. He's not looking at how 5700XT does "in older titles" at the current moment.
Also let's please stop moving the goal post from what a review should consider "old" and what we as players may consider "old" - these are not the same thing.

It’s kinda pointless though. The 5700xt was never competing with the 2060 super even though they both launched at $400. That’s not how the GPU market works these days. If Steve wants he can always point to a cheaper AMD card that provides the same raster performance as a more expensive Nvidia card and claim that it’s a bargain. I don’t see that changing anytime soon.
Which is why comparing GPUs by their MSRPs only completely ignoring the difference in feature sets is a wrong way to compare GPUs these days - but Steve insists that his assessment "was correct" while simultaneously boasting FSR2 on the 5700XT ignoring the fact that 2060S owners could've used DLSS for a couple years prior to FSR2 appearing. It's just a bunch of cringy self-confirmation and nothing more from him these days.

If he would be willing to do a proper comparison the review would've included the cards which were actually competing back then for the owners attention (5700/XT/2060S/2070S) and do a round of RT+DLSS testing of the Nvidia side (which would completely destroy his claims that "RT didn't matter" on these cards so of course he didn't).

The issue was that he didn't benchmark these games against other cards but just showed gameplay footage. Instead, he was benchmarking older cross gen games like Marvels Spider Man, Resi 4, and all the other games that do not make use of modern features which could pull Turing ahead of RDNA1.
Or even more than that they could actually work on Turing cards *with RT* at about the same performance level with the help of DLSS as they do on 5700XT without RT. AMD "sponsored" titles are especially likely to do so thanks to their "lite RT" implementations - at least when they are not running out of 8GBs of VRAM.
 
Last edited:
The claim was made that HUB makes content like the last video only to satisfy an "AMD crowd". And then they run with it without a single fact of the content being shown wrong. The result is a suspension of reason because of, essentially, adhoms.
I agree, it was unfairly presented by DavidGraham as an absolute where it should have been presented as a consideration.
 
Does AMD outperform Nvidia? That’s what matters. The vast majority of Intel products are not worth buying so it makes sense that AMD should garner more views.
 
Another techtuber testifying that AMD related content gets 10x the views vs Intel related content.

I don't doubt that the vast majority of home builders are currently favoring AMD CPUs. But an 'all Intel' build? That's really not something I'd expect many people to care about in the slightest. It's a data point, sure, but the ratio of viewers of this particular comparison is likely far from representative.

With regards to engagement on GPU reviews though, it's hard not to notice how the comment sections tend to skew very pro-AMD (or perhaps anti-Ngreedia). It's an interesting inverse of the current reality with regards to market share. The AMD crowd sure seems to be very vocal, sorta like the Amiga and Team OS/2 people of yore. Relatively scarce in numbers (as far as we can tell, going by Steam numbers, Jon Peddie, 3DMark contributions etc) but you wouldn't know it for the sheer volume of their voices.
 
I think looking at the reddit communities, respective community discourse, and sharing habits is enlightening here.
r/intel is half the size of r/AMD.
I do not need to say much about the posting style - if you frequent them, you know what they are like.
 
Back
Top