Value of Hardware Unboxed benchmarking

They do support DLSS but you don't want to benchmark competing GPUs when one is outputting a completely inferior image.

But when one can provide a far superior image with a performance uplift, and you don't account for that in the final assessment of your review, you're not really giving an accurate conclusion as to what is the overall actual value of the product when used in real gaming situtations.

I get the logistics of video card reviews and the wrench reconstruction throws into it, it's not an easy problem to solve - I'm not suggesting a weighting system to be used in benchmarks or comparing 'equivalent' visual quality by benchmarking different reconstruction settings. But in a recent review (think it was comparing the 4060 to 7600..?) Steve mentioned DLSS in his summary, and while a positive for Nvidia, ultimately 'wasn't that much better' than FSR, or something to that effect. When of course, one of the comprehensive comparative looks at FSR vs DLSS was done by his own partner, Tim - showing that was decidedly not the case.
 
But when one can provide a far superior image with a performance uplift, and you don't account for that in the final assessment of your review, you're not really giving an accurate conclusion as to what is the overall actual value of the product when used in real gaming situtations.

I get the logistics of video card reviews and the wrench reconstruction throws into it, it's not an easy problem to solve - I'm not suggesting a weighting system to be used in benchmarks or comparing 'equivalent' visual quality by benchmarking different reconstruction settings. But in a recent review (think it was comparing the 4060 to 7600..?) Steve mentioned DLSS in his summary, and while a positive for Nvidia, ultimately 'wasn't that much better' than FSR, or something to that effect. When of course, one of the comprehensive comparative looks at FSR vs DLSS was done by his own partner, Tim - showing that was decidedly not the case.
They mention DLSS superiority in virtually every video comparing an AMD and Nvidia GPU.


Not sure if this was the video you were referring too. I went and watched it and the conclusion seems very reasonable to me.
 
Last edited:
They mention DLSS superiority in virtually every video comparing an AMD and Nvidia GPU.

Sure they mention it, then basically toss it aside and it doesn't factor it into the conclusion at all. That overall conclusion I think was ultimately fair btw - neither of those cards are impressing from a value perspective. But his comment on DLSS are ultimately pretty threadbare, I think Tim for example, may have worded it differently.

Steve from HUB said:
For the most part, DLSS offers superior image quality to FSR, though we did note that both technologies are kind of poor at 1080p and lower, and really should only be used at 1440p and greater if possible. So I certainly think AMD has caught Nvidia in a lot of key areas...

First off, at 1080p and 1440p, that's where DLSS shows a significant quality difference to FSR, the lower resolution you go the worse FSR fares where it's all but pointless at those resolutions. Secondly, "4K" becomes possible (at 60fps with some quality settings reduction) in a lot of games if you have a solid performance reconstruction mode in these class of cards, but that's also where FSR really starts to falter too. So it's ultimately simply an acknowledgement of Tim's work while actually not acknowledging the very results in that work. It's simply tossed in at the end of the review for this reason, so people can respond to complaints that it wasn't addressed because hey, he 'mentioned' it.

The very real application of DLSS, at lower resolutions and/or reconstruction from lower base resolutions, is where the quality gap with FSR grows - DLSS is more relevant for the 4060/7600 class of cards as starting from 4K/Quality isn't really an option. If anything for this budget, I think it should have more prominence in a conclusion. Fine to say if you think the artifacts at 1080p are too much, sure - I would probably agree - but then FSR is all but pointless at below 4K then.

From the other side, Steve also mentions the superior experience of AMD's Adrenalin software related to the myriad of apps (and account login) you need for Nvidia, and while I don't think that should necessarily be a major factor in the final assessment, I wish more reviewers would call attention to Nvidia's failings in this aspect as well.

 
Last edited:
They don’t support ultra settings and often make optimized settings videos for big titles. Ultra settings are just so ubiquitous throughout benchmarking it’s easy to settle on that as a standard approach. Every game has an ultra preset for consistency. They do support DLSS but you don't want to benchmark competing GPUs when one is outputting a completely inferior image. It has also been long established what uplift you can typically expect. Benchmarks are meant to show comparative performance. Outside of RT, graphic settings and upscaling don’t typically change that.

Can’t have your cake and eat it too. Either IQ matters or it doesn’t.
 
Sure they mention it, then basically toss it aside and it doesn't factor it into the conclusion at all. That overall conclusion I think was ultimately fair btw - neither of those cards are impressing from a value perspective. But his comment on DLSS are ultimately pretty threadbare, I think Tim for example, may have worded it differently.



First off, at 1080p and 1440p, that's where DLSS shows a significant quality difference to FSR, the lower resolution you go the worse FSR fares where it's all but pointless at those resolutions. Secondly, "4K" becomes possible (at 60fps with some quality settings reduction) in a lot of games if you have a solid performance reconstruction mode in these class of cards, but that's also where FSR really starts to falter too. So it's ultimately simply an acknowledgement of Tim's work while actually not acknowledging the very results in that work. It's simply tossed in at the end of the review for this reason, so people can respond to complaints that it wasn't addressed because hey, he 'mentioned' it.

The very real application of DLSS, at lower resolutions and/or reconstruction from lower base resolutions, is where the quality gap with FSR grows - DLSS is more relevant for the 4060/7600 class of cards as starting from 4K/Quality isn't really an option. If anything for this budget, I think it should have more prominence in a conclusion. Fine to say if you think the artifacts at 1080p are too much, sure - I would probably agree - but then FSR is all but pointless at below 4K then.

From the other side, Steve also mentions the superior experience of AMD's Adrenalin software related to the myriad of apps (and account login) you need for Nvidia, and while I don't think that should necessarily be a major factor in the final assessment, I wish more reviewers would call attention to Nvidia's failings in this aspect as well.


He does factor it in to his purchase assessment. AMD needs to be at least 20% cheaper in this particular comparison to account for the feature disparity. I don't entirely agree with your DLSS/FSR value statement. While the quality gap does increase significantly at lower resolutions, both still become undesirable. At higher resolutions the gap is still large and FSR is still awful. For my money, the best use case for DLSS is to make higher resolutions more performant. While it's an interesting academic comparison running 4k DLSS/FSR on these GPUs, I feel that is a poor use case for almost everyone as the tradeoffs required become far too severe.

Can’t have your cake and eat it too. Either IQ matters or it doesn’t.
Not sure what you mean by this.
 
Last edited:
For my money, the best use case for DLSS is to make higher resolutions more performant. While it's an interesting academic comparison running 4k DLSS/FSR on these GPUs, I feel that is a poor use case for almost everyone as the tradeoffs required become far too severe.

That's exactly what it's doing - making a higher resolution more performant. It's highly dependent upon the game of course, but there are many instances where 4k/60 is quite possible on modern games when DLSS performance is used for 4060 class cards - it's largely possible on my 3060 even. The tradeoffs can be 'severe' for FSR yes, but not nearly as much for DLSS at that setting. I mean, we have plenty of visual evidence for this.

If that is too much of a tradeoff for 'almost everyone', then no one should be gaming on a PS5/SX either. No one is buying a 4060 class card and not expecting some compromises.
 
Last edited:
I kinda struggle to think what "severe tradeoffs" DLSS has when implemented properly.
Ghosting? This one usually present with game's own TAA too so it's not a DLSS "tradeoff".
Some moire patterns and blurring on movement? Again, it's not that uncommon with TAA either.
AA itself tends to be better with DLSS.
All in all in the majority of games you don't lose much IQ with DLSS while gaining 2-4x of performance. That's a "tradeoff" most people will make without second thought.
 
I kinda struggle to think what "severe tradeoffs" DLSS has when implemented properly.
Ghosting? This one usually present with game's own TAA too so it's not a DLSS "tradeoff".
Some moire patterns and blurring on movement? Again, it's not that uncommon with TAA either.
AA itself tends to be better with DLSS.
All in all in the majority of games you don't lose much IQ with DLSS while gaining 2-4x of performance. That's a "tradeoff" most people will make without second thought.

Yes, and thus I rarely ever play a game that has TAA as a default with no way to disable it. Ghosting = not going to play that game. Same with blurring on movement. No game is good enough that I'd willingly suffer eye strain in order to play it.

I have mentioned previously that the only time DLSS is an improvement is when it is used in place of a game's own forced (bad) TAA. That doesn't mean I find either enjoyable.

I had been hoping that DLSS 3.5 might fix some of the, IMO, egregious shortcomings of DLSS up to that point but while it addresses some, it reverts back to even worse behavior and artifacts in same cases. It's one step forward and one step backward.

I keep hoping that the state of temporal reconstruction will improve as that is the future for AAA games, but while it does here and there, it's still overall not pleasant to game with.

Thankfully it is not yet an issue for me as the games I really enjoy playing and want to play don't force some form of temporal rendering. It helps that in most cases, AAA games are garbage anyway (IMO) from a gameplay perspective.

Regards,
SB
 
Thankfully it is not yet an issue for me as the games I really enjoy playing and want to play don't force some form of temporal rendering.
Really? Could you name the games from 2023 which don't use some sort of temporal accumulation in their frames? Or even games where you can use something other than TAA?
 
Can’t have your cake and eat it too. Either IQ matters or it doesn’t.
They literally had a whole situation about this and the overwhelming response was that they shouldn't do benchmarking with reconstruction. So they went with that, even though they didn't fully agree with that themselves.

They've never denied that IQ matters, and have repeatedly said on many occasions that DLSS looks better than FSR at equivalent base resolutions.

Some of y'all are really just resorting to any argument possible to trash on them cuz you had your mind made up, rather than actually stepping back and maybe reevaluating whether what you think is reasonable or backed up by enough evidence or not. It's become a hate jerk, and as is typical, many people will never admit they were wrong, so double down even if they have to resort to ever more desperate arguments.
 
They literally had a whole situation about this and the overwhelming response was that they shouldn't do benchmarking with reconstruction. So they went with that, even though they didn't fully agree with that themselves.

They've never denied that IQ matters, and have repeatedly said on many occasions that DLSS looks better than FSR at equivalent base resolutions.

Some of y'all are really just resorting to any argument possible to trash on them cuz you had your mind made up, rather than actually stepping back and maybe reevaluating whether what you think is reasonable or backed up by enough evidence or not. It's become a hate jerk, and as is typical, many people will never admit they were wrong, so double down even if they have to resort to ever more desperate arguments.

Do you think their handling of the 8GB issue was reasonable and consistent with their other arguments?
 
Do you think their handling of the 8GB issue was reasonable and consistent with their other arguments?
I think it was not only perfectly valid at the time of their posting, but also even more valid reasoning based on the obvious and true assumption that RAM requirements will only go up on average going forward. Just because a game like TLOU1 patched their game to fix their clearly unoptimized VRAM demands doesn't mean what they said has no merit, as not only are games going to increase in demands in general, but there will always be less optimized games that you have to worry about. And yes, when it comes to PC gaming, it definitely makes sense when deciding what you need to buy that you should take into account that not every game will be brilliantly optimized.

I was probably on the '8GB sucks' train even before they were, to be clear. Especially when talking about the pricing/tiering aspect. We had affordable, midrange 8GB GPU's in 2015. The idea that it's acceptable to be selling 8GB GPU's in 2023 for more than $300 is insane and it should not be considered controversial to suggest so.
 
They literally had a whole situation about this and the overwhelming response was that they shouldn't do benchmarking with reconstruction. So they went with that, even though they didn't fully agree with that themselves.
What they call their community is a self-selected cesspool, and the feedback it provides is driven by strong convictions. It’s no excuse to say that they went the way they did because they were appeasing that crowd, all the more so if they really know better.
 
What they call their community is a self-selected cesspool, and the feedback it provides is driven by strong convictions. It’s no excuse to say that they went the way they did because they were appeasing that crowd, all the more so if they really know better.
It's pretty much pot calling kettle black situtation both ways, even if you want to single out one specific community you don't happen to agree with
 
Rather than me complaining that other sites offer useless benchmarks on DLSS and RT, I instead look at their non-DLSS and non-RT benchmarks (if they have any, some don't) as well as going to review sites that don't spend overly much time on them.

Wait, what? What review site only looks at DLSS and RT benchmarks? I honestly can't think of one. The only place where you'll routinely see cards benchmarked using DLSS and RT enabled is in Nvidia's marketing.

DF is probably the outlet that gives the most attention to RT and DLSS, yet in their 7800XT review they spend just as much time on straight rasterization as RT, if not more, and come out pretty positive for the 7800XT. Gamers Nexus's 7800XT review devoted only a small portion to RT. I honestly can't think of any major outlet that even prioritizes RT and DLSS, let alone exclusively.

If people don't like Hardware Unboxed's reviews and opinions, then they obviously don't have to read or watch them. Easy peasy. Noone is forcing them to read or watch them.

Reviewers can be reviewed, they're putting out a product for consumption that can be critiqued like any other. Hence, this thread.

It's like this indignation that how dare someone value something differently. It's honestly very strange that someone would feel that way rather than acknowledging that not everyone values the same things in the same way.

Everyone has a bias to varying degrees, but the objections raised recently with some of HBU's coverage seems more focused on their consistency. I don't necessarily disagree that strongly with their ultimate conclusions and I think they produce work of value, but I do have some criticisms of their methodology and their respective knowledgebase (albeit I tend to separate out Steve and Tim, who I think does present more professionally and is just generally more analytical).
 
Last edited:
They literally had a whole situation about this and the overwhelming response was that they shouldn't do benchmarking with reconstruction.
The overwhelming response from the community HUB had already cultivated with their existing practices. Those who disagree with their general direction aren't going to be on patreon voting in polls. It's very much a self-reinforcing feedback loop.
 
The overwhelming response from the community HUB had already cultivated with their existing practices. Those who disagree with their general direction aren't going to be on patreon voting in polls. It's very much a self-reinforcing feedback loop.

That’s fine though. That’s their audience. It’s okay to serve a niche.
 
That’s fine though. That’s their audience. It’s okay to serve a niche.
I'm not saying it isn't fine, just that community polls will overwhelmingly support the status quo thanks to selection bias. If you don't benchmark with upscaling, you'll attract a community not interested in it, then when you later run a poll, that community will vote not to benchmark upscaling.
 
Moved to Upscaling discussion ....
 
Last edited by a moderator:
Back
Top