Value of Hardware Unboxed benchmarking *spawn

Almost nobody care about 1-2% differences except for people like you. A GPU is 1-2% faster? It’s statistically insignificant.

Of course it is. Doesn't stop reviewers drawing conclusions on "which GPU is faster" though, even if it is only by 1-2% in a specific test suite. And many readers will pick up on, and amplify that conclusion across the Web without caring about the detail - because its the message they want to hear.

I couldn't care less about a manufactured 1-2% performance difference. I do care about reviewing practices which contrive such situations though in order to pacify their reader base.
 
The difference is that DLSS can be set to use a lower quality mode and still match/beat FSR 2.2 in image quality.

So in reality it should be FSR 2.2 quality vs DLSS balanced - That would give quite a big win to Nvidia in terms of frame rate and still give a slight IQ advantage.

The powerlines in FH5 at 2:50 show you how far a head DLSS is and how you can gain even more performance by dropping down to DLSS balanced mode.

So I still don't understand this video as it's still unrealistic to what Nvidia owners actually use.
Exactly. Often even DLSS performance has a better image quality than FSR quality and therefore its nonesense to compare framerates of DLSS quality to FSR quality. When two options are available one would mostly use DLSS instead of FSR on a Nvidia GPU.
 
Exactly. Often even DLSS performance has a better image quality than FSR quality and therefore its nonesense to compare framerates of DLSS quality to FSR quality. When two options are available one would mostly use DLSS instead of FSR on a Nvidia GPU.
Firstly, speak for yourself. DLSS and FSR are equally trash imo and my stance hasn't changed since both technologies launched. In fact, I famously/infamously called both of them crutches to the outrage of many here. Secondly, testing with FSR on an Nvidia GPU is not pointless when you're trying to you're attempting to compare different GPUs. The basic fundamentals of testing requires that you keep all variables static other than the things you're comparing. Testing AMD with FSR and Nvidia with DLSS is a stupid test. That data cannot be used to for relative performance. If you cannot understand that, there's nothing else left to say as it's pretty basic. Secondly @davis.anthony comment is senseless. How people are liking that ridiculous suggestion in a tech forum is frankly shocking. What a terrible take. Who decided that DLSS balanced is equivalent to FSR quality? All I see in that comment is fanboy stanning so that Nvidia can "win". Lets compare two unlike things at completely subjective settings to test what exactly? What does such a test produce? Honestly, it's disappointing to read this trash here.
 
Of course it is. Doesn't stop reviewers drawing conclusions on "which GPU is faster" though, even if it is only by 1-2% in a specific test suite. And many readers will pick up on, and amplify that conclusion across the Web without caring about the detail - because its the message they want to hear.

I couldn't care less about a manufactured 1-2% performance difference. I do care about reviewing practices which contrive such situations though in order to pacify their reader base.
Tell me which reviewer is declaring a 1-2 percent difference as a win? Its not Gamer's Nexus, it's not Hardware unboxed, its not Linus Tech Tips? Who is making these declarations because as far as I'm concerned, this is classic slippery slope fallacy. Instead of assuming that people are incapable of basic analysis, maybe give them the benefit of the doubt that they've done their homework.
 
Testing an nvidia gpu with dlss vs an amd gpu with fsr2 would be equally valid because that’s how the GPUs are likely to be used. As long as it’s clear that’s what’s being tested than there’s nothing wrong.

Testing fsr2 vs fsr2 is also fine because it’s keeping more things equal.

Ultimately they’re both subjective decisions. DLSS, FSR and native are all trade offs of image quality, which is very subjective, vs trade offs in performance, which is also very subjective. Some insane people tell me 30 fps is good enough for some games. If they were a reviewer I’d basically skip those reviews.
 
Secondly, testing with FSR on an Nvidia GPU is not pointless when you're trying to you're attempting to compare different GPUs. The basic fundamentals of testing requires that you keep all variables static other than the things you're comparing.

Which makes them completely useless then.

As an RTX owner I want to see DLSS tested on an Nvidia GPU, seeing how it performs and looks with FSR 2 is of no use to me as I never use it.

Testing AMD with FSR and Nvidia with DLSS is a stupid test.

No it's not, when I was looking to make a new GPU purchase I looked out for reviews that compared DLSS IQ/Perf on an Nvidia GPU to FSR IQ/Perf on an AMD one.

Seeing how FSR performed on an Nvidia GPU was of no use to me as a consumer.

Secondly @davis.anthony comment is senseless.

Not as senseless as your comments that I addressed above.

How people are liking that ridiculous suggestion in a tech forum is frankly shocking. What a terrible take. Who decided that DLSS balanced is equivalent to FSR quality?

My eye balls.

Lets compare two unlike things at completely subjective settings to test what exactly?

They're both methods of upscaling a lower resolution input to a perceived higher resolution one.

Honestly, it's disappointing to read this trash here.

So go as you're adding nothing to this forum.

DLSS quality mode trashes FSR quality mode, it always has.

DLSS balanced mode generally offers better reconstruction than FSR quality while offering quite a good increase in performance and if that upsets you then that is your responsibly to teach yourself to get over it.

Look at the attached picture, only a complete idiot with an Nvidia RTX GPU would use FSR 2 over DLSS 2 and only an even bigger idiot can't see why Nvidia owners care about using DLSS for testing Nvidia cards over FSR 2.

So factoring in the attached screenshot I'll say it again, seeing how FSR 2 performs on an Nvidia GPU is of no use to me as a consumer (And I'm guessing nearly every other Nvidia RTX owner also feels the same way)
 

Attachments

  • DLSS vs FSR marked.png
    DLSS vs FSR marked.png
    684.3 KB · Views: 18
Last edited:
This shit should be bannable. It’s literally tales from your behind. You lazily make accusatory claims with absolutely no evidence and then concoct some bs in your mind about how they’re biased.
What should be bannable are people who come into a thread with "shit" words demanding something from other posters instead of using Google for about 10 minutes to educate themselves beyond the tiny sample size of benchmarks they like since they fit their agenda. You, in other words.

Their data is different because they use different passes in the game.
Everyone is using different passes. Yet most benchmark results provide comparable trends and tendencies while Steve's are completely different about 50% of the time.
Also if someone is using a "different pass" which is so different that everyone else's passes are giving different results then maybe that "different pass" isn't actually a common occurrence in the game tested and shouldn't be used to represent game's performance?

If you want to make ridiculous claims like this, go purchase the gpus and do the same benchmark pass they do, then make your claims.
I do. Do you?

I’m tired of seeing lazy internet whiners put down the work of others while not providing any evidence to support their claim. Again, this shit should be ban worthy. It’s libel.
I agree. You should be banned. Because you haven't provided any evidence to back Steve's claims yet decided to dismiss what I've said simply because you don't like it.

So Steve's video.

"Fun" stuff first

https://www.techspot.com/article/2583-amd-fsr-vs-dlss/ - Here in HUB's own previous video FM5 with DLSS is faster than FSR2 in comparison to HUB's own new video where it's almost on par with FSR2 ¯\_(ツ)_/¯
https://www.techspot.com/review/2464-amd-fsr-2-vs-dlss-benchmark/ - here DLSS is considerably faster than FSR2.
That's about it for DLSS vs FSR2 comparisons from HUB. So based on this they made the decision that DLSS doesn't matter?

Hitman 3 - 1 fps difference in favour of DLSS in Steve's new video
4 fps difference in HUB's own previous test of H3 here:

Spider-Man: Remastered - no difference in latest video
1 to 2 fps difference in favour of DLSS here in HUB's own test:

Death Stranding - 2-3 fps in favour of DLSS
Again, 4-5 fps here:

FH5 - no difference in the new video
1-2 fps faster in their own previous benchmark:

Other benchmarks

Deathloop - DLSS is noticeably faster in every benchmark out there besides HUB's, so "different pass" suddenly doesn't work

SMR
Confirms the first benchmark on HUB - DLSS is 1-2 fps faster
Same is true for SMMM btw: https://www.computerbase.de/2022-11..._nvidia_dlss_24_bildqualitaet_und_performance

Death Stranding
DLSS is about 7 fps faster here instead of Steve's 2-3

Destroy All Humans Remaster
DLSS is slightly faster

Uncharted
DLSS is about on par

The only game where DLSS does worse than FSR2 which can be corroborated by other benchmarks is F1 2022
Likely because DLSS integration in F1 titles has always be rather bad.

This trend is typical. I read through most benchmarks which appear out there and FSR2 being faster on GF is an exception, not a rule.
It is in fact so obvious that most places which did DLSS vs FSR2 benchmarks regularly last year have basically dropped that in favour of testing DLSS on compatible GFs and FSR2 on the rest.

Now the "funniest" part is that Atomic Heart doesn't even support FSR2.
It's using FSR1.
Why did he even included it into the last video? To make his point stronger or something?

Testing fsr2 vs fsr2 is also fine because it’s keeping more things equal.
What things?
GFs run FSR2 math on tensor units in parallel with FP32 shading and if optimized properly can likely hide the majority of FSR2 cost this way - which no one will do of course.
Radeons run it on the main SIMDs eating cycles otherwise spent on graphics rendering.
It's not equal no matter how you slice it.
 
Last edited:
Which makes them completely useless then.

As an RTX owner I want to see DLSS tested on an Nvidia GPU, seeing how it performs and looks with FSR 2 is of no use to me as I never use it.
I own an RTX 4090 and DLSS is useless to me because I think it's trash. I particularly purchased a 4090 to avoid it because i consider upscaling to be an artifact ridden crutch to compensate for underpowered hardware. Seeing how it performs is a waste of time and even more so with DLSS 3 that inserts fake frames that has nothing to do with the game. So whose opinion is more important, yours or mine? Should reviewers cater to you or me when attempting to compare different gpus? Thankfully HUB has finally seen the light and is getting rid of the upscaling comparison in benchmarks just like the good old days. Leave upscaling where it belongs, in the trash.
No it's not, when I was looking to make a new GPU purchase I looked out for reviews that compared DLSS IQ/Perf on an Nvidia GPU to FSR IQ/Perf on an AMD one.

Seeing how FSR performed on an Nvidia GPU was of no use to me as a consumer.



Not as senseless as your comments that I addressed above.



My eye balls.



They're both methods of upscaling a lower resolution input to a perceived higher resolution one.



So go as you're adding nothing to this forum.

DLSS quality mode trashes FSR quality mode, it always has.

DLSS balanced mode generally offers better reconstruction than FSR quality while offering quite a good increase in performance and if that upsets you then that is your responsibly to teach yourself to get over it.

Look at the attached picture, only a complete idiot with an Nvidia RTX GPU would use FSR 2 over DLSS 2 and only an even bigger idiot can't see why Nvidia owners care about using DLSS for testing Nvidia cards over FSR 2.

So factoring in the attached screenshot I'll say it again, seeing how FSR 2 performs on an Nvidia GPU is of no use to me as a consumer (And I'm guessing nearly every other Nvidia RTX owner also feels the same way)
The problem with your whole post is that you think the world revolves around you and reviewers should cater to your desires. Nothing you said is a fact and built your argument on facts which are actually opinions in disguise. Reviewers dont have to cater to you but as usual, its easy to whine and complain. Finally, you arguing that dlss is better than fsr is the equivalent of arguing in favor of dog shit over cow shit, they're both shit.
 
What should be bannable are people who come into a thread with "shit" words demanding something from other posters instead of using Google for about 10 minutes to educate themselves beyond the tiny sample size of benchmarks they like since they fit their agenda. You, in other words.


Everyone is using different passes. Yet most benchmark results provide comparable trends and tendencies while Steve's are completely different about 50% of the time.
Also if someone is using a "different pass" which is so different that everyone else's passes are giving different results then maybe that "different pass" isn't actually a common occurrence in the game tested and shouldn't be used to represent game's performance?
[/QUOTE]
Steve's are different 50% of the time? Now I know you're lying. I've watched every HUB video and very few reviewers test as many games as they do. Furthermore, even fewer go back and retest games based on newer drivers, etc. I've also watched all Gamers Nexus videos since the rtx 2000 launch, I watch Jayz2cents, Linus, even the new guy Daniel and that's how I know you're lying.

I do. Do you?


I agree. You should be banned. Because you haven't provided any evidence to back Steve's claims yet decided to dismiss what I've said simply because you don't like it.

So Steve's video.

"Fun" stuff first

https://www.techspot.com/article/2583-amd-fsr-vs-dlss/ - Here in HUB's own previous video FM5 with DLSS is faster than FSR2 in comparison to HUB's own new video where it's almost on par with FSR2 ¯\_(ツ)_/¯
https://www.techspot.com/review/2464-amd-fsr-2-vs-dlss-benchmark/ - here DLSS is considerably faster than FSR2.
That's about it for DLSS vs FSR2 comparisons from HUB. So based on this they made the decision that DLSS doesn't matter?

Hitman 3 - 1 fps difference in favour of DLSS in Steve's new video
4 fps difference in HUB's own previous test of H3 here:

Spider-Man: Remastered - no difference in latest video
1 to 2 fps difference in favour of DLSS here in HUB's own test:

Death Stranding - 2-3 fps in favour of DLSS
Again, 4-5 fps here:

FH5 - no difference in the new video
1-2 fps faster in their own previous benchmark:

Other benchmarks

Deathloop - DLSS is noticeably faster in every benchmark out there besides HUB's, so "different pass" suddenly doesn't work

SMR
Confirms the first benchmark on HUB - DLSS is 1-2 fps faster
Same is true for SMMM btw: https://www.computerbase.de/2022-11..._nvidia_dlss_24_bildqualitaet_und_performance

Death Stranding
DLSS is about 7 fps faster here instead of Steve's 2-3

Destroy All Humans Remaster
DLSS is slightly faster

Uncharted
DLSS is about on par

The only game where DLSS does worse than FSR2 which can be corroborated by other benchmarks is F1 2022
Likely because DLSS integration in F1 titles has always be rather bad.

This trend is typical. I read through most benchmarks which appear out there and FSR2 being faster on GF is an exception, not a rule.
It is in fact so obvious that most places which did DLSS vs FSR2 benchmarks regularly last year have basically dropped that in favour of testing DLSS on compatible GFs and FSR2 on the rest.

Now the "funniest" part is that Atomic Heart doesn't even support FSR2.
It's using FSR1.
Why did he even included it into the last video? To make his point stronger or something?


What things?
GFs run FSR2 math on tensor units in parallel with FP32 shading and if optimized properly can likely hide the majority of FSR2 cost this way - which no one will do of course.
Radeons run it on the main SIMDs eating cycles otherwise spent on graphics rendering.
It's not equal no matter how you slice it.
My guy, you're bringing 1-2 fps "differences" as the pillar of the your argument? You're being intentionally obtuse. Go watch hub's latest video where they compare fsr vs dlss on the 4070ti. FSR is also faster by 1-2 fps and guess what? No one cares. Using statistically insignificant data points as the basis of your terrible argument makes your argument terrible. I also like how you selectively cherry picked HUBs data to confirmation bias yourself. As if you know their data better than they do smh.

Finally I like how you moved the goal post from how gpus perform when using the same upscaling algorithm to how the gpus handle each algorithm. All your arguments are bad faith. Wake me up when you're interested in having an intellectually honest discussion. Until then, don't @ me.
 
Last edited:
Since i can't edit my original post, ill just post this as a follow up. Just so everyone understands the irrelevance of DLSS and FSR, according to Nvidia, DLSS is supported in approximately 270 games and apps. FSR support is even less. On steam today, there are approximately 30k games. That means DLSS supported games account for less than 1% of games on steam. DLSS and it's equivalents will fade into irrelevance once GPUs become fast enough to run whatever new solution devs have come up with. We've seen it before and we'll see it again. All this noise about statistically irrelevant technology smh.
 
Last edited:
I own an RTX 4090 and DLSS is useless to me because I think it's trash. I particularly purchased a 4090 to avoid it because i consider upscaling to be an artifact ridden crutch to compensate for underpowered hardware. Seeing how it performs is a waste of time and even more so with DLSS 3 that inserts fake frames that has nothing to do with the game. So whose opinion is more important, yours or mine? Should reviewers cater to you or me when attempting to compare different gpus? Thankfully HUB has finally seen the light and is getting rid of the upscaling comparison in benchmarks just like the good old days. Leave upscaling where it belongs, in the trash.
All reviews already do cater to you as all reviews test at native.

What I'm asking for is simple, if you're going to test upscaling then make it relevant and test what people actually use.

The problem with your whole post is that you think the world revolves around you and reviewers should cater to your desires. Nothing you said is a fact and built your argument on facts which are actually opinions in disguise. Reviewers dont have to cater to you but as usual, its easy to whine and complain. Finally, you arguing that dlss is better than fsr is the equivalent of arguing in favor of dog shit over cow shit, they're both shit.
The problem with you is you hate upscaling and get triggered every time it gets bought up.

And if reviewers want people to watch their videos or read their review articles they absolutely better cater to those who need/use upscaling as like it or not, upscaling is not going away.

You've literally added nothing to these discussions and looking at your posting history you've contributed very little other than whining about upscaling.

DLSS and it's equivalents will fade into irrelevance once GPUs become fast enough to run whatever new solution devs have come up with. We've seen it before and we'll see it again. All this noise about statistically irrelevant technology smh.

Please explain where we've seen temporal/AI based upscaling before and when it went away before coming back.
 
Last edited:
I'm done in this thread now and muted BitByte as I have enough people living under bridges on Twitter taking up my time without having them on here too.
 
Testing an nvidia gpu with dlss vs an amd gpu with fsr2 would be equally valid because that’s how the GPUs are likely to be used. As long as it’s clear that’s what’s being tested than there’s nothing wrong.

Testing fsr2 vs fsr2 is also fine because it’s keeping more things equal.

Ultimately they’re both subjective decisions.
There are certainly valid arguments for either approach. As we've seen, this can lead to some impassioned advocacy.

That said, I think that there is a narrative that drives the pattern of choices that HUB has been making.
It seems to me that they are quite keen to de-emphasize qualitative differences, in order to reiterate a core message of their recent reviews: Everything is just way too bloody expensive, and we need to reward the cheaper choice, for the benefit of everyone.

Nothing wrong with that. But I personally prefer looking at reviews that go for a less reductionary approach and perhaps bring a bit more enthusiasm for the subject matter.
 
Tell me which reviewer is declaring a 1-2 percent difference as a win? Its not Gamer's Nexus, it's not Hardware unboxed, its not Linus Tech Tips? Who is making these declarations because as far as I'm concerned, this is classic slippery slope fallacy. Instead of assuming that people are incapable of basic analysis, maybe give them the benefit of the doubt that they've done their homework.

The article at the start of this thread does a performance summary based on a highly questionably selection of games/settings which concludes in the header of that summary that the XTX is 1% slower than the 4080 at 1440p and 1% faster at 4K. In the video commentary HUB does state they are "essentially the same performance" so they're certainly not making a big deal out of it. But, they are highlighting it which allows people to then go and use those charts to support a certain narrative. So just a 2% swing in this case changes the entire narrative of that chart - arguably why CoD: MW2 is in there twice at different settings.

Again, I'm not suggesting HUB or anyone else actually tests every game in both FSR and DLSS and then only selects the faster of the two. But using DLSS across the board on NV (whether that results in an overall gain or loss in performance) is far more reflective of how the GPU's would be used in real life. Arguably each test should then be supported by a side by side / slider screenshot comparing the image quality you get from each solution. In fact, generally speaking, a comparison slider that has a drop down on each side allowing the user to customise the comparison inclusive of native and all quality modes for DLSS/FSR, and supported by benchmarks at all those settings, is the perfect way to do this.

That would allow users to compare the performance of DLSS Balanced vs FSR Quality for example based on their own subjective analysis which absolutely can be a valid comparison point, but I understand why review sites shouldn't default to it.
 
Native resolution rendering is an archaic concept and isn’t even an accurate description of how games work today. Lots of stuff already happens at lower than native resolution (shadow maps, reflections, GI).

With 8K on the horizon upscaling will only become more prevalent. UE5 performance targets assume upscaling is enabled. DRS, VRS, DLSS, FSR etc are only going to get better as engines are built from the ground up with those features in mind.

Any reviewer not named digital foundry will have a tough time as different tiers of hardware will achieve the same output resolution and performance and the difference will be how much DRS kicks in. I fully expect this to be the norm on PC in the future just like it is today on consoles.
 
The article at the start of this thread does a performance summary based on a highly questionably selection of games/settings which concludes in the header of that summary that the XTX is 1% slower than the 4080 at 1440p and 1% faster at 4K. In the video commentary HUB does state they are "essentially the same performance" so they're certainly not making a big deal out of it. But, they are highlighting it which allows people to then go and use those charts to support a certain narrative. So just a 2% swing in this case changes the entire narrative of that chart - arguably why CoD: MW2 is in there twice at different settings.

Again, I'm not suggesting HUB or anyone else actually tests every game in both FSR and DLSS and then only selects the faster of the two. But using DLSS across the board on NV (whether that results in an overall gain or loss in performance) is far more reflective of how the GPU's would be used in real life. Arguably each test should then be supported by a side by side / slider screenshot comparing the image quality you get from each solution. In fact, generally speaking, a comparison slider that has a drop down on each side allowing the user to customise the comparison inclusive of native and all quality modes for DLSS/FSR, and supported by benchmarks at all those settings, is the perfect way to do this.

That would allow users to compare the performance of DLSS Balanced vs FSR Quality for example based on their own subjective analysis which absolutely can be a valid comparison point, but I understand why review sites shouldn't default to it.
I get you but as far as I'm concerned, it's a non issue. Look at the reddit link below. On average, reviewers say the 7900xtx is faster than the 4080 and HUB's data falls in line with that.... Again the aggregate is in the 1-2% range but, like I said, 1-2% is statistically insignificant. Futhermore, the 4080 is significantly faster when it comes to ray tracing so, I don't think a majority of people would pick the 7900xtx based on a 1-2% difference. Yea HUB can run all Nvidia GPUs with DLSS but, based on the amount of GPUs they test, its a lot of extra work. Like I said in an earlier post, DLSS and FSR supported games represent less than 1% of games on steam. Why should any reviewer spend so much effort on the varying technologies when its basically statistically irrelevant? Like I get it if they're doing an in depth dive on a game like they did for Hogwarts legacy but for relative performance, I don't really know that it's worth all the effort.

 
Native resolution rendering is an archaic concept and isn’t even an accurate description of how games work today. Lots of stuff already happens at lower than native resolution (shadow maps, reflections, GI).

With 8K on the horizon upscaling will only become more prevalent. UE5 performance targets assume upscaling is enabled. DRS, VRS, DLSS, FSR etc are only going to get better as engines are built from the ground up with those features in mind.

Any reviewer not named digital foundry will have a tough time as different tiers of hardware will achieve the same output resolution and performance and the difference will be how much DRS kicks in. I fully expect this to be the norm on PC in the future just like it is today on consoles.

Yep, I'm getting some pretty insane IQ improvements by down sampling from 4k to 1440p and using DLSS.

Dying Light 2 shows a much clearer and sharper image than even native 1440p, especially in the fine grass and trees in the distance.
 

Attachments

  • 4k DLSS Performance mode.png
    4k DLSS Performance mode.png
    6.6 MB · Views: 12
  • Native 1440p.png
    Native 1440p.png
    6.2 MB · Views: 12
Back
Top