Value of Hardware Unboxed benchmarking

To me it comes across as a bit unprofessional.
It is. Gamernexus is mostly benchmarking games in 1440p and 4K. A few Raytracing benchmarks, a little bit DLSS, no FG at all.

It isnt a product review, it just bashing. I think it is time to move on for companies like nVidia and stop sampling these channels. There is no value here. Nothing. You dont learn anything.
 
The tone of 4060Ti YouTube reviews seems weird to me, as if people were expecting some miracle they've dreamed up for themselves somehow which couldn't happen - and got utterly disappointed when it didn't.
First warning sign it's suddenly supposed to be 1080p card, while 3060 Ti was marketed as 1440p card.
But let's see some numbers, using TPU 1080p:
4060 Ti: +12.4% compared to 3060 Ti, doesn't beat 3070
3060 Ti: +31.6% compared to 2060 Super, +26.6% compared to 2070, beats even 2080 Super
2060 Super: +81.8% compared to 1060, +12.4% compared to 1080, even 2060 beats 1080 (barely, but still beats)

Starting to see why it's so big disappointed? Previous updates in similar performance range have been big, beating 1-2 tier higher GPUs from past gen. 4060 Ti doesn't.
 
I don’t mind some emotion or investment as long as it doesn't compromise the data and it’s about good PC products in general and not focused on a particular vendor. Reviews by people with no passion or who don't even play games would not be for the betterment.
 
It is. Gamernexus is mostly benchmarking games in 1440p and 4K. A few Raytracing benchmarks, a little bit DLSS, no FG at all.

It isnt a product review, it just bashing. I think it is time to move on for companies like nVidia and stop sampling these channels. There is no value here. Nothing. You dont learn anything.
Reviews are supposed to be apples to apples first, the rest is bonus and can come later too. As for their choice of resolutions, is this different from their previous reviews? Or do you just think NVIDIA should dictate the resolution, 3060 Ti was 1440p, but now same class is supposed go back to 1080p?

edit:
Our YouTube channel did quick poll, 1000+ answers. 40 % didn't know what DLSS2/3 or FSR/2 is. From those who knew what they are, only 20 % use them by default, 45 % if their FPS drops too low without and 35 % never use them.
 
I think it is time to move on for companies like nVidia and stop sampling these channels. There is no value here. Nothing. You dont learn anything.

I would love to see Nvidia do that actually. I'm all for more comedy.

As for 'not learning anything', you learn that Nvidia's marketing wrt to its cache making it's 'effective' bandwidth actually superior to the previous wider bus model is far more nuanced (at best) than Nvidia would like you to believe. I like reviews that directly target the marketing claims of companies myself.

Reviews are supposed to be apples to apples first, the rest is bonus and can come later too. As for their choice of resolutions, is this different from their previous reviews? Or do you just think NVIDIA should dictate the resolution, 3060 Ti was 1440p, but now same class is supposed go back to 1080p?

Very rude of reviewers to deviate from Nvidia's designed framing!
 
Actually, you can go to AMD and see benchmarks with nVidia GPUs: https://community.amd.com/t5/gaming/building-an-enthusiast-pc/ba-p/599407?sf265824152=1

No difference to Gamersnexus' "reviews". Same value.
Reviews are supposed to be apples to apples first, the rest is bonus and can come later too. As for their choice of resolutions, is this different from their previous reviews? Or do you just think NVIDIA should dictate the resolution, 3060 Ti was 1440p, but now same class is supposed go back to 1080p?

edit:
Our YouTube channel did quick poll, 1000+ answers. 40 % didn't know what DLSS2/3 or FSR/2 is. From those who knew what they are, only 20 % use them by default, 45 % if their FPS drops too low without and 35 % never use them.
So, your viewers dont know what DLSS2 is? Maybe, maybe, you could educate them? That would have a positive effect for gamers.

And "apples to apples" doesnt make sense when new GPUs come to the market. But you can show how these new features work, what they can provide and what the drawbacks are.

/edit:
Computerbase did a whole Frame Generation test with "The Witcher 3 Next Gen": https://www.computerbase.de/2023-05...itt_nvidia_dlss_3_in_the_witcher_3_untersucht

This has real value, you can learn something from it. You can play TW3 NG with 90 FPS frames on a 4060TI in 1440p, DLSS 2 Quality, FG and with a lower latency than with only DLSS 2 Quality w/o reflex. A 7900XTX (355W, $999) gets less performance in native 1440p and will have a higher latency, too. Or you can go to AMD.com and read a blog how great AMD is, because that is exactly the same information you get from Gamersnexus.
 
Last edited:
First warning sign it's suddenly supposed to be 1080p card, while 3060 Ti was marketed as 1440p card.
Why is that a warning sign? Two years have passed, requirements have risen. The cards are pretty close in their processing power, and the VRAM being the same too means that the "sweet spot" resolution has fallen.

Starting to see why it's so big disappointed?
No? You can't judge a new product on what gains previous products have shown. You should judge it based on what's available on the market right now and how it compares to that. Otherwise we're exactly in the area of dreaming up something which couldn't happen - like expecting a new chip in 2023 to somehow bend the laws of physics and provide gains similar to what chips from 2010s were showing on way less advanced node changes.
 
It is. Gamernexus is mostly benchmarking games in 1440p and 4K. A few Raytracing benchmarks, a little bit DLSS, no FG at all.

It isnt a product review, it just bashing. I think it is time to move on for companies like nVidia and stop sampling these channels. There is no value here. Nothing. You dont learn anything.

I don’t agree with this. The content is overall fine. I don’t think their conclusions are misrepresenting anything.

I don’t want to see these review channels disappear. I’d just prefer a more professional presentation. The gamers nexus one isn’t too bad. My overall impression I’ve had is that the review channels are getting whiny. Outrage sells so I expect it to continue regardless of the product. I just don’t see the same attitude in other review spaces, except for film and music. I think they think they’re sticking it to the big companies, but I just don’t like the delivery.
 
I don’t agree with this. The content is overall fine. I don’t think their conclusions are misrepresenting anything.

I don’t want to see these review channels disappear. I’d just prefer a more professional presentation. The gamers nexus one isn’t too bad. My overall impression I’ve had is that the review channels are getting whiny. Outrage sells so I expect it to continue regardless of the product. I just don’t see the same attitude in other review spaces, except for film and music. I think they think they’re sticking it to the big companies, but I just don’t like the delivery.
Which review came off as the most whiney to you?
 
Entertaiment content is new every week. And these are so diverse that you cant hate everything. :D
A GPU is a GPU. Its the same one week later.

The problem with these reviews is that they only say subjective "facts". With Frame Generation latency is a new thing for them. But you see them only talk negative about it in combination with Frame Generation. But where are the measurements between nVidia and AMD? You see 4K benchmarks with a 4060TI and this GPU gets called an "embarrassment" because it doesnt deliver 60FPS at $399. But where are the DLSS Performance benchmarks and quality comparisions?

A lot is missing. These reviewers just putting up a setting to bash nVidia.
 
So, your viewers dont know what DLSS2 is? Maybe, maybe, you could educate them? That would have a positive effect for gamers.
We have educated them, do additional tests with them each review etc. These highend enthusiast forums just can easily cloud ones judgement of how the general gaming public actually are and what they know, care etc even if educated.
And "apples to apples" doesnt make sense when new GPUs come to the market. But you can show how these new features work, what they can provide and what the drawbacks are.
Of course apples to apples makes sense, why wouldn't it? What I mean with the rest being bonus and coming later being ok is just that, they're new features and can and should be introduced, but hardware reviews should always start from apples to apples comparisons. Otherwise we're on a slippery slope where whoever dares go lowest wins.
(and yes, we do have DLSS3 tests in our 4060 Ti review too)
 
I think RX 7600 reviews are due tomorrow? so will be interesting to see if anything changes from the review perspective.
 
I think RX 7600 reviews are due tomorrow? so will be interesting to see if anything changes from the review perspective.
It's not aimed against 4060 Ti, but based on known specs it will be just as disappointing of a upgrade over last gen (Navi23)
 
And yet they've stopped talking about TLOU since the VRAM issue was addressed.

Crazy huh?
No they literally went over it again in their new video and showed how 8GB is still a liability in this game and others.

Y'all keep proving how unreasonable y'all are in your attempt to paint HUB as being unreasonable. It's embarrassing.
 
Except it isn't and it's embarrassing how they still can't test and come to correct conclusions.
You're just hoping making deliberately false claims will pass for those not paying attention or something, but everything you're saying is garbage. They have made a very clear, objective case for why 8GB isn't enough at this tier of product. The only people arguing otherwise are those who have an agenda.

As is often the case, your accusations are just a form of projection.
 
You're just hoping making deliberately false claims will pass for those not paying attention or something, but everything you're saying is garbage. They have made a very clear, objective case for why 8GB isn't enough at this tier of product. The only people arguing otherwise are those who have an agenda.

As is often the case, your accusations are just a form of projection.

Stop going off an a tangent, we're discussing TLOU here so stick to it.

The game is now perfectly playable with 8GB VRAM and any stutters are now caused by the latest update doing shader comp during gameplay where as it never used too.

They're absolute clowns.

But as I asked them on Twitter before they ran away.

Name all the games an 8GB GPU can't run at 60fps due to VRAM.........go...
 
Stop going off an a tangent, we're discussing TLOU here so stick to it.

The game is now perfectly playable with 8GB VRAM and any stutters are now caused by the latest update doing shader comp during gameplay where as it never used too.

They're absolute clowns.

But as I asked them on Twitter before they ran away.

Name all the games an 8GB GPU can't run at 60fps due to VRAM.........go...
We are not just discussing TLOU. That's literally my whole point here. You want to limit the discussion to TLOU so it makes it seem like that's all HUB has talked about, but it's not at all. Which ruins your entire argument.

Not that TLOU is some winning argument for people like you trying to make the case that 8GB is perfectly fine, either.

The audacity of calling anybody else 'clowns' is truly just cringe-worthy.
 
You want to limit the discussion to TLOU so it makes it seem like that's all HUB has talked about, but it's not at all.

Well yes as that's what I replied, you just don't like it because I'm right.

Do you think it's OK to test GPU's with unrealistic settings and claim it's VRAM when it's simply the GPU's just can't handle the settings?

And at playable settings for 60fps, 8GB is fine for 1080p.

Prove me wrong.
 
No they literally went over it again in their new video and showed how 8GB is still a liability in this game and others.

Unless they go into more detail than what they did at the start of their specific vram chapter in the video, they only tested it at the Ultra preset.

At least on my 3060, when selecting the Ultra preset, it sets the texture streaming rate to "Fastest", which is the same level of streaming as the release. The main benefit of the new patches as it relates to vram was twofold:

1) The large increase in quality for medium textures, making them a viable option now - drastically lower vram used, but the new quality preserves much of the games artistic vision.
2) The new streaming options, specifically "Fastest" (original), "Fast", and "Normal". Each of these significantly lowers vram usage without reducing texture quality, expense being potential pop-in (really only visible at certain points with "Normal" ime).

So basically, they ignored the very feature the patches introduced which significantly lessens the vram required. It's fine to test it at Ultra and report the results, but to do that and also ignore the improvements the patches provides is disingenuous, at least for this game. You're not representing how the game actually runs on an 8GB card now.

Also with Plague's Tale: With RT Shadows enabled, I get those same massive stutters they get, even though when running at 4k, DLSS quality, every other setting at medium (console settings) to basically shoot for "4k" 30fps, I'm using under 8GB of my available 12GB of vram. So it's not definite that it's actually a vram issue, and potentially more of a screwed-up RT implementation. Again, perfectly valid to report it especially when it's at 1080p as for $400 you should expect 1080p ultra, but you're not giving the full context of what the sacrifices an 8GB card owner may have to make in games now.

Don't have much disagreement with their overall review of the 4060ti mind you, but I would not use them as the sole source for what running 8GB cards in 2023 actually means, at least in terms of what you're giving up. As they say, you shouldn't have to give up anything for a card in this price tier when compared to the same priced consoles, and you may very well have to do that with some texture quality settings, now and in the future. That's still 'very bad' - but it's not quite as catastrophic as their benchmarks would indicate. They should provide a little more context.
 
Last edited:
Back
Top