Value of Hardware Unboxed benchmarking *spawn

The problem with a lot of review benchmarks is they don’t disclose the test scene and the pick scenes that are consistently repeatable rather than demanding scenes. I’m not sure if the little pip of game footage in the top left of hub reviews is the test scene or not.
 
It looks like these differences are seen specifically in CPU limited titles where AMD's utilization remains much higher.

Has any other site covered this?

Probably not. Most review sites are looking at performance with ultra settings. Not many reviewers like Optimum Tech that actually play games at a high level and have interest in using hardware on low settings.
 
It looks like these differences are seen specifically in CPU limited titles where AMD's utilization remains much higher.

Has any other site covered this?

Kitguru has put every GPU through their test parcour: https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-7900-xtx-review/27/
Computerbase has numbers from 60Hz and 144Hz.

There are still a lot of other differences between these GPUs and yet you wont find anything on certain youtube channels.
 
Kitguru has put every GPU through their test parcour: https://www.kitguru.net/components/graphic-cards/dominic-moass/amd-rx-7900-xtx-review/27/
Computerbase has numbers from 60Hz and 144Hz.

There are still a lot of other differences between these GPUs and yet you wont find anything on certain youtube channels.

Doesn't really look like the same thing. Those are Ultra/Extreme/High presets. In the Optimum Tech vid I think he's showing settings that are giving around 30-40% utilization on the Nvidia GPU, which is where this power difference really shows up. That's because he plays games competitively so he's playing on low settings for best visual clarity spotting enemies etc. It would have been nice if he showed clocking behaviour and voltage. I'm guessing the AMD gpus just can't control their voltage or frequency in the same way to bring it down as far when it's not needed.
 
Techpowerup's GPU reviews reflect related power discrepancies between Ada and the 7900 series in low GPU utilization scenarios (see the 60hz results). For example:

https://www.techpowerup.com/review/asus-geforce-rtx-4060-strix-oc/39.html

I don't believe I've seen any in depth analysis on the cause of this or the situation observed by OptimumTech, but the chiplet structure seems a likely candidate as @DavidGraham pointed out. Then again, I note that the Arc 770/750 cards burn a lot of juice at 60hz too and they aren't chiplets.
 

"We demand more VRAM, give us more VRAM, where is our VRAM?!?!?"
"Here it is"
"...A $500 joke!"

Yeah, VRAM isn't free. Steve doesn't know this or something?
 
Got your take on this, good can set my watch now
"My take on this" seems to be similar to, say, Intel's take on price difference between an 8 and 16GB A770.
I do wonder why it's so hard for people like Steve to see that increasing VRAM size = increasing price, and that outside of four (five?) benchmarks crafted specifically to showcase the issues appearing due to a lack of VRAM it would just be a complete detriment to the perf/price positioning of a product.

That being said, 16GB version of 4060Ti will likely be one of the longest relevant "cheap'ish" GPUs on the current market. Especially if it will drop down some $100 over time.
 
"My take on this" seems to be similar to, say, Intel's take on price difference between an 8 and 16GB A770.

And they think a decent value proposition for the 4060Ti would be 8GB at $300, and the upper limit of $400 for 16GB (which is quite a bit more than the ASRock 16gb A770 but that is a fair bit below msrp tbf). So the same retail price as the A770 models now, looks like you're in agreement then.

and that outside of four (five?) benchmarks crafted specifically to showcase the issues appearing due to a lack of VRAM it would just be a complete detriment to the perf/price positioning of a product.

16GB is less relevant in a product of this class yes, but that's the only option Nvidia had to increase the VRAM due to the bus width decisions they took. Nvidia is 'hamstrung' by design decisions of their own making.
 
Last edited:
And they think a decent value proposition for the 4060Ti would be 8GB at $300, and the upper limit of $400 for 16GB. So the same retail price as the A770 models now, looks like you're in agreement then.
So they think that a new product using a chip made on "4N" process (whether that's closer to N5 or N4 doesn't matter really) should have the same retail price at launch as a product from a year ago using a GPU of similar complexity made on N6? And they base this line of thought on what exactly - besides the obvious "we want that" argument?
 
So they think that a new product using a chip made on "4N" process (whether that's closer to N5 or N4 doesn't matter really) should have the same retail price at launch as a product from a year ago using a GPU of similar complexity
Hey, you're the one who brought Arc into the conversation. The point was that even they think a $100 premium for 8GB of more vram would be acceptable, the issue is the starting price of the 4060TI, a card that shows little to even negative improvement over the previous gen product at the same price.

GDDR6 cost btw, from all reports, is also considerably cheaper than it was a year ago. It's not the only factor in adding more vram to a card of course, but it is a rather significant one.

besides the obvious "we want that" argument?

They're reviewing a new product marketed to consumers and seeing if it's worth the asking price vs the previous generation.
 
Last edited:
Hey, you're the one who brought Arc into the conversation. The point was that even they think a $100 premium for 8GB of more vram would be acceptable, the issue is the starting price of the 4060TI, a card that shows little to even negative improvement over the previous gen product at the same price.
I've brought the price difference between models with 8GB RAM capacity difference, nothing else. That one seems awfully similar between A770 and 4060Ti.

GDDR6 cost btw, from all reports, is also considerably cheaper than it was a year ago. It's not the only factor in adding more vram to a card of course, but it is a rather significant one.
We don't know how much AIBs - likely the biggest GDDDR customers on the market - pay for their memory chips. It is unlikely to be what the spot prices are at any given point though.

They're reviewing a new product marketed to consumers and seeing if it's worth the asking price vs the previous generation.
So does it? The previous generation had a 8GB 3070 at $500 which is basically what a $400 4060Ti 8GB is now. (That's if we forget for a moment that you weren't able to get these 3070s anywhere near such price for the majority of their lifespan.)
If you're a 3070 owner then it doesn't worth it to upgrade at the same price. I don't see anything new here, it rarely does. Most upgrades below the very top end only make sense for people using 2+ gen old GPUs.
But if you're not a 3070 owner and are simply looking at buying something at $500 then which product is better here - this "$500 joke" or a good old $500 3070? Or something else entirely maybe?
 
Thx to reddit - these are a few of the last topics:

Is 8GB Of VRAM Enough?


VRAM: Did Nvidia Mislead Gamers? RTX 4060 Ti, Worst GPU Release Ever?


The Last of Us Part I, RIP 8GB GPUs! Nvidia's Planned Obsolescence In Effect


Was 8gb of VRAM actually enough for NVIDIA's RTX 3070 and RTX 3080?


VRAM Issues, Crashing & Predicting The Future


Now look at the thumbnail of the 4060TI 16GB. I think it says enough. This channel doesnt care about objective reviews anymore.
 
Now look at the thumbnail of the 4060TI 16GB. I think it says enough. This channel doesnt care about objective reviews anymore.

Unlike other outlets, which have praised the 4060 series, right?

Also, a "few of the last topics" - that selection of videos is spread out over 4+ months. Like really, what is the point here - they said 8GB was limiting, so they should automatically praise a 16GB overpriced variant when it arrived? They also said during their VRAM-doom videos that 8GB is fine in 2023...for entry level cards. The problem is the price, something DF agrees with.

Digital Foundry said:
But the $100 price premium - is shocking.
<snip>
When really this what the $400 offering should have looked like to begin with.


If you really think this card will garner anything other that mocking, you're deluded. You can hate on the clickbait thumbnails all you want sure, and their presentation of 8GB VRAM cards as being 'crippled' was not giving the full picture and perhaps detrimental to where the pressure should have been focused on with some releases, but there are not outliers with overall gist of their 4060+ coverage - Richard from DF literally came to the exact same conclusion they did. There is nothing hypocritical at being disappointed with the stagnation in vram limits for the midrange then also decrying the 'solution' being an offering that's priced out of the midrange.
 
Last edited:
Everyone knows that the price of the 16GB is not good. But when your youtube channel is active lobbying for more than 8GB i think price is not the most relevant point.
 
Back
Top