Value of Hardware Unboxed benchmarking

That in itself is disingenuous as well though.
We have no indication that "an architectural overhaul" would lead to anything being better. There are no better architectures on the market, on the contrary everyone is chasing Nvidia's h/w.
And while moving to a more advanced node would allow for some power savings it is highly likely that the same complexity increase would in fact cost even more to the end buyer making perf/price on 5090 even worse.

Yeah it’s more wishful thinking comparing reality against some better imaginary outcome. That’s why things will be a whole lot more interesting if AMD can significantly improve performance per CU on 4nm with RDNA 4. Then we can compare an actually better architecture and not fantasy.
 
Well that's not actually some universal truth at all, but regardless, it changes nothing about consumers being entirely right to complain.

Something being better for a company's financials doesn't mean that consumers should just swallow it and be ok with it. Do you realize how utterly absurd that line of thinking is?

You're basically saying, "So long as it helps make a company more money, you have no right to complain as a consumer". I cant stress enough how utterly bizarre this forum is in pushing this line of thinking constantly when it comes to defending Nvidia. Absolute bonkers.
I never said anything about consumers not having a right to complain.

1.) Anyone can complain about anything at any time. Death, taxes, Donald Trump getting elected etc.
2.) The addressable failure is a lack of competition in the graphics card market.
3.) Claiming the failure is caused by companies seeking profit is essentially saying the problem is that we live in a capitalistic economic system, which seems to me to be like complaining about death or taxes.
 
That’s why things will be a whole lot more interesting if AMD can significantly improve performance per CU on 4nm with RDNA 4. Then we can compare an actually better architecture and not fantasy.
That's if AMD will in fact price these at a level where they won't be just the usual +10% on perf/price (and only in "rasterization") and would instead provide a pricing scene disruption. Which I honestly doubt that they will.
 
That's if AMD will in fact price these at a level where they won't be just the usual +10% on perf/price (and only in "rasterization") and would instead provide a pricing scene disruption. Which I honestly doubt that they will.

From a pricing perspective sure I don’t expect AMD to leave money on the table unnecessarily. It’ll be interesting to see what they do with 64 CUs from an architecture standpoint against the 70 SM 5070 Ti and 96 CU 7900XTX.

The 5070 Ti may not quite catch the 4080 based on what we’re seeing from the 5090. Power budget is 20W lower and there’s no increase in perf/W.

If the 9070 XT can catch the 5070 Ti for a few dollars less then AMD’s naming gamble would’ve worked out. Now whether it’ll sell is a whole other story.
 
From a pricing perspective sure I don’t expect AMD to leave money on the table unnecessarily. It’ll be interesting to see what they do with 64 CUs from an architecture standpoint against the 70 SM 5070 Ti and 96 CU 7900XTX.
Higher clocks - which in turn won't be that much higher than what Nvidia did back in 22 on Lovelace already. So that's hardly an architectural advantage worth mentioning.

On that topic 5090 seem to hit 2.9GHz at max:
clock-vs-voltage.png


5080 having the highest specced boost will probably go higher. So it remains to be seen if RDNA4 will even have any sort of clock advantage.

The 5070 Ti may not quite catch the 4080 based on what we’re seeing from the 5090. Power budget is 20W lower and there’s no increase in perf/W.

If the 9070 XT can catch the 5070 Ti for a few dollars less then AMD’s naming gamble would’ve worked out. Now whether it’ll sell is a whole other story.
Sure but to be any sort of an indication of a better architecture it will have to do so at a considerably smaller chip with a considerably lower retail price. Otherwise it will be just another confirmation that Nvidia's h/w is still the best on the market and everyone else are just adjusting to that.
 
In general Steve focuses on consumer outcomes and not the “why” behind them which is a valid approach to take. It certainly plays well with his fans.
How else are you supposed to review hardware???? Consumer outcomes are literally the only thing that should matter to a consumer at the end of the day, the rest is the concern of the manufacturer.
 
How else are you supposed to review hardware???? Consumer outcomes are literally the only thing that should matter to a consumer at the end of the day, the rest is the concern of the manufacturer.

Of course it’s fine for Steve to do that for his content but there are people in the broader community who are interested in the bigger picture. And thankfully other “journalists” cover other angles as well. You must know that at this forum in particular people care about other things besides raster perf/$.
 
Because AMD want to ride the coattails of Nvidia's rising margins, too? Because AMD can also be greedy? It's really not complicated guys.

Do I need to bring up Intel here, because you seem to be completely forgetting them?
It's unlikely that Intel makes much money at all on the GPUs they sell. It's unclear if they even break even on the B570/B580. Certainly with all the costs added up they are losing lots of money in the dGPU market. And big picture, Intel as a company is rapidly declining.
 
Yeah Intel is anything but an indication that you can sell these chips at such prices. With how they are doing now I dunno how long they'll be able to still operate.
 
It seems to be 30-40% faster in most heavily GPU limited scenario's. I don't see what all the fuss is about, that's a fairly decent gen on gen increase by historical standards. The 4090 was a massive outlier in terms of performance increase.
yeah the 4090 broke a lot of brains it seems
 
It seems to be 30-40% faster in most heavily GPU limited scenario's. I don't see what all the fuss is about, that's a fairly decent gen on gen increase by historical standards. The 4090 was a massive outlier in terms of performance increase.
Lovelace had the luxus of a two full node jump.
 
Lovelace had the luxus of a two full node jump.
It's even less about that and more about the fact that 3090 wasn't as big of a jump on 2080Ti as it could have been if Nvidia wouldn't switch to 8N process for Ampere.
So the gains shown by Lovelace top end is the result of gains missing from Ampere top end.
 
Even with the 4090, it was "only" about 50% faster than the 3090Ti -- which, when considering the aforementioned two node jump and increase in transistors and outright speeds, shouldn't be surprising to anyone at all.

Given the 5090 is on functionally the same lithography generation as the 4090, I'm somewhat coming around to the thought of calling it a 4090Ti when speaking purely from a gamers perspective. It's still not a 4090Ti, as integer workloads will absolutely get a significant boost as the underlying INT units have indeed changed. But I do get the vibe...
 
It's even less about that and more about the fact that 3090 wasn't as big of a jump on 2080Ti as it could have been if Nvidia wouldn't switch to 8N process for Ampere.
So the gains shown by Lovelace top end is the result of gains missing from Ampere top end.
It is the number one reason. Samsung 8nm is a full node jump from TSMC's 16nm. The 3090 was ~45% faster in pure rasterizing and 60%+ in raytracing than the 2080Ti.
 
It seems to be 30-40% faster in most heavily GPU limited scenario's. I don't see what all the fuss is about, that's a fairly decent gen on gen increase by historical standards. The 4090 was a massive outlier in terms of performance increase.
It’s 30% faster for 30% more money. Thats why this is called a 4090ti, there’s not really a price/performance bump.
 
Depends on the benchmark, but most of the 4k benches seemed to do at least as well as 30%, and a few of the raytraced benches showed as much as fifty percent. Cyberpunk at 4k ultra was a good example. If you're playing with less than 4k then it's probably not a good upgrade for your gaming-only rig.

And MSRP to MSRP, its a 25% price hike over the 4090. I'm not gonna tell anyone it's a great value, but it's not the flatness people are trying to say it is.
 
Depends on the benchmark, but most of the 4k benches seemed to do at least as well as 30%, and a few of the raytraced benches showed as much as fifty percent. Cyberpunk at 4k ultra was a good example. If you're playing with less than 4k then it's probably not a good upgrade for your gaming-only rig.

And MSRP to MSRP, its a 25% price hike over the 4090. I'm not gonna tell anyone it's a great value, but it's not the flatness people are trying to say it is.
I suppose if one is playing almost 100% RT heavy titles yeah this would be a decent improvement in value but I don't think that is a reality for pretty much anyone. RT titles are more plentiful now than they were 5 years ago but it's still only a handful of AAA releases every year.

Otherwise a 30% bump for even a 25% higher cost is essentially what you'd expect from a mid gen refresh. I can't say I didn't expect this, we knew there wasn't going to be a node shrink and frankly I would imagine there is not a lot of juice left to squeeze in terms of architectural improvements for raster rendering anyways.
 
I suppose if one is playing almost 100% RT heavy titles yeah this would be a decent improvement in value but I don't think that is a reality for pretty much anyone. RT titles are more plentiful now than they were 5 years ago but it's still only a handful of AAA releases every year.

Otherwise a 30% bump for even a 25% higher cost is essentially what you'd expect from a mid gen refresh. I can't say I didn't expect this, we knew there wasn't going to be a node shrink and frankly I would imagine there is not a lot of juice left to squeeze in terms of architectural improvements for raster rendering anyways.
The reality is that the $2k price is going to be unavailable even to most people buying at MSRP - all the leaked AIB models are $150-$800 higher. Hopefully some will come it at the FE MSRP. And before people start yelling about AIB greed, we have no idea what they’re paying NVIDIA for the chips - although an $800 markup for a cooler is admittedly a bit silly. Same story with the 5080s.

I was thinking of picking one up but if I arrive on launch day and all they have is a $2449 TUF, I’ll be the one guy leaving empty handed. Just can’t stomach that much more for a 30% average bump.

As for HUB, approaching it from a consumer perspective makes sense - giving consumers no context for why the uplift is less than before doesn’t. Also some of their pricing takes have been unrealistic and kinda silly, like Tim suggesting AMD should have released the 9700x for $249.
 
Last edited:
Back
Top