Value of Hardware Unboxed benchmarking

That in itself is disingenuous as well though.
We have no indication that "an architectural overhaul" would lead to anything being better. There are no better architectures on the market, on the contrary everyone is chasing Nvidia's h/w.
And while moving to a more advanced node would allow for some power savings it is highly likely that the same complexity increase would in fact cost even more to the end buyer making perf/price on 5090 even worse.

Yeah it’s more wishful thinking comparing reality against some better imaginary outcome. That’s why things will be a whole lot more interesting if AMD can significantly improve performance per CU on 4nm with RDNA 4. Then we can compare an actually better architecture and not fantasy.
 
Well that's not actually some universal truth at all, but regardless, it changes nothing about consumers being entirely right to complain.

Something being better for a company's financials doesn't mean that consumers should just swallow it and be ok with it. Do you realize how utterly absurd that line of thinking is?

You're basically saying, "So long as it helps make a company more money, you have no right to complain as a consumer". I cant stress enough how utterly bizarre this forum is in pushing this line of thinking constantly when it comes to defending Nvidia. Absolute bonkers.
I never said anything about consumers not having a right to complain.

1.) Anyone can complain about anything at any time. Death, taxes, Donald Trump getting elected etc.
2.) The addressable failure is a lack of competition in the graphics card market.
3.) Claiming the failure is caused by companies seeking profit is essentially saying the problem is that we live in a capitalistic economic system, which seems to me to be like complaining about death or taxes.
 
That’s why things will be a whole lot more interesting if AMD can significantly improve performance per CU on 4nm with RDNA 4. Then we can compare an actually better architecture and not fantasy.
That's if AMD will in fact price these at a level where they won't be just the usual +10% on perf/price (and only in "rasterization") and would instead provide a pricing scene disruption. Which I honestly doubt that they will.
 
That's if AMD will in fact price these at a level where they won't be just the usual +10% on perf/price (and only in "rasterization") and would instead provide a pricing scene disruption. Which I honestly doubt that they will.

From a pricing perspective sure I don’t expect AMD to leave money on the table unnecessarily. It’ll be interesting to see what they do with 64 CUs from an architecture standpoint against the 70 SM 5070 Ti and 96 CU 7900XTX.

The 5070 Ti may not quite catch the 4080 based on what we’re seeing from the 5090. Power budget is 20W lower and there’s no increase in perf/W.

If the 9070 XT can catch the 5070 Ti for a few dollars less then AMD’s naming gamble would’ve worked out. Now whether it’ll sell is a whole other story.
 
From a pricing perspective sure I don’t expect AMD to leave money on the table unnecessarily. It’ll be interesting to see what they do with 64 CUs from an architecture standpoint against the 70 SM 5070 Ti and 96 CU 7900XTX.
Higher clocks - which in turn won't be that much higher than what Nvidia did back in 22 on Lovelace already. So that's hardly an architectural advantage worth mentioning.

On that topic 5090 seem to hit 2.9GHz at max:
clock-vs-voltage.png


5080 having the highest specced boost will probably go higher. So it remains to be seen if RDNA4 will even have any sort of clock advantage.

The 5070 Ti may not quite catch the 4080 based on what we’re seeing from the 5090. Power budget is 20W lower and there’s no increase in perf/W.

If the 9070 XT can catch the 5070 Ti for a few dollars less then AMD’s naming gamble would’ve worked out. Now whether it’ll sell is a whole other story.
Sure but to be any sort of an indication of a better architecture it will have to do so at a considerably smaller chip with a considerably lower retail price. Otherwise it will be just another confirmation that Nvidia's h/w is still the best on the market and everyone else are just adjusting to that.
 
In general Steve focuses on consumer outcomes and not the “why” behind them which is a valid approach to take. It certainly plays well with his fans.
How else are you supposed to review hardware???? Consumer outcomes are literally the only thing that should matter to a consumer at the end of the day, the rest is the concern of the manufacturer.
 
How else are you supposed to review hardware???? Consumer outcomes are literally the only thing that should matter to a consumer at the end of the day, the rest is the concern of the manufacturer.

Of course it’s fine for Steve to do that for his content but there are people in the broader community who are interested in the bigger picture. And thankfully other “journalists” cover other angles as well. You must know that at this forum in particular people care about other things besides raster perf/$.
 
Because AMD want to ride the coattails of Nvidia's rising margins, too? Because AMD can also be greedy? It's really not complicated guys.

Do I need to bring up Intel here, because you seem to be completely forgetting them?
It's unlikely that Intel makes much money at all on the GPUs they sell. It's unclear if they even break even on the B570/B580. Certainly with all the costs added up they are losing lots of money in the dGPU market. And big picture, Intel as a company is rapidly declining.
 
Yeah Intel is anything but an indication that you can sell these chips at such prices. With how they are doing now I dunno how long they'll be able to still operate.
 
It seems to be 30-40% faster in most heavily GPU limited scenario's. I don't see what all the fuss is about, that's a fairly decent gen on gen increase by historical standards. The 4090 was a massive outlier in terms of performance increase.
 
It seems to be 30-40% faster in most heavily GPU limited scenario's. I don't see what all the fuss is about, that's a fairly decent gen on gen increase by historical standards. The 4090 was a massive outlier in terms of performance increase.
yeah the 4090 broke a lot of brains it seems
 
Back
Top