AMD RX 7900XTX and RX 7900XT Reviews

So what, exactly, is broken in the released RNDA3 hardware, apart from the inability to hit the forecasted clocks?
 
Nothing. The inability to hit clocks is claimed to be due bug but remains to be confirmed later if/when they do A1 for 7950 refresh.

We can see the power bug pretty easily, it's definitely there.

AMD should use CES to pre-announce a fixed GPU, whichever one is closest, if only just showing off performance and announcing a much later release date. Get people thinking about other things.
 
After being on such a roll the last few years with Ryzen and RDNA2 to then put themselves in the position where they loose all the ground and good work they've done on their image is sad.
 
So what, exactly, is broken in the released RNDA3 hardware, apart from the inability to hit the forecasted clocks?

As far as I remember, there are a couple of documented bugs in the Linux driver but nothing that should make it considered as "broken".
What I can understand at the moment is that the clocks could not hit higher values due to the power usage in graphics workloads but the fact that they can reach more than 3GHz in compute tasks tells me (which is not a VLSI hardware expert) that the ALU/shader core itself is not the culprit there but there is some other part of the pipeline which can be responsible for the power consumption. Other than that, I can also see that the architecture change to dual issue ALUs is probably impacting the driver optimizations and thus there is a lot of work to do on that side.
 
This really seems to have upset Hardware Unboxed. For a while now they seemed to have been saying RT is not important, FSR is as good as DLSS etc and they have just gone nuclear as of this review and are using things that they previously deemed not important to sink the boot in. Like in that tweet in a reply he said nvidia didn't really lie cause frame generation and when asked about aib cards brushed it off with i've got other things to work on before I bother with them. They are one of the places I expected to be a bit more upbeat about the 7900xtx.
 
This really seems to have upset Hardware Unboxed. For a while now they seemed to have been saying RT is not important, FSR is as good as DLSS etc and they have just gone nuclear as of this review and are using things that they previously deemed not important to sink the boot in. Like in that tweet in a reply he said nvidia didn't really lie cause frame generation and when asked about aib cards brushed it off with i've got other things to work on before I bother with them. They are one of the places I expected to be a bit more upbeat about the 7900xtx.
Upbeat about what? This thing is a huge letdown. AMD has completely failed to capitalize on the 4080's absurd pricing. And the timing of this failure is the worst I can recall. At least R600 didn't come at a time when NVIDIA was raising prices by 70% (3080 to 4080).
 
Upbeat about what? This thing is a huge letdown. AMD has completely failed to capitalize on the 4080's absurd pricing. And the timing of this failure is the worst I can recall. At least R600 didn't come at a time when NVIDIA was raising prices by 70% (3080 to 4080).
Which should maybe point us all to the actual reason behind current day pricing instead of just bashing Nvidia or AMD or anyone who doesn't actually produce the goods in question? Eh, here's hoping.
 
This really seems to have upset Hardware Unboxed. For a while now they seemed to have been saying RT is not important, FSR is as good as DLSS etc and they have just gone nuclear as of this review and are using things that they previously deemed not important to sink the boot in. Like in that tweet in a reply he said nvidia didn't really lie cause frame generation and when asked about aib cards brushed it off with i've got other things to work on before I bother with them. They are one of the places I expected to be a bit more upbeat about the 7900xtx.

Yea its a weird turn around. I actually think the card is pretty decent for its pricing. We will really need to see what the $900 nvidia part is like to really judge this. Anyone thinking this was going to be on par across the board with a 4080 16gig when it was priced $200 cheaper was out of their mind
 
RDNA2 was much closer to the RTX3000 in raster than anyone expected them to get.

The 6900 was a curve ball no one expected.

RDNA2 being close to RTX3000 was mostly due to RTX3000 series staying at the same clocks that nvidia achieved with GTX1000 series more than 4 years prior.

RDNA2 clocks were quite likely to be few hundred MHz higher than the PS5's, and the relatively little architecture change meant that you could easily gauge where it would land. otoh RTX3000 pulling over 300W for keeping itself afloat around 1.8-1.9GHz was so far out of left field that AMD were able to push 6900XT at $1k price point.
 
Which should maybe point us all to the actual reason behind current day pricing instead of just bashing Nvidia or AMD or anyone who doesn't actually produce the goods in question? Eh, here's hoping.
What exactly are we looking at in terms of BOM for the 4080 vs the 3080? TSMC 5nm is far more expensive than Samsung 8nm, and GA102 is far larger than AD103; 3080 has bigger memory bus but less memory etc., so I'm not sure how this shakes out. 4080 probably does cost more to make, but 70% seems like o_O
 
What exactly are we looking at in terms of BOM for the 4080 vs the 3080? TSMC 5nm is far more expensive than Samsung 8nm, and GA102 is far larger than AD103; 3080 has bigger memory bus but less memory etc., so I'm not sure how this shakes out. 4080 probably does cost more to make, but 70% seems like o_O
Rough rule of thumb is cost-per-transistor has been flat or going up slightly (note: NOT cost-per-mm^2, that's gone up massively), so you can get a rough ballpark estimate of the relative cost of each die. It's all very fuzzy though because of vendor negotiations, discounts, long-term relationships etc.

Also, NV wouldn't necessarily increase the sticker price of each sku in exact proportion to the cost of that die. They clearly rebalanced the entire stack, shifting margins away from the 90 to the 80. So at least some part of that 70% increase is paying for the 4090. This sort of shuffling is all par for the course.

Remember that this is a company that took a major haircut on the lingering GA102 inventory due to a demand drop. So this stupid narrative that suggests that NV is "expecting gamers will pay pandemic/crypto prices" doesn't make any sense. They obviously know their immediate demand profile. Prices have likely been set out of necessity and with careful planning, not some cartoon sales chief going "muhahaha". Obviously some bets have been made, and the market has the final say on whether those bets pan out.
 
Rough rule of thumb is cost-per-transistor has been flat or going up slightly (note: NOT cost-per-mm^2, that's gone up massively), so you can get a rough ballpark estimate of the relative cost of each die. It's all very fuzzy though because of vendor negotiations, discounts, long-term relationships etc.

Also, NV wouldn't necessarily increase the sticker price of each sku in exact proportion to the cost of that die. They clearly rebalanced the entire stack, shifting margins away from the 90 to the 80. So at least some part of that 70% increase is paying for the 4090. This sort of shuffling is all par for the course.

Remember that this is a company that took a major haircut on the lingering GA102 inventory due to a demand drop. So this stupid narrative that suggests that NV is "expecting gamers will pay pandemic/crypto prices" doesn't make any sense. They obviously know their immediate demand profile. Prices have likely been set out of necessity and with careful planning, not some cartoon sales chief going "muhahaha". Obviously some bets have been made, and the market has the final say on whether those bets pan out.
This makes me wonder why CPU prices are not skyrocketing. Possible reasons are: Intel has no advanced manufacturing :mrgreen: and AMD is using the premium process sparingly with chiplets. Also I guess CPU performance is not really increasing at a comparable rate.

If chiplets are the answer, too bad they fucked it up with N31. Whether it was a bug or just lackluster design, N31 is offering similar value proposition to the abysmal 4080. This won't be earning AMD any marketshare.

Also can anyone source Jen-Hsun on that conference call talking about how they were trying to get GPU prices back to pandemic/mining levels? I know he said it on an investor call but I can't find it.
 
Last edited:
Back
Top