AMD RX 7900XTX and RX 7900XT Reviews

All IHVs tend to know very well of what's coming from their competitors long before this is unveiled to the public. They can easily change the cooler designs and power limits at least.
Then their position doesn't make much sense to me! They used assertive wordings like the leader in perf/watt, the best in class in perf/watt, and the most advanced GPU, despite not having the performance or the power advantage, it's the first time I have seen them do this, it's like their marketing was dissociated from reality!
 
I find it hard to believe. All IHVs tend to know very well of what's coming from their competitors long before this is unveiled to the public. They can easily change the cooler designs and power limits at least.
I think nvidia has spies inside amd.. jensen hinted to that when he trashed vega
 
I think nvidia has spies inside amd.. jensen hinted to that when he trashed vega
They don't need spies. They work in the same industry which means that they use the same tools and people are changing places from one IHV to another all the time. It is fairly easy to predict / model what the competition will be able to achieve if you know the base from which they are working and more or less the target which they are aiming at.
 
But what is wrong with the Hardware? I have found two things which are totaly strange:

1. The MS Shader are not working or working Bad in Syntehtick benchbmarks?
2. The Compute Shader shows only half the Power they can output compared to 6900xt?

See Picture Bewlow from 4090 and from 7900xtx from the 3d Center forum:
Mashshader.jpgpolygons5.jpg
 
Last edited:
This is interesting. I don’t recall seeing this behavior on N21. Can anyone else confirm N31 is pulling power from RAM to keep board power in check?
TlDw: The power bug is so bad the card starts massively underclocking the memory to maintain die power.

Which is... :oops: how tf did they ship this? Well I guess all those attempts at trying to parse out performance per silicon have to take this into account as well.
 
My guess is this GPU was a 450+w monster with a massive cooler and 3Ghz+ clock speeds.

Then the 4090 released and AMD saw all the hate and bad press from people about the sheer size of the thing and it's power draw.

AMD wanting to avoid the same negative feedback dropped the power requirements down, neutered the cooling and in the process destroyed the GPU's performance just so they could brag on stage how you don't need to change your PC case or upgrade your PSU.
 
My guess is this GPU was a 450+w monster with a massive cooler and 3Ghz+ clock speeds.

Then the 4090 released and AMD saw all the hate and bad press from people about the sheer size of the thing and it's power draw.

AMD wanting to avoid the same negative feedback dropped the power requirements down, neutered the cooling and in the process destroyed the GPU's performance just so they could brag on stage how you don't need to change your PC case or upgrade your PSU.
Even at 450 watts it looses to a 4090 and in efficiency.. amd can't talk about efficiency anymore nvidia will be more efficient on the whole product stack
 
This is interesting. I don’t recall seeing this behavior on N21. Can anyone else confirm N31 is pulling power from RAM to keep board power in check?
Probably the GPU gets so unstable that memory downclocks due to stalling pipelines / uncorrectable errors etc. Considering it's Jay, he probably did a slapdash attempt at OC'ing and it eventually got so unstable this weird behaviour appeared. We have to wait until the people at igorslab or OCN get how to circumvent the new layer of (idiotic) blocks against power fiddling (why on Earth did AMD try to do this now when they are on a backstep I can't fathom)
 
My guess is this GPU was a 450+w monster with a massive cooler and 3Ghz+ clock speeds.

Then the 4090 released and AMD saw all the hate and bad press from people about the sheer size of the thing and it's power draw.

AMD wanting to avoid the same negative feedback dropped the power requirements down, neutered the cooling and in the process destroyed the GPU's performance just so they could brag on stage how you don't need to change your PC case or upgrade your PSU.

You can’t “neuter the cooling” in just a few weeks. Companies don’t turn on a dime.
 
What exactly is broken in N31? I understand that it was supposed to clock higher, but is there also something else going on here? Some functionality that is completely broken and turned off? There's a lot of information in this thead, but I find myself largely enable to sort the legitimate theories and evidence from baseless speculation or FUD.
 
What exactly is broken in N31? I understand that it was supposed to clock higher, but is there also something else going on here? Some functionality that is completely broken and turned off? There's a lot of information in this thead, but I find myself largely enable to sort the legitimate theories and evidence from baseless speculation or FUD.

I think the clue is in the split clocks, it just seems really weird to do that and to me only makes sense to do it if there's a part of the chip that has a problem and it's holding back the other parts.
 
Back
Top