AMD RX 7900XTX and RX 7900XT Reviews

Joker Productions with a nice comparision between 7900XTX and 4080 using a overlay:

No clue which clock rate this overlay reads but in F1 22 the 7900XTX can clock under 2000MHz. That explains the low performance in a few games:
Why does the 7900 show 0.5-2.5GB more “VRAM used” than the 4800, especially with RT? It also clocks way lower in some RT tests vs. the rasterized test.

Guardians of the Galaxy and F1 2022 show the biggest VRAM deltas. It clocks quite high in both Guardians tests but is much slower than the 4800 in both. It clocks quite low in both F1 tests yet is only slow with RT.
 
Last edited:
Why does the 7900 show 0.5-2.5GB more “VRAM used” than the 4800, especially with RT? It also clocks way lower in some RT tests vs. the rasterized test.
Likely VRAM allocation. Cards with more VRAM almost always always show a higher "usage" than those with less despite running identical settings and resolution.
 
7900xtx is an oc Monster: 3.2 ghz…
Too bad they don't show power consumption numbers. Plus the 4090 in their test uses a 5800X. They updated the 4080 to a 13900K but didn't bother updating the 4090.
 
Joker Productions with a nice comparision between 7900XTX and 4080 using a overlay:

No clue which clock rate this overlay reads but in F1 22 the 7900XTX can clock under 2000MHz. That explains the low performance in a few games:

Wow, the clockspeeds are atrocious in that video with F1 RT. Is it thermally throttling when the core temps are in 80s, because it dropped down to 1.5GHz which is OG Vega territory. :no:

I found another video with overlay, though not as reputable channel as Joker's, showing almost 500-600MHz of clock deficit for 7900XTX against 4090 in rasterized games.


TPU's review otoh showed 2.6GHz median, though again AMD's clocks are a bit confusing this gen.

 
AMD and Nvidia are doing their absolute best to dissuade gamers from upgrading. I’ve upgraded almost every major performance jump since 9800 pro days but here I am going on 7 years without an upgrade and little desire to do so.

Seems thats a trend not just by nv and amd or the pc space.
 
I found another video with overlay, though not as reputable channel as Joker's, showing almost 500-600MHz of clock deficit for 7900XTX against 4090 in rasterized games.
AMD didn't changed their power sensor reading as the total board power? The overlay value never reachs 355W at all.
 
AMD didn't changed their power sensor reading as the total board power? The overlay value never reachs 355W at all.
No they didn't. On that subject, Igor tested the power with his advanced equipment, and found these results, he measures power of all tested games, then averages the values: 7900XTX consumed 26% more power than 4080.

1671012328927.png
He breaks down the power for raster and ray traced games as well, in RT the 7900XTX burns 20% more power, but in raster it burns 30% more.

1671012721443.png
1671012753281.png


 
Last edited:
No they didn't. On that subject, Igor tested the power with his advanced equipment, and found these results, he measures power of all tested games, then averages the values: 7900XTX consumed 26% more power than 4080.

View attachment 7805
He breaks down the power for raster and ray traced games as well, in RT the 7900XTX burns 20% more power, but in raster it burns 30% more.

View attachment 7809
View attachment 7810


8nm ampere more efficient in rt than 5nm 7900 xt 😁💻
 
nice review and overall nice conclusion

hihi, Barbie girl enthusiast gamer niche : )

RDNA's RT implementation requires >2x BLAS memory compared to Nvidia's.
Seems AMD didn't fixed it.
I take this as a hint NV uses treelet compression, with spatial branches of BVH requiring only low precision for the bounding boxes.
RDNA2 instructions would not allow this, due to precision not being flexible, afaict.

Would be very interesting to see Intels numbers in this regard, too improve the speculations. Could be just a matter of branching factor too, which we only know for AMD to be 4.
 
Back
Top