AMD RDNA3 Specifications Discussion Thread

Well, everyone round here who says that 3080(ti) ray tracing performance with DLSS speed-up is good enough to play max RT in all games will be happy with 7900XTX RT with FSR 3.

Yuck.

Well the vast, vast majority of the PC gaming public have to make those compromises too, as they're not buying $1600 cards. Anyone buying a card under $1000 - still a very high price for a GPU - will have to accept such levels of RT performance, even from Nvidia.

The upcoming 4070 is where it gets far more interesting.
 
Kinda but that performance has been on the market for 2 years now, and it was launched at $700...

The biggest elephant in the room now is how a $1000 7900XTX will compete against a $1200 4080. 355W vs 320W should also be interesting.
Yes, that's why I reckon 7900XTX is a $700 card. But AMD thinks it can sell this for $999.

Also, notice, I said 3080 with (Ti) in brackets :)

As far as future games and ray tracing is concerned, I think AMD's only escape route is UE5...

Not happy to hear AMD seemingly talk about proprietary RT code for RDNA 3.

For me personally it wouldn't be enough of an increase over my 3080 to warrant a purchase. The big games I am looking forward too would be Starfield , fable , elder scrolls 6 , avowed , dead space remake and Calisto protocol. I think I can safely hold off and wait to see what the new carsd look like performance wise on those games as they come. Think my 3700 and am4 socket might get an upgrade . The 5800 3d dropped offically to $330 so could see that sub $300 for black friday and maybe we will see a good 3d cache 7x00 series and I will just jump onto that.

I think these cards are a good jump relative to the previous amd line up. However I think its tough compared to the current Nvidia line up. I do wonder if we actually had benchmarks of the 4080 runt if this wouldn't have looked better
Your CPU would not be fast enough for a 4090, it seems and maybe not for a 4080 16GB, either... I can't tell what resolution you use for games.

On the other hand, I hope that ray tracing in more games will make talk about CPU-limited graphics cards redundant. RTX 4090 should have no trouble demonstrating real value against RX 7900XTX.

I can't help thinking that RDNA 4 needs to arrive autumn 2023 or else AMD can join Intel in the irrelevant graphics card tech business.
 
Last edited:
Which "fat margins"? A 533mm^2 chip loses to a 378mm^2 chip is bad. The 4080 will be so much better. Good luck trying to convince enough developers not to put Raytracing in your games.

More and more games will 'have' raytracing, of course. The question is when the biggest install base of modern GPU's - that being the consoles - have the weakest implementation, how much effort will developers put into actually making it an integral experience, or at least at a level where the RTX advantages can really shine through?

Maybe we'll get a lot more Metro Exoduses, sure - where consoles get ~1080p but you're getting 3-4X the performance on higher-end Nvidia's due to the huge RT advantage. But I suspect we'll also continue to see a lot of stuff like Sackboy where the advanced levels of RT are hilariously unoptimized, or something like Re:Village, where they don't bother to up the fidelity nearly enough over the console-optimized settings. Enhancements that actually take a good degree of developer effort are seemingly much rarer as opposed to just dialing up the precision.

That's an MCP.
Most people won't turn the RT on for years to come.
It's the age of HRR monitors that steadily gained in res over past 2 years.
The upcoming 4k@240 panels do be kinda snazzy.

This is one angle I think the "RT or nothing" crowd downplays a bit - gamers expect much higher frame rates these days for their high-end hardware too. These two aspects of image quality are in conflict. RT becoming the basis for new games has not only the consoles pushing against that, but also the demands from high-refresh rate gaming.
 
Figured out the answer to my own question.... 1.5x RT perf per CU. But CU counts didn't increase very much due to the new design, so 1.575 for the XT and 1.8 for the XTX. RIP.
 
More and more games will 'have' raytracing, of course. The question is when the biggest install base of modern GPU's - that being the consoles - have the weakest implementation, how much effort will developers put into actually making it an integral experience, or at least at a level where the RTX advantages can really shine through?

Maybe we'll get a lot more Metro Exoduses, sure - where consoles get ~1080p but you're getting 3-4X the performance on higher-end Nvidia's due to the huge RT advantage. But I suspect we'll also continue to see a lot of stuff like Sackboy where the advanced levels of RT are hilariously unoptimized, or something like Re:Village, where they don't bother to up the fidelity nearly enough over the console-optimized settings. Enhancements that actually take a good degree of developer effort are seemingly much rarer as opposed to just dialing up the precision.
There is literally no effort on the developer side. Just activate it in the UE4. Why do you think there are so many UE4 games with Raytracing? AAA developers are the one who dont care because the publisher have the money to spend on faking everthing.
 

Some simple calculated performance figures.

This is EXCELLENT performance for the price. Wow! AMD is killing it.

While I really love RT I realize for the vast majority of gamers, AMD is the far better choice compared to Nvidia. Most people don't care about RT.

Most games will be based on UE5 in the future, and as we know from the UE5 thread, it does have a very competent Software RT solution which still looks great in most instances. So theoretically, if devs were to use that, the 7900XTX would still perform very close to the 4090 even in a next generation game because no Ray accelerators / RT cores are in use then.
 
There is literally no effort on the developer side
jayzus.
PLEASE DON'T SAY IT DON'T SAY IT.
we already have DLSS2/3 and FSR2, 3 in development
Yes the ways to cope with the perf hits but most people actually use those just to get moar FPS to drive their HRR monitors.
I remember when people were begging for DLSS in Warzone and that's a game where absolutely no one ever touched that RT setting.
 
Performance figures match up with 20gbps ram.

The, 7950xt (gods this naming is a mess) is next year then, 24gbps/higher wattage/etc. 25% faster performance overall. Yay bandwidth limitations.

Ohwell, sounds like a good launch for what they could get in time given the price. Probably really wanted to get something out for the holidays.

More importantly: AI engines and FSR3 just straight out. Lolwut.
 
Last edited:
Back
Top