AMD RDNA3 Specifications Discussion Thread

Doesnt look better - found this in one of the reviews:
fortnite.png

 
Doesnt look better - found this in one of the reviews:
fortnite.png


Not great, but not bad. For +20% price you get +13% (2160p), +9% (1440p) or +3% (1080p) performance depending on resolution going to the 4080.

Still not a great product, but then neither is the 4080.

Regards,
SB
 
Doesnt look better - found this in one of the reviews:
I'm not expecting it to look better, it's not about to suddenly gain compute performance in UE5 compared to all the other engines/games out there. That graph also seems to be using HW RT instead of SW though. With HW RT I would have expected the gap with the 4080 to be larger.

What it does show, albeit with very limited data from 1 game so far, is that the AMD GPUs contend rather well with a UE5 title, especially given current pricing.
 
Now that we have a production game using UE5.1, I'd really like to start seeing benchmarks of performance comparison for Nanite and Lumen. With so many developers moving to UE5, I think the reliance on hardware RT isn't going be as high as the industry was moving to previously.

UE5 support hardware rt too, and it's looking better with it, so...
 
Still not a great product, but then neither is the 4080.
Well, imo there can't be great product in this price range, which is just too expensive for a toy. We have to wait on the smaller chips.

But... Fortnite needing such thing to run at 60 fps at 1440p? A multiplayer game? With toon graphics? Wtf?

Whom do you blame? Devs or IHVs? It's like them shaking hands and congratulating each other how crazy they are, lol.
 
But... Fortnite needing such thing to run at 60 fps at 1440p? A multiplayer game? With toon graphics? Wtf?
With Nanite and Lumen, the style of the graphics is largely irrelevant.

UE5 support hardware rt too, and it's looking better with it, so...
See above, performance difference seems to be negligible. Not enough data yet obviously though.
 
With Nanite and Lumen, the style of the graphics is largely irrelevant.


See above, performance difference seems to be negligible. Not enough data yet obviously though.

DF did 1 or 2 videos on UE5 and RT, and the SW versions is not looking as good as the hw ones, they have to sacrifice quality too have decent performances, their is no magic here.

Now, it's doesn't matter a lot since games are not there yet, but, I don't buy the "sw is enough" stuff. And rdna 3 is behaving pretty well in RT now, so no reason to skip HW rt altogether.
 
DF did 1 or 2 videos on UE5 and RT, and the SW versions is not looking as good as the hw ones, they have to sacrifice quality too have decent performances, their is no magic here.

Now, it's doesn't matter a lot since games are not there yet, but, I don't buy the "sw is enough" stuff. And rdna 3 is behaving pretty well in RT now, so no reason to skip HW rt altogether.
I'm not referring to any quality differences between SW and HW Lumen in UE5. I'm referring to the lack of any performance difference between 7900XTX and 4080 in HW Lumen.
 
Whom do you blame? Devs or IHVs? It's like them shaking hands and congratulating each other how crazy they are, lol.
I take offense to your use of the word "blame". The 1978 Mercedes Benz S-class was the first production car with ABS. Today ABS is present in everything down to a Yaris. The transition didn't happen in 1 year, it gradually trickled down through the tiers. And it would never have happened if people were looking to "blame" Mercedes, its suppliers or its customers for introducing new technology in the topmost tiers.
 
I don't buy the "sw is enough" stuff. And rdna 3 is behaving pretty well in RT now, so no reason to skip HW rt altogether.
I don't see much of a difference between Portal RTX and the stuff i do in SW (just mine is fast on prev gen HW). If you would see that too, you would change your mind.
And i want RT too, that's not the point.

The point is: I do not buy a >1000$ GPU just to play silly video games. Even less so if it's a multiplayer shooter, which is meant to be played by everyone, at high frame rates. This is crazy.

Idk what gfx options in Fortnite can do, but it's clear everybody will prefer high FPS over visuals in a competive shooter. So if even the biggest monster GPUs can not run this, why do they offer such high settings at all?
The only expected outcome is this:

"I have worked two months, just to afford my new 4080 \:D/
Plugin it in...
Yes! It still works! It did not melt!
Playing a round of Fortnite against my Friends. HAHA, i'll frag them all. They have just 1070, those poor Bastards.
Launching the game...
Oh... update to UE5! nice...
It's running! It looks a bit better... but wait... um... just 45 FPS on my new 4K panel? Whaaat??? I got already fragged??? OUTRAGE!!!!"
Killing family, neighbors, their dog, committing suicide, quitting gaming. In that order.

No, no, guys. This won't work. It's no fun, just frustration. And assuming it would work is crazy.
 
I'm not referring to any quality differences between SW and HW Lumen in UE5. I'm referring to the lack of any performance difference between 7900XTX and 4080 in HW Lumen.
Where do we know this from exactly?

Theoretically speaking Nanite and (s/w) Lumen are both heavy compute users which means that both are highly likely to run as wave32 on RDNA which probably means that they won't be able to use VOPD on RDNA3 and will run at about 1/2 of peak FP32 compute throughput. This would put N31 at ~30 TFs against AD103's ~50 TFs.

And as for h/w Lumen I don't see why it would run any differently than any other h/w RT approach would. I.e. it can certainly be limited by something other than actual RT in which case RDNA may be on par or even win.
 
Idk what gfx options in Fortnite can do, but it's clear everybody will prefer high FPS over visuals in a competive shooter. So if even the biggest monster GPUs can not run this, why do they offer such high settings at all?
Many don't mind running at 60fps. Users can also turn off the nanite and lumen enhancements to fallback to regular geometry and lighting.
 
The point is: I do not buy a >1000$ GPU just to play silly video games. Even less so if it's a multiplayer shooter, which is meant to be played by everyone, at high frame rates. This is crazy.

This is a somewhat arbitrary stance. People spend far more money on far sillier things all the time.

Idk what gfx options in Fortnite can do, but it's clear everybody will prefer high FPS over visuals in a competive shooter. So if even the biggest monster GPUs can not run this, why do they offer such high settings at all?

Maybe Epic decided that an extremely popular game showing off Nanite and Lumen is a good way to shore up developer confidence in their new engine? It’s a huge marketing win for UE5, Nanite and Lumen.

Results matter. Promises don’t.
 
This is a somewhat arbitrary stance. People spend far more money on far sillier things all the time.
Yes, but a multi billion dollar industry can't live from some Ferrari fans.
Results matter. Promises don’t.
False promises sell better than terrible results.

But well, if they think it makes sense, it's their decision. Imo, some single player 3rd party game with impressive gfx would make a better impression. Looking back, making BFV the first RT game wasn't well thought either.
Game devs... never realizing it's broken. And when they finally do, they make a day one patch.
 
@JoeJ Fortnite is stable 60 fps with sw lumen and nanite on current gen consoles, but also has a 120 fps mode with them disabled. The game scales from switch all the way to rtx 4090 and runs well. I agree that basically people will play with lower settings in a game like that, but I think Epic is incredibly smart to build new technologies that they use in games that ship to millions of players. Basically since they released Fortnite everything that goes into UE is play tested, and the quality of UE has only increased because of it. Plus the game looks absolutely awesome with nanite and lumen so it's fun to play around with, even if I won't play it that way primarily.
 
Yes, but a multi billion dollar industry can't live from some Ferrari fans.

Sure and that’s why there’s a whole range of price and performance options to choose from.

False promises sell better than terrible results.

But well, if they think it makes sense, it's their decision. Imo, some single player 3rd party game with impressive gfx would make a better impression. Looking back, making BFV the first RT game wasn't well thought either.
Game devs... never realizing it's broken. And when they finally do, they make a day one patch.

The Lumen and Nanite implementation in Fortnite isn’t a terrible result by any stretch of the imagination. It’s very much a playable and enjoyable demonstration of cutting edge rendering tech.
 
Sure and that’s why there’s a whole range of price and performance options to choose from.
Which is fine.
But i see this going on since many years across the whole industry: They show us awesome gfx demos, but we don't get games which look that good. And if so, the HW requirements are over the moon, and it still barely runs on that.
This is a short living marketing advantage, but on the long run it accumulates disappointment on the audience and customers.

Same happens here: They show us awesome Land in the Nanite. Then we get a Matrix demo which is less impressive than CP and runs much worse, and now Fortnite with highest settings at 45 fps on a 4080.
I don't say the tech is bad, but the marketing is. It feels over promising, setting standards they can not hold, turning cutting edge into bleeding edge.

It shouldn't be like that. It's interesting to discuss for us here, but imo nobody should show gfx demos to the public, only final games you can actually buy right now. Less disappointment, less inflated expectations, better business.
This culture of 'impress' does not give us a favor anymore. The impression is short living too. At the time i'll play the first UE5 game with awesome detail, it's nothing new anymore. The disappointment of low FPS will outweight the impression, which is a missed opportunity, and a really big one.

It's not really a big problem. You guys say i could run Fortnite at 60 fps so that's fine. If i can use it to push latest HW, nothing wrong with that either.
But any disappointment sums up over time. And that's currently the dominating impression about the games industry, it feels.
 
Back
Top