AMD: RDNA 3 Speculation, Rumours and Discussion

Status
Not open for further replies.
So one thing I'm not seeing people talk about is what features (aside from RT) will demand big boy performance in the next 2-3 years. In the same way that people are skeptical of RT's performance hit I have similar questions about why some games are so demanding. For example what are Borderlands 3, Control and Star Wars doing that's so heavy?

6900xt.png

Your chart is really showing, why RT will be extremely important in the next gen, if the speculated performance figures are right. Even with more demanding games N31 should land at 150FPS average in 4k in this chart, if not cpu limited. But who cares about FPS above 120 FPS. Nearly no one.
AMD winning 20% at Rasterization, like 120FPS vs 144 FPS, but Nvidia still winning 10% in RT with 77 FPS vs 70 FPS would give Nvidia the win. People care for situations where they need the speed, not for speed in undemanding games.
 
Your chart is really showing, why RT will be extremely important in the next gen, if the speculated performance figures are right. Even with more demanding games N31 should land at 150FPS average in 4k in this chart, if not cpu limited. But who cares about FPS above 120 FPS. Nearly no one.
AMD winning 20% at Rasterization, like 120FPS vs 144 FPS, but Nvidia still winning 10% in RT with 77 FPS vs 70 FPS would give Nvidia the win. People care for situations where they need the speed, not for speed in undemanding games.

IMO, that just shows how almost all AAA games are designed around the performance and capabilities of consoles with a few developers putting in some extra effort to add features or higher quality settings to PC.

RT on PC I guess is a nice way for some developers to tack on something extra that pushes the GPU PCs, but as we'll see as this generation of consoles goes on. Outside of a few developers, almost all AAA developers will be mostly limited by what the consoles can do. While a few developers might try to take advantage of the extra performance and/or features on PC.

Basically, no different from any previous console generation.

Color me skeptical (perhaps overly so) that there will be many developers that do something like 4A games did with revamping their engine to being more of an RT engine (albeit still heavily reliant on rasterization for many things) versus being a Rasterization engine with RT tacked on.

IMO, it's a little concerning that DICE, who are known for pushing technology on PC, have seemingly pulled back from RT for their next game. But perhaps they are saving the RT unveil for when the game launches. Still odd, as DICE have never been shy about trumpting the technology that they are pushing in their engines.

Regards,
SB
 
Your chart is really showing, why RT will be extremely important in the next gen, if the speculated performance figures are right. Even with more demanding games N31 should land at 150FPS average in 4k in this chart, if not cpu limited. But who cares about FPS above 120 FPS. Nearly no one.
AMD winning 20% at Rasterization, like 120FPS vs 144 FPS, but Nvidia still winning 10% in RT with 77 FPS vs 70 FPS would give Nvidia the win. People care for situations where they need the speed, not for speed in undemanding games.

I would be surprised if RDNA 3 doesn't significantly improve on RDNA 2 RT performance. AMD doesn't want their PC hardware to be associated with "console settings". Also, why would they be investing in expensive multi-chiplet GPUs just to run console games at 300fps. We're at the very start of a console generation and the new machines seem to already be tapped out.

IMO, it's a little concerning that DICE, who are known for pushing technology on PC, have seemingly pulled back from RT for their next game.

Are you thinking the raytraced surfel GI stuff they presented at Siggraph was just a toy project?
 
Your chart is really showing, why RT will be extremely important in the next gen, if the speculated performance figures are right. Even with more demanding games N31 should land at 150FPS average in 4k in this chart, if not cpu limited. But who cares about FPS above 120 FPS. Nearly no one.
AMD winning 20% at Rasterization, like 120FPS vs 144 FPS, but Nvidia still winning 10% in RT with 77 FPS vs 70 FPS would give Nvidia the win. People care for situations where they need the speed, not for speed in undemanding games.
You're somehow assuming only way games would get heavier is by adding RT and we've reached some sort of ceiling for rasterization
 
You're somehow assuming only way games would get heavier is by adding RT and we've reached some sort of ceiling for rasterization

There are also expensive features like volumetric fog that are purely compute based and have nothing to do with rasterizing triangles. Mesh shaders and VRS are the most recent additions to the rasterization pipeline but haven’t made an impact yet. It’s strange that there are far more console games using RT than mesh shaders even though it’s theoretically easy to swap between mesh shaders and the classic pipeline.
 
Well, rasterization in Battlefield V worked well since day 1, raytracing support delayed the released and raytracing optimizations were done for the whole two quarters after release to get acceptable performance (even at the expense of reducing particle effects etc.). So adding of raytracing may be easy, but optimization definetely isn't.
 
Optimizing is totally different than implementation.UE4 has native raytracing support. Every UE4 engine game can use it. Performance is another question. So ignoring much better raytracing performance for future products will make them obsolete.
 
Last edited:
Well, rasterization in Battlefield V worked well since day 1, raytracing support delayed the released and raytracing optimizations were done for the whole two quarters after release to get acceptable performance (even at the expense of reducing particle effects etc.). So adding of raytracing may be easy, but optimization definetely isn't.
That was the first RT game - of course there would be learning to be done to understand best practices and performance pitfalls. The entire industry learned a lot by virtue of BF V and its presentations.
 
Optimizing is totally different than implementation.UE4 has native raytracing support. Every UE4 engine game can use it. Performance is another question. So ignoring much better raytracing performance for future products will make them obsolete.

Are you kidding? When you implement a thing, you do it having optimization in mind. you don't write a shitty code and then optimize it. Only junior dev do so.
Of course profiling is a must, but if you have experience, no significant changes are needed
 
I meant only that deciding to market such a design means they are confident on being able to produce it
They always planned it, from day like -1.
Much the same way Zen3 CCD had all the funny pads for V$ long before even Intel die photography teams spotted them.
or the packaging will make it longer.
Shit can always go wrong with advanced packaging at scale but overall nothing should be anything different really.
 
I noticed that Greymon's accuracy is slowly reducing
Looks like somebody clipped his sources :)
 
Status
Not open for further replies.
Back
Top