AMD: RDNA 3 Speculation, Rumours and Discussion

Status
Not open for further replies.
If AMD can get to Ampere level RT and have a >=1.25x raster perf/w advantage with RDNA3 vs Ada, which looks to be possible with the limited info we have, it could be a very interesting gen and especially so for laptops. Dragon range + RDNA3 could be killer
Right now if AMD can get similar performance to a 4090 but at 4080 pricing they could have a really big hit on their hands esp if they have 16 and 24 gig on their cards vs 12/16 on the geforce 4080s.
 
I don't see a reality where AMD can compete with 4090 performance, particularly when RT is enabled, but hopefully they can offer something of better value than the 4080 duo. If DLSS frame generation really looks and feels like a natively rendered higher framerate AMD is dead in the water, but if not AMD has a chance to claw back some good market share.
 
I don't see a reality where AMD can compete with 4090 performance, particularly when RT is enabled, but hopefully they can offer something of better value than the 4080 duo. If DLSS frame generation really looks and feels like a natively rendered higher framerate AMD is dead in the water, but if not AMD has a chance to claw back some good market share.

We will have to see , amd was pretty close last gen with traditional rasterization so it could be the case again , they also could have improved ray tracing after all rdna 2 was their first attempt.

You are right that frame generation could be an issue but it also depends on how much support it gets. Also ram pools could end up being important this gen of cards . We just have to see what Nov 3rd brings I guess
 
It is what i meant. Pure rasterizing performance is equal, but i dont pay $1000€ to disable raytracing in three years old games. And the 6950XT was released 18 months after the 3090.
I can agree with you on enabling RT on expensive cards.
I want to have the best eye candy on such a card, but I certainly wouldn't like to activate DLSS or FSR just to have playable FPS.
RDNA2 architecture is not bad, but RT performance is abysmal. Let's hope RDNA3 will have a massive improvement in RT.

BTW, please enlighten me why do you compare 6950XT vs 3090 when you can compare 6950Xt vs 3090Ti or 6900XT vs 3090? Just so you can write, It was released 18 months later and still lost? :rolleyes:
 
Unless you're playing at 1080p then you are disabling RT in some games if you want at least 60fps, even with a 3090.
This is not really true. If you check 6950XT TPU review you will see 6 games managed >60 FPS even at 4K using 3090 Ti.
The only exceptions were 3 games:
2160p Control: 39.6 FPS
1440p Cyberpunk 2077: 47.3 FPS
2160p Cyberpunk 2077: 23.6 FPS
2160p Watch Dogs Legion: 48.3 FPS

BTW for RT a lot of shaders is very important, right? Then RDNA3 should improve a lot based on leaks.
Shaders: 12288 vs 5120 = +140%
clocks: 2.31->3.5-3.6GHz = +52-56%
TFlops: 86-88.47 vs 23.65= +264-274%
 
Last edited:
You really don't. I've played most games with RT in 4K(+DLSS sometimes) on a 3080 and they all ran fine at 60+.
Without DLSS!
I just don't find It logical to enable RT for best eye candy, but because of low FPS I also need to enable DLSS in some games, which will improve FPS, but visuals will take a hit.
DLSS and FSR are great for weaker cards, there I can accept It.
 
I can agree with you on enabling RT on expensive cards.
I want to have the best eye candy on such a card, but I certainly wouldn't like to activate DLSS or FSR just to have playable FPS.
RDNA2 architecture is not bad, but RT performance is abysmal. Let's hope RDNA3 will have a massive improvement in RT.

BTW, please enlighten me why do you compare 6950XT vs 3090 when you can compare 6950Xt vs 3090Ti or 6900XT vs 3090? Just so you can write, It was released 18 months later and still lost? :rolleyes:

AMD compared both: https://community.amd.com/t5/gaming/advancing-performance-per-watt-to-benefit-gamers/ba-p/545586
 
This is not really true. If you check 6950XT TPU review you will see 6 games managed >60 FPS even at 4K using 3090 Ti.
The only exceptions were 3 games:
2160p Control: 39.6 FPS
1440p Cyberpunk 2077: 47.3 FPS
2160p Cyberpunk 2077: 23.6 FPS
2160p Watch Dogs Legion: 48.3 FPS

BTW for RT a lot of shaders is very important, right? Then RDNA3 should improve a lot based on leaks.
Shaders: 12288 vs 5120 = +140%
clocks: 2.31->3.5-3.6GHz = +52-56%
TFlops: 86-88.47 vs 23.65= +264-274%
If you read my comment I said in some games...
 
Without DLSS!
I just don't find It logical to enable RT for best eye candy, but because of low FPS I also need to enable DLSS in some games, which will improve FPS, but visuals will take a hit.
DLSS and FSR are great for weaker cards, there I can accept It.
DLSS doesn't hit the image quality nearly enough for it to be an issue. In many cases it's actually improving game's own AA.
99.9% of times you're getting better graphics with RT+DLSS than without RT and DLSS.
FSR1 is useless trash IMO.
FSR2 looks good but so far I haven't seen a game with RT where FSR2 would be the preferable option on a GeForce.
 
DLSS doesn't hit the image quality nearly enough for it to be an issue. In many cases it's actually improving game's own AA.
99.9% of times you're getting better graphics with RT+DLSS than without RT and DLSS.
FSR1 is useless trash IMO.
FSR2 looks good but so far I haven't seen a game with RT where FSR2 would be the preferable option on a GeForce.
You can't make those decisions for others. The universe doesn't rotate around you and you don't decide for anyone but yourself.
 
AMD has a good opportunity to gain some ground given the price points and skus of announced 4XXX series.

They’ve got to keep that price point below 700 imo for this to work for them.
 
Last edited:
AMD has a good opportunity to gain some ground given the price points and skus of announced 4XXX series.

They’ve got to keep that price point below 700 imo for this to work for them.

They need to have seriously improved on ray tracing performance as it'll be easy for Nvidia to justify their pricing.

Nvidia will just say "Yea they're just as fast in raster performance, but we offer higher RT performance so we've worth the extra"
 
You stated "DLSS doesn't hit the image quality nearly enough for it to be an issue" as if it was some kind of universal fact. It's a matter of opinion. For some it is issue, for others it isn't.
Everything you and I say in our posts and messages are "matter of opinion" unless it's a link/reference to something which can be seen as fact.
 
AMD has a good opportunity to gain some ground given the price points and skus of announced 4XXX series.

They’ve got to keep that price point below 700 imo for this to work for them.
Navi 32 is only supposed to be about 200mm² of TSMC 5nm, and 150mm² of 6nm. There's definitely a lot of room there for AMD to utilize pricing/value advantages in the upper midrange compared to Nvidia. I'd expect similar performance to the 12GB 4070 4080.

Though this isn't expected til early next year.

Navi 31 at 308mm² of 5nm and 225mm² of 6nm is still not gonna be cheap and definitely wont get below $700, especially since it has a very strong chance of performing better than the GA103 4080 16GB. I could see $1300 for full version, and then $1000 for cut down version.
 
Status
Not open for further replies.
Back
Top