AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

You're looking at perf/watt in very specific benchmarking suites which in itself isn't "best case" for anything but a representation of currently available gaming performance without RT.
It can be considered a "best case" for RDNA2 probably but not for Ampere.
TPU uses Cyberpunk at 4k and CB uses Control, Doom Eternal and Gears Tactics at 3 different resolutions. How is that a best case scenario for RDNA 2?
 
TPU uses Cyberpunk at 4k
Without RT. As there are cards which don't support it.

and CB uses Control, Doom Eternal and Gears Tactics at 3 different resolutions
Also without RT in Control probably? Can't seem to find them specifying how and what they are using but there are also cards like GTX1060 in their results. Also all three are DX12/VK titles AFAIR.

How is that a best case scenario for RDNA 2?
Current gen gaming without RT or usage of any of DX12U features is the best case scenario for RDNA2.
Add RT or production workloads into the mix and the results will change dramatically.
 
Without RT. As there are cards which don't support it.


Also without RT in Control probably? Can't seem to find them specifying how and what they are using but there are also cards like GTX1060 in their results. Also all three are DX12/VK titles AFAIR.


Current gen gaming without RT or usage of any of DX12U features is the best case scenario for RDNA2.
Add RT or production workloads into the mix and the results will change dramatically.
3 Nvidia sponsored games and 1 UE4 game is not a best case for RDNA 2. With or without RT.
 
It seems RDNA 2 does very well at lower resolutions with RT off in most games regardless of “sponsor”. The architecture will likely age very well on UE5 games using TSR.

It’s interesting that even after RDNA 2 and the new consoles hit the market most reviewers are still treating RT as this exotic feature that is disabled in testing. Reminds me of early anti-aliasing days. Wonder when RT enabled will be considered default.
 
It seems RDNA 2 does very well at lower resolutions with RT off in most games regardless of “sponsor”. The architecture will likely age very well on UE5 games using TSR.

It’s interesting that even after RDNA 2 and the new consoles hit the market most reviewers are still treating RT as this exotic feature that is disabled in testing. Reminds me of early anti-aliasing days. Wonder when RT enabled will be considered default.
Took a long time for some reviews to do tesselation on even ground...so might be a while...due to AMD's performance disparity.
 
3 Nvidia sponsored games and 1 UE4 game is not a best case for RDNA 2. With or without RT.
What are games being "Nvidia sponsored" have to do with how they run? Or with perf/watt metrics for that matter.

It's only "AMD sponsored" games which tend to have some seriously peculiar performance issues lately.
CP2077 without RT runs slightly better on AMD h/w for example.
Eternal, Tactics are more or less vendor agnostic on recent h/w.
Control is the only game which runs better on Ampere even without RT AFAIR.
 
It seems RDNA 2 does very well at lower resolutions with RT off in most games regardless of “sponsor”. The architecture will likely age very well on UE5 games using TSR.

It’s interesting that even after RDNA 2 and the new consoles hit the market most reviewers are still treating RT as this exotic feature that is disabled in testing. Reminds me of early anti-aliasing days. Wonder when RT enabled will be considered default.
I think it will become more prevalent when we start getting more titles where the visual uplift aligns with the performance hit.
 
Your seethe is both unwarranted and unnecessary; you always have other threads to take a dump in.
Plz, stop the chlidish remarks and stick to basic arguments.

So far in laptops, we don't see any massive power effecincy advantage for RDNA2, it didn't even translate to a performance advantage.

If you have some numbers, please point me to them.
Nothing besides maximum FPS pew-pew matters.
Nope, only in your lala lands perhaps. Visuals count too, especially with the current roster of next gen games.
I think it will become more prevalent when we start getting more titles where the visual uplift aligns with the performance hit.
That's not an objective point, Ultra settings in general add very little IQ compared to high settings, while costing a considerable amount of performance, should we discard Ultra settings too?

Control and Cyberpunk uses RT effectively, yet no one bothers to check them. Minecraft is the most popular game ever, yet no one bothers to include Minecraft RTX in their suite.

Clearly this needs to change, most AAA games now have one form of RT, if we are just going to ignore it just because we want to pad AMD on the back, then that's not very fair now, is it?
 
Plz, stop the chlidish remarks and stick to basic arguments.
It will only get worse next year.
So far in laptops, we don't see any massive power effecincy advantage for RDNA2
Do the needful and powercap NV23 for me.
Or someone else does it.
I'm all patience.
Visuals count too, especially with the current roster of next gen games.
Too bad, everyone wants dem frames hence why nearly every nextgen game ever will ship a performance mode everyone would use.
Control and Cyberpunk uses RT effectively
The what.
They're just morbidly expensive.
Minecraft is the most popular game ever, yet no one bothers to include Minecraft RTX in their suite.
Because no one plays Minecraft with RTX.
At least have some decency to mention Metro Exodus or idk.
 
Last edited:
Do the needful and powercap NV23 for me.
Or someone else does it.
Power cap them and use RT too, and lets see who would win.
Too bad, everyone wants dem frames hence why nearly every nextgen game ever will ship a performance mode everyone would use.
RT modes need those pew pew fps too, you know? If AMD is going to offer fps only in RT off modes and ignore the IQ modes then they are in for a world of hurt. Their competitor already offers both.

Because no one plays Minecraft with RTX.
Clearly you don't watch Minecraft RTX streamers or content creators at all.

Metro Exodus or idk.
Yeah, that too. Now who is testing it with RT on and factoring the results into their general consensus again?
 
and use RT too
Use what?
On a laptop GPU?
I know you probably hold bags of $NVDA but there's not really a reason to go off the rocker just yet.
Next year?
See you at the mountains of madness.
RT modes need those pew pew fps too, you know?
Too bad most of them have horrific perf everywhere thus basically useless.
Dial down the res and/or lose the shit out of your FPS count in a world of HRR panels being shoehorned everywhere gaming.
Their competitor already offers both.
Shit I wish I was this good at fartsniffing.
Clearly you don't watch Minecraft RTX streamers or content creators at all.
Who cares, it's all about that sweet sweet TAM.
Minecraft playerbase isn't even unified enough under Bedrock to slurp the RTX juice as is.
Now who is testing it with RT on and factoring the results into their general consensus again?
Idk, no one cares about anudda eurojank game anymore.
 
Our green friends will mutate in desperation into an even angrier type of night goblin as early as late this year.
Navi 33 is coming Q4 2021? Or announced in Q4 2021 to release in Q1/Q2 2022?


Dudebro one-liner stream of consciousness posts are awesome. Just as good as last time.
To be honest I'm fine with one-liners as not everyone has english as their first language nor everyone had english classes at school. I don't recall seeing something like written text standards in the forum rules either.
The fan/bot accusations are the only thing derailing the thread here. Everyone is doing it ATM, but we can easily see where it started.
 
Back
Top