AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

And why not?

And that's before adding DLSS boost to RTX cards.

People should really stop acting as if enabling RT is somehow impossible unless you own a 3090.

Those are pretty selective benchmarks. From that exact same review (https://www.techpowerup.com/review/powercolor-radeon-rx-6600-fighter/33.html), to hit at least 60FPS at 1080p you will need AT LEAST the following:
Control - RX 6800 XT
Cyberpunk - RTX 3090
Deathloop - RX 6700 XT
Doom - RX 6600
F1 2021 - RTX 2060
Far Cry 6 - RX 6600
Metro Exodus - RX 6600 XT
RE8 - RTX 3060
Watch Dogs Legion - RTX 3060 Ti

And at 1440p you will at least need:
Control - RTX 3080
Cyberpunk - NA
Deathloop - RTX 3060 Ti
Doom - RX 6600 (beats RTX 3070 — probably due to a driver bug)
F1 2021 - RTX 2060
Far Cry 6 - RTX 3060
Metro Exodus - RTX 3060 Ti
RE8 - RTX 3060
Watch Dogs Legion - RX 6900 XT

These aren’t exactly cheap cards; and this is just to hit 60FPS.
 
Those are pretty selective benchmarks. From that exact same review (https://www.techpowerup.com/review/powercolor-radeon-rx-6600-fighter/33.html), to hit at least 60FPS at 1080p you will need AT LEAST the following:
Control - RX 6800 XT
Cyberpunk - RTX 3090
Deathloop - RX 6700 XT
Doom - RX 6600
F1 2021 - RTX 2060
Far Cry 6 - RX 6600
Metro Exodus - RX 6600 XT
RE8 - RTX 3060
Watch Dogs Legion - RTX 3060 Ti

And at 1440p you will at least need:
Control - RTX 3080
Cyberpunk - NA
Deathloop - RTX 3060 Ti
Doom - RX 6600 (beats RTX 3070 — probably due to a driver bug)
F1 2021 - RTX 2060
Far Cry 6 - RTX 3060
Metro Exodus - RTX 3060 Ti
RE8 - RTX 3060
Watch Dogs Legion - RX 6900 XT

These aren’t exactly cheap cards; and this is just to hit 60FPS.

Techpowerup tests at ultra settings with all RT effects enabled at maximum quality. You don’t need a 6800xt to play Control with RT at 1080p. A 2060 is just fine.

The ultra settings hype is real.
 
That almost makes me wonder if the NV driver is doing somethign differently on the 3060's and not rendering the same things as the 3070.

In the other games benchmarked by Techpowerup the 3060ti/3070 are were they should be, the 3070 slightly faster than the 3060ti

Perhaps there is some reason why the 3060Ti stays just above some memory threshold at 1440p in Doom at those settings and the 3070 doesn't. At 4K RT they both crash hard and 3070 is a tiny bit faster. RTX 2060 is struggling already at 1080P RT with it's 6GB.
 
Techpowerup tests at ultra settings with all RT effects enabled at maximum quality. You don’t need a 6800xt to play Control with RT at 1080p. A 2060 is just fine.

The ultra settings hype is real.
According to first result from Google, rtx 2060 can't hit average 60 fps with medium rt + high other settings. Maybe someone can dig up similar on current version of the game + drivers
https://www.game-debate.com/news/27...rks-and-performance-cost-geforce-rtx-2060-6gb
 
6600 fails really hard at 1080p in Far Cry 6 versus 3060 :)

AMD should have at least launched the card before Far Cry 6!

This is possibly the first game that really destroys the 32MB Infinity Cache at 1080p. Sure, it could be a driver bug. Will anyone notice if the bug is fixed?
 
Without. This site got the same result. Medium + medium RT is playable on a 2060.

https://www.pcgamesn.com/control/nvidia-rtx-dlss-ray-tracing-performance-benchmarks
Computerbase's results seem to put the 2060 on the barely playable side at 42,4 fps avg. with RT @ medium. They do their test in the so-called corridor of doom, an area that's notoriously heavy on the graphics card. Apparently, that's an important factor in their assessment in contrast to pcgamesn.com (who btw say they use base setting high + the respective RT-setting)
Hoch = high
Mittel = medium
Aus = off

upload_2021-10-16_14-47-4.png
 
Last edited:
Computerbase's results seem to put the 2060 on the barely playable side at 42,4 fps avg. with RT @ medium. They do their test in the so-called corridor of doom, an area that's notoriously on the graphics card. Apparently, that's an important factor in their assessment in contrast to pcgamesn.com (who btw say they use base setting high + the respective RT-setting)
<snip>
All these articles appear to have been written before the DLSS2.0 patch, and so it makes sense that they all tested primarily with DLSS disabled. But DLSS2.0 changed the playability equation. @Dictator's video shows the 2060 getting 75fps+ at 1080p/DLSSQ with his optimized settings and all RT enabled.

In writing a single-line summary on playability it's probably best to succinctly lay out all the information, e.g., "on a 2060, it's playable at 1080p/60fps with all-RT/DLSSQ or no-RT/no-DLSS, and at 1080p/30fps with all-RT/no-DLSS". Take your pick.
 
All these articles appear to have been written before the DLSS2.0 patch, and so it makes sense that they all tested primarily with DLSS disabled. But DLSS2.0 changed the playability equation. @Dictator's video shows the 2060 getting 75fps+ at 1080p/DLSSQ with his optimized settings and all RT enabled.

In writing a single-line summary on playability it's probably best to succinctly lay out all the information, e.g., "on a 2060, it's playable at 1080p/60fps with all-RT/DLSSQ or no-RT/no-DLSS, and at 1080p/30fps with all-RT/no-DLSS". Take your pick.
That's something under 1080p scaled to 1080p, not 1080p.
 
Trigger Warning: these results have no raytracing or DLSS enabled AFAICT. It's just rasterization without upscale or reconstruction.

Please do move along if games without DLSS or raytracing offend you so much that you feel the need to derail the discussion with endless DLSS and raytracing propaganda.

Shamelessly reposted from here, but here's how RDNA2 behaves in games according to their release date.




7FPrvP9.png




d4PN07k.png




And for 2021 I noticed something additional:


NZLyO9m.png




Another thing to note is the evolution between games and their sequels that came out during / after the 9th-gen console releases using the same engine and developer.
Here's the 6800XT compared to the 3080 in these games:

Anvil Next 2.0 - Ubisoft Montreal / Milan:
2018 Assassin's Creed Odyssey DX11 vs. 2020 Assassin's Creed Valhalla DX12
1080p: -8.9% vs +17.2%
1440p: +1.4% vs +13.6%
2160p: -1.6% vs. +0.9%

EGO Codemasters:
F1 2020 DX12 vs. F1 2021 DX12
1080p: +7.1% vs. +25.7%
1440p: +7.6% vs. +17.9%
2160p: +0.1% vs. +10.8%

Dunia - Ubisoft Montreal / Toronto
Far Cry 5 DX11 vs Far Cry 6 DX12
1080p: -10.4% vs. -0.7%
1440p: -9.7% vs. +2%
2160p: +6.8% vs. +5.6%

RE - Capcom:
Resident Evil 3 2020 vs. Resident Evil Village
1080p: +17.5% vs. +24%
1440p: +9.8% vs. +20.1%
2160p: +5.8% vs +8%



In all these game/engine evolutions the RDNA2 GPU got a performance boost relative to its Nvidia closest competitor, in the order of ~10% at 1440p. The only regression I see is on Far Cry 6 @4K, but in the other games we already see a decreased advantage at that resolution, probably because 128MB of GPU LLC aren't ideal for native 4K rendering anyway.
 
Back
Top