AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Some of these numbers are odd. 6900XT is performing exactly like the 6800XT. Bandwidth limitation? We did not see it in practically any other game, even with very high amount of VRAM/texture used. Possibly more a geometrical issue.

Well, they even say they made (software?) RT working on a GTX1660. Lol.
 
I wouldn't put much value into current Cyberpunk 2077 benchmarks at the moment, much less as it being representative of what to expect in general PC performance between the two architectures.
It's a (deeply) nvidia-funded game, meaning they had access to the game's code much earlier than AMD for driver development, and in traditional manner nvidia got their own trojan horses engineers on site to skew performance on competing architectures enhance performance on their own architecture.
Most multiplatformers will be built for RDNA2 first due to consoles, so Cyberpunk isn't setting any trends.

Disclaimer: I already own the game and will love it nonetheless. I have nothing against it or CDPR. A Polish independent studio getting money and engineering resources from wherever they can to make a game that breaks the scale on production values is just doing what needs to be done. I would probably do the same under similar circumstances, as getting a paycheck is more important than tech tyranny.




Regarding the various degrees of RT adoption, 1 year ago I'd be saying RT is the future and everyone needs more RT performance yesterday if we want to survive.
Nowadays, I look at the most impressive real-time demo I saw the entire year and it's Unreal Engine 5 without a shred of RT. It looks like Voxel GI makes RT GI worthless to anything but RT benchmarks.
I look at the released game that IMO has the best graphics out there and it's Demon's Souls. It has no RT.

Cyberpunk looks great, but it seems to look great without RT, or with only a few RT effects enabled. It looks like nvidia ordered suggested CDProjekt RED to just use as much RT as possible to make a showcase out of the game, and the result is a wholly unoptimised mess that doesn't really perform very well in any graphics card, and some effects could be traded by non-RT implementations with immense advantages in performance and residual difference in visual quality (which is already pretty much what always happens when we set PC settings to Ultra, anyway).


Consoles are the current limiting factor. If it is more performant than RT on consoles than it will be leveraged more in games than RT on consoles. If it isn't more performant than RT on consoles then it won't be leveraged more in games. If it is used more on consoles, it'll become the defacto way to do X thing regardless of whether RT can do it better on PC.

Lower end PCs have been the limiting factor for High end PCs at least ever since the 8th-gens released in 2013, because it coincided with Intel getting increasingly relaxed on their CPUs and IGPs due to lack of competition.
7 years later and the XBone/PS4 are still running circles around the dreaded Gen9 GT2 IGP that is in most PCs nowadays. Or even the newer Iris Plus in Ice Lake.
 
I wouldn't put much value into current Cyberpunk 2077 benchmarks at the moment, much less as it being representative of what to expect in general PC performance between the two architectures.
It's a (deeply) nvidia-funded game, meaning they had access to the game's code much earlier than AMD for driver development, and in traditional manner nvidia got their own trojan horses engineers on site to skew performance on competing architectures enhance performance on their own architecture.
Most multiplatformers will be built for RDNA2 first due to consoles, so Cyberpunk isn't setting any trends.

Disclaimer: I already own the game and will love it nonetheless. I have nothing against it or CDPR. A Polish independent studio getting money and engineering resources from wherever they can to make a game that breaks the scale on production values is just doing what needs to be done. I would probably do the same under similar circumstances, as getting a paycheck is more important than tech tyranny.




Regarding the various degrees of RT adoption, 1 year ago I'd be saying RT is the future and everyone needs more RT performance yesterday if we want to survive.
Nowadays, I look at the most impressive real-time demo I saw the entire year and it's Unreal Engine 5 without a shred of RT. It looks like Voxel GI makes RT GI worthless to anything but RT benchmarks.
I look at the released game that IMO has the best graphics out there and it's Demon's Souls. It has no RT.

Cyberpunk looks great, but it seems to look great without RT, or with only a few RT effects enabled. It looks like nvidia ordered suggested CDProjekt RED to just use as much RT as possible to make a showcase out of the game, and the result is a wholly unoptimised mess that doesn't really perform very well in any graphics card, and some effects could be traded by non-RT implementations with immense advantages in performance and residual difference in visual quality (which is already pretty much what always happens when we set PC settings to Ultra, anyway).




Lower end PCs have been the limiting factor for High end PCs at least ever since the 8th-gens released in 2013, because it coincided with Intel getting increasingly relaxed on their CPUs and IGPs due to lack of competition.
7 years later and the XBone/PS4 are still running circles around the dreaded Gen9 GT2 IGP that is in most PCs nowadays. Or even the newer Iris Plus in Ice Lake.
Too bad those Nvidia engineers couldn't be bothered to do any optimizing for pascal owners.
 
Some of these numbers are odd. 6900XT is performing exactly like the 6800XT. Bandwidth limitation? We did not see it in practically any other game, even with very high amount of VRAM/texture used. Possibly more a geometrical issue.

Well, they even say they made (software?) RT working on a GTX1660. Lol.
The same goes for the 3090. it is probably a CPU limitation. The game looks horribly optimize tho.
 
I wouldn't put much value into current Cyberpunk 2077 benchmarks at the moment, much less as it being representative of what to expect in general PC performance between the two architectures.

But we can with AC valhalla?

in traditional manner nvidia got their own trojan horses engineers on site to skew performance on competing architectures enhance performance on their own architecture.

Claims and conspiricy theories like these really need to be backed up by evidence.

Most multiplatformers will be built for RDNA2 first due to consoles, so Cyberpunk isn't setting any trends.

Titles being developed for consoles or not, there will be huge differences due to RT and DLSS, and the overal more performant Ampere arch, aside from 4k becoming more of a thing.

I would probably do the same under similar circumstances, as getting a paycheck is more important than tech tyranny.

So, first you say you dont have anything against the studio, the go on to claim they got paid to botch performance on different hardwares.

7 years later and the XBone/PS4 are still running circles around the dreaded Gen9 GT2 IGP that is in most PCs nowadays. Or even the newer Iris Plus in Ice Lake.

Im sure the average gaming pc is more powerfull then the One s/PS4 from 2013.

Anyway, i dont think theres any consipiry or that developers did get paid to lower performance etc. DF noted in their video that this is another crysis, a game developed for pc, and then being downscaled. Not only PS5/XSX suffer, but also low end pcs and 2013 consoles.

I look at the released game that IMO has the best graphics out there and it's Demon's Souls

Nah, il side with DF, CP2077 is topping the list now.
 
Last edited:
Maybe one should ask AMD, how happy they are about where their performance is right now with regard to driver optimizations.

That said, I miss performance number for my Vega 56. Only the unbuyable 6800 and one 5700 XT at Tom's against a whole slew of GeForces.
 
The pervasiveness of conspiracy theory about gpu vendors intentionally sabotaging performance in games made by independent studios is annoying. I doubt cdprojectred is not optimizing for AMD. Their biggest platform will probably be console.
It has happened in the past. And can the studios really be considered independent if what they are developing is sponsored...?
 
At 4K Ultra settings? Unlikely, then we should have close numbers for the series 3000 cards. BTW no words on the drivers used on both sides, and only with Intel CPUs

Looks clearly cpu limited on a 10900k at 1080p medium and 1440p medium. The top end cards get held back by the cpu. Really want to see cpu scaling tests for this game. Have a feeling it'll be a bloodbath.
 
I wouldn't put much value into current Cyberpunk 2077 benchmarks at the moment, much less as it being representative of what to expect in general PC performance between the two architectures.
It's a (deeply) nvidia-funded game, meaning they had access to the game's code much earlier than AMD for driver development, and in traditional manner nvidia got their own trojan horses engineers on site to skew performance on competing architectures enhance performance on their own architecture.
Most multiplatformers will be built for RDNA2 first due to consoles, so Cyberpunk isn't setting any trends.

Disclaimer: I already own the game and will love it nonetheless. I have nothing against it or CDPR. A Polish independent studio getting money and engineering resources from wherever they can to make a game that breaks the scale on production values is just doing what needs to be done. I would probably do the same under similar circumstances, as getting a paycheck is more important than tech tyranny.




Regarding the various degrees of RT adoption, 1 year ago I'd be saying RT is the future and everyone needs more RT performance yesterday if we want to survive.
Nowadays, I look at the most impressive real-time demo I saw the entire year and it's Unreal Engine 5 without a shred of RT. It looks like Voxel GI makes RT GI worthless to anything but RT benchmarks.
I look at the released game that IMO has the best graphics out there and it's Demon's Souls. It has no RT.

Cyberpunk looks great, but it seems to look great without RT, or with only a few RT effects enabled. It looks like nvidia ordered suggested CDProjekt RED to just use as much RT as possible to make a showcase out of the game, and the result is a wholly unoptimised mess that doesn't really perform very well in any graphics card, and some effects could be traded by non-RT implementations with immense advantages in performance and residual difference in visual quality (which is already pretty much what always happens when we set PC settings to Ultra, anyway).




Lower end PCs have been the limiting factor for High end PCs at least ever since the 8th-gens released in 2013, because it coincided with Intel getting increasingly relaxed on their CPUs and IGPs due to lack of competition.
7 years later and the XBone/PS4 are still running circles around the dreaded Gen9 GT2 IGP that is in most PCs nowadays. Or even the newer Iris Plus in Ice Lake.

I disagree with this. There is no secret, Nvidia have much more die area for RT. Without RT, the game performs well on RDNA 2 GPU at 1080p, 1440p a bit better than Nvidia Ampere GPU and it is a bit behind at 4k like some other game.

The games doesn't have RT on AMD GPU but they will develop it probably on consoles and AMD GPU at the same time. After with RT, the game will probably perfom much better on Nvidia GPU.
 
Last edited:
Back
Top