AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

So going by what I'm reading here, there is a huge discrepancy between RDNA2 and Ampire based cards with RT disabled?
Just how big is it?

Although RT disabled is unfair to Nvidia, interesting data point.
 
So going by what I'm reading here, there is a huge discrepancy between RDNA2 and Ampire based cards with RT disabled?
Just how big is it?
If that's about CP2077 it seems to run slightly better on NV h/w than on AMD - but I'm sure that AMD will be able to get a 5-10% boost via driver optimizations.
Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.

index.php

https://www.guru3d.com/articles-pages/cyberpunk-2077-pc-graphics-perf-benchmark-review,1.html

Other benchmarks released so far show similar results:
https://www.purepc.pl/test-wydajnosci-cyberpunk-2077-jakie-sa-wymagania-sprzetowe
https://www.tomshardware.com/news/cyberpunk-2077-pc-benchmarks-settings-performance-analysis
https://www.pcgameshardware.de/Cybe...k-2077-Benchmarks-GPU-CPU-Raytracing-1363331/

Saying that the game is running bad on Radeons would be a lie.
 
Please give examples?
Plenty of examples and even developers' first hand experiences have been provided in this forum tens or perhaps hundreds of times on numerous occasions (perhaps in this very same thread even).
Then the usual negationists will keep asking for the same proof that was handed to them several times over again because then they get to shift the goalposts to discussing to death this one point (among the many presented) they see as potentially flawed.

I eventually grew tired of these tactics, and I honestly don't really have the time or will to, for the 100th time, dig up the sub-pixel triangles in Geralt's hair and the interviews with AMD personnel all but confirming it was being done on purpose. In the off-chance you're genuinely interested in getting examples look for example for Robert Hallock's 2015 statements on that issue.
There's a reason I put those tidbits with strike-through. It's because I just want to write what I think but I'm not really in the mood to fight for every word in that sentence for yet again.


Regardless, those who've been paying attention and remember History knew that nvidia overexerting RT functionalities in Cyberpunk was coming over a mile away.




So, first you say you dont have anything against the studio, the go on to claim they got paid to botch performance on different hardwares.
I guess this goes against your polarized with-or-against-us view of the world, but it is possible to #gasp# criticize something that you actually like.
Mind. Blown.



DF noted in their video that this is another crysis, a game developed for pc, and then being downscaled.
Unlike most games that will be developed for consoles first (since you know, most of the revenue is coming from those and all) and only then they're scaled up/down for PC hardware. Which is exactly why this game isn't setting any trends.
In your quest to trying to dismiss my opinions you actually managed to reinforce them. Wow, thanks.



I disagree with this. There is no secret, Nvidia have much more die area for RT. Without RT, the game performs well on RDNA 2 GPU at 1080p, 1440p better than Nvidia Ampere GPU and it is a bit behind at 4k like some other game.
Who suggested nvidia has no RT performance advantage?
 
Last edited by a moderator:
If that's about CP2077 it seems to run slightly better on NV h/w than on AMD - but I'm sure that AMD will be able to get a 5-10% boost via driver optimizations.
Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.

index.php

https://www.guru3d.com/articles-pages/cyberpunk-2077-pc-graphics-perf-benchmark-review,1.html

Other benchmarks released so far show similar results:
https://www.purepc.pl/test-wydajnosci-cyberpunk-2077-jakie-sa-wymagania-sprzetowe
https://www.tomshardware.com/news/cyberpunk-2077-pc-benchmarks-settings-performance-analysis
https://www.pcgameshardware.de/Cybe...k-2077-Benchmarks-GPU-CPU-Raytracing-1363331/

Saying that the game is running bad on Radeons would be a lie.
What am I missing?
There seemed to be a whole page of huge concern about normal kind of slightly favors one over the other.
Not huge differences that we have seen in the past.
 
I guess this goes against your polarized with-or-against-us view of the world

Keep personal attacks elsehwere perhaps. Its a technical discussion about hardware and games, not personal views of the world and all.

Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.

Valhalla favoured AMD hardware. CP2077 seems to be good for both, seeing how a 3070 and 6800 perform to eachother, slight advantage to 3070 but thats not suprising seeing the specs of the hardwares.

Unlike most games that will be developed for consoles first (since you know, most of the revenue is coming from those and all) and only then they're scaled up/down for PC hardware. Which is exactly why this game isn't setting any trends.

Most games are developed with console specs as a baseline, which means low/mid end pc hardware too. Games are then being upscaled to take advantage of more powerfull systems. Since scaling is doing so well these days, theres nothing to worry about there.
CP2077 is the most impressive looking game so far, its selling like nothing else now on pc and moves alot of hardware. The gameplay is dues ex.... Its setting alot of trends i think.
 
It doesn't mean it's wrong either. And it especially doesn't warrant calling it a conspiracy theory as a way to dismiss it without any actual thorough analysis/investigation, or at least a lookout for warning signs.

Isn’t that the usual refrain of conspiracy theorists? That it’s your job to prove that they are crazy?

There are no signs of nefarious behavior in Cyberpunk so the fear mongering is unnecessary.
 
If that's about CP2077 it seems to run slightly better on NV h/w than on AMD - but I'm sure that AMD will be able to get a 5-10% boost via driver optimizations.
Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.

index.php

https://www.guru3d.com/articles-pages/cyberpunk-2077-pc-graphics-perf-benchmark-review,1.html

Other benchmarks released so far show similar results:
https://www.purepc.pl/test-wydajnosci-cyberpunk-2077-jakie-sa-wymagania-sprzetowe
https://www.tomshardware.com/news/cyberpunk-2077-pc-benchmarks-settings-performance-analysis
https://www.pcgameshardware.de/Cybe...k-2077-Benchmarks-GPU-CPU-Raytracing-1363331/

Saying that the game is running bad on Radeons would be a lie.

It runs better on Ultra, other settings it runs better on AMD RDNA2 GPUs. The first benchmark I saw was on High.
 
It runs better on Ultra, other settings it runs better on AMD RDNA2 GPUs. The first benchmark I saw was on High.

What he means is that in the same performance segments (6800 vs 3070), they perform close enough, which tells us the game runs well on both architectures. Its not like valhalla where things dont align as well, like his example of a 3070 class product outperforming a 6900xt which doesnt happen here).
 
So going by what I'm reading here, there is a huge discrepancy between RDNA2 and Ampire based cards with RT disabled?
Just how big is it?

Although RT disabled is unfair to Nvidia, interesting data point.
Seems like only on Ultra quality however. Which may make some sense, pondering if shader complexity is increasing with quality, then having more available ALU may be beneficial here, the ALUs can saturate further. Typically 30% of the frame IIRC uses int32, once those sections are complete, Ampere will flip that pipeline back to FP32. So perhaps we're seeing a good amount of use from that as long as the workload is very large here

When shader quality is less complex, the front end is going to be a larger fraction of the frame time, and this will benefit the faster 6000 series cards, as the additional ALU on ampere is not taken advantage of in the lighter loads. Wish we had 2080TI metrics to compare here (too lazy to search) but I think on lower quality presets the 2080TI should compete well with some of the 3000 series cards.
 
I think when the halo GPU can't even run 4K native at 60fps average on max setting and can only reach there using medium, it seems that CP2077 is tailor made to show not only ray tracing advantage but also DLSS. And you practically need DLSS when running ultra with RT (regardless of medium or ultra RT) at 1920 even with the halo GPU. There is nothing much AMD can do in this case.
 
Produce a GPU with strong RT and ML Upscaling or their own comparable tech?

I'm actually glad that CDPR pushed the boat far out especially if this game is intended to last a long time.
As long as the settings are there to make it a playable experience on a range of products.

I'm super happy about cdpr pushing envelope as well. This is what pc gaming is supposed to be, push to the max even to the point future hw is needed for max settings. It's ok. to play with non maxed out settings.
 
Wow, the game actually punishes 4c CPUs by a lot, and even the 6c models are losing significant performance from the 8c models.

Hey guys remember when the official PC requirements came out and CD Projekt put the Core i7 4790 side-by-side with a Ryzen 5 3600?

5xFHJFy.jpeg





PCGamesN actually pitted the 4790K against the Ryzen 5 3600 in their CPU benchmark:

s4d59G4.png




:runaway::runaway::runaway::runaway:



What am I missing?
There seemed to be a whole page of huge concern about normal kind of slightly favors one over the other.
Not huge differences that we have seen in the past.
There's not much worth of discussion within the game's performance results without RT. (I wonder if bringing up RT-less results that no one was questioning is just another means to skew the narrative.)

Now if you turn RT all the way up then the game becomes an unusable mess in terms of performance with questionable IQ enhancements, unless you have recently-released a top-end card from only one IHV (working together with an upsampling technique that is exclusive to that same IHV).
Of course the negationists will be swearing up and down that it's all in our head and CDPR or nvidia would never do something like that because they said so and they always tell the truth. It is known.
Just like that time Jen Hsu Huang told an audience that they could buy a laptop with a ~8TF RTX 2080 mobile and it would perform the same or better than the new consoles. Always the truth.


The fact that RT-enabled Cyberpunk is running like ass on the Turing graphics cards that cost over $1000 half a year ago is just planned obsolescence from nvidia an unfortunate coincidence that propels Turing owners to upgrade to the new and much better GPUs.
And it's totally not the same as the Kepler-killing super geometry we saw in some games, right after Maxwell came out.:nope:





Isn’t that the usual refrain of conspiracy theorists? That it’s your job to prove that they aren’t crazy?
Now you're up to the point of calling people crazy? Here's a post in this very same thread from not even a week ago.

My experience as a AAA dev is the opposite, AMD providing code working well on NV/AMD but NV providing code working well on NV and being slow on AMD. (And changes could be made to improve AMD perf with little to no impact on NV perf...)
 
Back
Top