So going by what I'm reading here, there is a huge discrepancy between RDNA2 and Ampire based cards with RT disabled?
Just how big is it?
Although RT disabled is unfair to Nvidia, interesting data point.
If that's about CP2077 it seems to run slightly better on NV h/w than on AMD - but I'm sure that AMD will be able to get a 5-10% boost via driver optimizations.So going by what I'm reading here, there is a huge discrepancy between RDNA2 and Ampire based cards with RT disabled?
Just how big is it?
Plenty of examples and even developers' first hand experiences have been provided in this forum tens or perhaps hundreds of times on numerous occasions (perhaps in this very same thread even).Please give examples?
I guess this goes against your polarized with-or-against-us view of the world, but it is possible to #gasp# criticize something that you actually like.So, first you say you dont have anything against the studio, the go on to claim they got paid to botch performance on different hardwares.
Unlike most games that will be developed for consoles first (since you know, most of the revenue is coming from those and all) and only then they're scaled up/down for PC hardware. Which is exactly why this game isn't setting any trends.DF noted in their video that this is another crysis, a game developed for pc, and then being downscaled.
Who suggested nvidia has no RT performance advantage?I disagree with this. There is no secret, Nvidia have much more die area for RT. Without RT, the game performs well on RDNA 2 GPU at 1080p, 1440p better than Nvidia Ampere GPU and it is a bit behind at 4k like some other game.
What am I missing?If that's about CP2077 it seems to run slightly better on NV h/w than on AMD - but I'm sure that AMD will be able to get a 5-10% boost via driver optimizations.
Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.
https://www.guru3d.com/articles-pages/cyberpunk-2077-pc-graphics-perf-benchmark-review,1.html
Other benchmarks released so far show similar results:
https://www.purepc.pl/test-wydajnosci-cyberpunk-2077-jakie-sa-wymagania-sprzetowe
https://www.tomshardware.com/news/cyberpunk-2077-pc-benchmarks-settings-performance-analysis
https://www.pcgameshardware.de/Cybe...k-2077-Benchmarks-GPU-CPU-Raytracing-1363331/
Saying that the game is running bad on Radeons would be a lie.
It's a game from NV's program so we should all be very concerned apparently for no reason.What am I missing?
There seemed to be a whole page of huge concern about normal kind of slightly favors one over the other.
Not huge differences that we have seen in the past.
It doesn't mean it's wrong either. And it especially doesn't warrant calling it a conspiracy theory as a way to dismiss it without any actual thorough analysis/investigation, or at least a lookout for warning signs.Just because something is plausible doesn't mean it's true.
I guess this goes against your polarized with-or-against-us view of the world
Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.
Unlike most games that will be developed for consoles first (since you know, most of the revenue is coming from those and all) and only then they're scaled up/down for PC hardware. Which is exactly why this game isn't setting any trends.
Without going into specifics, DX 10.1 and Hairworks.Please give examples?
It doesn't mean it's wrong either. And it especially doesn't warrant calling it a conspiracy theory as a way to dismiss it without any actual thorough analysis/investigation, or at least a lookout for warning signs.
If that's about CP2077 it seems to run slightly better on NV h/w than on AMD - but I'm sure that AMD will be able to get a 5-10% boost via driver optimizations.
Otherwise there's nothing shocking like 3070 being faster than 6900XT going on here - in contrast to what you may see in AC Valhalla which for some reason is fine among the same people.
https://www.guru3d.com/articles-pages/cyberpunk-2077-pc-graphics-perf-benchmark-review,1.html
Other benchmarks released so far show similar results:
https://www.purepc.pl/test-wydajnosci-cyberpunk-2077-jakie-sa-wymagania-sprzetowe
https://www.tomshardware.com/news/cyberpunk-2077-pc-benchmarks-settings-performance-analysis
https://www.pcgameshardware.de/Cybe...k-2077-Benchmarks-GPU-CPU-Raytracing-1363331/
Saying that the game is running bad on Radeons would be a lie.
It runs better on Ultra, other settings it runs better on AMD RDNA2 GPUs. The first benchmark I saw was on High.
Seems like only on Ultra quality however. Which may make some sense, pondering if shader complexity is increasing with quality, then having more available ALU may be beneficial here, the ALUs can saturate further. Typically 30% of the frame IIRC uses int32, once those sections are complete, Ampere will flip that pipeline back to FP32. So perhaps we're seeing a good amount of use from that as long as the workload is very large hereSo going by what I'm reading here, there is a huge discrepancy between RDNA2 and Ampire based cards with RT disabled?
Just how big is it?
Although RT disabled is unfair to Nvidia, interesting data point.
Game seem to be hitting CPUs pretty hard too so if you, say, test it on Zen3 platform with SAM enabled on N21 cards there may be wins in less GPU limited scenarios because of this.It runs better on Ultra, other settings it runs better on AMD RDNA2 GPUs. The first benchmark I saw was on High.
Produce a GPU with strong RT and ML Upscaling or their own comparable tech?There is nothing much AMD can do in this case.
Produce a GPU with strong RT and ML Upscaling or their own comparable tech?
I'm actually glad that CDPR pushed the boat far out especially if this game is intended to last a long time.
As long as the settings are there to make it a playable experience on a range of products.
There's not much worth of discussion within the game's performance results without RT. (I wonder if bringing up RT-less results that no one was questioning is just another means to skew the narrative.)What am I missing?
There seemed to be a whole page of huge concern about normal kind of slightly favors one over the other.
Not huge differences that we have seen in the past.
Now you're up to the point of calling people crazy? Here's a post in this very same thread from not even a week ago.Isn’t that the usual refrain of conspiracy theorists? That it’s your job to prove that they aren’t crazy?
My experience as a AAA dev is the opposite, AMD providing code working well on NV/AMD but NV providing code working well on NV and being slow on AMD. (And changes could be made to improve AMD perf with little to no impact on NV perf...)