Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

Do we have any evidence which suggest otherwise with any degree of certainty?
If it’s performing noticeably better on an aging R9 390 I’d say thats some evidence. This is DX11 even. It’s crazy that people are so adverse to this. Everyone accepted that AMD’s driver sucked back when the reverse was true.
 
If it’s performing noticeably better on an aging R9 390 I’d say thats some evidence. This is DX11 even. It’s crazy that people are so adverse to this. Everyone accepted that AMD’s driver sucked back when the reverse was true.
Yeah which means that this may be an application specific issue.
 
Other than AMD GPUs being faster when CPU limited? Using less of the CPU when not? What commonality is required?
The one which would explain why some titles show the issue while others don't. As I've said more testing is required.

Note that "AMD GPUs being faster when CPU limited" is a false statement already.
 
The one which would explain why some titles show the issue while others don't. As I've said more testing is required.

Note that "AMD GPUs being faster when CPU limited" is a false statement already.
Actually the count is at 8 now. I forgot the Fortnite video that was posted.
 
Those several thousand haven't been tested. Anyways i don't think anything outside of a PR from Nvidia would convince you.
Right, and unless you want to test these several thousand you need to provide something which would allow you to say which games and on what systems will actually exhibit this behavior. Which means further investigation is needed. I don't think that Nvidia PR will be able to handle it, do you?
 
Right, and unless you want to test these several thousand you need to provide something which would allow you to say which games and on what systems will actually exhibit this behavior. Which means further investigation is needed. I don't think that Nvidia PR will be able to handle it, do you?
You sure didn’t require several thousand games be tested when referring to AMD’s poor driver.
 
You sure didn’t require several thousand games be tested when referring to AMD’s poor driver.
I sure did, and I urge to find anything which would suggest otherwise.
And since you won't be able to I suggest you would stfu about Nv PR and the rest of similar stupid personal attacks.
 
Here's another one. COD Modern Warfare 3080 vs 6800xt on an i7 8700k. For pretty much equal frame rate/frame times over the same run the AMD gpu uses a less cpu.


Framerate, frametimes are very similar though Radeon seems to have less variance and better stability
upload_2021-3-14_19-57-30.png

If we look at CPU usage, AMD has a very clear advantage.

upload_2021-3-14_19-58-20.png
 
Again, a title which generally favors Radeons, even in GPU limited scenarios.

I think that those which tend to favor GFs in GPU limited scenarios are a more interesting point of research here. Control for example.

I imagine we'll see follow up testing by a variety of reviewers, so we'll find out soon enough.
 
Tried Control - DX11 is faster than DX12. DX11 has better multi-threading, too... like WoW.
I guess developers either ignore nVidia's DX11 driver and life with a slower DX12 path or they go the "max out every core" route to "optimize" the DX12 path.
 
I imagine we'll see follow up testing by a variety of reviewers, so we'll find out soon enough.
That's what I'm hoping for really.

Tried Control - DX11 is faster than DX12. DX11 has better multi-threading, too... like WoW.
I guess developers either ignore nVidia's DX11 driver and life with a slower DX12 path or they go the "max out every core" route to "optimize" the DX12 path.
Or they just don't optimise their games for Nv h/w at all. Which worked fine for the period when Nv had top end for themselves but now it starts to backfire on them.
(I'm not very fond of developers, yeah.)
 
Quake 2 RTX comparision between 3080 and 6900XT:

Looks fine for both GPUs. 3080 is ~2x as fast with the same CPU utilisation.
 
Back
Top