Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

Not really. 1080p360Hz monitors exist. Try pushing max frames in COD Warzone. You need a pretty high-end gpu to pair with a high end cpu to make it work.
If I want to play "FPS" online I will will play ARMA 3 (I am a veteran)...COD is too "arcad'ish" for me...same with Battlefield games.
Best "army" FPS I tried was the OG "Flash point"...worst must be split between COD/BF.
 
CPU is unlikely to be the bottleneck in any way at those settings, however it's completely irrelevant to this discussion. It's not RTX 3090 issue, not even Ampere issue since RTX 2080 Ti behaved the same way.

So NVIDIA does better than AMD at maximum settings/4K...when I play games I mostly GPU limited...aka doing it the right way.

If I remember correctly AMD is sub 30 FPS in WDL at 4K/DXR/Ultra settings, while the 3090 is 60FPS with DLSS Quality /shrugs
 
So, instead of showing how good nVidia and AMD can handle Raytracing at 1080p HBU is doing low-end CPUs tests with >$800 graphic cards in games which have broken DX12 paths on nVidia hardware...
They could have used Call of Duty Black Ops. It runs with >200FPS on a 3090...
I'm pretty sure neither 5700 XT nor 5600 XT ever sold anywhere near $800 let alone over it, 3070 might be over it in the current situation though.
You're actually suggesting NVIDIA sponsored title with RT and all is having broken DX12 path on NVIDIA hardware? You really think they'd let that past QA at NVIDIA?

Do we really have users here who simply can't see any fault ever in their favorite companies or wth is going on?

So NVIDIA does better than AMD at maximum settings/4K...when I play games I mostly GPU limited...aka doing it the right way.
This isn't about what some company is doing better than some other company at whatever settings.
This is about on companys video cards across at least two generations showing higher CPU usage on lowlevel APIs compared to another companys video cards from 2 generations.
Shouldn't be that hard concept to grasp, even when it doesn't apply to you. Not everyone is running highend CPU and GPU.

Also, I'm pretty sure that for example RTX 2060 won't do much better than 5600 XT in 4K max settings so maybe use, you know, specific model instead of company?
If I remember correctly AMD is sub 30 FPS in WDL at 4K/DXR/Ultra settings, while the 3090 is 60FPS with DLSS Quality /shrugs
Apples to oranges is always fun.
 
Interesting how people are reacting to HUB's inquiry of Nvidia's driver overhead, and how they reacted to stories of AMD's high driver overhead during the GCN era. :D

Has nothing to do with driver overhead and everything to do with the fact that these "low level APIs" happen to be in AMD sponsored titles. (Haven't watched the video yet.)

Jeez. So now the Nvidia Defence Force are protesting things without even considering the evidence.

Of course, because NVIDIA can never be at fault, right? :rolleyes:

Nvidia is never wrong. Don't you know that?

https://forum.beyond3d.com/posts/1009246/

If you have worse results in DX12 than in DX11 then it's the software's fault since it is worse at managing the h/w than the DX11 driver.
So to answer your question - yes, this isn't a "fault" of Nvidia.
Interesting take.

https://forum.beyond3d.com/posts/2087957/
DegustatoR said:
RDNA's D3D11 driver may be rather poor considering that it's been less than half a year since it's launch and AMD generally isn't that good in D3D11.
 
I've watched the video and the findings themselves are interesting but it's a bit of a stretch to go from them to "Nvidia has a driver overhead problem, lol".

They should've done more testing in DX12 and Vulkan. They say that they did but no results are presented.
They should've done CPU thread load diagrams. They are comparing 6C/12T CPUs which are progressively faster in single thread - this means that if it is a driver limitation it should show up as a single thread being 100% loaded.
They should've compare these results to DX11 results in the same games with the same graphics.

As it is it's not an indication of a driver issue and can just as easily be an application side issue.

Jeez. So now the Nvidia Defence Force are protesting things without even watching or reading things.
Jeez now the AMD defense force going through posting history to prove that someone is wrong instead of actually presenting anything of value.
 
I think I have to agree that there isn't nearly enough testing here to determine a true issue with Nvidia DX12 driver overhead. There needs to be multiple games in both DX12 and Vulkan with a similar testing format but almost the entire video is focused on 1 game.
 
I've watched the video and the findings themselves are interesting but it's a bit of a stretch to go from them to "Nvidia has a driver overhead problem, lol".

They should've done more testing in DX12 and Vulkan. They say that they did but no results are presented.
They should've done CPU thread load diagrams. They are comparing 6C/12T CPUs which are progressively faster in single thread - this means that if it is a driver limitation it should show up as a single thread being 100% loaded.
They should've compare these results to DX11 results in the same games with the same graphics.

As it is it's not an indication of a driver issue and can just as easily be an application side issue.

So, just to confirm, you do actually acknowledge Hardware Unboxed's findings and journalistic integrity now?

A couple months ago you were leading the charge of basically calling them AMD mouthpieces.

Jeez now the AMD defense force going through posting history to prove that someone is wrong instead of actually presenting anything of value.

Cute.
I'm just trying to figure out why you're in every thread rabidly defending Nvidia, shaming independent tech journalists, and claiming that anyone who says anything remotely critical of Nvidia is an AMD shill.
 
I'm pretty sure neither 5700 XT nor 5600 XT ever sold anywhere near $800 let alone over it, 3070 might be over it in the current situation though.
You're actually suggesting NVIDIA sponsored title with RT and all is having broken DX12 path on NVIDIA hardware? You really think they'd let that past QA at NVIDIA?

Do we really have users here who simply can't see any fault ever in their favorite companies or wth is going on?

Sure. I guess i have just missed the $250 GTX1060, or GTX1660TI or whatever nVidia card was released prio Ampere.
Neither Ubisoft nor Dice have been cared to optimize their base DX12 path for nVidia hardware. One is broken (BF5) the other is just unoptimized (Watch Dogs) for nVidia hardware.
The same will happen with other games. But that is what all these developers wanted: More control over the GPU.
 
Last edited:
So, just to confirm
You can confirm whatever you want, you're fully able to do so without me.

I'm just trying to figure out why you're in every thread rabidly defending Nvidia
AMD fans have a weird view on what other posters are doing. Tough luck I guess.

Apparently saying that RDNA DX11 driver could be bad for now since it's a new architecture is "rabidly defending Nvidia" in your view - which says enough about you and anything you could say really.
 
I'm not that knowledgeable about the gpu architectures, but it sounds like Nivida does more work upfront in the driver before submitting work to the GPU, and AMD's hardware is better at scheduling work submitted from the driver. It does have an interesting case for esports gamers who are looking for max performance at 1080p or 1440p with low settings. AMD might offer a path for a much cheaper solution in that particular case.
 
I'm not that knowledgeable about the gpu architectures, but it sounds like Nivida does more work upfront in the driver before submitting work to the GPU, and AMD's hardware is better at scheduling work submitted from the driver.
The CPU (average on all cores) has a higher load on NV than on AMD.
It's rather unlikely that it has anything to do with any kind of scheduling.
It looks as if the driver is doing some sort of runtime validation / translation the reason for which is unknown and which is absent on AMD side.
But it's certainly something NV should look into. It's certainly possible that if they are doing something with the driver there they should disable this in CPU limited scenarios - as the only reason I can think of for them to do this is to improve the GPU limited performance.
 
Pairing 800+ dollar gpus with lower end cpus and then play at 1080p resolutions without any form of reconstruction or use the headroom to enable next gen features like ray tracing.

Guess users with that kind of setups would be intrested in DFs findings on CRT screens.
 
Pairing 800+ dollar gpus with lower end cpus and then play at 1080p resolutions without any form of reconstruction or use the headroom to enable next gen features like ray tracing.

Guess users with that kind of setups would be intrested in DFs findings on CRT screens.
Regardless of how many people would be playing with these configurations and settings, investigations like these are interesting. Just like CPU reviews test games at 720p and 1080p low with the fastest video cards available.

If these new cards were available at RRP, I could see people matching RTX 3070/ 3080 or RX6800/ XT with Zen+ parts.

As for more testing into the issue, yes there are more avenues to investigate, but don't forget (I think) Steve said in the video he already did over 1000 benchmark runs. There is also the previous video which shows more games.

I don't think there should be any outrage at these results, but they are interesting, and shouldn't be shot down as wrong or pointless just because they could show that Nvidia isn't all conquering :???:
 
While 1080p performance is moot for cards like 3090, the bigger problem is that the fps is getting capped lower than what high-refresh rate displays can do and lower 1% lows. Gaming on 144Hz and beyond means many people are not trying to induce GPU bottleneck by loading their cards with all the uber ultra settings to play at 30-60fps.

That's why the nvidia subreddit is full of people with Zen2 and below CPUs who found their nvidia 'upgrade' didn't pan out as they wanted.

 
I think I have to agree that there isn't nearly enough testing here to determine a true issue with Nvidia DX12 driver overhead. There needs to be multiple games in both DX12 and Vulkan with a similar testing format but almost the entire video is focused on 1 game.

They test Watch Dogs Legion, Horizon Zero Dawn, and Shadow of the Tomb Raider. So 3 games.

I recommend this video that HU references btw:


It's from 2017 but gives a good background on why we may be seeing this discrepancy now. This may be down to architectural differences that hurt AMD in the past but may be helping them now.
 
Pairing 800+ dollar gpus with lower end cpus and then play at 1080p resolutions without any form of reconstruction or use the headroom to enable next gen features like ray tracing.

Using 'reconstruction' would just further push the bottleneck to the CPU, so that really wouldn't apply here. Regardless we can recognize where Nvidia has advantages and also recognize where perhaps they may fall short. ANd as the video shows, it's not simply pairing "$800 GPU's with lower-end CPU's" - there were cases where a 5600XT actually was more performant than a 3070, and that was at 1080p with framerates just above 60fps and a 2600 XT, which is absolutely not a rare CPU for gamers. It's worth noting that.
Guess users with that kind of setups would be intrested in DFs findings on CRT screens.

It's interesting because it's interesting, and points to bottlenecks - just because your CPU isn't completely maxxed out doesn't mean it's not an issue, Windows can frequently kick off a process and it helps to have more cycles free. HU noted this in their SOTTR benchmarks, even limiting it to 60fps, the Radeon 'felt' smoother, as there were less small stutters compared to Nvidia as the CPU would hit 90%+ more frequently.

This kind of silly hand-waving platform fanboy stuff is when this forum is at its worst, this is the exact kind of detail-oriented discussion is what Beyond3D should be about. If you think it's irrelevant, maybe just move on.

All that being said, I want more data - testing with GTX 1660/1060's/2060's and very common CPU's like the 8400.
 
Last edited:
You can confirm whatever you want, you're fully able to do so without me.

and I’m glad you got around to actually watching the video, rather than sticking to wild speculation.

AMD fans have a weird view on what other posters are doing. Tough luck I guess.

Apparently saying that RDNA DX11 driver could be bad for now since it's a new architecture is "rabidly defending Nvidia" in your view - which says enough about you and anything you could say really.

I’m just going by your record on this forum, e.g., the HUB thread, where you: condemned HUB, implying that they have no credibility; and defended, and indeed encouraged, the disturbing trend of Nvidia strong arming journalists over how they conduct reviews.

But sure, I guess defending independent tech journalism makes me an AMD fanboy.
 
Back
Top