Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

Maybe, but here im at everything maxed including DXR 4k60 in 64 player matches on a 2080Ti and a Zen2 cpu.No idea what the fuss is about, BFV doesnt run bad and gaming at 1080p resolutions on modern high end hardware isnt all that intresting outside of Esports perhaps.
There are two points which Nvidia should be concerned about:
A) People do upgrade GPUs more often than CPUs. Which means that a situation when someone with a 9600K got himself 3070 in place of a Vega 64 and this lead to worse performance is very possible.
B) As this generation will kick into higher gear console Zen2 8C/16T 3.7(ish) MHz CPUs will become used to 100% to reach not 120 but 60 and even 30 fps more often. This means that there will be A LOT of people with 4C-8C from before circa 2020 which may run into CPU limitations.
Point B is important actually because this may start showing up outside of theoretically engineered situations where 3080 is running 1080p at 200 fps and loosing to a 5600XT at 240 fps. It may start showing up in 4K and at fps around 30-60 which is the main playable territory.
In fact I wonder if it is already in games like AC Valhalla. I still wonder if it's actually CPU related though.
 
High refresh rate monitors are becoming more and more common. If you're a 60Hz gamer, this probably won't be an issue. If you're a high refresh gamer, this information is going to be pretty valuable. I don't think it's fair to assume that a person with low to midrange hardware is necessarily going to only play at 60Hz. There are options for 1080p low to try to hit 144. I know because that's what I did. Playing low settings in Battlefield 5 in multiplayer is probably the most common setup.

At the lower-end of the spectrum perhaps yes. The problem then seems to lie in gamers using older generation of CPU's i think. Any decent 8 core Zen2 cpu, coupled with say a 2070 or higher wont have all that much trouble achieving 1080p low/144 i think.
I actually dont know about that last, the people i play with/play on our servers do not play on lowest settings, neither 1090p. Anyway, people getting say a 6800XT or 3080 class GPU are usually sitting with under-powered old CPUs right? :p
This is not to say NV shouldn't (try to) fix the higher CPU load issue for those that dont want or cant upgrade their CPUs to 2020/21 or atleast console standards.though.

There are two points which Nvidia should be concerned about:
A) People do upgrade GPUs more often than CPUs. Which means that a situation when someone with a 9600K got himself 3070 in place of a Vega 64 and this lead to worse performance is very possible.
B) As this generation will kick into higher gear console Zen2 8C/16T 3.7(ish) MHz CPUs will become used to 100% to reach not 120 but 60 and even 30 fps more often. This means that there will be A LOT of people with 4C-8C from before circa 2020 which may run into CPU limitations.
Point B is important actually because this may start showing up outside of theoretically engineered situations where 3080 is running 1080p at 200 fps and loosing to a 5600XT at 240 fps. It may start showing up in 4K and at fps around 30-60 which is the main playable territory.
In fact I wonder if it is already in games like AC Valhalla. I still wonder if it's actually CPU related though.

In short, people need to upgrade. Just like the storage, people on pre-NVME drives perhaps should upgrade. Its a new generation, time for new hardwares....
 
High refresh rate monitors are becoming more and more common. If you're a 60Hz gamer, this probably won't be an issue. If you're a high refresh gamer, this information is going to be pretty valuable. I don't think it's fair to assume that a person with low to midrange hardware is necessarily going to only play at 60Hz. There are options for 1080p low to try to hit 144. I know because that's what I did. Playing low settings in Battlefield 5 in multiplayer is probably the most common setup.
It can still affect 60hz gaming though - for example on my system with the 1660, when Horizon Zero Dawn stutters below 60fps at 1080p, the vast majority of the time it's because it spikes a CPU core 100%. Now certainly it's still not a great port and a Radeon could very well have the same result, but instead of the ~60% CPU the game is constantly taking before these spikes, what if it had 10-20% more headroom to weather them? On a non-VRR display with a framerate cap, these drops are also going to be far more noticeable when they occur.

It's similar to what HU found when running SOTTR at a capped 60fps with a 3080 vs a 6800, on a CoreI3-10100. The "avg frames' readout from both was basically identical at 59.9fps, but HU reported there were noticeable stutters that could be felt on the 3080, likely due to the avg CPU requirements being 10-20% higher across the run.

0WKgkje.jpg
 
...
In short, people need to upgrade. Just like the storage, people on pre-NVME drives perhaps should upgrade. Its a new generation, time for new hardwares....

Or save money and buy the gpu that will perform better on your lower-end cpu? Like, if you have infinite money, sure just upgrade.
 
Or save money and buy the gpu that will perform better on your lower-end cpu? Like, if you have infinite money, sure just upgrade.
Not sure how you would "save money" though. Let's assume that you'll get a cheaper Radeon instead of a more expensive GeForce because Radeon would run faster on your CPU. What will happen after you upgrade the CPU? You'll need to buy another GPU which will be as fast as a more expensive GeForce would be on this new CPU now?
I don't think you can save money here. You could get a Radeon which would be as fast as a GeForce and cost the same though if you're unwilling to upgrade your CPU any time soon.
 
Not sure how you would "save money" though. Let's assume that you'll get a cheaper Radeon instead of a more expensive GeForce because Radeon would run faster on your CPU. What will happen after you upgrade the CPU? You'll need to buy another GPU which will be as fast as a more expensive GeForce would be on this new CPU now?
I don't think you can save money here. You could get a Radeon which would be as fast as a GeForce and cost the same though if you're unwilling to upgrade your CPU any time soon.

Yah, there are tons of scenarios. Some people keep cpus for 6-8 years, especially budget conscious people. Really depends on what they have and how long they want it to last. You can come up with infinite different upgrade scenarios and it's hard to know which ones are the most prevalent. Ultimately I do think reviewers should be helping users save money. Just buying upgrades all the time isn't financially responsible or feasible for most people.

It's kind of interesting. The methodology for gpu testing makes sense, where you use the fastest cpu possible to avoid cpu bottlenecks. So you might review an RTX3060 with a 10-series i7 or i9. That'll show you how fast the gpu can go. But is that really who is buying a 3060? Many of them probably have older i5s or ryzen5s. In a sense this part of the story was missed by what seems like good methodology. I'm curious how a 3060 review changes if you pair it with a 3600x or a 9500k. There may be a space for someone doing reviews based on the most common cpus - a budget reviewer that tests the mid and low cards with the the most common cpus of the last two or three years.
 
Last edited:
@troyan He's using dx11 with future frame rendering on and all settings to low, which is basically the esports setup for battlefield. Not sure why you wouldn't believe that this person is being honest about their video. It's from 2019, well before this issue was highlighted by hardware unboxed to the broader community. It's likely he posted it hoping for real answers. If you spend money on a new gpu and it seemed to run much worse you'd probably be very disappointed.

Because more people would have seen this problem in BF5. On youtube there are videos with Geforce and 4C processors from Intel which dont have problems:

To be more clear the user goes from GPU limited to fully CPU limited with much lower performance. The R9 390 is quite a bit faster and bottlenecks on gpu utilization. He switches to the gtx1660ti and performance drops quite a bit because the cpu hits 100% and can't feed the gpu fast enough. GPU utilization is 55-60% range.

Because something went wrong. Wrong settings, game update, broken driver at this time etc. It still a PC plattform.

/edit: BTW: He found his fix:
 
Last edited:
Because more people would have seen this problem in BF5. On youtube there are videos with Geforce and 4C processors from Intel which dont have problems:



Because something went wrong. Wrong settings, game update, broken driver at this time etc. It still a PC plattform.

Those two videos are consistent with the other BFV video ...

Both cases the gpu is bottlenecked by the cpu and showing low gpu utilization, and sub 100fps performance. The 2060 is even running about the same as the 1660 which it obviously shouldn't. Not to mention the cpus are better in these BFV videos than the one comparing the 390 to the 1660. The system with the radeon 390 actually seems to run better than the two vids you posted.
 
...

/edit: BTW: He found his fix:

His "fix" was disabling fullscreen optimizations and overclocking his cpu and ram. He went from 4.5 to 4.6 GHz on his cpu and overclocked his ram from 1866 to 2133 including tweaking primary and secondary timings. Memory latency can have a huge impact on game performance. Did he find a "fix" or did he work around it with overclocking?
 
Overclocking will not decrease the CPU utilisation like this.

He's gave the cpu enough overhead that the gpu is now the bottleneck, which is why cpu utilization is lower. The CPU core overclock is very small, around 2.2%. The improvement in memory latency might be more significant. Could be a 12% latency reduction or more, which saves a lot of clock cycles every time the CPU access RAM. Unfortunately the guy never showed results with each change independently and never tested the cpu changes with the AMD card installed so we don't know how it would have affected cpu usage there.

I don't know what disabling fullscreen optimizations does, but I tried it when I had my 1060 and it never did much of anything. Personally I think the RAM overclock and timings is likely why he saw a big jump in performance since he was hard cpu limited.
 
Yah, there are tons of scenarios. Some people keep cpus for 6-8 years, especially budget conscious people. Really depends on what they have and how long they want it to last. You can come up with infinite different upgrade scenarios and it's hard to know which ones are the most prevalent. Ultimately I do think reviewers should be helping users save money. Just buying upgrades all the time isn't financially responsible or feasible for most people.

It's kind of interesting. The methodology for gpu testing makes sense, where you use the fastest cpu possible to avoid cpu bottlenecks. So you might review an RTX3060 with a 10-series i7 or i9. That'll show you how fast the gpu can go. But is that really who is buying a 3060? Many of them probably have older i5s or ryzen5s. In a sense this part of the story was missed by what seems like good methodology. I'm curious how a 3060 review changes if you pair it with a 3600x or a 9500k. There may be a space for someone doing reviews based on the most common cpus - a budget reviewer that tests the mid and low cards with the the most common cpus of the last two or three years.
Assuming that some people just don't upgrade CPU is a dangerous road to take. I'm thinking that the days when you could use your CPU for 6-8 years has passed (and that's already stretching it really; I'm on my 5th year on current CPU and it's showing it's age although I've bought one which would last me for as long as possible - 6850K in 2016). We're looking at Intel and AMD (and Apple, kinda) battling between themselves for next several years which means that CPUs will improve each couple of years or so enough to warrant an upgrade. So one had to look at his GPU purchase considering not only his current CPU platform but what he may upgrade to in the near future as well.

And as for metholody we still need more investigation into the issue. I'm not too sure that is as clear as "Nv GPUs need a faster CPU" yet. It may be app dependent, it may be not CPU at fault here, DX11 is still prevalent on average, etc
 
@DegustatoR Yah, I agree. Stretching a cpu that long is hard, but for some people that's the budget they have. They'll have to play low/ultra low settings and deal with whatever performance they can get, which is why knowing how gpus perform on low spec cpus matters. You might have a case where two gpus perform equally on a high end cpu, but one of the two is clearly better on a low-end cpu. If those gpus are at the upper end of what you can afford, then the choice becomes obvious.

@DavidGraham This issue is really only the specific case where you're lowering gpu settings enough that the cpu becomes the bottleneck, so most likely esports type players. I'd be curious to see comparisons between a 3060 and the 5700xt and 6700xt on a 3700x. The AMD cards may come way ahead in that scenario. Who knows. Reviewers typically don't make that kind of comparison when doing cpu reviews. They'll usually pick one gpu and test a range of cpus with games running at various resolutions at high settings. In a sense it comes down to some issues I have with the way reviewers operate. They think they're benchmarking hardware, but they're actually benchmarking software and using a range of software to try to make assumptions about the hardware performance. Essentially you're benchmarking windows + drivers + game, not the hardware. I think this is why this edge case may have been overlocked. The reviewers make a lot of assumptions about which hardware consumers want to put their hardware together, and then they make assumptions about how the user will configure the software (graphics settings etc).
 
Last edited:
Because more people would have seen this problem in BF5. On youtube there are videos with Geforce and 4C processors from Intel which dont have problems:



Because something went wrong. Wrong settings, game update, broken driver at this time etc. It still a PC plattform.

/edit: BTW: He found his fix:
The middle video that shows conquest which is the mode that really hammers the CPU has framerates wildly bouncing between 60 and 115. Thats a horrible experience. CPU constantly pegged at 100%. Constant frametime stutters during action. The reason this hasn’t ever been brought to light is that benchmarkers all use the fastest CPU possible while also testing single player mode. I can tell you that on forums people have complained about performance since launch. We all just thought it was DICE issues and not Nvidia’s poor driver.
 
Back
Top