@Frenetic Pony 8gb at what resolution? I don't feel like combing a 16 minute video to figure it out.
Since people were asking for 8gb+ vram usage... well here it is. Call of Duty Black Ops... whatever number (Cold War) using over 8gb even without raytracing:
See, ok, that's done. There's been other examples all around this forum of course. So the 3060ti and 3070 aren't perfectly ideal GPUs, nor perhaps is the 3080 as that's getting pretty close to above 10gb already.
yep, using more than 8GB ram doesn't automatically means requiring over 8GB. Some games automatically hogs VRAM, maybe to improve streaming.8GB+ VRAM usage, but that is not a good indicator of the game "requiring" over 8GB. I have seen that game using over 16GB of VRAm on 3090 on particular scenes but 3080 has no problem handling that scene with mere 10GB of VRAM.
Since people were asking for 8gb+ vram usage... well here it is. Call of Duty Black Ops... whatever number (Cold War) using over 8gb even without raytracing:
See, ok, that's done. There's been other examples all around this forum of course. So the 3060ti and 3070 aren't perfectly ideal GPUs, nor perhaps is the 3080 as that's getting pretty close to above 10gb already.
But then, we don't live in an ideal world. After all its not like you can buy any of these new GPUs anyway, or a console for that matter. Heck maybe by the time you can, if you want to go to the extra expense and such cards as the 3080ti and 3070ti even exist, you can buy one of those. My only hope is to give the best buying advice I can to whoever comes around here being curious. Hell maybe AMD will suddenly put out better GPUs as well, and Intel's new CEO will somehow make their GPUs worthwhile. We'll wait and see.
Reminds me a bit of the time when Windows Vista was released and people were complaining it would suck up all RAM in the system it could find. No more free RAM *doh*
No, it's not. Call of Duty games are well known to use VRAM to cache streaming assets, that hardly means that they need that much VRAM.See, ok, that's done.
Doubtful. Next gen is more than a year away.We will have next gen before we can buy current gen
Doubtful. Next gen is more than a year away.
No, it's not. Call of Duty games are well known to use VRAM to cache streaming assets, that hardly means that they need that much VRAM.
In Cold War there's a setting which allow you to directly control the percentage of VRAM which the game will allocate.
And from what benchmarks there is I see zero indication that it actually require more than 8GB of VRAM, even with RT.
Example one:
Code:https://i.postimg.cc/7ZfGNFnH/Screenshot-2021-01-23-144224.png
https://www.pcgameshardware.de/Call...-Cold-War-Benchmark-Release-Review-1362383/2/
Example two:
Code:https://overclock3d.net/gfx/articles/2020/11/19113412366l.jpg
https://www.overclock3d.net/reviews...r_pc_performance_review_optimisation_guide/11
It does seem to require more than 6GB in 4K with RT Ultra. But considering that you get <60 fps in such mode even on a 2080Ti and even with DLSS that seems like a non-issue.
Not really if you consider the difference in their launch dates.The difference between 3090 and 3060 Ti is strangely low...
We’ve tested a total of 17 graphics cards for this performance look. While many developers behind much of this software would quickly suggest using a Radeon Pro or Quadro over a gaming GPU, there are many cases when those gaming GPUs offer the better bang-for-the-buck. It’s up to you or your company whether or not you’ll suffice with the gaming counterparts, but the professional models feature advanced support, sometimes ECC memory, and in some cases, driver optimizations that propel viewport performance forward in CAD suites (some of which will be seen here).