This could mean relative to current levels, very larger L2 sram cache on dies. HBM3 is expected to be released in 2020. 8GB of the highest performing HBM3 offers up to theoretically 4TB/s bandwidth. Hopefully HBM2/3 costs will be cheap enough for $250-300 cards. If HBM2/3 can be affordably placed in PS5/Xbox0.25 we could see the needed bandwidth increase to nearly match the ALU increase..
ALU to memory access ratio will shift even more towards the ALUs in the future. It has to bandwidth just isn't following sharp rising in computational power.
4 GB GTX 680 was released much later. Most GTX 680s in the market are 2 GB models. And 670 was also very popular. 2 GB GTX 670/680 has become a very popular minimum requirement in current games.But by that time we had GTX 680/7970, and 3GB/4GB cards were a reality (4GB GTX 680's). However 1.5 GB cards were more than enough, and you only needed higher if you planned to use resolutions above 1080p, at least in those 3 games (GTA V, Max Payne 3, Rage).
Every single game in this list runs well on a 2 GB card at 1080p. Higher than 1080p resolution might not be available and you can't select highest texture quality settings. You might also see some occasional stuttering from vram swapping (overcommitted).In 2013, things got out of hands with the release of the likes of COD Ghosts, now we enter the territory where video games use crazy amount of VRAM irrespective of the user's resolution, and they use it just to store textures. at that time we saw the necessity of 3GB and 4GB cards. BF4 recommended 2GB+ cards for large maps, Arma 3 did the same, we then had Thief do the same thing as well! Then we had Watch_Dogs, Wolfenstein, Titanfall, Daylight, Dead Rising 3, Ryse, COD Advanced Warfare, Dying Light, Dragon Age Inquisition, Far Cry 4, The Evil Within, Just Cause 3, COD Black Ops 3, Evolve, all do the same, sometimes even pushing the 3GB limit into 4GB! Then we pushed further into the territory of 4GB+ games like Shadow Of Mordor, Batman Arkham Knight, Assassin's Creed Unity, Rise Of The Tomb Raider, Mirror's Edge Catalyst, Rainbow Six Siege, Doom, GTA V, And many of the AAA games released in 2016: Dishnonored 2, Gears Of War 4, Forza Horizon 3, Deus Ex ManKind Divided, COD Infinite Warfare, some of these games already push 7GB+ VRAM utilization!
DX11 + graphics driver handles overcommitted graphics memory surprisingly well. It has become a common practice to overcommit graphics memory on PC. DX12/Vulkan doesn't support automatic overcommitment. You need to program CPU<->GPU memory swapping on your own. That's the main reason why some DX12/Vulkan ports need more GPU memory than DX11 versions.What does it help, if the engine doesn't do it. I guess he suggests, taking the matter into his own hand and buy more RAM instead of praying that (to equal effect) _all_ engines are smart with memory, gives guaranteed improvements..
How is HBM2/3 going to help your 8GB RX 480 or 8GB 1080? A GTX 480 (first NV GDDR5 board) had 7.5 flops per byte, GTX 680 was at 16 flops per byte and GTX 980 is at 20 flops per byte. With GTX 1080 we're at 25.7 flops per byte (it's kinda fun thinking about this, since a "flop" needs 12 bytes of input and produces 4 bytes of output). Of course Tesla P100 with HBM2 shifts it back a bit to 13.2 flops per byte. But P100 is strongly compute centric, meaning double precision arithmetics and larger register file take their (transistor) toll. If you were to scale GP102 up to GP100 size you could probably get a GPU with 40 SM and a 13 Tflops. Which combined with HBM2 would net you 18 flops per byte. Honestly I'm amazed at why some people want to see P100 in gaming.This could mean relative to current levels, very larger L2 sram cache on dies. HBM3 is expected to be released in 2020. 8GB of the highest performing HBM3 offers up to theoretically 4TB/s bandwidth. Hopefully HBM2/3 costs will be cheap enough for $250-300 cards. If HBM2/3 can be affordably placed in PS5/Xbox0.25 we could see the needed bandwidth increase to nearly match the ALU increase.
If 2 GB was acceptable for 1440p + ultra details one year ago, 4 GB should be acceptable for 1440p + ultra details this year. I don't believe game memory consumption has doubled in just one year.That article is a year old, to be fair... A test with newer games would be better.
Sadly, It wasn't fine, at normal settings the game suffered texture pop ins, and general slow loading of textures, even with the slightest turning of the camera, we had to load the highest texrures to avoid these problems.A quote from Geforce.com about enabling highest texture cache size on Rage: "Be aware, however, that you will almost certainly require a video card with 1.5GB of Video RAM to enable 8K textures". With normal settings 1 GB graphics card was just fine.
At 1080p you couldn't select the highest texture setting without suffering massive hitching and stutters, you couldn't even do that at 720p! You had to select the lower texture level and suffer texture pop-ins or blurry textures depending on the game. For high end cards, that is not acceptable.Every single game in this list runs well on a 2 GB card at 1080p. Higher than 1080p resolution might not be available and you can't select highest texture quality settings. You might also see some occasional stuttering from vram swapping (overcommitted).
Doom, Mirror's Edge Catalyst, COD Black Ops 3, COD Infinite Warfare, Dishonored 2, Gears of Wars 4, Rise Of Tomb Raider (in extended gameplay sessions) use more than 4GB Of VRAM at the highest texture/shadow/reflection levels, they stutter like crazy or suffer frame drops when running on 4GB cards, again that is not acceptable.I am arguing that currently there's close to zero gains for having more than 4 GB
The problem is: PC games use a much higher texture resolution than consoles, use more memory heavy effects (alpha transparency effects, higher LOD, more reflections and shadows resolution.. Etc). That combined with a bit of over-commitance often results in an abnormally high VRAM consumption, consoles don't suffer that issue because developers are much more careful and less liberal in their approach there.4 GB should be acceptable for 1440p + ultra details this year. I don't believe game memory consumption has doubled in just one year.
Doom, Mirror's Edge Catalyst, COD Black Ops 3, COD Infinite Warfare, Dishonored 2, Gears of Wars 4, Rise Of Tomb Raider (in extended gameplay sessions) use more than 4GB Of VRAM at the highest texture/shadow/reflection levels, they stutter like crazy or suffer frame drops when running on 4GB cards, again that is not acceptable.
Sadly, It wasn't fine, at normal settings the game suffered texture pop ins, and general slow loading of textures, even with the slightest turning of the camera, we had to load the highest texrures to avoid these problems.
Using 4k textures resulted in significantly more blurring and pop ins than 8k textures, 8k consumed upward of 1.5GB VRAM. Here is the comparison made through NV's guide:Don't think that was a result of not enough VRAM. You could enable CUDA texture transcoding and any noticeable "pop in" would be significatly reduced. In my case, even with a GTX 950, it's nearly imperceptible.
http://international.download.nvidi.../ragetexturetweak/RageComparison-Animated.gif-With 8k textures enabled the game was extremely crisp and nearly every object was highly detailed.
-Unexpectedly, the 4k textures were of an extremely low quality, a result we verified by visiting other outdoor areas.
-With the configuration file removed the Auto-Balancer was given free reign. The result was textures of a higher resolution, compared to the forced 4K textures, but well below that of the forced 8K textures. As such, we’d estimate that this represents 5-to-6k.
Using 4k textures resulted in significantly more blurring and pop ins than 8k textures, 8k consumed upward of 1.5GB VRAM. Here is the comparison made through NV's guide
I remember my 1GB 560 Ti had some problems with Rage, only at 2720x1536 in some game areas though. Lets not talk about how my 6970 2GB fared with that game though.
GPU transcode just minimizes pop-in. You can still spin around and see it in Rage, Wolf and Evil Within.
My experience with GTX 980 isn't like this. It doesn't stutter like crazy in all new games (at 1440p). GTX 980 / 980 Ti were still Nvidia's flagship GPUs half year ago (before GTX 1080 was released last summer).Doom, Mirror's Edge Catalyst, COD Black Ops 3, COD Infinite Warfare, Dishonored 2, Gears of Wars 4, Rise Of Tomb Raider (in extended gameplay sessions) use more than 4GB Of VRAM at the highest texture/shadow/reflection levels, they stutter like crazy or suffer frame drops when running on 4GB cards, again that is not acceptable.