Nvidia Ampere Discussion [2020-05-14]

It's now up on the US-site aswell https://www.nvidia.com/en-us/geforce/gaming-laptops/mx-450/
Mention of PCIe4 support is present there aswell.
… which makes it more unlikely that this is a typo. But it throws the question as to why? Everyone else and their dogs seem to think that PCIe 4 (or x16 interfaces for that matter) are not necessary in that performance bracket and just a waste of power. PCIe x2 4.0 instead of more usual x4 3.0?
 
In raw fps, it's not showing a lot, but lot less stutters and "freezes" in some games (I've a 16gb Vega).

Anyway. I hope a 20gb version is in the work.
 
In raw fps, it's not showing a lot, but lot less stutters and "freezes" in some games (I've a 16gb Vega).
This I haven't seen either. Recent benchmarks show that even 4GB cards are doing fine at the moment and you need to go down to 3GB to actually start seeing noticeable hitching due to the lack of VRAM in some titles.
 
Well that's not what I experience / experienced. Even some DF videos are showing that 8gb is too little in some titles... 4gb ? I come from a fury 4gb, try playing Andromeda or even the Witcher, X com II (so no crazy recent games either), etc.... with this, a lot of swapping is happening. Yeah the average fps is correct, but the experience is not good at all, to me at least :/
 
Well that's not what I experience / experienced. Even some DF videos are showing that 8gb is too little in some titles... 4gb ? I come from a fury 4gb, try playing Andromeda or even the Witcher, X com II (so no crazy recent games either), etc.... with this, a lot of swapping is happening. Yeah the average fps is correct, but the experience is not good at all, to me at least :/

Old engines using outdated APIs. I think once we'll see true Next-gen engines flexing their muscles, they should be using textures much more efficiently. Sampler Feedback, Windows DirectStorage etc. should really help keeping VRAM requirements sane.

Not that I wouldn't mind a doubling of VRAM throughout the entire stack. What's up with Samsung's and Micron's 16Gbit density modules anyway? Their website report their 16Gbps modules are still in "sampling" for almost a year now. Have they fallen asleep or something?
 
3090 with 24GB is already using the chips which can be used to create 3080 with 20GB. Or is there any other way for 3090 to have 24GB?
 
For all the hand wringing about only 10GB on the 3080 there has been many rumors about a 20GB 3080(Ti). If AMD's high-end comes out with the expected 16GB then I fully expect Nvidia to release a 2080 Ti with 20GB.
 
For all the hand wringing about only 10GB on the 3080 there has been many rumors about a 20GB 3080(Ti). If AMD's high-end comes out with the expected 16GB then I fully expect Nvidia to release a 2080 Ti with 20GB.

Seeing xsx (and probably ps5) sit around 10gb for vram, a 10gb version would make sense. And higher for higher end gaming.
 
Back
Top