Will GPUs with 4GB VRAM age poorly?

It is a hindrance NOW even at 1080p.

Well, after this crap I'm not even going to feed the tr.. the thread anymore.
First it was "RX480 is the worst card ever, Raja should commit seppuku" circlejerk, then the powergate talks taking over all AMD discussions everywhere, now this.


I'm sorry mods, but these things are progressively turning B3D into quite the fanboy forum.
I expected so much better from the community..
 
Boosting Texture Quality to "Hyper" quadruples the quality of Image-Based Lighting, and further improves Environment Mapping fidelity, creating more detailed and vibrant scenes, especially at night.

For the technically-minded, here’s a quick overview of the many features enhanced by Hyper:


  • Lighting Quality: Shadow Map Resolution, Spotlight Shadow Map Resolution, and Shadow Draw Distance greatly increased
  • Mesh Quality: Object Draw Distance, Character Draw Distance, Object LOD Quality, and Object Shadow Draw Distance greatly increased
  • Reflection Quality: Fidelity further improved
  • Texture Quality: Environment Mapping Quality improved and Image-Based Lighting Resolution increased
  • Motion Blur: Sample count increased
  • Render Scaling: Downsampling upgraded to Lanczos Separable from Bicubic Sharper
Most current gen cards should be able to handle these visual improvements.
 
Well, after this crap I'm not even going to feed the tr.. the thread anymore.
First it was "RX480 is the worst card ever, Raja should commit seppuku" circlejerk, then the powergate talks taking over all AMD discussions everywhere, now this.
I'm sorry mods, but these things are progressively turning B3D into quite the fanboy forum.
I expected so much better from the community..
Actually the one that keeps trolling here is you, what does this have to do with X480 or the powergate? does stating facts now equals trolling? people should do what exactly? stop technical discussions and stop openly speaking about observations and problems so as not to anger the "mob"? The post leaking fanboyism in this thread is obvious. Does appeasing the mob now part of the moderator job description?

Title change is funny, the title was chosen to target Fury cards specifically to showcase that driver level optimizations are not enough to shield them from extreme memory requirements of some games.(as AMD claimed) Even at normal resolutions like 1080p. Changing it to "cards with limited VRAM" does nothing but state the obvious!
Ans what the heck does limited mean? For all I care 1gb of VRAM is also limited! Or do you mean 2gb? Maybe 3gb no 4!
 
Last edited:
Title change is funny, the title was chosen to target Fury cards specifically to showcase that driver level optimizations are not enough to shield them from extreme memory requirements of some games.(as AMD claimed) Even at normal resolutions like 1080p. Changing it to "cards with limited VRAM" does nothing but state the obvious!
+1
Whoever is changing titles would you please ask the OP for permission.
 
I don't know who edited the forum title but I edited it again. In general let's tone down the fanboy talk. It's extremely silly. I think there's real value evaluating at what point (if any) is 4GB not enough and what the ramifications are in those situations. I'll let you guys discuss whether these situations exist in any meaningful/practical sense. :D
 
In general let's tone down the fanboy talk. It's extremely silly.

I think there's real value evaluating at what point (if any) is 4GB not enough and what the ramifications are in those situations. I'll let you guys discuss whether these situations exist in any meaningful/practical sense. :D

Code:
if (GPUMem == ConsoleTitleMem)
  UseCompressedTexturesFFS();
else if (GPUMem < 2*ConsoleTitleMem)
  Life = IsShorter();

KillCaptainObvious();

shifty.gif shifty.gif shifty.gif
 
Last edited:
I don't know who edited the forum title but I edited it again. In general let's tone down the fanboy talk. It's extremely silly. I think there's real value evaluating at what point (if any) is 4GB not enough and what the ramifications are in those situations. I'll let you guys discuss whether these situations exist in any meaningful/practical sense. :D

Why limit the discussion when at some point 3gb cards and 6gbs cards will face the same limits, just at different sides of the spectrum. Naturally the lower ram cards hit it first.

Also it seems disingenuous to not directly spell out in the posts that the settings used for the benchmarks are set to levels no typical gamer would ever use because of how extreme and how far beyond the breaking point they really are.
 
Actually the one that keeps trolling here is you, what does this have to do with X480 or the powergate? does stating facts now equals trolling? people should do what exactly? stop technical discussions and stop openly speaking about observations and problems so as not to anger the "mob"? The post leaking fanboyism in this thread is obvious. Does appeasing the mob now part of the moderator job description?

Title change is funny, the title was chosen to target Fury cards specifically to showcase that driver level optimizations are not enough to shield them from extreme memory requirements of some games.(as AMD claimed) Even at normal resolutions like 1080p. Changing it to "cards with limited VRAM" does nothing but state the obvious!
Ans what the heck does limited mean? For all I care 1gb of VRAM is also limited! Or do you mean 2gb? Maybe 3gb no 4!

Why do you want to have such a limited discussion, why not investigate 3gb and 6gb cards from other vendors too?
 
Why limit the discussion when at some point 3gb cards and 6gbs cards will face the same limits, just at different sides of the spectrum. Naturally the lower ram cards hit it first.
6 Gb cards dont suffer.

Also it seems disingenuous to not directly spell out in the posts that the settings used for the benchmarks are set to levels no typical gamer would ever use because of how extreme and how far beyond the breaking point they really are.
Only they are NOT. Almost all tests ran at 1080p using maximum allowed visual settings set by the developer. It includes stuff like texture, shadow, reflection resolution. People buy high end cards so they can crank visual quality to the max. Not the other way around.
 
6 Gb cards dont suffer.


Only they are NOT. Almost all tests ran at 1080p using maximum allowed visual settings set by the developer. It includes stuff like texture, shadow, reflection resolution. People buy high end cards so they can crank visual quality to the max. Not the other way around.
So they dont suffer yet, but in time they will. So how far out do you define "age poorly". Is that only to mean within 1 year, or does it go out to 3 years or even 5 years?
 
Exactly. They will age in time.. After 2 or 3 generations. They might last a long time. They might not. But what we have here is an unprecedented occurence. A vendor claimed that through driver manipulation 4 GB Of VRAM is all that is needed. That claim didn't last a few months and was challenged in several games. Beginning with last year's Black Ops 3 and rainbow six.
 
6 Gb cards dont suffer.
At one point I ran with dual NV 8800 GTXes with 768MB DDR3 each. Coming from a plain-jane Geforce 6800 with 256MB, the difference was quite noticeable... :)

Some years later however, those same 8800s could not handle TESV:Skyrim's memory demands (even though rasterizing power was more than sufficient to run the game at 60fps@1440P), requiring me to knock both textures and shadows almost to their lowest settings.

Undoubtedly, 6GB graphics cards will end up the same way.
 
Exactly. They will age in time.. After 2 or 3 generations. They might last a long time. They might not. But what we have here is an unprecedented occurence. A vendor claimed that through driver manipulation 4 GB Of VRAM is all that is needed. That claim didn't last a few months and was challenged in several games. Beginning with last year's Black Ops 3 and rainbow six.

Oh, I missed that claim. I dont doubt they said it.

It just seems so blatantly obvious that would never be the situation in all cases. I never saw any value in the 4GB Fury cards, due to the same vram limitations you're pointing out now. So that's why this entire thread seemed overly obvious to me.

Now I fully understand your motivations for being so narrowly focused for this thread. Also, I'm surprised this thread wasn't created at product launch. The claim seems counter-intuitive to all our past experiences.



Though as a feature piece, it might be useful to have a more generalized article or discussion dealing with vram limitions and their breaking points.

Carry on. Sorry for any misconceptions.
 
Why limit the discussion when at some point 3gb cards and 6gbs cards will face the same limits, just at different sides of the spectrum. Naturally the lower ram cards hit it first.

Also it seems disingenuous to not directly spell out in the posts that the settings used for the benchmarks are set to levels no typical gamer would ever use because of how extreme and how far beyond the breaking point they really are.

Because you have to draw the line somewhere. There will always come a point where we'll need more vram than in the past. The question is where is that cutoff today? The op of this thread believes it to be greater than 4GB (even with AMD's memory usage improvements). The discussion should be centered around is that statement true, what situations must exist to make that statement true (or not), and what are the ramifications of that statement being (always/sometimes/never) true.

Restricting the discussion to just Fury (and AMD's memory usage improvements) is not very interesting. I think we can agree there comes a point where no driver magic will help you. But if the sole purpose of this thread is just "will 4GB + AMD driver magic be enough for every situation?" then we can close this thread now because obviously the answer is no.
 
Why do you want to have such a limited discussion, why not investigate 3gb and 6gb cards from other vendors too?
I see merit in having separate discussions for diferent memory sizes, not IHVs. Makes it easier to keep limitations kicking in identified and located in a temporal fa2shion.

Maybe you could spin off separate threads for what is relevant now: 2, 3, 4, 6, 8 and 12 GiB?
 
So those cards run great now and might, in some unknown point in the future, likely after their performances have already been an issue, suffer from lack of memory, is that it ?

How often do you replace your video card ?
How soon will new desktop PC from Sony & MS have games needing more than 4GiB VRAM ?
Which resolution do/will you play at ?

Most of those questions cannot be answered...
 
Also, I'm surprised this thread wasn't created at product launch.

That thread was created and has no less than 141 posts, right at the second page of this sub-forum.
https://forum.beyond3d.com/threads/is-4gb-enough-for-a-high-end-gpu-in-2015.56964/
There are actually informative posts with VRAM usage numbers and several benchmarks in that thread, plus many others in the Fury X and 300 series threads.


But if the sole purpose of this thread is just "will 4GB + AMD driver magic be enough for every situation?" then we can close this thread now because obviously the answer is no.
That much is obvious, yes.
Has the aforementioned "driver magic" been successfully suppressing the RAM deficit compared to 6 and 8GB models so far? The overwhelming majority of new game releases seem to claim it has. But sure, it won't last forever.
 
Last edited by a moderator:
I'm going to help this train wreck of a thread out: http://www.hardwareluxx.de/index.ph...ten/39700-amd-radeon-rx-480-8-gb-vs-4-gb.html

This review compares the two versions (4GB and 8GB) of the RX 480. With the help of google translate and bar graphs, it would seem like at the moment 4GB is still "good enough" for the majority of use cases. There was only one big standout, Tomb Raider, where there 8GB version was about 15% faster (and interestingly at 1080p!). The other games were all within a ~5% performance delta. I'm curious to know why Tomb Raider needs the extra vram.

In general though, I'd personally recommend going >4GB if you're buying a high(er) end gpu today. I suspect cases like Tomb Raider will start becoming more of the norm. The question is will that happen before your next gpu purchase. :p
 
There was only one big standout, Tomb Raider, where there 8GB version was about 15% faster (and interestingly at 1080p!). The other games were all within a ~5% performance delta. I'm curious to know why Tomb Raider needs the extra vram.

Apparently it uses just north of 4GB? http://www.dsogaming.com/pc-performance-analyses/rise-of-the-tomb-raider-pc-performance-analysis/

Xbox One games have access to 5GB of RAM, so... *shrug*, ¯\_(ツ)_/¯

I mean, how much non-graphics-related memory does a title normally need? Can't be too much if the game can run on a 360. :V
 
Back
Top