Why is the amount of graphics ram suddenly so important?

No they're not, not by a longshot. You will get significant stuttering on many titles, especially 360 ports, on cards with less than 256 megs of ram.

Perhaps we are moving to 256 as the target. That is still lagging way behind the amount of avalible ram.

I remember when I actually had to upgrade my video card because of ram and not performance else where. I haven't had to do that in a long time. I think eq2 made me go from a 128 meg 9700pro to a x800xt 512 meg if I recall right.

Weren't people at epic talking about using 2 gigs of ram for their unreal 3 engine games at one point before next gen consoles launched ?
 
Perhaps we are moving to 256 as the target. That is still lagging way behind the amount of avalible ram.

No different than the way we're lagging behind the amount of shader processing power/functionality. PC Gaming isn't dying, it's just becoming console gaming with the same mandatory installs, manual patching, non-games oriented interface, that we always had. Some companies are also doing away with dedicated servers and modding and raising the retail price to match the consoles too.

But as long as Blizzard continues to enjoy huge profit margins with WoW PC Gaming is healthy, right? :|
 
I've been running the Heaven benchmark recently on my 4850 512MB card and it is fine and scales downwards logically with increasing AF and AA at 1024x768 in DX9 mode. It also does that fine at 1920x1200 as well in DX9 mode.

But when I go to DX10 mode 1920x1200 with AA and AF it falls off a cliff. Can someone with a 4850 card and 1GB memory run it at 1920x1200 max AA max AF DX10 and tell me the fps? I get less than 3 !

Thanks
 
Perhaps we are moving to 256 as the target. That is still lagging way behind the amount of avalible ram.
You can play console ports with a 256 megabyte card only if you use a low resolution and no MSAA. Console games are designed for 1280x720 resolution (and most have no AA). The common PC monitor resolutions are 1680x1050 and 1920x1200, and PC gamers tend to use 4xMSAA.

Assuming a basic 16 bit per channel float HDR color buffer and a 32 bit depth buffer: The common 1920x1200 PC resolution takes 15 megabytes more graphics memory compared to common 1280x720 console resolution. If we add 4xMSAA to the mix, the PC backbuffer takes 95 megabytes more graphics memory. All the new games using deferred rendering need at least double backbuffers, so the memory scaling by resolution&MSAA is even more noticeable (backbuffer takes 182 megabytes more memory on the most common PC settings). So 512 megabyte card is very much advisable on PC to play all the newest direct console ports if you want to enjoy the game. Many console ports also have optional HQ settings with more shadow map resolution and more texture resolution. If you want to use these super high quality settings and play on 1920x1200 4xMSAA you basically need a 1024 megabyte card.
 
I've been running the Heaven benchmark recently on my 4850 512MB card and it is fine and scales downwards logically with increasing AF and AA at 1024x768 in DX9 mode. It also does that fine at 1920x1200 as well in DX9 mode.

But when I go to DX10 mode 1920x1200 with AA and AF it falls off a cliff. Can someone with a 4850 card and 1GB memory run it at 1920x1200 max AA max AF DX10 and tell me the fps? I get less than 3 !

Thanks

My 4850 is a 1GB card but it's core is overclocked to 750 mhz.

4850_1gb_heaven_dx10.jpg
 
Last edited by a moderator:
Back
Top