Why should 128mb be better than 64mb for a VGA-card?

up

Newcomer
I'm in doubt about any use for 128mb.
Isn't the impact of fast memory much more important?
Take e.g. a recent Geforce4 Ti4200 offer:
DDR GeForce4 Ti4200 Geforce4 Ti4200 250 Mhz, 128Mb DDR Tv-Out, DVI, Dual-Head € 215.00

DDR GeForce4 Ti4200 Special Edition Retail, TV-Out 64Mb DDR at 3.3ns (standard 4.0ns).The core clock speed is 250MHz en memory clock speed is 550MHz. € 199.00
 
Performance at stock speeds is pretty even , in all honesty. 128MB versus 64MB there really isn't much difference except at higher resolutions and FSAA settings.
 
The performance disparity shows when games start to use very large textures or extremely detailed textures. An example would be with the Code Creatures bench.
 
The extra 64MB is a safety net in case you want to use "HRAA" (thanks, nVidia!), or use very high res textures. You can save money and get a slightly faster card by foregoing the extra memory, at the cost of missing out on using insanely detailed textures if future games support them. RtCW, however, can make a 64MB card stutter in some areas--it's thought this is because of texture swapping.
 
Pete said:
RtCW, however, can make a 64MB card stutter in some areas--it's thought this is because of texture swapping.
The same is true for Jedi Knight 2 and SOF 2. Both would benefit from the additional 64MB of RAM.
 
The extra 64MB is a safety net in case you want to use "HRAA" (thanks, nVidia!), or use very high res textures. You can save money and get a slightly faster card by foregoing the extra memory, at the cost of missing out on using insanely detailed textures if future games support them. RtCW, however, can make a 64MB card stutter in some areas--it's thought this is because of texture swapping.

Actually, at higher resolutions w/ FSAA, the 64MB Ti 4200 usually does even better. The reason is that most modern games don't really stress 64MB of memory, let alone 128MB.

In the end, the main thing to pay attention to is that most games won't bother to truly stress 64MB of video memory until they start developing for video cards that have 64MB as a minimum spec. Currently, most games still are being developed with 16MB cards as the min spec, and a very few with 32MB as the minimum.

The main reason is simply that it's not easy for developers to produce multiple texture packs for various hardware, and so not many do (I only know of two...do you know of any more? Neverwinter Nights and Unreal Tournament).

Thus, it should be obvious that in 95% of all games released over the next year, a 64MB Ti 4200 with higher-clocked memory will almost certainly have superior performance in nearly any game situation (i.e. any quality level...). There will be some that show differences, but it will be a simple matter of reducing texture quality (and the texture quality should still be quite good....).
 
Back
Top