Nvidia GT300 core: Speculation

Status
Not open for further replies.
I played most of farcry at 1024 full details 4x TSAA, on former high end and midrange with identical performance : 6800GT, 7600GT. Very workable.
Now to gain framerate I've set it at 2xQ with 2x TSAA it's rather nice (regular 2x is still noisy)
 
It would be nice to have flexibility with TA Super-sampled with single GPU's when using x8Q or x16Q instead of being forced to use x8 TA super-sampled on Alpha Tests. If one could have the ability to add x2, x4, x8 TA super-sampled - when using x8Q or X16 Q would be most welcomed. Or x2, x4 TA super-sampled -- when using x4, x8 CSAA, x16 CSAA.

I think ATI cards let you do that. If you select 'Performance' adaptive AA it uses 1/2 the number of samples you are using for MSAA. E.g. 4xMSAA would give you 2xTSAA, a good compromise IMO.
 
I think ATI cards let you do that. If you select 'Performance' adaptive AA it uses 1/2 the number of samples you are using for MSAA. E.g. 4xMSAA would give you 2xTSAA, a good compromise IMO.

Yep, but it's a shame they removed the option to choose between SS and MS based AAA's
 
Yep, but it's a shame they removed the option to choose between SS and MS based AAA's

Unless you use Rivatuner and programs like it...

FWIW, for the few levels of Far Cry I played on my 8500 GT, I ran it at 1024x768 with max in game settings, 16xQ CSAA and 16xAF. Solid frame rate too, IIRC. Then again, didn't the entire 8 series have a massive AA speed up?
 
I do ;)

At those settings its a locked 60fps on my GPU. I seem to recall the game being very playable with AA and high resolutions even on 6800 level hardware though.

6800 level hardware went into the high 50 fps range after 65xx Forceware drivers in 1280 in fairly empty scenes to hold a decent framerate in heavy firefights; with the 7800 after that you could go up to 16*12 w/o sweat. As I said IMHLO it took a G80 to make FC with all the bells and whistles on to be playable in ultra high resolutions.

My point was and is that we will see a healthy performance increase with coming X11 single GPUs, but I doubt that we'll see absolute high resolution/maxed settings playability on those.
 
If FarCry would be an example to judge Crysis performance on future platforms...
I'd say that Far Cry performed better on average on a typical high-end gaming platform at the time it was released.
So it'll probably take more time for Crysis to become playable at DX10 very high at 1920x1200 with AA/AF then it took Far Cry to reach it's limits.
NV40/R420 were pretty much enough for Far Cry highest settings with AA/AF at ~1,3 MPixels resolutions and they came out less than two months after the game was released.
For Crysis (which will be 2 years old soon) we still need mGPU configs to play in anything higher than 1680x1050 with AA/AF and in DX10 VH settings.
And i personally not that sure that even the first DX11 generation GPUs will be enough to play Crysis DX10 VH in 1920x1200 with AA/AF...
 
And i personally not that sure that even the first DX11 generation GPUs will be enough to play Crysis DX10 VH in 1920x1200 with AA/AF...

That's what actually meant all along. The next best question would be who cares anyway? I mean how often can you replay a certain game until you reach playability under X settings?
 
That's what actually meant all along. The next best question would be who cares anyway? I mean how often can you replay a certain game until you reach playability under X settings?
Well, even a game with low replayability value, if it's any good, can be fun to come back to months or years later.
 
I agree. My favourite game is still 10 years old Unreal Tournament (1999)... it's interesting, how can today features (like wide-tent MSAA 16x, HQAF 16x and adaptive AA), which were non-existant 10 years ago, change the look of the old graphics.
 
I agree. My favourite game is still 10 years old Unreal Tournament (1999)... it's interesting, how can today features (like wide-tent MSAA 16x, HQAF 16x and adaptive AA), which were non-existant 10 years ago, change the look of the old graphics.

Sounds like when I went through a number of Unreal maps some time ago with 16xS/16xAF; I didn't really "play" the game per se, just awed at the resulting IQ and wished it would had looked as good on the Voodoo2 when I first played it.

However some games can be called "classics" for a reason. Would you do the same with say Unreal 2?

I know we're completely OT at the moment and it can get a very long debate; I simply can't personally accept that we've reached a certain point where gameplay and storylines are good enough and don't need any revolutionary or original ideas, while the only other goal (and yes ignore the following exaggerations) in today's games is to get every rock as shiny as possible so that even the blind can see that shaders are being used, an unbelievable amount of polygons are being used and what not and some aspects like aliasing consistently take the back seat. How about have a more modest approach for a change with fore mentioned IQ improving aspects (and others) and invest some of the shader resources for instance into antialiasing within the shader code selectively wherever shader aliasing appears for instance?
 
Sounds like when I went through a number of Unreal maps some time ago with 16xS/16xAF; I didn't really "play" the game per se, just awed at the resulting IQ and wished it would had looked as good on the Voodoo2 when I first played it.

However some games can be called "classics" for a reason. Would you do the same with say Unreal 2?
I did it with UT2004 on an 8800GTX. Does that count?
 
Heh, I remember doing this way back wi games going as far back as Doom/Quake. :) And time permitting I'll continue to do so if there are new ways to enhance classics.

The holy grail is if some card would come out with a way to take 2D spires and levels and make them fully 3D rendered/textured playable scenes. ;) Impossible I know, but I can always dream of a Fallout 1 in full glorious top down isometric 3D. :D

Regards,
SB
 
Heh, I remember doing this way back wi games going as far back as Doom/Quake. :) And time permitting I'll continue to do so if there are new ways to enhance classics.
Dungeon Keeper 2 for me. I can't think of the number of times I've completed that, but at least 3 times was due to playing with new settings on the latest gan of graphics cards. :)
 
Dungeon Keeper 2 for me. I can't think of the number of times I've completed that, but at least 3 times was due to playing with new settings on the latest gan of graphics cards. :)

Oooh good one also. But it already does half the job for you by having the world in 3D although still using 2D sprites. Still that would be a good stepping stone on the way to my holy grail. ;)

OK 3D accelerated graphics card driver writers. Hop to it. That's your next assignment. :)

Regards,
SB
 
Well, it really would be a minor improvement, but I wonder if they could manage something akin to what Diablo II did for its 3D acceleration?

The real difficult thing would be determining which objects should be upright, and which should be laying flat.
 
I did it with UT2004 on an 8800GTX. Does that count?

Of course does it count; speaking of I had never finished before the 8800GTX Oblivion, so I loaded a couple of additional texture packs and started it all over again ;)
 
Of course does it count; speaking of I had never finished before the 8800GTX Oblivion, so I loaded a couple of additional texture packs and started it all over again ;)

tell us you actually loaded the mods to fix the game play as well. Rats dropping daedric armor and weapons :rolleyes:
 
Status
Not open for further replies.
Back
Top