Articles about GDC ( add more )

Interesting quote from one of the tutorial summaries on that site:
You can't always assume that hardware T&L is going to be faster. Operations like skinning and blending can sometimes be faster in software (due to result caching).
Wash that man's mouth out with soap!
 
Simon F said:
Interesting quote from one of the tutorial summaries on that site:
You can't always assume that hardware T&L is going to be faster. Operations like skinning and blending can sometimes be faster in software (due to result caching).
Wash that man's mouth out with soap!

:D

Ack, I'm about to leave my Geforce256 DDR behind without ever having any advantage out of the hardware T&L in games (although 3dS MAX likes it)!

Talk about paying for progress, I guess. (And now they want me to pay for DX9 - forget about it! ;) )

Regards LeStoffer
 
Hmm, now that you say it..... I sold my GF256 DDR this November and got myself an r8500. And looking back and thinking hard, no, none of my games ever did run faster with hardware T&L than without.

So extrapolating, I guess I will not see a game that exploits pixelshaders to great effect during the lifetime of my r8500. From what I've heard of the DOOM3 and the Unreal2 engines, this actually seems _very_ likely. And with the release of the GF4MX, practically guaranteed.

Oh well. I hope the manufacturers won't stop innovating, because I'll pay for coolness anyway. I'm a sucker for technology. :)

Entropy
 
Entropy:
So extrapolating, I guess I will not see a game that exploits pixelshaders to great effect during the lifetime of my r8500. From what I've heard of the DOOM3 and the Unreal2 engines, this actually seems _very_ likely. And with the release of the GF4MX, practically guaranteed.
IIRC the GF4MX doesnt support pixel shader :(
 
pascal said:
IIRC the GF4MX doesnt support pixel shader :(

That was his point. We'll see pixel shader-employing games 1-2 years after pixel shader-supplying cards go mainsteam. And that hasn't happened yet.
 
Back
Top