GeforceFX & ATI demos?

Don't you think all this looks a lot like the the old T&L (Geforce1 x Voodoo3) controversy? Sure, it took years for all the industry catch up and start to take advantage of it, but someone has to make the first step sometime... and it seems to me Nvidia has always tried to be pioneer with new technologies/tendencies, as is the case with shaders now... of course, one can question if they don't use to push the envelope a little to ahead of what is ideal/pratical...
Rodrigo

I think the situation is similar, but with one big difference. NVIDIA's more powerful shader capabilities are being eclipsed by poor performance on current 3D tech compared to the R300.

Back in the GF256 days, they delivered a even bigger stinker. It was no where near the jump between Ti4600->NV30. But the competition had nothing better, so instead of a stinker, it's a 'winner.' If the Voodoo5 didn't feature creep and instead was released along side it, GF256 would be a clear fiasco due to useless t/l being killed by Voodoo5's higher effective fillrate.

The situation today is the same, HOWEVER, ATI didn't feature creep and produced something that's superb for the present and well balanced for the future. This basically spoiled the NV30 and it now looks like a stinker.

In all fairness, NV30 an't that bad, its foward looking shaders is hurting its current performance ONLY in comparison to the R300. If you look at the numbers, it's still hellish fast.
 
ben6 said:
Hrm, the ATI demos should work on Geforce FX. What drivers are you using? 42.68?
I'm running 42.86 on a Quadro FX 2000

Animusic runs nicely.The other run partially but report missing shaders - i.e. the car body does not get rendered.
 
pocketmoon_ said:
ben6 said:
Hrm, the ATI demos should work on Geforce FX. What drivers are you using? 42.68?
I'm running 42.86 on a Quadro FX 2000

Animusic runs nicely.The other run partially but report missing shaders - i.e. the car body does not get rendered.

It may be a texture format problem. The car demo uses 16-bit integer values, and I think NVidia only supports 12-bit integers. ATI said that even 12-bits was not enough to get good precision, so they packed two normal components in a R16G16 texture, and did z=sqrt(x^2 + y^2) for the third.

High precision integer values are important, so I hope NVidia supports them rather than forcing you to work with 32-bit floats.
 
Hmm, that texture format should be supported by GF3 and up, its the same as the HILO16 format in OpenGL which comes with the NV_texture_shader extension.
 
Jallen - nice to see you here :)

IIRC one of the demo's requires Multiple Render Targets, which NV3x doesn't (at the moment ?) support.

The ATI demo's pack all the shaders, images etc into one big resource file. Would be nice to unpack them to take a look at the shader code.
 
Hmm, that texture format should be supported by GF3 and up, its the same as the HILO16 format in OpenGL which comes with the NV_texture_shader extension.

nVidia seems to have a history of supporting certain features and formats in GL, and not in D3D. (rt-patches and Palletized textures come to mind.) Don't know excatly why. (And of course, don't know if that's the case here.)
 
pocketmoon_ said:
The ATI demo's pack all the shaders, images etc into one big resource file. Would be nice to unpack them to take a look at the shader code.

Just hit an ENTER on them in Windows (Total) Commander.
Or use WinZip.
 
I was wrong. The GeforceFX can't use the ATI demos with the current drivers. That does not mean the can't run the demos however ;)
 
Back
Top