DirectX 3/5/6 difference?

I don't think the GeForce DDR had a DXTC "bug". I think they decoded dxt1 to 16bit rgb instead of 32bit rgba like everybody else. Thus the banding this caused was by design, not a bug.
The AF of r200 was indeed bad. The angle dependency would have been quite acceptable at that time, but the problem was that it could only do "bilinear AF" (so all samples from the same mipmap). As an upside, there was generally not really a performance loss to ordinary trilinear sampling, though arguably the quality wasn't really better neither... Though personally I thought the radeon 9000 (rv250) was a much better card all things considered. Not much more than half the transistors with ~85% the performance, only cutting nearly useless stuff like TruForm, and without any problems at all with its (multi monitor) outputs. But obviously this couldn't keep up with GeForce 4 neither.
 
Though personally I thought the radeon 9000 (rv250) was a much better card all things considered. Not much more than half the transistors with ~85% the performance, only cutting nearly useless stuff like TruForm, and without any problems at all with its (multi monitor) outputs.
RV250 and RV280 seemed very successful. Many video cards, notebooks, and they built a 2 pipe version as an IGP.
 
I remember that IGP on a friend's laptop, I thought it was impressive, smoothly running some recent or late 90s stuff at 800x600 or 1024x768 - it didn't take me much to be impressed by an IGP in those days.
I tried Doom 3 and it was not really playable even at 512x384, but it ran. I don't remember what was the
CPU.

The 9200 was good as a low end card, common on OEM PC. The AF could simply be left permanently on, and you lived with the filtering weirdness.
I had a ti 4200 instead, permanently left at 2x AA / 4x AF (profile to set 8x AF on undemanding games) and then a 6800GT permanently at 4x AA / 16x AF.
The ti4200 had perfect but slow filtering, too bad its 4x MSAA was useless.
 
I don't think the GeForce DDR had a DXTC "bug". I think they decoded dxt1 to 16bit rgb instead of 32bit rgba like everybody else. Thus the banding this caused was by design, not a bug.

It was just a pain in the ass in Quake 3, which is a game that came out before the Geforce I think.
Btw texture compression "saved" the Voodoo5 for gaming (too bad with geometry/feature demanding games it would get overperformed by a geforce 2 MX) : you set the game at 16bit output (16bit@22bit, which does prevent banding a lot), 32bit compressed textures and there's a free performance increase over 32bit output and uncompressed textures.
 
Actually, Quake3 and Geforce 256 came out at almost the same time. I remember later games would use DXT5 instead of DXT1 on NV10-20 cards. I did some reading and NV17 and NV25 added dithering to DXT1 to improve quality.
 
You could check a box in Rivatuner and then DXT1 textures were recompressed to DXT5, fixing the problem.

Late in the Voodoos's life (i.e. years after 3dfx's demise) there was also MesaFX, an open source OpenGL implementation that I guess is derived from linux drivers in a way. It was good at running anything, such as Call of Duty 1 or one demo from Humus that used vertex shaders (IIRC).
One setting forced the compression of all textures and I thought that was really cool.
Stock driver panel allowed to force old games to 32bit rendering, too.
Universal RGSS AA was the best of course. Before killing my last compatible PC, I played the first Turok game and it looked like a Silicon Graphics tech demo.
 
You could check a box in Rivatuner and then DXT1 textures were recompressed to DXT5, fixing the problem.

Late in the Voodoos's life (i.e. years after 3dfx's demise) there was also MesaFX, an open source OpenGL implementation that I guess is derived from linux drivers in a way. It was good at running anything, such as Call of Duty 1 or one demo from Humus that used vertex shaders (IIRC).
One setting forced the compression of all textures and I thought that was really cool.
Stock driver panel allowed to force old games to 32bit rendering, too.
Universal RGSS AA was the best of course. Before killing my last compatible PC, I played the first Turok game and I thought it looked like a Silicon Graphics tech demo.
 
And then there were the drivers..... With the driver issues that card had I wonder how broken the GPU was. I remember the ugly'n'slow antialiasing would get broken, fixed, broken, on and on.

I don't remember the AA being ugly, quite the opposite. Neither broken, but I used 8500 later in it's lifetime. Maybe it was harder to support because of how many modes it offered.

The angle dependency would have been quite acceptable at that time, but the problem was that it could only do "bilinear AF" (so all samples from the same mipmap). As an upside, there was generally not really a performance loss to ordinary trilinear sampling, though arguably the quality wasn't really better neither...

The AF was much better than trilinear at smoothing mipmap transitions, blessing for older games especially.
 
Did R200 had a working MSAA? I know only of ordered SSAA support that crippled the depth-buffer optimizations, i.e. really slow.
 
I don't remember the AA being ugly, quite the opposite. Neither broken, but I used 8500 later in it's lifetime. Maybe it was harder to support because of how many modes it offered.
The details are sketchy but yes the behavior of the AA settings changed with driver development. It also had some problems with some situations and there were artifacts in some cases. For example fogging apparently causes problems for the jittered grid feature.
 
Did R200 had a working MSAA? I know only of ordered SSAA support that crippled the depth-buffer optimizations, i.e. really slow.


Ixbt has screenshots and details of the many modes.
http://ixbtlabs.com/articles/pmradeon2/

It is SSAA and can use a programmable jittered pattern or ordered grid. Apparently ATI called it multisampling sometimes and confused people, but they were referring to the fact that it can change sample patterns.
"SMOOTHVISION™ is a super-sampling technique that gives users a huge number of different sampling options, by varying the number of samples taken per pixel and changing the sample processing technique. In both the High-Quality settings and High-Performance settings a total of 5 different anti-aliasing levels are possible, giving a total of 10 different settings. ATI’s implementation of anti-aliasing provides a substantial improvement over other forms of anti-aliasing found in other GPUs. Instead of shifting every pixel in the same direction, groups of pixels can be individually shifted in different directions. This technique can be used to generate a more random distribution of pixel samples, leading to higher quality anti-aliased images."

some historical recollection that seems to indicate that the AA hardware wasn't quite fully baked.
http://www.rage3d.com/board/showthread.php?t=33660401
In the 1st smoothvision driver set 2x and 4x quality modes were both non ordered (although 4xQ appeared to be just 2xQ with a 2x horizontal OGSS). There was some evidence of adaptiveness/randomness but it only really showed in d3d. Opengl also showed some very strange tearing artefacts in 2x and 4xQ if you looked closely. All other modes were OGSS.

In the 6xxx/9xxx drivers opengl became all OGSS and the d3d non ordered modes worked only with fog off. The adaptivness/randomness was also reduced. In these drivers mipmap levels are also only adjusted correctly for the 2xq non ordered mode.
 
Back
Top