Crusher said:
If there was a problem with S3TC on the GeForce cards when it was introduced with the 5.08 drivers, it was fixed in driver updates long ago. How old are those screenshots you are posting? Two years? I can tell you without a doubt that I do not see ANYTHING like that, whatsoever, on my GeForce 2 while playing the Quake 3 Arena demo. The wall textures look fine, and do not have the blocky color tinting that screenshot shows.
Crusher,
I don't want to get into another nvida S3TC flame war, but this was discussed a
long time ago. The problem was the nvidia cards were looking
definitely worse on the Quake 3 sky textures than anyone else. After some investigation, there was a general consensus that nvidia was doing 16-bit interpolation on DXT1 textures as opposed to 24-bit. This made for heavy banding on certain textures (like the sky) and generally looked awful there. If you forced the use of DXT3 textures instead, the result was greatly improved. As I stated a little while ago, it seems that nvidia is decompressing the DXT1 textures as 16-bit into their texture cache to save space, but they have to decode DXT3 as 32-bit so you get better results their.
Now, if you aren't getting blocky lightmaps on walls in Quake 3 there are several possibilities: The lightmaps are not being compressed (either due to the game not compressing them (that was an option in the pre-release source code) or the drivers not compressing them), compression is off completely or you are just looking at the wrong place. The reason why I say this is that I
know for a fact that many of the lightmaps in Quake 3 show serious artifacts when compressed. The reason is that most textures that contain lightmaps are 128x128, however, the individual lightmaps are only small areas within those textures (like 4x4, 2x6, etc.). Now, because these little lightmaps are not block aligned (i.e. aligned on a 4x4 S3TC block) you will get artifacts when compressed because the edge texels will not completely cover a block. It's not a problem with S3TC at all, but just a poor texture to compress.
As far as I can tell, the only image quality problem that exists now is not a texture compression problem with the GeForce 2, it's a fundamental result of using S3TC to compress textures. This is still evident in the sky texture, and is not specific to NVIDIA cards. Other games that use texture compression exhibit little to no loss in image quality, and have no artifacts such as the ones you are claiming exist.
As I said before, nvidia was looking significantly worse on the Quake 3 sky textures. I can't speak for other games, but you don't even know what they are doing! For all you know, they are compressing the textures as DXT3 on nvidia cards, or even avoiding compressiong of bad textures completely on all cards. Without deep investigation, you cannot tell.
Now, if you want to do some experiments for yourself, it's not that difficult to create a small OpenGL program that uploads textures. You can create a 24-bit texture with a nice smooth gradient (kinda like the Quake 3 sky textures), upload it compressed and uncompressed and compare the results. You can also create your own precompressed texture and upload that (I think nvidia supports the extension for this).
Sorry for the long-windedness of this post, but this is a subject that I am very familiar with and I don't want people to get the facts wrong.