GF4 has inflated 3dmarks scores so says the INQ.....

Have you even seen a GeForce 2 run 3DMark2001SE using texture compression? It looks just fine.

Why yes to answer your question, ahmmm maybe you didn't have a Radeon at the time, no not by a long shot would I say it looked fine. I was the unlucky or stupid buyer of a GF2 MX400 64meg card (which is outside in my shed getting moldy). Yes if you do a hack and change the texture compression from DXTC 1 to 3 it clears it up. Now again then you are then compairing DXTC1 results to DXTC3 results arn't you?? ;) Hey something else 3dMark isn't checking for either. All sorts of different conditions are being inserted into a benchmark which is supposenly is a benchmark, NOT!. :)
 
Always falling back to the Quake 3 sky texture to try and prove the horrible texture compression of the GeForce 2. Never bothering to mention that this is a fault of the texture compression itself, not the GeForce card. The sky in Quake 3 should not be compressed. DXT1/S3TC degrades quality of all textures, and is especially noticable when it's stretched over a large area (such as, say, the sky). You also don't bother to mention the fact that the quality of the sky with texture compression enabled is reduced on ALL hardware, not just NVIDIA cards.

http://www.ultimatehardware.net/gfvkyro2/q3tc.htm

Texture compression reduces the sky quality on both the GeForce and the Kyro.

As you can see the sky in Quake 3 becomes very ugly, this is a problem with texture compression and Quake 3 and doesn't seem to effect most other games this badly.

http://www.reactorcritical.com/review-savage4/review-savage4_9.shtml

Unfortunately there are only few games with S3TC support as of late. One is Unreal. The specialized levels created by S3 look almost perfect. Just as attractive is the Unreal Tournament (which I have to judge from screenshots made by others). The second one is Quake 3. Alas, the S3TC implementation in it is almost nothing different than the autocompression.

When you select "Compressed textures" Quake 3 starts to compress all of the various textures making no difference between them. This of course includes the textures that are not designed to be compressed, e.g., the sky texture. Usenet newsgroups are filled with messages about a striking difference between 16 and 32-bit textures. The sky texture is the most preferred example because it makes that difference obvious.

Matrox has (or had) a PDF file explaining the lossy nature of S3TC, and examples of how it degraded the quality of the textures. I don't have access to their developer relations section of their website, but a cached HTML copy of this document can be found here:

http://216.239.35.100/search?q=cach...Quake+3+Sky+Texture+Compression&hl=en&ie=UTF8

Even real game textures that are near-perfect candidates for low-loss compression show a visible difference: Notice the missing bright red spots in the compressedtexture, and the washed out look of the image on the far right.

(two pictures of Quake 3, one without TC, one with TC)

Notice the missing bright red spots in the compressed texture, and the washed out look of the image on the far right.

Quake 3's sky is not an indication of NVIDIA's texture compression being bad, it's an indication of S3TC's texture compression algorithms creating bad looking textures.

Do you still have your GeForce 2 Doomtrooper? I challenge you to run 3DMark2001SE with compressed textures and evaluate it objectively and try to claim the texture compression quality is bad. I guarantee you it won't be. In fact, I doubt you'll even notice the difference in texture compression quality between that and the Radeon.

). Yes if you do a hack and change the texture compression from DXTC 1 to 3 it clears it up.

I haven't done any hacking, and I haven't changed any settings. With regular NVIDIA drivers and a normal installation of 3DMark2001SE, there is no noticable image quality degredation due to using compressed textures. Mayhaps you should scrub the mold off your MX and install it again and have another look.
 
riva%20tuner12.jpg


Notice force DXT 1 and DXT5 or you do it in the registry manually
 
Note what? Where does it say "Force the use of DXTC3 textures in all D3D applications even if the texture isn't in that format"?
 
There is NO problem with S3TC on a Radeon Vivo, so where did you get that bunch of horse poop from, are you denying that there never was or is a texture compression problem with a Geforce 2 ??

Radeon Compressed
q3dm1_radeon.png


Geforce 2 Compressed

q3dm1_geforce.png


There is no Sky here :-?


Soldier of fortune Geforce 2 Texture Compression
sof_geforce_comp.png


Soldier of fortune Geforce 2 No Compression

sof_geforce.png
 
are you denying that there never was or is a texture compression problem with a Geforce 2 ??

If there was a problem with S3TC on the GeForce cards when it was introduced with the 5.08 drivers, it was fixed in driver updates long ago. How old are those screenshots you are posting? Two years? I can tell you without a doubt that I do not see ANYTHING like that, whatsoever, on my GeForce 2 while playing the Quake 3 Arena demo. The wall textures look fine, and do not have the blocky color tinting that screenshot shows. Either those screenshots were taken with a defective card, extremely old drivers that have long since been fixed, or possibly it was a result from excessive overclocking of the video card's memory.

As far as I can tell, the only image quality problem that exists now is not a texture compression problem with the GeForce 2, it's a fundamental result of using S3TC to compress textures. This is still evident in the sky texture, and is not specific to NVIDIA cards. Other games that use texture compression exhibit little to no loss in image quality, and have no artifacts such as the ones you are claiming exist.
 
Open your registry to:

HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\Class\DISPLAY\0000\NVidia\OpenGL

What's the DWORD: S3TCQuality Value:XXX ??
 
That DWORD value is 0x00000000 (0). I assume you're going to make some kind of claim like "if it was set to 1 that the driver would be using DXT3 instead of DXT1". I'm also willing to bet you'll try to make the same claim anyway, just because you seem to be insistent on proving to me that the texture compression of my video card looks awful when I know for a fact it doesn't :) In all honestly, I don't know what the S3TCQuality key is used for, and I doubt you do either, since it's pretty apparent from your comments that you don't work for NVIDIA, and as such would not have access to such information. Any claim that you DO know what it is used for has about as much credibility as your argument that 2 year old screenshots you posted from a site that went out of its way to try and make NVIDIA look bad is proof that the texture compression of the GeForce 2 is bad/wrong/ugly. Anyone who has one of these cards can easily load up Quake 3 and see for themselves that it doesn't look like that, and no other games that use texture compression that I've seen look like that either.

So I'll say it again, if there ever was a problem with the GeForce 2's texture compression, it was fixed in a driver update long ago. Blame S3 for the sky in Quake 3 Arena, or even better, blame ID for using S3TC compression on all of the textures in their game, regardless of image quality degredation.
 
1==DXT3 (Good, but less compression)
0==DXT1 (Bad, but more compression)


So now take some screenshots...otherwise you got some SUPER Geforce 2 as there isn't a option in Rivatuner (all Nvidia tweakers in fact) to fix the issue and about 2000 articles on the net about it ;)
 
Crusher said:
.


So I'll say it again, if there ever was a problem with the GeForce 2's texture compression, it was fixed in a driver update long ago. Blame S3 for the sky in Quake 3 Arena, or even better, blame ID for using S3TC compression on all of the textures in their game, regardless of image quality degredation.

I'm not blaming S3 for nothing dude, my old Radeon looked just fine with the sky compressed as did the Voodoo 5...

3dfxvsatvsnvidia4xOGL.jpg
 
I don't have anywhere to upload screenshots at the moment. I've been talking with a friend, and hopefully will be able to put some on one of the servers where he works. If you have an FTP server I can upload them too, I will upload a couple screenshots from EverQuest and Quake 3. In EverQuest you can barely notice the difference with texture compression enabled. Quake 3 looks nothing like the screenshot you posted. I can also post some examples of artifacts that are a result of overclocking the memory too high. If I bump my RAM from 333 MHz to 366 MHz, the memory will start to overheat after a while and some textures will get splotches of color flickering over them. The light colored walls in No One Lives Forever is especially prone to this. If I clock the memory back to the default 333 MHz, I can play the game for hours on end and never see the artifacts.

I would also like to see what kind of proof you can offer that proves that registry entry is a switch between DXT1 and DXT3.

I assure you I do not have a super video card of any sort. It's a stock, retail Creative Labs Annihilator Pro GeForce 2 GTS 32MB video card, which I bought from Buy.com for about $220.
 
Crusher,

Don't talk to me, talk to anandtech, gamebasement, tomshhardware and tell them your GTS does texture compression right, along with Epic Games and even Nvidia fan sites.. I had one, I saw it with my own eyes and the only fix was using the 'hack'..unless the 'hack' is now included in the new drivers.
 
Crusher said:
If there was a problem with S3TC on the GeForce cards when it was introduced with the 5.08 drivers, it was fixed in driver updates long ago. How old are those screenshots you are posting? Two years? I can tell you without a doubt that I do not see ANYTHING like that, whatsoever, on my GeForce 2 while playing the Quake 3 Arena demo. The wall textures look fine, and do not have the blocky color tinting that screenshot shows.
Crusher,

I don't want to get into another nvida S3TC flame war, but this was discussed a long time ago. The problem was the nvidia cards were looking definitely worse on the Quake 3 sky textures than anyone else. After some investigation, there was a general consensus that nvidia was doing 16-bit interpolation on DXT1 textures as opposed to 24-bit. This made for heavy banding on certain textures (like the sky) and generally looked awful there. If you forced the use of DXT3 textures instead, the result was greatly improved. As I stated a little while ago, it seems that nvidia is decompressing the DXT1 textures as 16-bit into their texture cache to save space, but they have to decode DXT3 as 32-bit so you get better results their.

Now, if you aren't getting blocky lightmaps on walls in Quake 3 there are several possibilities: The lightmaps are not being compressed (either due to the game not compressing them (that was an option in the pre-release source code) or the drivers not compressing them), compression is off completely or you are just looking at the wrong place. The reason why I say this is that I know for a fact that many of the lightmaps in Quake 3 show serious artifacts when compressed. The reason is that most textures that contain lightmaps are 128x128, however, the individual lightmaps are only small areas within those textures (like 4x4, 2x6, etc.). Now, because these little lightmaps are not block aligned (i.e. aligned on a 4x4 S3TC block) you will get artifacts when compressed because the edge texels will not completely cover a block. It's not a problem with S3TC at all, but just a poor texture to compress.

As far as I can tell, the only image quality problem that exists now is not a texture compression problem with the GeForce 2, it's a fundamental result of using S3TC to compress textures. This is still evident in the sky texture, and is not specific to NVIDIA cards. Other games that use texture compression exhibit little to no loss in image quality, and have no artifacts such as the ones you are claiming exist.
As I said before, nvidia was looking significantly worse on the Quake 3 sky textures. I can't speak for other games, but you don't even know what they are doing! For all you know, they are compressing the textures as DXT3 on nvidia cards, or even avoiding compressiong of bad textures completely on all cards. Without deep investigation, you cannot tell.

Now, if you want to do some experiments for yourself, it's not that difficult to create a small OpenGL program that uploads textures. You can create a 24-bit texture with a nice smooth gradient (kinda like the Quake 3 sky textures), upload it compressed and uncompressed and compare the results. You can also create your own precompressed texture and upload that (I think nvidia supports the extension for this).

Sorry for the long-windedness of this post, but this is a subject that I am very familiar with and I don't want people to get the facts wrong.
 
or you are just looking at the wrong place

Apparently I was looking in the wrong spot, because I did manage to find that archway just now, and it does appear to have those blocks. I guess I just never noticed that particular area before. It's certainly not indicitive of the entire game, since the majority (nearly all) of the textures do not have that kind of problem (which is what it seemed like Doomtrooper was trying to suggest). Even the identical archway on the other side of that courtyard doesn't have the same artifacts, and I didn't notice it in the other maps at all (which are what I usually play the few times I actually play the demo). So NVIDIA does 16-bit interpolation on DXT1 textures, and this causes some graphic anomolies in isolated areas, and presumably gives them a performance boost over doing 24-bit interpolation due to the smaller texture size, correct? That would explain how other cards could have a better looking sky comparatively (although you can't deny that it still loses quality with S3TC on any hardware). I'll agree with that.

I won't agree with a blanket statement like "the GeForce's texture compression looks horrible", because that implies all texture compression, and it's apparently limited to DXT1. And for the most part, DXT1 looks fine too, the sky and that one wall area being the only places I've seen otherwise. I'm guessing EverQuest uses DXT3, because it looks just as good with TC on as it does without it (you can notice a slight degredation through the screenshots, but no anomolies are present, and you can't tell the difference unless you're directly comparing screenshots).

Thanks for clearing me up.

edit:

btw, changing that registry key didn't appear to have any effect on Quake 3
 
which is what it seemed like Doomtrooper was trying to suggest


I wasn't suggesting anything other than what the 100's of screen shots and reviewers noticed :rolleyes: Please don't try to paint a picture that what I was saying wasn't true..thx
The real sad thing is, the reviewers left compression on in those days with IQ looking like that and putting up the graphs, with total disregard to IQ :rolleyes:
 
Recent interesting URLs with some bearing on the discussion.

Examplifying the lack of quality criteria in 3DMark:
http://www.digit-life.com/articles/triplexxabre/index.html

Unverified quote from ATI engineer demonstrating percieved industry importance of 3DMark scores:
http://www.hardforums.com/showthread.php?s=db81f267d6af28dea5699773d86a2161&threadid=427743
The relevant part:
"You have no idea. The sample that Doom]I[ ran on was both ALPHA hardware as well as ALPHA drivers....the way the card is now (referring to its alpha stage) it can do 14XXX+ 3Dmarks...we will definately do our best to get it higher.
 
Crusher said:
So NVIDIA does 16-bit interpolation on DXT1 textures, and this causes some graphic anomolies in isolated areas, and presumably gives them a performance boost over doing 24-bit interpolation due to the smaller texture size, correct?

That would likely depend on the texture and its size (hence whether it fist in the cache. However, its not necessarily even a given then – I think nVIDIA have done it this way because the decompression algorithms are in the wrong place in hardware!

There been some talk recently that NV2A only stores the decompressed textures in the cachae, not the compressed textures – I would assume that this would be the same for GF3/4 as well, and probably those that came before; it would explain why it interpolates to 16bit though as more texture could fit into the cache. Now, it appears that Gamecube stores the compressed textures in the cache and decompresses them once they are called from the cache – I would assume that other people would follow a similar implementation as it seems to make more sense.
 
I got an email from a contact who used to work with Real3D in the drivers department, and he seems to think the 3DMark 2001SE "bug" with the Detonator drivers is really an optimization, based on his experience. Here's what he wrote to me:


Saw your comment on the nVidia "bug" that causes results to drop in 3DMark 2001 without splash screens. That's an optimization for sure, we used to do the same sort of tricks at Real3D in Winbench 2D and 3D. We would detect the splash screen (check for a texture of a certain size and format, etc) and at that time we would flush all our buffers and stuff. At that point you know that your scores aren't being measured so that's a REAL good time to do any critical "clean up and get ready" work or anything that takes more time then normal.
The question is, is that a good thing, or a bad thing? You be the judge.

From 3dgpu.com
 
The GF3 still has bad texture quality when using DXTC1 and I presume the GF4 isn't much better. Here are some images I took today showing the differences between DXTC 1, 3 and no compression.

Quake3 version 130
GF3 Ti200 at 245/530 core/mem
W2K
T-Birdy 1.2 @ 1.4ghz
Nvidia 29.42 drivers
All images 800x600x32 jpeg

DXTC 1

DXTC 3

DXTC none

Using Timedemo at 1600x1200x32 at max settings with no anisotropic filtering or AA I get

With compressed textures
DXTC 1 108.7FPS
DXTC 3 106.1FPS

Without compressed textures
99.6FPS

Note the switch in the Registry for the 29.42 drivers works differently, to enable S3TC quality mode for DXTC 3 textures you will have to set registry value to this:

OGL_S3TCQuality 01 00 00 00

Unituner wouldn't work in setting the registry for this, I had to do it manually and figure out what activated the better looking compressed textures.

The real point was that this registry hack wasn't available when the GF2 and Radeon was being tested in the past. On a 32meg card this gave a pretty unfair avantage for the GF2 over the Radeon. Even though the GF2 looked like ass it was still evaluated against the much better looking Radeon, where most sites didn't even inform the readers of the difference that is if they even noticed but then called the GF2 the winner in performance! :devilish:
 
Back
Top