nVidia release new all singing all dancing dets.

All texture sharpening does is raise the anistropy by 1 level.

That is not what it did above in my images, I was already supposenly at 8x anisotropy, when selecting Texture Sharpening it did something else. So what really is it doing? It also helps in taking out the faded look in QuakeIII.
 
That is not what it did above in my images, I was already supposenly at 8x anisotropy, when selecting Texture Sharpening it did something else. So what really is it doing? It also helps in taking out the faded look in QuakeIII.

For example: It's a known fact that Quincunx AA tends to blur textures. With Texture Sharpening, the textures are not blured, that's about it i guess... obviously, it can be used in conjuction with all other methods of AA such as 4xS, but the IQ improvement won't be as noticable there.

What I'd like to know is whether or not this affects perfomance, since I'm too lazy to check it out myself! :D
 
Noko, make sure that the q3 texture slider is all the way to the right -it tends to get knocked down a notch after a new driver install.

If you havent figured out there is something wrong with the control applet as far as choosing anis settings in these beta drivers I dont know what to say... texture sharpening does what I allready commented on, nvidia reps have allready made the same comments...

I dont see any difference as far as texture sharpness with your 8x or 8x+sharpening, just a difference in the brightness of a few textures
(was there a lightning flash )
 
hehe noko is the only one having a difference in textures in Q3A. Are you sure they don't look blurry because you have turned the brightness way up?

Q3A is dead similar here. But I can't take screenshots anymore. They come out all messed up.
 
That looks like a 16-bit versus 32-bit texture problem. There isn't any chance that nVidia has the equivalent of "Convert 32bit textures to 16bit" but it just isn't labelled clearly (i.e. texture sharpening turns it off)? Hopefully this suggestion can be tested and just dismissed or supported instead of sparking some sort of pointless attack/defense that doesn't disprove or prove it...*hint* *hint*. o_O
 
noko said:
This is interesting, the frame buffer shots look great but when playing it looks all dull and faded????? Here is a fram buffer shot which looks like it should:

QuakeIII 40.41 drivers, 2xAA 8xAA, frame buffer


When playing, it doesn't look like this at all!! Any ideas? hmmm, there is a trick I am going to do. . . [/url]

Try fiddling with r_hwignoregamma...I've long had issues with that setting and screenshots...I think Quake 3's handling of hardware gamma settings may be outdated or incompatible with newer drivers (and has been to some degree for ATi drivers for a while), atleast in regard to taking screenshots.
 
demalion said:
That looks like a 16-bit versus 32-bit texture problem. There isn't any chance that nVidia has the equivalent of "Convert 32bit textures to 16bit" but it just isn't labelled clearly (i.e. texture sharpening turns it off)? Hopefully this suggestion can be tested and just dismissed or supported instead of sparking some sort of pointless attack/defense that doesn't disprove or prove it...*hint* *hint*. o_O

Ok Dad :rolleyes:
 
There is some speculation if these drivers may be dropping down to 16-bit mode for nature..this is a screen shot of 16-bit vs 32-bit on a Radeon 8500...can you tell the difference ??

16-32.jpg


Testing shows as much as 10-13 fps diffence when switching and comparing images....which side is 32-bit..anyone ??.
 
demalion said:
That looks like a 16-bit versus 32-bit texture problem. There isn't any chance that nVidia has the equivalent of "Convert 32bit textures to 16bit" but it just isn't labelled clearly (i.e. texture sharpening turns it off)? Hopefully this suggestion can be tested and just dismissed or supported instead of sparking some sort of pointless attack/defense that doesn't disprove or prove it...*hint* *hint*. o_O

Q3 requests S3TC textures for the most part (which generally has higher resolution than 16bit). If it was 16bit you would see banding in certain areas, like the sky...unless that is being detected and is selectively using 16bit. I wouldn't see why though, S3TC has better quality.

The old NV panel for GL had an option for textures (16, 32, auto). Turn off texture compression in GL, and force 16bit GL textures in the panel.

I don't think it's a 16 vs 32 bit problem.
 
Doomtrooper said:
Testing shows as much as 10-13 fps diffence when switching and comparing images....which side is 32-bit..anyone ??.

uh, it's in the title bar of your dialog box.
 
Doom, the one on the right is 32 bit :).

Try fiddling with r_hwignoregamma...

Tried it, it helps but still lousy. Tried other settings such as r_overbrightbits 0 which didn't help either. Yet the frame buffer what is being generated by the GPU is great looking, on screen pure CRAP. Tried digital vibrator control and ended up with a very orange looking game :(. I will try RTCW to see if it also has the same problem.
 
Doomtrooper said:
demalion said:
That looks like a 16-bit versus 32-bit texture problem. There isn't any chance that nVidia has the equivalent of "Convert 32bit textures to 16bit" but it just isn't labelled clearly (i.e. texture sharpening turns it off)? Hopefully this suggestion can be tested and just dismissed or supported instead of sparking some sort of pointless attack/defense that doesn't disprove or prove it...*hint* *hint*. o_O

Ok Dad :rolleyes:

:LOL: Well, Son, you aren't the only one I had in mind. Takes more than one to go back and forth and to sustain the "pointless" aspect. :D
 
Each shot is clearly marked 16bit and 32bit, eesh -cmon


Well by zooming 8500 16bit and 32bits shots by 300 percent you have gotten to the bottom of the nature cheat. kudos to doomtrooper.

ehehehehe.
 
As for your example, Son ;) , you're doing a good job so far.

I can say that I can tell a difference, but I couldn't without forewarning that there was. I can only tell it in one spot too...the tall stone. There are discoloration artifacts (which I do not think I would spot in motion or without zoom...though I might be able to pick up the hint after a few viewings) that aren't present (or I should say aren't evident) in the 32-bit shot. That's odd, I'd expect more artifacts to be evident. The "overbright" scenes would be more conducive to spotting differences I think.
 
Not my pic...I don't even have it installed :p

I guess I should have said 'can you really tell the difference with 32-bit without closely examining the shot @ 300%'

vs.

which side is 32-bit..anyone ??.

Ok then..carry on.
 
"Each shot is clearly marked 16bit and 32bit, eesh -cmon"

Ermm... DOH!!.... just got out of bed... brains are still a bit mushy :)
 
Back
Top