nVidia release new all singing all dancing dets.

Crap crap crap. . . RTCW has same problem as QuakeIII with the 40.41 drivers. Brightness (really gamma) has virtually no significant control, could call it the fade control vice brightness control. I guess my VisionTek GF3 Ti200 doesn't like these drivers.
 
Might have been a wee bit simpler to compare 16bit vs 32bit nv shots instead which would quickly prove/disprove another one of your senseless
speculations Doom.
 
walkndude said:
Might have been a wee bit simpler to compare 16bit vs 32bit nv shots instead which would quickly prove/disprove another one of your senseless
speculations.

You aren't tracking very well, atleast for anything except trying to bash Doom...reminds me of a post of mine in the not so distant past in this thread...
It was I that made a speculation based on the NOLF shot. Doom provided some support based on the (in)ability to tell 16 bit from 32 bit on the R200. The next step is for someone to provide GF shots to indeed demonstrate that 16 bit textures are not being used when texture sharpening is not turned on when a performance increase is noted. You are not facilitating the quick resolution of that goal.

All that remains to be seen is if Doom can resist the temptation to be sucked in to exactly the type of pointless back and forth I might have mentioned recently.
 
I'm tracking fine, the post was a speculation that the increase in nature is due to the forcing 16bit in order to increase fps.
plain and simple.

Doom then posted shots of a 8500 at 16 and 32bit and while they are hard to tell apart(other than being labled:)) it proves nor disproves anything.

I merely stated 2 nv shots at 16 and 32 would have quickly gotten to the bottom of his speculation.

I dont need to bash doom, he does that just fine on his own.
 
I did not have a problem telling the difference between the two shots, I didn't even notice the top labelling :oops:.

I restarted using my backup hard drive W2K partition with the 29.42 drivers and QuakeIII looks great so does RTCW. My video card has issues with the 40.41 beta's, the IQ using them just plain out sucks. I reloaded the 40.41 after a complete uninstall and no change in my problems with QuakeIII engine games. Well I will be going back to the 29.42 drivers plain and simple. For me these drivers suck, nice options but overall not worth it :(.
 
Demalion, my replies to doom had absolutely nothing to do with texture sharpening or its effects at all. Who's having a problem staying on track ?

--------------------------------------------------------------------------------

There is some speculation if these drivers may be dropping down to 16-bit mode for nature..this is a screen shot of 16-bit vs 32-bit on a Radeon 8500...can you tell the difference ??
 
walkndude said:
Demalion, my replies to doom had absolutely nothing to do with texture sharpening or its effects at all. Who's having a problem staying on track ?

--------------------------------------------------------------------------------

There is some speculation if these drivers may be dropping down to 16-bit mode for nature..this is a screen shot of 16-bit vs 32-bit on a Radeon 8500...can you tell the difference ??

I recommend you look back until you find mention of the NOLF 2 screenshots, then re-read the thread carefully from there.
 
Gamma in Quake 3 has been correctly working on pretty much all cards since The Q3Tests. All of the sudden Nvidia releases some drivers that seem to break Quake 3. The Logical conclusion that Prime pointed out on on page 18 is that this is Id software's problem.

/me laughs out loud

Haha. I can't control myself here. id Software's code must be broken in such a unique way since the code works on countless driver revisions by all the hardware vendors but breaks on a Beta driver release by Nvidia. Yeah, right.

The obvious real conclusion is just that Nvidia screwed up Gamma Correction in these drivers. It's really quite difficult (for id Software) to screw up the call to SetDeviceGammaRamp(). It either works for all, or it doesn't at all for any.
 
noko said:
I did not have a problem telling the difference between the two shots, I didn't even notice the top labelling :oops:.

Actually, on a second look without "morning eyes", there are plenty of other clues that it is 16 bit, mostly in the water. There are plenty of other slight differences as well, but not readily identifiable as 16-bit except in the water (due to dithering) or where the color artifacting is not apparent in 32-bit (such as in the stone). Still remains an interesting question when the rendering is 32-bit and the textures are 16-bit though and comparing that to previous drivers, since that "Texture sharpening" setting reminds me of the "Texture Preference" settings in the Catalyst drivers (which cycle through 16 and 32 bit textures).

By the way, I'm not proposing that with this setting exposed in the control panel (that is a key point) that this would be a cheat, just proposing that this is worth investigating and (likely?) eliminating as a possibility for performance increases while texture sharpening is disabled.
 
Demalion, read the quote, thats what my replies were to. Not the Nolf texture sharpening issue. I cant make it any more clear to you.

This "speculation" that the inflated nature scores with 40.41s are due to 16bit being forced is what my comments are addressing. This speculation was started in a thead at rage3d in which another user was having refresh rate oddities while running 3dmark and believed it may be roote in the 40.41's forcing 16bit.

k ?
 
Colourless said:
Gamma in Quake 3 has been correctly working on pretty much all cards since The Q3Tests. All of the sudden Nvidia releases some drivers that seem to break Quake 3. The Logical conclusion that Prime pointed out on on page 18 is that this is Id software's problem.

/me laughs out loud

Haha. I can't control myself here. id Software's code must be broken in such a unique way since the code works on countless driver revisions by all the hardware vendors but breaks on a Beta driver release by Nvidia. Yeah, right.

The obvious real conclusion is just that Nvidia screwed up Gamma Correction in these drivers. It's really quite difficult (for id Software) to screw up the call to SetDeviceGammaRamp(). It either works for all, or it doesn't at all for any.

Well, the drivers are in beta status after all! :D
I'm sure that the next detonators revision (the offical one) will fix all these problems!
 
Colourless said:
Gamma in Quake 3 has been correctly working on pretty much all cards since The Q3Tests. All of the sudden Nvidia releases some drivers that seem to break Quake 3. The Logical conclusion that Prime pointed out on on page 18 is that this is Id software's problem.

/me laughs out loud

Haha. I can't control myself here. id Software's code must be broken in such a unique way since the code works on countless driver revisions by all the hardware vendors but breaks on a Beta driver release by Nvidia. Yeah, right.

Umm, re-read my post. I've had problems with Quake 3 on my ATi card with that setting (which I may have labelled incorrectly...it has hw, ignore, and gamma in it) and taking screenshots pretty consistently and the terms I used were "I think Quake 3's handling of hardware gamma settings may be outdated or incompatible with newer drivers (and has been to some degree for ATi drivers for a while), atleast in regard to taking screenshots." That is hardly the statement you are implying was made. :rolleyes:

The obvious real conclusion is just that Nvidia screwed up Gamma Correction in these drivers. It's really quite difficult (for id Software) to screw up the call to SetDeviceGammaRamp(). It either works for all, or it doesn't at all for any.

There is an inconsistency between in game display and screenshots, for the one person who mentioned it here, and me on ATi drivers with r_ignorehwgamma (or whatever it is) set to 0. Not sure why, but I do think the inconsistency is because of Quake 3 behavior. I'm not calling it a failing of Quake 3, but I did assert that Quake 3 may possibly be outdated or incompatible with some changes in those drivers (not that the incompatibility or being outdated is the fault of Quake 3 or id). Is that really so ridiculous when it seems to be a problem unique to Quake 3, atleast for this user?
 
walkndude said:
Demalion, read the quote, thats what my replies were to. Not the Nolf texture sharpening issue. I cant make it any more clear to you.

Is this a thread, or is that one post an article standing by itself? You are treating this as if it is standing by itself, when it was made in the context of what I referred you to. That is what I meant by not tracking. Without context, Doom's post would fit your understanding, but in the context of the thread it has another meaning that doesn't warrant your reply. And that is all I'm trying to draw your attention to. I think it would be pretty apparent if you did as I asked. But to prevent being more guilty of what I tried to prevent, I'll end my replies to that here.
 
hax said:
demalion said:
That looks like a 16-bit versus 32-bit texture problem. There isn't any chance that nVidia has the equivalent of "Convert 32bit textures to 16bit" but it just isn't labelled clearly (i.e. texture sharpening turns it off)? Hopefully this suggestion can be tested and just dismissed or supported instead of sparking some sort of pointless attack/defense that doesn't disprove or prove it...*hint* *hint*. o_O

Q3 requests S3TC textures for the most part (which generally has higher resolution than 16bit). If it was 16bit you would see banding in certain areas, like the sky...unless that is being detected and is selectively using 16bit. I wouldn't see why though, S3TC has better quality.

I was referring to the NOLF 2 shots.
 
demalion, I would re-read your post, if it was actually your post i was replying to. I actually recommend you re-read mine. :) This is a comment from my post:

The Logical conclusion that Prime pointed out on on page 18 is that this is Id software's problem

I wasn't replying to you.

Of course, now replying to you, Quake 3 does do 'odd' things with Gamma. Quake 3 increases the screens dynamica range from 0-1 to 0-2 by specific adjustments to the gamma table. If you disable hardware gamma, or you disable overbright bits (r_overbrightbits from memory), Quake 3 is unable to increase the dynamic range so things will look pretty different. This occurs on all hardware.
 
Oh brother demalion, dooms post about nature being forced into 16bit mode has absolutely nothing to do with your texture sharpening posts. The way you make it sound is that the whole thread is revolving around your texture sharpening post.

this has become childish.
 
Colourless said:
demalion, I would re-read your post, if it was actually your post i was replying to. I actually recommend you re-read mine. :) This is a comment from my post:

The Logical conclusion that Prime pointed out on on page 18 is that this is Id software's problem

:LOL: Well, it seemed that I was being swept into the same category. It seems I was just plain wrong, no two ways around it. ;) I hope you can see why I made my mistake though.

I wasn't replying to you.

Of course, now replying to you, Quake 3 does do 'odd' things with Gamma. Quake 3 increases the screens dynamica range from 0-1 to 0-2 by specific adjustments to the gamma table. If you disable hardware gamma, or you disable overbright bits (r_overbrightbits from memory), Quake 3 is unable to increase the dynamic range so things will look pretty different. This occurs on all hardware.

How does some hardware get proper screenshots then? Is there some relation to hardware gamma and in game gamma settings that are necessary?
 
The QuakeIII issue also carried over to RTCW for me, now going back to NOLF2 the 40.41 also affected the gamma there too, at least for me. Just not as bad as QuakeIII engine games. Now SeriousSam had no issues with the new drivers. Now texture sharpness had a dramatic effect in NOLF2 on a few textures in my case. Not really sure what this setting does. I don't buy it increases anisotropic filtering one notch when I was at 8x anyways during the test. I am back using the 29.42 drivers, thank goodness for those. They don't have to many options but at least they work right for me except for the TV out stuff.
 
When will someone post screens and nature benchmarks with 16bit color? Both driversets included 30.82 and the new 40's.
 
Back
Top