nVidia release new all singing all dancing dets.

breez said:
When will someone post screens and nature benchmarks with 16bit color? Both driversets included 30.82 and the new 40's.

Well, "16 bit color textures" would be more helpful, and in contrast with 32 bit color textures.
 
demalion said:
breez said:
When will someone post screens and nature benchmarks with 16bit color? Both driversets included 30.82 and the new 40's.

Well, "16 bit color textures" would be more helpful, and in contrast with 32 bit color textures.

Yeah, whatever :)
 
Nature 1024x768x32, system specs below.

Compressed textures - 38.2 FPS

DXT1 textures - 38.0 FPS

16 bit textures - 40.1 (more banding in the sky)

32 bit textures - 37.4 FPS

Looks like for a GF3 it is a vertex limiting benchmark and not memory bandwidth limited. GF4 may show much different results due to the dual vertex shader units.

Edit, maybe cpu limited in my case, we need a more capable machine :eek:

I am not going to waste my time reloading the 40.41 drivers to see the difference in this benchmark, oh well.
 
demalion said:
How does some hardware get proper screenshots then? Is there some relation to hardware gamma and in game gamma settings that are necessary?

Quake 3 will gamma correct the screenshots it takes. The TGAs it creates should match what's on screen, if they don't there is a problem. It must be also noted that the TGAs that Quake 3 creates may not actually match what is in the frame buffer. If hardware gamma and overbrights are enabled, as they are by default, the frame buffer is not the same as what is written to disk.

Let me show you. (Click on images for larger version)

This first image shows what you are supposed to normally get on screen, and in a screenshot taken by Quake 3.

This second screenshot is a direct framebuffer dump. Notice how much darker this shot it than the previous one. This is what happens when Quake3's Gamma correction/modification isn't applied.



For completeness sake, here are a few more shows showing what various settings do.

This shot is what you get when you disable overbright bits. You should noticed that much of the screen is the same as the normal shot, except that the brightly lit areas aren't anywhere near as bright.


This last shot shows what you get with Hardware Gamma Correction Disabled. It is effectively the same as the shot with no overbright bits.
 
Thanks, your illustration is very clear. My last question is to verify the reason for the difference between the framebuffer dump and the corrected TGA images: this is because Quake 3 artificially adjust the data in the process of creating a screen capture to simulate the effect of its custom managed gamma handling (I presume of the hw gamma management) that is not the same as when the data is viewed independently? I tested various in game gamma settings and none matched the direct framebuffer dumping that was the source of the discrepancy I experienced. EDIT: To be clear, I read the answer as yes and just want to make sure I read correctly.

I was aware of the basic principle here (but not about the significance of overbright or the full import of enabling the ignorehwgamma setting), but not fully cognizant that all other screenshots depended on being done by the in game routine (I just used PrtScrn).
 
The reason why Quake 3 applies the gamma, is the Hardware Gamma Table is set up to brighen the images by a factor of 2. Without brigtning the images, they will not look anything like they are supposed to.

So, the answer is a 'yes'.

You should NOT use PrntScrn with Quake 3 engine games. Your screenshtos will most likely be wrong.
 
Colourless said:
Gamma in Quake 3 has been correctly working on pretty much all cards since The Q3Tests. All of the sudden Nvidia releases some drivers that seem to break Quake 3. The Logical conclusion that Prime pointed out on on page 18 is that this is Id software's problem.

/me laughs out loud

Hmmm, laugh all you want, and go ahead and put words in my mouth while you're at it...I don't remember saying that it's conclusively id's problem. My speculation was based on the posts of a friend of mine that pointed out on another board that Q3 possibly doesn't like the newer OpenGL 1.4 code in these newest drivers. I'm hopeful (note the "Hopefully this will be fixed" part in my previous post) that this is the case and that it will be fixed in the newly announced soon-to-be-released update to Q3 (even though before this, id said there would be no more updates). Or that nvidia will fix it in the next revision. I guess I should have put that last bit in my other post so as not to cause you such hysterical fits. :rolleyes:
 
My speculation was based on the posts of a friend of mine that pointed out on another board that Q3 possibly doesn't like the newer OpenGL 1.4 code in these newest drivers.

It highly unlikely to be an API issue since its incumbent on the API to maintain backwards compatibility with previous versions on the API such that the raft of older applications that utilise those previous versions do not need such changes. Do DX7 games suddenly stop working because DX8 or 9 is about? No.

Its more likely to be an issue with the Drivers.

Also, I’ve heard that the same issue does not present itself with Medal of Honour, which is another Q3 based title, likely using similar code for the engine, so this would appear to be something specific.
 
Well this problem with Q3A seems to affect only some people. Maybe they have to install the drivers again or something.

The brightness slider works fine, I can easily distingvish 32 and 16 bit color, and 16 and 32bit textures.
 
DaveBaumann said:
Its more likely to be an issue with the Drivers.

Also, I’ve heard that the same issue does not present itself with Medal of Honour, which is another Q3 based title, likely using similar code for the engine, so this would appear to be something specific.

I agree with your point, but it's no more certain than my speculation. You can turn the logic over and say that if it works in MoH but not Q3, it's more likely Q3 at fault since MoH is similar but works. :)
Also, my original post was not intent on blaming any party. Could there not simply be an incompatability between the new drivers and the old game without either being "broken" or "wrong"? Maybe id doesn't need to "fix" Q3, rather "modify" it? Ditto for the drivers. Regardless, this is turning into too long a post to state something as simple as: "Maybe it will work better after the Q3 point release" or the next driver release.

noko: I also have a GF4 and my Q3 looks bad too. It's definitely not just a GF3 problem. I would state that hopefully the next point release will help matters, but apparently that's not the right thing to say...
 
Prime said:
Could there not simply be an incompatability between the new drivers and the old game without either being "broken" or "wrong"?
no, there could not

And you logic is flawed.
 
Prime..

If Noko uses a older driver eveything looks ok, if he goes back to Det 40 things get bad, then you state I hope ID fixes this driver issue in their next point release..

Ok then.. :-?
 
The new dets contain support for the ARB_vertex_program extension, NitoGL posted a R9700 demo over at Rage3D which runs fine (300+fps), if you enable NV30 emulation this drops to ~6fps without any visible change in quality. Does this mean the NV30 will be 50x faster than the NV25 :D, and that is a joke; before the SCUDs start flying...
 
At NVNews after the first 10 pages of fanboyism the problems are started to get recorded for these drivers. Video out seems to be broken as well as well as a number of other issues. Well Nvidia has definitely released a beta driver on their website that truely is BETA. There are some nice features with these new drivers but still too many problems. Why release a driver that can imatate a NV30 when the NV30 is surppose to be around the corner :eek:?
 
I think we all know the answer to that question. But it seems that if you try to point it out, you'll be labeled an "insert competing company name here" f@nboy. :rolleyes:
 
Doomtrooper said:
Prime..

If Noko uses a older driver eveything looks ok, if he goes back to Det 40 things get bad, then you state I hope ID fixes this driver issue in their next point release..

Ok then.. :-?

Ok Ok, I guess I should have worded my original post differently. I had it stuck in my head about a possible OGL 1.4 incompatibility from a discussion with someone else, and then id comes out with news of a new Q3 point release. As I stated in my earlier response, my original post did not have the intent of saying that it's obviously a q3 problem, nor even that it was likely a q3 problem. The intent was that hopefully it was a q3 problem/incompatibility, and that it would be fixed quickly with the upcoming point release. That would get the problem resolved more quickly, and I'd still have all the goodies in my new driver panels. :)

Little did I know that an innocent post would provoke such a response from certain individuals on this board who take video cards WAY too seriously. (Not talking about your post, Doom. It was pretty tame under the circumstances)

Althornin said:
no, there could not

And you logic is flawed

Well geez, I stand corrected. Such compelling arguments have swayed me over to your side... :rolleyes:
PS. And you grammar is flawed

noko said:
Video out seems to be broken as well as well as a number of other issues.

Can't speak for the other issues, but vid-out on my card (GF4) actually works better with the newer drivers. (Sorry if that offends some of you :rolleyes:)
 
Here is an example of just how deluded the internet press is:

http://www.hothardware.com/hh_files/S&V/det40.shtml

While even the most jaded nVidia supporter here will admit that there are issues with this driver, this is the kind of garbage that the majority of the internet passes on as "real" journalism. Here's the final conclusion:

"It is certainly good to see NVIDIA putting work into its drivers. Of course, we'd much rather see NV30 or even a functional nForce2 board for that matter. The new Detonator 40.41 drivers deliver on what NVIDIA claims, though. Mainly, gamers get a little extra performance and more business-oriented users have a couple new features to increase productivity. The inclusion of anisotropic filtering support has been long overdue, but for the most part, the feature is still too slow to be used at high resolutions. From a logistical point of view, the new control panel is much easier to navigate and should be very easy to adapt to. Now, if only ATI could take a hint and simplify the myriad of quality and performance settings cluttering the Catalyst drivers..."

This is the reason to applaud B3D for their inciteful reporting. I bet HotHardware is on the "A" list to recieve the newest nVidia hardware.....
 
martrox said:
Here is an example of just how deluded the internet press is:

http://www.hothardware.com/hh_files/S&V/det40.shtml

Well. . .Chris, the author of that hothardware article, used to be the main hardware writer for Sharkyextreme, a site I've never had much respect for. But plain ole' logic and that article's conclusion are definitely in bi-polar positions of one another (insofar as comments comparing the layout and functionality of these new beta drivers to the Catalyst control panels).
 
Back
Top