nVidia release new all singing all dancing dets.

1. For those interested (or for those that thinks the image is more blurry with these drivers), here are three shots from Tiger Woods:

30.82:
http://129.241.138.186/3082normal.jpg

40.41 default without Texture Sharpening:
http://129.241.138.186/4041nosharpening.jpg

40.41 with Texture Sharpening:
http://129.241.138.186/4041sharpening.jpg

It's obvious from these pictures that the IQ is just as sharp (or blurry) with the new drivers as the old. Enabling Texture Sharpening does make the image sharper, but BEYOND the default "sharpness" of both driversets.

2. The overclocking tab is only availible if overclocking is enabled in Rivatuner (or similar) OR by using Coolbits.reg. The pictures I posted on the first page might be misleading.

3. As some people here mentioned (I have now verified it many times myself): The default aniso setting is actually 1X (or "application controlled") and not 0X even if the slider shows 0X. When you change the slider the first time "application controlled" will go away and aniso will follow the slider with 1X as "no aniso" and 0X as no filtering whatsoever. To get back "application controlled" you can press the Restore Defaults button. Or just use 1X.
 
I just have to say, Galilee, that those shots are not particularly-good at showcasing texture quality.

Try finding shots that include highly-regular textures that are also high-contrast. Those make the best test. I can e-mail you a tiny level for UT that I made to compare texure filtering for horizontal/vertical surfaces (i.e. doesn't show 8500's rotation probs...).
 
Reverend said:
Er, has it been confirmed that the improvements are for the pixel shaders and not the vertex shaders? Insofar as 3DMark (Nature) and Aquanox are concerned, I don't think we can separate the two (vertex and pix shaders).

In any case, your second question probably answered your first, ben :).

When I tested my GF4, I noticed that the adv pixel shader test score almost doubled and nature test almost doubled while the other 3dmark test including vertex shaders stayed about the same. I thought the nature test was more stressfull on the pixel shaders so I assumed (which is a bad idea).... Still an x2 gain is a very healthly optimization.
 
Chalnoth said:
I just have to say, Galilee, that those shots are not particularly-good at showcasing texture quality.

Try finding shots that include highly-regular textures that are also high-contrast. Those make the best test. I can e-mail you a tiny level for UT that I made to compare texure filtering for horizontal/vertical surfaces (i.e. doesn't show 8500's rotation probs...).

Well I know, but it's quite easy to see on the green grass if you switch back and forth.
But please e-mail me that map if you can. vidaralm@stud.ntnu.no
 
When I tested my GF4, I noticed that the adv pixel shader test score almost doubled and nature test almost doubled while the other 3dmark test including vertex shaders stayed about the same. I thought the nature test was more stressfull on the pixel shaders so I assumed (which is a bad idea).... Still an x2 gain is a very healthly optimization.
The toughest thing in the Nature test isn't the vertex or pixel shaders - it's the stack of alpha textures all over the trees and ground. The first couple of frames are predominately textures - there's no pixel shading going on until you reach the lake surface and vertex stuff only involves fairly basic movement - and the newer drivers have a significant effect right from the off.

The VS Test is only using a shader for the skinning, everything else is standard pipeline stuff; which is probably why there's no increase here. The PS/APS tests though are both multiple texture stages (4 for both tests), which makes me wonder if this is where the "boost" has been applied.
 
Also, Galilee, did you post fps comparisons for that game already? I think it would be necessary to include frame rates for the image quality comparisons, hopefully in the same post for clarity.
 
Slightly off-topic, but does anyone remember the console command for Quake 3 to show the different mipmaps boundaries using different colors? Also, if you know the command for Serious Sam 2 as well, that'd be great too. :) TIA.
 
demalion said:
Also, Galilee, did you post fps comparisons for that game already? I think it would be necessary to include frame rates for the image quality comparisons, hopefully in the same post for clarity.

Hi. I forgot to use fraps with the 30.82 drivers (and I wont install them again, so much work). But I tested in the new drivers and got 71fps at the point where the shot is taken. With sharpening I got 43fps. According to Reverend the performance in Tiger Woods is the same with the new drivers. Personally I find the game more smooth. No idea how to run a timedemo in that game, but if feels more smooth with the new drivers (was pretty choppy with the old for me).
 
Neeyik said:
Shouldn't that settle the IQ (cheating) question?
There's no "cheating" to settle. Go back a few pages and you'll see that I've posted test results of Nature with the 30.30s and 40.41s - not surprisingly the latter are considerably higher. I then posted scores of all the tests in 3DMark with the AF setting at 0x and 1x (and yes, it looks rough in 0x)...result? No difference - which should come as no surprise if people actually remembered their hardware stats. Each pipeline TMU is designed, on a GF4 at least, to fetch a bilinearly filtered texel in single clock cycle in the first place (and so memory bandwidth hit has been taken into consideration), so what is there to be gained by using nearest-point?

For clarification, I certainly was not accusing NVIDIA of "cheating," but simply pointing out that others are, and the only way to get to the bottom of it is to do direct IQ comparisons in the specific applications showing the most dramatic effect.

This happens to be the "Nature" test, and I personally would be interested in seeing the two IQ screenshots generated by the Pro version if someone would post them (one for each driver... and possibly the advanced shader test as well).

I could care less about 0x or 1x settings, since I thought it was already established that it defaults to 1x and the slider labeling is just misleading.

Again: can anyone post the screenshots of Nature for comparison between 30.xx and 40.xx drivers? If people are "suspicious" of the method in which performance in this test was increased, then direct screenshot comparisons would relieve their doubts. I'd do it myself, but I have neither the pro version nor a GF4.
 
Alrighty, I snapped off a bunch of screenshots in both Quake 3 and Serious Sam 2 showing the different colored mipmap boundaries at every anisotropic filtering level for both D3D and OGL. Suffice to say, I'm quite baffled with 8x in D3D, and I don't see the point of even using 0x at all. Too grainy. See results here: http://www.3dgpu.com/yabb_se/index.php?board=2;action=display;threadid=1360

Just be forewarned, you'll have to download about 836KB of data for all the pictures to load.

Bigus Dickus:

Hang on tight, I'll do it. I'm assuming here you want screenshots of both Det 40.41 and 30.82 at default D3D settings....
 
I'd also recommend the most similar looking image quality shots that shows a performance increase, if you can spare the time to establish how similar looking they are. Perhaps just ensuring 1x sampling is enabled and using that for comparison?
 
Reverend said:
You sure 1x = bilinear and not as per application setting?

I'm sure.
I've verified it in out own engine.

Upon installing the driver it's in application settings mode (altough it displays 0x)

After you change the anisotropic settings, it always overrides the application.
And there's no way to change it back to "application preference".
(Except for uninstalling the driver.)

---

detxp.png

The new standard in image quality!
 
use restore default in d3d driver panel to set aniso 0 to application preference again
 
Galilee said:
1X is the same as application control. (bilinear/trilinear).

not entirely. 1x is bilinear.

although in most cases it leads to the same results.
 
Okay, finally... takes forever for me to upload on this blasted 56k modem. :-?

I've taken a near similiar screenshot in Nature, with Detonator 30.82 and 40.41. I've ensured that I clicked on Restore Defaults for the D3D properties page, and made sure everything was at default. I didn't touch anything in 3DMark 2001SE, except to tell it to only run Nature. I used F12 to capture, which creates a .bmp file, and I've converted it to .jpg with the compression ratio at the lowest (1%). Each screenshot is quite large, at about 700kb apiece.

Detonator 30.82: http://www.3dgpu.com/misc_images/det3082.jpg

Detonator 40.41: http://www.3dgpu.com/misc_images/det4041.jpg

Looking at them, and taking into account that the shots are about 0.05 seconds apart, I really can't notice any difference at all. Can you?
 
Back
Top