nVidia release new all singing all dancing dets.

Flaws always come out. Why put them in on purpose?

How can you consider "blind benchmarks" improving by a large measure in the single most controversial performance issue a flaw?

Website review on driver version X at max AF scores 56 fps, then suddenly driver version Y at max AF scores 76 fps... and there is much rejoicing and the topic slowly dies since nobody bothered to actually look at driver version Y to notice it's done absolutely nothing IQ-wise between max and second from max settings... with the only notable difference being a very small change in performance to discourage those that might think the top most setting isnt "taking" or being set at all.

But then again, I don't expect any reasonable discussion on this topic as we've already been over this, from which you gloriously announced the difference between 4x and 8x was "plainly obvious" and that you just needed to "know where to look"- yet scurried away into silence when this link was given in the last set of "monumental AF performance improvements at 8x!" driver revision:
http://shark_food.tripod.com/aftest/aftest.html

So in reality, NVIDIA can keep playing their benchmark games for the "blind benchmarks" to create spurious yet fictional improvements in AF performance (or looks like spurious yet fictional non-AF performance by using point-sampling as no-AF now too) and the owners of the hardware can just recognize two things:
1) Benchmarks scores published for AF (and now no-AF) hold absolutely zero basis for true hardware performance.
2) NVIDIA still has the "real McCoy" reachable by Unwinder or others making tweakers so we can at least have the option to use this functionality, albeit at the proper, non-website benchmark performance levels.

It is just evolving into things like:
Bill: How fast does the Geforce4 run in benchmark XYZ?
Ralph: 126 FPS by website method, 97 fps for "real"
Bill: How about with anisotropic filtering?
Ralph: Well, website default for 8x would be 78 fps, but with "real" 8x it's 63 fps.

So it does all simply become a matter of PR... but also one of missed expectations if anyone establishes some form of acknowledgement that performance anticipated by website benchmarks can be used to extrapolate real performance once the REAL features are actually used.
 
NVIDIA has obviously done something wrong with the 8X setting. What makes you think that is the way it's supposed to be?
There is absolutely no difference in IQ between 4X and 8X in most games, and the performance is very near the same:
3dmark Low Detail Car: 8X 123.9fps
3dmark Low Detail Car: 4X 125.0fps

Dragothic Low Detail: 4X: 112.5fps
Dragpthic Low Detail: 8X: 111.3fps

In UT 4X looks similar to 8X.

To me it looks like they are trying to optimize the settings somehow (maybe by using a trick similar to the noe Rivatuner can do), but it looks like they are either doing something wrong at this point, or they are not "there" yet.
I think it's a little to early to yell "cheat!".


1) Benchmarks scores published for AF (and now no-AF) hold absolutely zero basis for true hardware performance.

I don't understand this. Does not benchmarks that don't use AF have any real basis? Please elaborate. And does not benchmarks that use 2X or 4X AF have any real basis?
 
Could the fact that the drivers are for XP/2000 OS only have an effect on the increases* seen in the drivers ?
Disregarding the 8x Aniso for a moment, but couldn't they have figured out a way to squeeze some extra speed out of the card by using certain optimisations found in 2k based OS's ?

or am i just grasping at straws.. and we will see a W9x release of the drivers soon ?
I feel that if we do not see a W9x version of the drivers, then this may be the case..

*actual increases are subject to opinion
** edited for content
 
Galilee - the difference between 4x and 8x AF is dependent on the levels of anistropy in those tests. If the algorithm produces no more taps than what is available at 4x level, then the performance should be the same at 8x.

gameresults2.gif


From my own testing, the high detail dragothic and nature tests show a drop across all the AF levels.
 
Okay, but then what the f*** are they doing? :)
My guess is that they have optimized some levels, and that that results in 4X quality in some games, but better on others.
?

And an optimization like that might mess up the display of mip-levels like the SS:SE shots Matt took.
 
Top half of screen 8x, bottom half 4x, that could be an adaptive method that would work somewhat between the 4x and 8x setting. Not that Nvidia is doing this.
 
The bottom line here is that something's wrong with these drivers. I would not accuse nVidia of anything.....yet. When these drivers make WHQL, then there should be major screaming.....with the exception that so many sites will use these drivers like they are the last coming of Christ......which very well may be nVidia's intent. While they can stand back and say "hey, that's why they're beta", notice they don't issue any statements explaining just what's happening. Hence my original question of another "nVidia halftruth".
 
Yeah theyre just borking 8x to improve benchmarks... thats why Neeyiks benchmarks show the 40.41 performing worse than the 30.xx.

Same tiring arguments over and over and over.
 
walkndude said:
Yeah theyre just borking 8x to improve benchmarks... thats why Neeyiks benchmarks show the 40.41 performing worse than the 30.xx.

Sorry you didnt understand this.. I will try to be more clear.

8X was "borked" earlier (actually the first month after the GF4 was released.. the "AF performance has been fixed!" drivers) but people are just now actually looking at them (the link I gave is over a month old). It is now additionally "borking" more to close the gap, and I'm sure Anand is busily doing a "Radeon 9700 vs GF4 Ti 40.41 shootout" as we speak, where both point-sample mode is compared to trilinear and 4x GF4 (labeled as 8x) is benchmarked against 16x on the 9700.
 
1600*1200 TC disabled 40.41

car test high qu.

4x 36.6 aniso registry value is set to 3 -> 4xaniso
8x 36.7 aniso registry value is set to 4 -> 4xaniso (bug)
8x 31.4 rivatuner 8x aniso registry value is set to 8 -> 8x aniso

lobby test high q

4x 28.1 aniso registry value is set to 3 -> 4xaniso
8x 28.2 aniso registry value is set to 4 -> 4xaniso (bug)
8x 24.3 rivatuner 8x 8x aniso registry value is set to 8 -> 8x aniso


i think it is a too obvious bug to be a cheat at least IMO funny thing is that brian burke not mention this 8x aniso issue while for him the point sampling thing is a bug what i think it's not
 
Maybe the anisotropic issue is a bug (I personally don't believe it is as Shark pointed out this has been like this for a while)..still doesn't explain the the point sampling offering which lowers IQ yet inflates 3Dmark scores (default graphic setting I might add), the obvious optimizations on BENCHMARK appz only..

When is the last time any of us played 3DWINBENCH...no comment.. :rolleyes:

The only good thing about this release is the improved driver panel.
 
Doomtrooper said:
Maybe the anisotropic issue is a bug (I personally don't believe it is as Shark pointed out this has been like this for a while)..still doesn't explain the the point sampling offering which lowers IQ yet inflates 3Dmark scores (default graphic setting I might add), the obvious optimizations on BENCHMARK appz only..

When is the last time any of us played 3DWINBENCH...no comment.. :rolleyes:

The only good thing about this release is the improved driver panel.

why they even implemented aniso settings like this ? -> 0,1,2,4,8 of course it's confusing people and aniso 0 is the same as it is in rivatuner and that is point sampling so nothing seems wrong they just forgot to also add the specific "d3d application determinde" option which is also in rivatuner. actually default setting is "d3d application determinde" but as soon you change the aniso setting the first time aniso 0 will be point sampling. you can set it again to "d3d application determinde" if you hit the "restore defaults" button and it is no need to unistall the drivers as brian burke stated.
 
...

Well, can't you just call it "All singing, all dancing display driver, that - being rather drunken - unfortunately wet itself" and be done with it?

;)
 
Doomtrooper said:
till doesn't explain the the point sampling offering which lowers IQ yet inflates 3Dmark scores (default graphic setting I might add),

The mode selected when you choose "aniso 0" is NOT the default. After installing the drivers, the default is always "application preference." Now nVidia just needs to fix the driver control panels so that that is an option to switch back to after selecting another mode of AA.

In other words, right now, the D3D aniso settings in the driver panel are useless. However, it's still good that they're there, as the problems are so simple that they're almost certain to be fixed with the next release.

I think it's kind of silly, though...it seems that all the intelligent people at nVidia are doing things other than making the driver panels. They probably have one intern doing all the work for the driver panels...
 
Since I'm trying to do a review of MotoGP at the moment (and it didn't like the 40.41 drivers at all), I've been playing about with AF in the 30.83s - rather than just use the usual benchmarks, I wanted something with a shed load of texture layers and/or wide range of anisotropy and ended up using the PVR Temple demo.

1024 x 768 @ 32-bit
Default drivers = 132.5 fps
RivaTuner 0x = 132.6 fps
1x = 132.9 fps
2x = 112.1 fps
4x = 92.7 fps
8x = 81.5 fps

So let me get this straight - somebody wants to claim that using nearest-point sampling is an excellent "cheat" for a GF4? When the memory interface is perfectly capable of coping with 4 32-bit texels per memory bus clock? Oops, my mistake - there's a 0.1 fps gain to be had!

And as for AF 4x & 8x...well, I have to say that I couldn't see much difference myself but there again the areas where it would matter didn't stay visible for very long. There's certainly a performance difference between the two though.
 
Ok I'll post what I edited away. It's flamable but who gives a shit.

:

I'm sorry but this is getting way out of hand. ATI people like Doom keeps saying they are sacrificing imagequality for performance and that the default setting is point-sampling.

That is just plain wrong. The default setting is application controlled, and looks like it always have looked.

It is actually pretty irritating to hear people say this over and over again. Not just here (because here people are intelligent enough to know bs when they hear it) but when these people are saying it over and over again at ATI fansites it pisses me off. I know that making scandals like this is essential for a fan (and essential for a fansite, you know to pull the crowd together), but try to stick to the truth. I know that you get some sort of respect at rage3d when you badmouth NVIDIA but try to use "real stories" instead of making them up.

NVIDIA's driver have problems, but instead of using real problems that are actually worth discussing you make up problems where there are none.
 
Back
Top