DaveH: NVIDIA 44.03 Texture Filtering

Bouncing Zabaglione Bros. said:
StealthHawk said:
When Performance mode is selected in the driver control panel does it do tri/bi filtering of the detail textures, or only bilinear? If the driver really is just dropping down to Performance mode, isn't this likely to be a driver bug? in which case the whole fuss was made over nothing?

But it's app detected. Haven't people seen the same thing with any app renamed to UT2K.exe? It seems too much of a coincidence that this "bug" would happen at the same time as Nvidia marketing is crowing about this "30 percent increase in speed" on this driver set and UT2K in particular.

It would have to be a pretty amazing bug that causes the same speed increase on the same game as Nvidia happened to put into their marketing slides.

I don't think even NVIDIA would intentionally just lower the slider on purpose. I would actually believe it's a bug if that was what was happening.
 
It's an old image, but its something that Kyle will understand.

joke.jpg
 
StealthHawk said:
I don't think even NVIDIA would intentionally just lower the slider on purpose. I would actually believe it's a bug if that was what was happening.

Stealthhawk, how can anyone consider this a bug and say it with a straight face when you rename the Direct3D Anisotropic tester to UT 2003 it changes then slider option back to 'performance' filtering. When you don't rename the filtering tester it shows trilinear or quality :LOL:

If that is not evidence, what is....a 'bug' only when it see's UT 2003...sure.
 
Doomtrooper said:
Stealthhawk, how can anyone consider this a bug and say it with a straight face when you rename the Direct3D Anisotropic tester to UT 2003 it changes then slider option back to 'performance' filtering. When you don't rename the filtering tester it shows trilinear or quality :LOL:

If that is not evidence, what is....a 'bug' only when it see's UT 2003...sure.

I seem to remember that you believed that the old quack problem was just a bug. And that was also just a rename .exe and the driver "not recognising the texture slider" problem. And just in that specific game.

Now, let's not go there yet again but just see if Nvidia can get the performance back without this "optimization" in the new set of drivers that supposedly are coming out soon.

If they can, then surely it must be just a bug ? :)
 
Well obviousally they can't Bjorn, we have been down that road before...forcing trilinear in UT 2003 has been done using the aniti-detect scripts and the peformance hit was significant.

Funny people bring up Quack alot when losing arguments, Quack which spawned from that same biased site that has not mentioned the NV 'hacking/optimizations' not in one title, but all titles...including synthetic.


The ATI optimizations affected some maps and a total of 5 textures, here we have multiple titles with very poor filtering, and guess what..no mention of it besides 3Dvelocity.
Quack is mild to crap image quality below (going up against ATIs worst image quality)

http://www.3dvelocity.com/reviews/gffx5800u/gffx_5.htm
 
StealthHawk said:
I don't think even NVIDIA would intentionally just lower the slider on purpose. I would actually believe it's a bug if that was what was happening.

I do, and all the evidence you need is here:


Example

Antalus
1280x1024 Aggressive - 148.3 FPS
1280x1024 Balanced - 134.2 FPS
1280x1024 Application - 92.8 FPS

1600x1200 Aggressive - 108.5 FPS
1600x1200 Balanced - 95.5 FPS
1600x1200 Application - 65 FPS

http://www.beyond3d.com/forum/viewtopic.php?t=4772&highlight=anisotropic+filtering

35% + performance gains...
 
Bjorn said:
I seem to remember that you believed that the old quack problem was just a bug. And that was also just a rename .exe and the driver "not recognising the texture slider" problem. And just in that specific game.
....

Considering the fact that "quack" is separated in time from the present by a considerable span of time, dealt with different API versions and APIs, different software and different hardware, and was corrected to everyone's satisfaction long ago, what has "Quack" remotely got to do with the current situations regarding nVidia--which are ongoing, unacknowledged by the company, and unresolved?

I can talk about other things ATi did years ago relative to benchmarks back when 2d was the norm for games--so what? It's ancient history. What is compelling is what is going on *now*, don't you think?

I mean, basically, prior cheating by nVidia, ATi, 3dfx, whomever--is absolutely no defense nor is it a justification for the things occuring now relative to these *current* issues, which have their own distinct identity and their own unique ramifications in the present.

So, whatever one's interpretation might be about historical events that have long been resolved, they should never be confused or compared with current events. To that end I object to even raising the subject as its only possible contribution is to cloud and obscure a clear and precise understanding of the present events.
 
Doomtrooper said:
Well obviousally they can't Bjorn, we have been down that road before...forcing trilinear in UT 2003 has been done using the aniti-detect scripts and the peformance hit was significant.

Funny people bring up Quack alot when losing arguments, Quack which spawned from that same biased site that has not mentioned the NV 'hacking/optimizations' not in one title, but all titles...including synthetic. ..

Maybe they can't. I don't know and i don't really care that much either. Btw, the reason that i brought up quack had nothing to do with the problem itself, only with your opinion about it and how you seemed to have changed your mind lately.

I can talk about other things ATi did years ago relative to benchmarks back when 2d was the norm for games--so what? It's ancient history. What is compelling is what is going on *now*, don't you think?

See above..
 
Bjorn said:
Maybe they can't. I don't know and i don't really care that much either. Btw, the reason that i brought up quack had nothing to do with the problem itself, only with your opinion about it and how you seemed to have changed your mind lately.

....See above..

I'm really not trying to split hairs here, I think this is an important point.

As we all know, the "quack" issue and the issue of the improprieties nVidia's manifiested for the last several months are two separate issues not identical in character or substance and separated by a lot of time.

If Doom, or anybody else, has a different opinion on each of these issues, how is that surprising considering that they are not the same issue in the first place? Different issues often elicit different opinions, don't they?

Therefore, having different opinions on different issues is not contradictory. Right?
 
Doomtrooper said:
StealthHawk said:
I don't think even NVIDIA would intentionally just lower the slider on purpose. I would actually believe it's a bug if that was what was happening.

Stealthhawk, how can anyone consider this a bug and say it with a straight face when you rename the Direct3D Anisotropic tester to UT 2003 it changes then slider option back to 'performance' filtering. When you don't rename the filtering tester it shows trilinear or quality :LOL:

If that is not evidence, what is....a 'bug' only when it see's UT 2003...sure.

I'm not saying it's a bug. I'm saying there is a possibility that it is a bug.

It's called a software sideeffect. One reason I'm against application specific optimizations.
 
StealthHawk said:
Bouncing Zabaglione Bros. said:
I don't think even NVIDIA would intentionally just lower the slider on purpose. I would actually believe it's a bug if that was what was happening.

Yes they would and they are. This is not a bug that would pass they competitive analysis, let alone QA.
 
SpellSinger said:
StealthHawk said:
I don't think even NVIDIA would intentionally just lower the slider on purpose. I would actually believe it's a bug if that was what was happening.

Yes they would and they are. This is not a bug that would pass they competitive analysis, let alone QA.

Why not, according to [H] there is no discernable IQ loss :LOL:

Seriously though. I never said it absolutely was a bug. I said it could be a bug. Big difference. You guys are jumping on the bandwagon way to early, with no evidence. Using benchmark scores from old drivers don't prove what NVIDIA is or is not doing in these drivers.

As I outlined at R3D, the claims being made are simple to prove.

1) Take a screenshot of Xmas' filtering program with the driver set to Quality and the executable renamed to UT2003.exe.
2) Take a screenshot of Xmas' filtering program with the driver set to Performance.
3) Compare, if they are exactly the same then more testing needs to be done. If they are not exactly the same then it should be clear that NVIDIA is not just moving the IQ slider down, but is possibly lowering quality to around the level of Performance.
4) Benchmark UT2003 with the driver set to Quality.
5) Benchmark UT2003 with the driver set to Performance.
6) If the filtering patterns are the same, and the benchmark scores are the same, then NVIDIA is lowering the slider down to Performance mode in the case of UT2003. Otherwise, they are doing something else.

It should not take that long to verfiy whether your theory is correct or not, but until you or someone else takes these steps you have not scientifically proven anything. Right now you are comparing apples(new driver, new filtering algorithms) to oranges(old driver, old filtering algorithms).

Specifically, it is not clear right now whether 44.03 is dropping straight down to Performance in UT2003 when Quality is set, or whether it is approxmiately dropping down to Performance, or whether it is doing something else entirely. AFAIK Performance provides bi/tri period. AFAIK Quality in UT2003 is doing trilinear on default textures and bilinear on detail textures. Please correct me if I'm wrong.

edit: fixed quote

edit: Well, it looks like it is not dropping down to Performance, because Performance behaves differently: http://www.beyond3d.com/forum/viewtopic.php?t=7072 So much for that theory :LOL:
 
Stealthawk, the 44.03 drivers effectively removed the "Performance" option it seems. Look here in Toms review the performance and high performance options both nearly drop back to full Bilinear, not the half and half mode that was the old "Balanced" mode, or what they are doing in UT2003.

[edit] - oh, didn't notice your last edit!
 
Well that's very interesting. For some reason I thought "Balanced" was still doing tri/bi as well as having an improved algorithm :oops:

However, I would still hesitate to say that what we're seeing in UT2003 is old school Balanced for the following reasons:

Comparing the two texture filtering shots Doom posted, you can see that they look very close, but they are not the same.

Back with pre-43.51 drivers, Balanced's AF algorithm was much poorer than Quality's at clearing up textures. The new Balanced setting is much better, except it apparently is not doing tri/bi anymore. The new Balanced setting clears up textures almost as well as Quality/Application, trilinear not withstanding.

In conclusion, we are seeing better IQ than what "Balanced" provides in 44.03, as well as better IQ than what you would get from the old driver's Balanced. I had this typed up a bit better, but the forums ate my post :( Anyway, it certainly doesn't look like there is even a possibility of this being a bug :)
 
StealthHawk said:
Seriously though. I never said it absolutely was a bug. I said it could be a bug. Big difference. You guys are jumping on the bandwagon way to early, with no evidence. Using benchmark scores from old drivers don't prove what NVIDIA is or is not doing in these drivers.

I don't see how it could be a bug. Bug are unexpected errors that are discovered either during, or after the development/Q&A process. The fact the Nvidia PR are *advertising* the effects (same 30 percent frame increase, on the same UT2K affected application) means that they know about this "bug" in detail, well in advance. This cannot be an accidental error. It must be a deliberate cheat.
 
Dave H said:
While I would think they might mention it elsewhere, it bears noting that in the excerpt you posted, Nvidia doesn't even mention bilinear vs. trilinear, only their AF kernel. (Incidentally, I'm sure you're not supposed to post the whole thing, but that reviewer's guide looks interesting. It's always fun to see the sleaze Nvidia tries to smear around.)

Sorry to harp on about this, but I found the quote again whilst digging through the NV35 documentation for the NV35 preview.

The following grab is from the document: "GFFX_5900_Overview_041803_v2.pdf" handed out to press:

5900_overview.gif
 
official Nvidia marketing said:
NVIDIA Intellisample HCT includes a "quality" mode that delives true anisotropic filtering...

Ok, that's pretty explicit. I didn't doubt such a statement was made, although it's interesting to see it doesn't offer even a nitpicky semantic loophole. And, if I've got the timeline correct, this document was released at the same time as the 44.03 drivers, correct?

DaveBaumann said:
Sorry to harp on about this,

Not at all. Informative posts like this are tons more valuable than page after page of outraged screeds before all the facts are in.
 
Dave H said:
And, if I've got the timeline correct, this document was released at the same time as the 44.03 drivers, correct?

I believe so, but I think we had various builds of the drivers. Either that or my mind is failing me since balanced is not an option on the current 44.03 drivers.
 
Back
Top