DaveH: NVIDIA 44.03 Texture Filtering

CapsLock said:
"Interesting..." ??

The wods that come to my mind are more like: sad, outrageous and pathetic. Of course these are merely subjective reactions and are not indicative of actual gameplay.

Well I got banned from [H] forums for stating the same thing that Dave Baumann exposed in this thread, although I didn't have this evidence I used common sense reasoning.

Bottom line is there has been a cover up done even when Brent admits here on this forum that he saw the difference between 'performance/balanced' and 'application/quality' filtering, and again with all the history with Quack and now supposedly knowing all along for months and continueing to benchmark the ATI cards using full trilinear while letting Nvidia get away with driver hacks.

If that is not evidence enough for ATI PR to put them on the bottom of the list for the next card release, I don't know what is.
What do they got to do ,to get blacklisted ?? They got caught cheating on the Pentium 4 article, and they got caught again here
rofl.gif


So interesting was the best I could come up with, I want to say lots more.
 
rubank said:
It seems clear that low AF setting together with aggressive LOD gives at least as good IQ while giving clearly higher scores.

Aggressive LOD will lead to texture aliasing in motion.
 
You´re right of course, but 4xAA seems to take care of most of that problem and it is really hard to tell the difference between 2xAF/-2 and 16xAF/0 even in motion (apart from the higher framerate in the first case) and not only in ut2003.

Could be my bad eyesight of course.

rubank

EDIT: oops made a mistake, forget about it :oops:
 
rubank said:
You´re right of course, but 4xAA seems to take care of most of that problem and it is really hard to tell the difference between 2xAF/-2 and 16xAF/0 even in motion (apart from the higher framerate in the first case) and not only in ut2003.

Err, multisampling AA doesn't do anything for texture aliasing.

Could be my bad eyesight of course.

Could be. :)
 
Easy AF test?

I just tried out the Tiger Woods Golf 2003 demo, and the bilinearly filtered MIP levels and shimmery AF are quite distinct on my 9100, particularly towards the end of the tee shot, when the camera follows the ball back toward the curvy greens. I'm curious to know how Quality AF on both an ATi and nVidia DX9 card would look--it seems to be an easy test of how both Qualtiy and Performance modes of each IHV fare (in terms of AF'ing at all angles, and their various pseudo-trilinear MIP-map blends).

Edit: I wonder if renaming the executable to "ut2003.exe" would have an effect, too. :)
 
Doomtrooper said:
Bottom line is there has been a cover up done even when Brent admits here on this forum that he saw the difference between 'performance/balanced' and 'application/quality' filtering

What does that have to do with the current UT2003 issue? Is the level of filtering being done with Quality selected the same as the filtering done with Performance being selected?

I assumed what was going on was that we were getting something between full trilinear(what you normally get with Quality) and what you get with bi/tri when the slider is set to Performance.

Quality UT2003 on Quality Performance High Performance
........................|........................................................
 
Dave H said:
Err, multisampling AA doesn't do anything for texture aliasing.

Yeah, I said I made a mistake.
But still, I see very little difference and while this may be hurting to others I can just let my sore eyes continue to deceive me :)

rubank
 
StealthHawk said:
What does that have to do with the current UT2003 issue? Is the level of filtering being done with Quality selected the same as the filtering done with Performance being selected?

I assumed what was going on was that we were getting something between full trilinear(what you normally get with Quality) and what you get with bi/tri when the slider is set to Performance.

Quality UT2003 on Quality Performance High Performance
........................|........................................................

Looks like balanced or now the renamed performance mode myself...peformance is a mixed bag of bilinear and trilinear...and amazingly looks just like balanced when looking at Dave's filtering test.

D3D tester renamed to UT 2003.exe-Bilinear/Trilinear

ut2003.jpg



FX 5800 Balanced Mode From Beyond3D's Preview

bal_1x.gif


D3D tester not renamed-Trilinear

d3d.jpg


So it has LOTS to do with it.

http://www.beyond3d.com/previews/nvidia/gffxu/index.php?p=20
 
Yes, it looks close...I was wondering more how it compared to the current Performance/Balanced option. Since the algorithms were changed since 43.51. Any chance you could test this for us Dave :D
 
funny thing is, didnt the [H] mention in an earlier review that they could see the difference between balanced and quality?
And now they apparently cannot....
 
Yes they did: :rolleyes:

http://www.hardocp.com/article.html?art=NDQ0LDI=



However, when running in the recommended setting of "Aggressive" mode on the NVIDIA graphics cards we found they do not use Trilinear Filtering. Instead, the card utilizes NVIDIA's own per-pixel adaptive algorithm whose resulting image quality is better than Bilinear Filtering, but certainly not up to Trilinear Filtering. This Aggressive setting positioned the NVIDIA GFFX cards at a distinct frame rate performance advantage, compared even to their own "Balanced" mode that is lacking true Trilinear Filtering, due to the fact that they are not saddled with the workload of Trilinear Filtering. As additional testing revealed, the average frame rate performance of the NVIDIA graphics cards dropped significantly once Trilinear Filtering was forced by application settings.

We soon realized that NVIDIA's Aggressive settings did not reflect the IQ we have come to expect and it surely did not equate to fair "apples-to-apples" benchmarks. So we set out to figure what exactly was going to represent a fair comparison between our ATI and NVIDIA cards while preserving the IQ we all find so important, or better yet that quality that gives us a "cinematic experience".

Moving to the “Balancedâ€￾ screenshot in the middle, we see that the situation has improved. Here, we are beginning to see the drastic line of distinction between each mip-map gradually fade. This indicates that a higher level of filtering is present, though the degree of that filtering level is questionable. Though some blending is taking place, the colors remain very concentrated and fail to mix enough to avoid showing the same line of distinction. Unfortunately, things remain roughly the same once we enable 8X AF. Although NVIDIA claims that Balanced mode will use Trilinear Filtering when AF is present, we can clearly see that "full" Trilinear Filtering is not present. Instead, we are faced with NVIDIA's own adaptive algorithm which fails to provide the desired level of image quality

Moving to “Applicationâ€￾ mode, we are forcing the application itself to dictate the level of quality which will be used. Through using this mode and selecting Trilinear Filtering within the game, we are assured that this level of image quality will be used. Viewing the first image above, we finally see a situation where Trilinear Filtering is present in its full capacity. With this setting selected, the mip-map colors effectively meld with one another to create a smooth transition throughout the scene. Once 8X AF is selected, we are again presented with the level of image quality we have been searching for. Here, each color seamlessly blends into the next creating a very clean image. This is the setting that illustrates that "application" mode is the only true way to compare the NVIDIA cards with the ATI cards running Trilinear Filtering.


ATI vs. NVIDIA Comparison
Quality Settings



(Apples-To-Apples At Last)

As you can see above we have now found a level of image quality that we can say is "fair" for benchmarking. You can see the GeForce FX 5600 Ultra's image is now on par with the Radeon 9500 Pro by referencing the above mip-map paths and transitions. True Trilinear Filtering is enabled within the game along with maximum image quality settings. The 9500 Pro is running in "Quality" mode whereas the GeForce FX 5600 Ultra is utilizing "Application". Now that we know that NVIDIA is handling filtering in ways that we are used to, be it wrong or right, we can adjust and make sure that we have comparable frame rate data.


In an effort to pull this together a bit more, we have spliced the screenshots together to form one image to contrast the filtering used. You can click on this image to see the full 1024x768 version which highlights this issue even further. In each case, the image was run using each of the three quality modes with maximum image quality settings and Trilinear Filtering selected within the game. For this series of screenshots, no level of FSAA or AF was enabled.

1047311589S4MzihRI0V_2_10.jpg



[H]-->
lol2.gif
 
Good memory. I like the last paragraph in that article...

Before you leave this page, please keep in mind that NVIDIA is in no way bound to use Trilinear Filtering. There is no doubt in my mind that we will see better and different ways of doing these things in the future and this is possibly a beginning of that. But we are in the position to make sure that when we deliver comparative data to you, that we do it in the fairest possible manner. So for this article, we are going to show you how the GFFX family plays out using several different IQ settings and then we will look at some images. You can then make up your own mind on whether or not NVIDIA's lack of Trilinear Filtering in their Aggressive and Balanced modes really makes a difference that you care about.

The emphasis is mine. Had they made mention of the adaptive trilinear filtering in UT2003 in the first place and added something similar to the paragraph above we wouldn't be where we are today. BTW, Sean Pelletier wrote that article. We've heard Kyle's and Brent's comments on this latest issue, but I would be interested to also hear what Sean had to say.

Tommy McClain
 
Here's another one that makes you wonder...

http://www.hardocp.com/article.html?art=NDQ0LDc=

It becomes very clear to us now why NVIDIA would suggest that the reviewers of their GFFX products use the "Aggressive" quality setting as opposed to any other. Clearly if this was the only setting that you used to qualify the card, you might be writing a very positive conclusion for your readers that could be very possibly off target. It is our opinion that it would be morally objectionable to compare the NVIDIA GFFX at Aggressive settings with an ATI 9500Pro utilizing Trilinear Filtering without having full disclosure of the facts surrounding image quality

Emphasis again is mine. One could assume that NVIDIA took these comments to heart and figured they would no longer give the user a choice in the matter by special-casing for UT2003 and removing the Application setting from the drivers. I really like the last sentence [H] made(not being sarcastic). Unfortunately they weren't consistent with this belief when they tested the GeForceFX 5900 Ultra. Maybe NVIDIA was hoping this would happen? Either way this is the main reason why I, ATI and others had a problem with original review.

Tommy McClain
 
AzBat said:
BTW, Sean Pelletier wrote that article. We've heard Kyle's and Brent's comments on this latest issue, but I would be interested to also hear what Sean had to say.

Tommy McClain


you know Sean came from NVNews and I'm sure some people here expressed opinion they thought he would be nVidia centric than Brent.

Funny now that thought.
 
you know Sean came from NVNews and I'm sure some people here expressed opinion they thought he would be nVidia centric than Brent.

Times change. Community support changes.

The only thing that has really changed between now and the past 5+ years is that there are more than 3-4 people pointing out the facts and lobbying the result. That environment was much more accomodating for someone to let their true colors fly and go without challenge.

Today, everything is pretty much the same, but people are a bit more cautious and have to yield to what the audience understands. Subtlety has become key as the shift of focus has changed and public mindset has changed. Stating even esoteric things results in almost a lynch-mob style response (see the prior [H] articles). Whereas in the past, publishing such things would only require the effort of banning/squelching a handful of people.
 
So you guys are 100% sure that when Quality is selected it is in fact dropping down to Performance mode?

The case is not that it looks "more or less" like Performance mode, but that it is Performance mode? All I will say is that so far that has not been established scientifically. What has been presented are some old shots with an old driver back when NVIDIA used poorer AF algorithms, and that in UT2003, the filtering pattern looks close(but not exact!) to that aforementioned old filtering pattern in an old driver.

When Performance mode is selected in the driver control panel does it do tri/bi filtering of the detail textures, or only bilinear? If the driver really is just dropping down to Performance mode, isn't this likely to be a driver bug? in which case the whole fuss was made over nothing?
 
StealthHawk said:
When Performance mode is selected in the driver control panel does it do tri/bi filtering of the detail textures, or only bilinear? If the driver really is just dropping down to Performance mode, isn't this likely to be a driver bug? in which case the whole fuss was made over nothing?

But it's app detected. Haven't people seen the same thing with any app renamed to UT2K.exe? It seems too much of a coincidence that this "bug" would happen at the same time as Nvidia marketing is crowing about this "30 percent increase in speed" on this driver set and UT2K in particular.

It would have to be a pretty amazing bug that causes the same speed increase on the same game as Nvidia happened to put into their marketing slides.
 
Back
Top