ATi is ch**t**g in Filtering

DaveBaumann said:
christoph said:
atis answer is just pr....no surprise here. microsoft whql, well you dont even need trilinear to pass that test afaik

There are actually specific IQ tests that you have to be within certain tolerences, these include Trilinear filtering.

But for the most part, You'd have to expect that D3d Only requires Billinear Samples on most mip map transistions. Point Sampling on Mip Map transistions would Obviously bring back no Certification.
 
DaveBaumann said:
christoph said:
atis answer is just pr....no surprise here. microsoft whql, well you dont even need trilinear to pass that test afaik

There are actually specific IQ tests that you have to be within certain tolerences, these include Trilinear filtering.

Those are the tests DrawPim and you were talking about a page or two ealier? The ones with wildy different mip-levels of vertical and horizontal bars?
 
FUDie said:
How narrow minded! Trilinear is on by default, however the driver/hardware will optimize the filtering when possible. Not real complicated. If the mipmaps aren't box filtered mipmaps (i.e. they are colored), then it reverts to the default setting of trilinear.

So rendering with color mipmaps on is now considered "default" while image rendered normally is special case?
 
Geeforcer said:
So rendering with color mipmaps on is now considered "default" while image rendered normally is special case?

No, by default the GPU/driver is doing trilinear - as long as no texture is mip-mapped whose mip-levels exceed an unknown contrast value to each other.

Possibly 'real texturing' at all switches off trilinear? Who knows?
 
Specific suggestion for reviewers to use colored mipmaps to verify the behaviour seems kinda slimy.

Ok, can anyone confirm for sure that this behaviour only occurs when application requests autogenerated mipmaps ?

EDIT, i.e. if i do my downfilter myself, then i will get full trilinear on all texture stages where i request it ?
 
Can someone force this "optimized trilinear" on color mip-maps?

Everybody here seems to forget that true trilinear is a trade-off. You get rid of mip map transitions but also the image gets blured in comparison with bilinear by blending with a lower mip map level.

Most FX users said Bri gets a better looking image due to bilinear being sharper.

If ATI can offer a filtering mode that deals with mip map transitions (and not just borders) and gives the added "crispness" of bilinear, I'm all for it.
 
Even bilinear is a tradeoff - information from different texture samples are averaged.

The sharpest possible pictures are renderen with point sampling - albeit with a huge tradeoff regarding overall quality. Not because it is not sharp but because it looks unnatural.[/i]
 
db wrote:
There are actually specific IQ tests that you have to be within certain tolerences, these include Trilinear filtering.

all i can find as a requirement to pass the test concerning textures in the current microsoft version of the whql test specifications is this:

http://www.microsoft.com/whdc/hwtest/pages/specs.mspx

3D Graphics Assertions: Textures
5.554.1 Graphics adapters must support Direct3D compliant MIP-mapped textures.
Reference Documents: PC 2001 System Design Guide, GRPH-0185.1
PC99 Notes: Carries forward assertion 5.2.1; carries forward intent of PC 99 System Design Guide Requirement 14.27.2.

5.554.2 Graphics adapters must support Direct3D compliant bilinear or better filtered textures.
Reference Documents: PC 2001 System Design Guide, GRPH-0185.1)
PC99 Notes: Carries forward assertion 5.2.2; carries forward intent of PC 99 System Design Guide Requirement 14.27.2.

5.554.3 Graphics adapters properly support the required Direct3D texture sizes: 8x8, 16x16, 32x32, 64x64, 128x128, 256x256, 512x512, and 1024x1024.
Reference Document: PC 2001 System Design Guide, GRPH-0188
PC99 status: No PC 99 assertion available; carries forward intent of PC 99 System Design Guide Requirement 14.31

5.554.4 Graphics adapters must support the following Direct3D texture combination operations: MODULATERGB, MODULATEALPHA, ADD, and BLEND.
Reference Document: PC 2001 System Design Guide, GRPH-0186
PC99 status: No PC 99 assertion available; carries forward intent of PC 99 System Design Guide Requirement 14.29.

5.554.5 Graphics adapters must support Direct3D texture combination operations simultaneously with fogging and alpha blending.
Reference Document: PC 2001 System Design Guide, GRPH-0186
PC99 Notes: Supersedes PC 99 assertion 5.1.2; supersedes intent of PC 99 System Design Guide Requirement 14.27.1.

5.554.6 Graphics adapters must implement the following color texture formats:
16 bpp non-palletized with a bit pattern of 1:5:5:5 Alpha:Red:Green:Blue
16 bpp non-palletized with a bit pattern of 4:4:4:4 Alpha:Red:Green:Blue
32 bpp non-palletized with a bit pattern of 8:8:8:8 Alpha:Red:Green:Blue
Reference Document: PC 2001 System Design Guide, GRPH-0187
PC99 Notes: No PC 99 assertion available; carries forward intent of PC 99 System Design Guide Requirement 14.30.

and this:
TD-5.22 Texture Tests Description
MipMapped Textures
To verify proper implementation of MIP-mapped textures, the test program first queries the graphics driver via the DirectX HAL to ensure support is present.

Next, the test program conducts an automated verification that MIP maps can be initialized and displayed properly. The following steps are performed using the DirectX HAL device:

A series of three different striped color texture MIP maps are initialized.
The three-level MIP-mapped texture is applied to nine meshes of various sizes and each rendered to the primary surface buffer.
Pixel color values in the primary surface buffer are compared to identical images generated by the Direct3D Reference rasterizer. This comparison confirms that each MIP map is present and can be activated in the rendered image.
A 15% error tolerance is allowed between each hardware-generated image and each Reference rasterizer image. This error value is calculated as the cumulative pixel value differences between any two scenes.
Setup

This information is preliminary. For complete procedures for this test in the current HCT, see the HCT Documentation.

No special hardware setup is required for this test. Microsoft Windows XP operating system must be installed, as well as DirectX version 8 or higher.

Results Interpretation

A pass/fail indication of compliance with this requirement is displayed and logged by the test program. Additional information is generated on failure cases, which includes the details of the original MIP map textures and the specific comparison step that failed.

Texture Filter
This test verifies that the graphics adapter supports bilinear (or better) texture filtering with perspective correction, as opposed to simple point-sampled texturing methods.

The test first verifies that linear texture filtering and perspective corrections are reported capabilities of the hardware. The DirectX HAL device capabilities are queried for confirmation of this feature.

The next phase of this test renders and displays two textured scenes. A striped texture pattern is used for these images.

The leftmost scene activates bilinear texture filtering (HAL device with filtering), while the rightmost scene is generated by the Direct3D Reference rasterizer. The test then automatically confirms that the HAL texture-filtered scene closely resembles the Reference rasterizer for various mesh sizes.

To confirm support for perspective correction, the DirectX HAL device is used to render a square mesh with various texture images applied. The z-value of each individual corner of the mesh is varied to skew each image. A pixel comparison to an identical scene created using the DirectX6 Reference rasterizer is performed.

For both the filter test and the perspective correction test, a 15% error tolerance is allowed between each hardware-generated image and corresponding Reference rasterizer image—this error value is calculated as the cumulative pixel value differences between any two scenes.

Texture Management tests
This test assesses driver texture management when run with the following command line switches "texman –HWTexMan –dx7". This verifies that texture memory is managed correctly for the following types of textures in the following order

16 bit RGB textures
16 bit RGB mip mapped textures
8 bit palletized textures
8 bit palletized mip mapped textures
Set LOD on 16 bit mip mapped RGB textures
16 bit RGB Cube mapped textures
16 bit RGB Cube mapped mip mapped textures
8 bit palletized Cube mapped textures
8 bit palletized Cube mapped mip mapped textures.
This test generates 120 textures of the texture type. So for mip mapped 16 bit RGB textures 120, 7 level mip maps are created starting at a 256*256 texture. This means that during this test the amount of textures allocated ranges from around 7.8 M to over 40 M of textures. Set LOD is tested with 16 bit RGB mip mapped textures only and purely tests that the Level of Detail or mip level can be set.

Setup

This information is preliminary. For complete procedures for this test in the current HCT, see the HCT Documentation.

No special hardware setup is required for this test. Microsoft Windows 98, ME or Windows 2000 operating system must be installed, as well as DirectX version 7 or higher.

Results

A pass or fail indication of compliance with this requirement is displayed and logged by the test program.

Textured lines and textured point tests
Two short tests to verify that the driver applies textures on line and point list primitives correctly. The application renders a number of textured lines or point to completely fill a rectangle in the center of the screen.

Setup

This information is preliminary. For complete procedures for this test in the current HCT, see the HCT Documentation.

No special hardware setup is required for this test. Microsoft Windows XP operating system must be installed, as well as DirectX version 8 or higher.

Results Interpretation

A pass/fail indication of compliance with these requirements is displayed and logged by the test program. All results are recorded in a detailed log file.

care to point me to the direction you mentioned?
 
ATi's response

Our algorithm for image analysis-based texture filtering techniques is patent-pending. It works by determining how different one mipmap level is from the next and then applying the appropriate level of filtering. It only applies this optimization to the typical case – specifically, where the mipmaps are generated using box filtering. Atypical situations, where each mipmap could differ significantly from the previous level, receive no optimizations. This includes extreme cases such as colored mipmap levels, which is why tests based on color mipmap levels show different results. Just to be explicit: there is no application detection going on; this just illustrates the sophistication of the algorithm.

http://www.theinquirer.net/?article=15971

Doesn't address why the final output is difference from the Software Reference Rasteriser. What are the implications of this?
 
This has been very interesting. Of course, the opinions here seem to be split as you would expect….with maybe the exception that many of the “fanATIcs’ seem to be more circumspect about this than most “nVidiots†ever were about any of the stuff nVidia has pulled. While there are some things about this “optimization†that are somewhat alike between what ATI & nVidia does, it need to be brought into a bit sharper relief….

nVidias use of brilinier was an attempt to add performance to a sub par architecture that caused major IQ degradation. It was noticeable within seconds, ATI has added a feature to the hardware that improves performance and is not noticeable in use…in fact, has been used for the last year on the RV products with absolutely no complaints… and in fact, the RV series is notable – as are all of ATI’s current products – for having the best IQ in its class. ATI is pushing the envelope of videocard technology here, not trying to cover up a bad product design.

The fact is that what ATI is doing here, technology wise, is a very good thing. If what they say is true, they are giving us as good, if not better quality while increasing the speed. And the fact that the RV’s have been doing this for the last year without any complaints only re-enforces the fact that it works incredibly well. Now, how they have handled it and are handling it is the real crux of the biscuit. Fact is it should be defeatable, period. If for no other reason so that users can get a feel for the technology. And they badly need to explain this new technology as fully as is possible. The enthusiast community needs to have the facts about this ASAP. And this can only help ATI’s case here.

I kind of look at this the exact same way I do with adaptive AF, only it’s better – much better. Where you have a completely obvious change to IQ with adaptive AF over regular AF, with this the differences are, so far, not visible. More speed without any IQ loss…. Kind of a holy grail kind of thing. It’s a case of ATI using intelligence to solve a problem, and I applaud them. And I’m sure that nVidia will follow suit here at some point, just as they have done with adaptive AF. It’s a no brainer…..
 
vb said:
If ATI can offer a filtering mode that deals with mip map transitions (and not just borders) and gives the added "crispness" of bilinear, I'm all for it.

The main issue is not the actual filtering imo. It's that Ati is telling everyone to use (look at the pictures from the pdf's in this thread) "coloring mip map" tools for checking the quality of their filtering. And then they implement a new mode of filtering that invalidates these tools. Without any explanation and without offering any way of turning these optimizations off.
 
Bjorn said:
vb said:
If ATI can offer a filtering mode that deals with mip map transitions (and not just borders) and gives the added "crispness" of bilinear, I'm all for it.

The main issue is not the actual filtering imo. It's that Ati is telling everyone to use (look at the pictures from the pdf's in this thread) "coloring mip map" tools for checking the quality of their filtering. And then they implement a new mode of filtering that invalidates these tools. Without any explanation and without offering any way of turning these optimizations off.

Indeed. ATI know damned well that the filtering used in the coloured mip-map synthetic tests will more than likely be different from the actual filtering applied under normal game conditions.

They may not think it looks different, they may or may not be right, that's not the issue. The synthetic tests they are holding up as demonstrating their IQ is demonstrating the IQ under the best possible conditions, conditions which do not hold for a wide range of real-world situations.
 
Bjorn said:
vb said:
If ATI can offer a filtering mode that deals with mip map transitions (and not just borders) and gives the added "crispness" of bilinear, I'm all for it.

The main issue is not the actual filtering imo. It's that Ati is telling everyone to use (look at the pictures from the pdf's in this thread) "coloring mip map" tools for checking the quality of their filtering. And then they implement a new mode of filtering that invalidates these tools. Without any explanation and without offering any way of turning these optimizations off.

That's a pretty valid question actually and we should ask ATI to see what it has to say about that.
 
DaveBaumann said:
Fodder said:
What about the app detection both ATI and Nvidia are doing to avoid AA issues in Far Cry, Splinter Cell, Prince of Persia, etc.?

These applications have issues when AA is forced through the control panel (which has always been a slightly sticky issue with WHQL), and they don't have AA selections in the game. If a game offers AA selections but there are still issues in this case ATI will not turn it off because its something the application can control (and in this case they will try to resolve the issue).

On the subject of application detection, I think there should be a checkbox option in the driver control panel to allow or disallow the detection, with an explanation beside it that some games require the detection to operate correctly (the explanation could link to the driver help, listing the specific games/apps being detected). This is something that should apply to all companies, i'm not singling anyone out.

The same applies to brilinear filtering and variants. A checkbox enabling or disabling the filtering optimisation should be provided to the user and obeyed by the driver at all times.

Give the user the choice, then respect that choice. If the user then suffers poor IQ or poor performance it is because of his choice and his choice alone and he can remedy it if he so chooses.

It really is that simple.
 
This all optimizing stuff is just silly. Instead of investing R&D time on improving texture filtering quality, IHVs develop more and more techniques to reduce basic trilinear filtering quality even more. This gen hardware offers much worse filter quality than a plain old 2001 Geforce3.

Hell, not even all D3D filtering types in D3D are supported in current hardware. IMO, there would be a lot of things to improve, bicubic filters, gaussian filters, smart filtering and last but not least, single-cycle filters.
 
If we add a checkbox for everything, you are looking at a total usability nightmare though. And big corporations take these things VERY seriously; we are a minority, and the millions of users do not want a control panel filled with lots of different (and most of the times confusing) options.

edit: That does not mean that we shouldn't have the option to do stuff btw. Just looking at it from a different angle.
 
ok, about checking ATI filtering:

1) bench application with textures vs colored mipmaps vs monocolored mipmaps
2) use your own texture, let driver autogenerate mipmaps. Texture has to be prepared in a way that every generated mipmap differs much. Then try to check if transitions are bri/trilinear.

Any method seems valid/possible to you guys?
 
Bjorn said:
The main issue is not the actual filtering imo. It's that Ati is telling everyone to use (look at the pictures from the pdf's in this thread) "coloring mip map" tools for checking the quality of their filtering. And then they implement a new mode of filtering that invalidates these tools. Without any explanation and without offering any way of turning these optimizations off.

I have to agree with that but, from a marketing POV, if they went straight with this on launch, how many sites would have checked "disable trilinear optimisations" on NV40 drivers and which was fairer?
 
radar1200gs said:
DaveBaumann said:
Fodder said:
What about the app detection both ATI and Nvidia are doing to avoid AA issues in Far Cry, Splinter Cell, Prince of Persia, etc.?

These applications have issues when AA is forced through the control panel (which has always been a slightly sticky issue with WHQL), and they don't have AA selections in the game. If a game offers AA selections but there are still issues in this case ATI will not turn it off because its something the application can control (and in this case they will try to resolve the issue).

On the subject of application detection, I think there should be a checkbox option in the driver control panel to allow or disallow the detection, with an explanation beside it that some games require the detection to operate correctly (the explanation could link to the driver help, listing the specific games/apps being detected). This is something that should apply to all companies, i'm not singling anyone out.

The same applies to brilinear filtering and variants. A checkbox enabling or disabling the filtering optimisation should be provided to the user and obeyed by the driver at all times.

Give the user the choice, then respect that choice. If the user then suffers poor IQ or poor performance it is because of his choice and his choice alone and he can remedy it if he so chooses.

It really is that simple.

Cripes! For about the first time ever I totally agree with radar! ;)

I wonder if the ATI hardware is capable of doing 'true' trilinear in addition to the optimised filtering?
 
Back
Top