Gamespot: ATI admits to same 'tricks' as Nvidia

Bunnyman

Newcomer
http://www.gamespot.com/news/2004/06/03/news_6099873.html


ATI admits to same "tricks" as Nvidia
Graphics company ATI, long time critic of competitor Nvidia's video card driver optimizations, concedes to using undocumented optimizations of its own.

After criticizing rival Nvidia for using optimized trilinear filtering algorithms in its PC graphics card drivers--calling them "unacceptable tricks" in a recent presentation to hardware reviewers last month--ATI Technologies has acknowledged using similar optimizations itself.
 
This is basically a summery of the whole filtering affair with nothing new. Move along, move along.
 
In an online chat hosted by ATI to discuss the allegations, the company admitted that its graphics drivers do contain adaptive filtering optimizations, but vehemently denied using application-specific cheats. Company representatives stated, "Our target is also to avoid any need to detect applications, and as such we have to try to be sure that our image quality remains high in all cases. To achieve this we spent a lot of effort developing algorithms to make the best use of our quality tuning options. This isn’t a performance enhancement applied to popular gaming benchmarks."

When asked about trilinear optimizations in light of ATI's recent disclosure, Nvidia's Brian Burke commented, "In our view, if an optimization produces the correct image while speeding up performance then it is beneficial to the end user and is legitimate. If a change in the driver does not produce the correct image, or functions only in the benchmark, it is either a bug and must be fixed, or a cheat."
 
basicly ati is not looking at say farcry and seeing how they can lower image quality or advance feature sto speed up the rendering. They just have this one method that is the same on every gaem.
 
that "news" from gamespot was complete yellow journalism. What a croc of a headline and first paragraph. Guess when they aske BB they decided to run with it. :rolleyes: what a duff.
 
When asked about trilinear optimizations in light of ATI's recent disclosure, Nvidia's Brian Burke commented, "In our view, if an optimization produces the correct image while speeding up performance then it is beneficial to the end user and is legitimate. If a change in the driver does not produce the correct image, or functions only in the benchmark, it is either a bug and must be fixed, or a cheat."
Oh, man, that's rich. I think that one broke this here irony meter I just cobbled together from an old Palm Pilot Pro and some tape. :LOL: Another sig-worthy quote in a week chock full of them!
 
jvd said:
basicly ati is not looking at say farcry and seeing how they can lower image quality or advance feature sto speed up the rendering. They just have this one method that is the same on every gaem.
Which would make this no different from nVidia's "brilinear."

There's really nothing about ATI's "adaptive trilinear" that makes it more valid to use in benchmarks than nVidia's. They may have some algorithm that selectively disables the method for "optimal image quality," but it's still the same filtering algorithm.
 
Chalnoth said:
Which would make this no different from nVidia's "brilinear."
But I was able to look at screenshots and see the difference in brilinear, I'm even getting a new card just so I can see what it looks like up close and personal.

I think it comes down to if it's noticeable or not, and I ain't in a position to judge both yet so I'm not going to....but I haven't been able to tell between ATi's trillinear or tryllinear so far and I have been trying.
 
digitalwanderer said:
But I was able to look at screenshots and see the difference in brilinear, I'm even getting a new card just so I can see what it looks like up close and personal.

I think it comes down to if it's noticeable or not, and I ain't in a position to judge both yet so I'm not going to....but I haven't been able to tell between ATi's trillinear or tryllinear so far and I have been trying.
But that's mostly just because ATI's version, when image quality is set to "high" in the drivers, applies it to a lesser degree.
 
Chalnoth said:
But that's mostly just because ATI's version, when image quality is set to "high" in the drivers, applies it to a lesser degree.
True, but it works out pretty good just for that reason...ATi cheats smarter. ;)
 
Bunnyman said:
When asked about trilinear optimizations in light of ATI's recent disclosure, Nvidia's Brian Burke commented, "In our view, if an optimization produces the correct image while speeding up performance then it is beneficial to the end user and is legitimate. If a change in the driver does not produce the correct image, or functions only in the benchmark, it is either a bug and must be fixed, or a cheat."


Is this quote even related to ATI trilinear optimisations? IIRC, this is a quote from last year when Nvidia was trying to justify their shader replacement policy and how they *never* cheated on application detected benchmarks.

Is Gamespot just coblling together old quotes from unrelated subjects in order to try and sensationalise a molehill into a mountain? Well I suppose you have to do something when your "news" is a month behind everyone else...
 
digitalwanderer said:
Chalnoth said:
But that's mostly just because ATI's version, when image quality is set to "high" in the drivers, applies it to a lesser degree.
True, but it works out pretty good just for that reason...ATi cheats smarter. ;)

not really if you look at the xbit.com conclusion.
While in motion its definatly noticable.
 
could you link that conclusion? i can't seem to locate it and i find it a bit odd that they would claim it is noticeable just now when it has been going on for over a year.
 
Chalnoth said:
There's really nothing about ATI's "adaptive trilinear" that makes it more valid to use in benchmarks than nVidia's. They may have some algorithm that selectively disables the method for "optimal image quality," but it's still the same filtering algorithm.
You're absolutely right, but at the moment there are some good things about ATI's method.

There are some graphics algorithms that rely on a linear gradient across mip-map levels. zeckensack suggested using a trilinear texture lookup in a texture with different shades of grey to get a number that could half compensate for ps2.0's lack of gradient instructions (and it could be faster too depending on what you want to do with it).

I also remember reading about a caustics algorithm (from NVidia's developer site, I think) that uses trilinear filtering to get the shade of light correct. You take a mesh and displace the vertices to the locations a ray of light would have refracted, and render the mesh to a lighting texture. If the resulting triangle is small it should be bright, and vice versa.

Finally, some depth of field algorithms use texldb to vary the blurriness, though I'm not a big fan of this method. There could be some vaguely noticeable differences in how the effect behaves.

At least ATI's method doesn't give you any different results here, because they check if it's just an ordinary box filtered mip-map chain or not. However, ATI is potentially missing out on a lot of speed boosts because they're not applying it universally.

What do you think? Should ATI just go ahead and do like NVidia?
 
1. By default, the driver should always render exactly as the application suggests.
2. The driver may have additional options that enhance image quality or performance, that are user-selectable.
 
Oh, so you're of the opinion that NVidia should also make normal trilinear default, and then offer the optimizations as a driver option? That's a bit of a surprise to me, but I was probably being presumptuous about your stance on this issue.

Maybe I'm being cynical, but I seriously doubt that will ever happen. If there are enough textures in upcoming games where ATI can't check for box filtering, I'm sure they'll also revert to NVidia's method of globally applying the optimization by default.
 
Mintmaster said:
Oh, so you're of the opinion that NVidia should also make normal trilinear default, and then offer the optimizations as a driver option? That's a bit of a surprise to me, but I was probably being presumptuous about your stance on this issue.

Maybe I'm being cynical, but I seriously doubt that will ever happen. If there are enough textures in upcoming games where ATI can't check for box filtering, I'm sure they'll also revert to NVidia's method of globally applying the optimization by default.

Their is no by defualt for ati their is only always.
 
Back
Top