ATI cheating on anisotropic filtering performance

Status
Not open for further replies.
Snarfy said:
Oh, god, if you'd actually read any of the other fifty threads that have brought up the 'Far Cry' pictures then you'd know the answer lies with how the 1.1 patch changed the NV paths to use PS1.x instead of 2.0 - This is game specific. If the card's IQ was so bad have you not thought why it doesn't occur in other games? Being uninformed, quick to judge and finding evidence to reinforce your prejudices isn't hard. Neither is cutting'n'pasting other peoples' findings without understanding the context. Well done :)
 
i think for the most reviews i read they used brilinear for the gffx and the trilinear/bilinear mix for radeons. The most reviewers dosen't really mention what they did.

3dcenter used full trilinear on all stages for both cards that's why i liked it
 
This is game specific.

hey, you're right! kind of like how changing farcry.exe to fartcry.exe lowers performance on nvidia drivers! it's game specific!

hmm, mabye they did that because they were getting spanked in far cry? :?
 
Hmm, this is the first time I hear that my forced trilinear anisotropic filtering on my Radeon 9700 is actually only trilinear for the first texture stage.

I remember times where people here were furious for bilinear being forced upon people. However, no trilinear filtering beyond stage 1 is being put aside like it's nothing. Sorry to say, but it smells around here.
 
Chalnoth said:
With quotes like:
Tomshardware.com said:
As we will see repeatedly throughout our benchmark section, anisotropic filtering is the greatest strength of the new Radeon X800 cards.
You'd think that some reviewers would have bothered to pay attention to how each card applied its anisotropic filtering. Fortunately, one did:
http://www.beyond3d.com/reviews/ati/r420_x800/index.php?p=13#tex

Notice that while the benchmarks around the web are sure to have used nVidia's "highest-quality" setting, a setting which removes the "brilinear," ATI is still using full bilinear filtering on texture stages other than the base texture.

Chalnoth:

What happened? We all have our preferences, and yours happens to be nVidia gathering from your past posts, which is fine. I don't believe I've seen you make quite this inflamitory of a post before though. You are a smart guy, and given the number of posts you've made (compared to my measily 300-400), I'd assume that you knew this was the case with ATI drivers since well before the X800XT was launched. Now, I'm not saying that this is a non-event, but it is certainly not a huge revelation like your thread title makes it out to be.

So anyway, for now I'll ignore the juvenile tone of your post and actually focus on your issue with it. I have mixed feelings about the issue. Ever since the driver developers moved from the "bilinear" and "trilinear" labels to "performance" and "high quality" ones, this kind of stuff has started happening. From that respect it's not exactly lieing nor cheating as they simply are offering a "high quality" setting that may or may not be the highest quality the card can produce. Obviously one goal is to make the high quality setting actually display high quality IQ, but the other is to make it as fast as possible. ATI must have felt disabling trilinear filtering in the other texture stages didn't adversely affect image quality enough to justify it's inclusion.

What is an important distinction for me, is that if a program requests trilinear, the driver does infact deliver trilinear. In this way, ATI is making a distiction between real trilinear being requested, and the "high quality" mode in the driver control panel.

So where does this leave us? Well, ever since the move to the ambiguous labels in the drivers, it's been pretty impossible to benchmark nVidias "quality" settings versus ATI's "quality" settings, as each one takes different short cuts. Neither is really "cheating" per say (in this case) because they never claim to be doing anything specific. It just means that unfortunately reviewers trying to compare the different high quality modes are will have to note that they are doing slightly different things and make note of whether or not the images being produced are comparable.

On the other hand, if the application (or for that matter a user) requests trilinear filtering on all texture stages and doesn't get it, then we have a problem. The same thing is true of rendering shaders, inserting clip planes, or other mischief to try and decieve the user. I don't think the case mentioned in this thread is in the same category.

Nite_Hawk
 
Snarfy said:
Natoma said:
Nick Spolec said:
But I still think they should give the user the option to enable full Trilinear across all texture stages.

Maybe someone *cough* Kombatant *cough* can spill the beans on a future CAT release. :D

you mean the new control center? i saw a pic of it, looks nice. havent heard if it lets ya enable tri across the board. could be, the cats have always been great drivers. :D

This thread title should read Chalnoth upset ATI released part on par with the revolutionary NV40 and will beat NVIDIA to market, tries to incite riot in a Beyond 3d thread.

haha, amen.

silly Chalnoth. i bet in prison, he'd be bubba's favorite cell mate :oops:

If IQ isn't noticeably different then I don't see a problem. However, I'd expect the people who agree with this to give Nvidia the same benefit when they 'optimise' without any noticeable difference (noticeable not being blow-ups of individual frames with microscopic differences).

i suppose this counts as noticably different, then:

http://www.tomshardware.com/graphic/20040414/geforce_6800-46.html

You can tell the 6800 quality instantly. in motion, i'm sure its positively nauseating. Since the 6800 wont be out till sometime in june, we have some time before the horrible videos show up. Thank god.[/quote

but if you also look at this splinter cell screenie you can see that ati has flaws as well look at the shadows in the nv's pic they are missing on all the ati's pic
http://hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNV80X2wuanBn
 
I do not see a missing shadow in the ATI picture, just my opininon. But it does have some shadows that are softer but they are there.
 
Stryyder said:
I do not see a missing shadow in the ATI picture, just my opininon. But it does have some shadows that are softer but they are there.

if you look at the rock ati isnt producing the shadows
so you dont see anything different on any of this screenies?
 
Chalnoth said:
With quotes like:
Tomshardware.com said:
As we will see repeatedly throughout our benchmark section, anisotropic filtering is the greatest strength of the new Radeon X800 cards.
You'd think that some reviewers would have bothered to pay attention to how each card applied its anisotropic filtering. Fortunately, one did:
http://www.beyond3d.com/reviews/ati/r420_x800/index.php?p=13#tex

Notice that while the benchmarks around the web are sure to have used nVidia's "highest-quality" setting, a setting which removes the "brilinear," ATI is still using full bilinear filtering on texture stages other than the base texture.
thats Funny I read at Toms (i need to get the link) That in their AFtester tests the X800 clearly did a better job of covering all the angles than Nvidias new method.
 
sonix666 said:
I remember times where people here were furious for bilinear being forced upon people. However, no trilinear filtering beyond stage 1 is being put aside like it's nothing. Sorry to say, but it smells around here.

It's been repeatedly discussed. ATI has been upfront on this too: CP settings mean trilinear AF on the first stage, bilinear on all others; application specified settings mean trilinear on all stages.
 
whats the matter with chalnoth. Started the thread and took off. Kinda like hit and run. Anyway the rest has been covered by others. Interesting that toms noticed that ati's filtering is better than nvidias though.
 
Honestly yeah, it depends on preference.

Trilinear AF has nothing to do with Trilinear filtering.

You could actually say the ATi method is superior because it filters more of the screen. The "most accurate" AF would have been the old 5950 type AF, but it was unable to go to 16x at any appreciable level of speed. Even in that case, I still prefered the star shaped AF (much less accurate) to the round AF simply because it applied the effect to that much more of the screen.
 
webmedic said:
whats the matter with chalnoth. Started the thread and took off. Kinda like hit and run. Anyway the rest has been covered by others. Interesting that toms noticed that ati's filtering is better than nvidias though.

He went out to buy an x800 pro :LOL:
 
webmedic said:
whats the matter with chalnoth. Started the thread and took off. Kinda like hit and run.

Hey, the guy's not a robot; he deserves some online time off from time to time :LOL:
 
ya, it "is being put aside like it's nothing" because it is damn old news, since the 3.4 drivers or so i belive.
 
Natoma said:
Nick Spolec said:
But I still think they should give the user the option to enable full Trilinear across all texture stages.

Maybe someone *cough* Kombatant *cough* can spill the beans on a future CAT release. :D


You can but you either need to go digging in the registry and change the anisotype to either 2 or 0 (whichever one the control panal won't change it to) or use one of the many tweaking programs such as rage3dtweak or rtool etc etc.
 
Stryyder said:
webmedic said:
whats the matter with chalnoth. Started the thread and took off. Kinda like hit and run. Anyway the rest has been covered by others. Interesting that toms noticed that ati's filtering is better than nvidias though.

He went out to buy an x800 pro :LOL:

:oops: :LOL:

yes possibly thats what he did when he saw the nv30. I'm not sure which ati card he bought but I know he has one.
 
ATI launches card. ATI ships cards. They gave us what they said they would give, they made no false claims. I have nothing but the utmost respect for them as a company.
 
Status
Not open for further replies.
Back
Top