Ways to solve complaining about ATI's filtering?

DSC said:
Afraid of the truth? If ATI really has any merit, they should allow full trilinear so that sites can retest and rebench. But since you'll never ever get full trilinear again on newer R4xx and RV4xx cards..... enjoy your new "trylinear" filtering then.
well other than the fine reviews at BD3, and the new style by Brent , i dont realy care about benchmarks/"shoot outs". I kind of a matured sometime ago.. so its not all that. But what i do do is play games and want the best IQ . Which ATi has given me for a few years now. you talk about merit (somthing you have none off) for ATI.. ? ATI doesnt do bench marks, review sites do benches, it up to the sites to make the call. As for trylinear, well it seems to be just fine, and plays great on my AIW9600pro, and will be just fine on my X800s when they show up. If you realy understood what the whole issue was about, and werent working for nvidias PR department (so sad) you would worry more about the 6800s getting to retail, and not a cool new rendering tool.
 
If you get the same image quality from this optimization and full trilinear then why would you want ATI to waste precious cpu cycles on full trilinear? Why should these games be benchmarked again just to show slower results? Do you think people are going to want to disable this feature when they play games for no other reason then to slow down the card?
 
Options are always good, a situation or condition could show a weakness to this filtering and the ability to go to full quality Trilinear just makes sense. I like that Nvidia has done that.
 
noko said:
Options are always good, a situation or condition could show a weakness to this filtering and the ability to go to full quality Trilinear just makes sense. I like that Nvidia has done that.

Appently in an interview with 3dcenter.de Nvidia stated they are going to offer even more options in filtering in addition to what they have now, assumably something in between their brilinear and full trilinear filters.
 
Ostsol said:
Easiest way I can think of is for ATI's texture quality slider to force full trilinear on everything when it is set to full. One notch down and the optimizations are put in place. Further notches down, do whatever else they've already been doing. Default the drivers to the second notch from maximum. That way the casual gamer will be able to get the best performance with arguably just as good quality. The hardcore users can tweak all they want to get what they want. Also, full documentation into each setting and what they do would be helpful.

Would anyone complain about that?
I think the best solution would be that when the slider is set to full it utilizes the adaptive trilinear. When the slider is one notch down it uses full trilinear. Only because speed is also apart of IQ in games so it would be fair say that equal IQ with higher speed is better.
 
nelg said:
I think the best solution would be that when the slider is set to full it utilizes the adaptive trilinear. When the slider is one notch down it uses full trilinear. Only because speed is also apart of IQ in games so it would be fair say that equal IQ with higher speed is better.
Well, the problem with that is that the ends of the sliders are labeled "Performance" and "Quality". The adaptive mode is meant to increase performance, so. . .
 
Ostsol said:
nelg said:
I think the best solution would be that when the slider is set to full it utilizes the adaptive trilinear. When the slider is one notch down it uses full trilinear. Only because speed is also apart of IQ in games so it would be fair say that equal IQ with higher speed is better.
Well, the problem with that is that the ends of the sliders are labeled "Performance" and "Quality". The adaptive mode is meant to increase performance, so. . .
Well there you go. You have outed me. I am using a Rage Fury. :oops:
 
Someone pointed out that the CP should be kept simple. While I agree I think it's inevitable and over time we will see more options regardless what ATI is saying. I like NVIDIA's idea actually, in the Performance and Quality Settings tab there is a checkbox which enables Advanced Options. Once you enable it, you have a few parameters to look at. Users with less knowledge would simply not turn it on (okay some would and they would actually complain that they have a messed up quality), you could even add a tooltip saying not to enable it if you do not know what the settings change.

Andy/Raja We try to keep the control panel as simple as possible (even as its complexity increases), and if the image quality is identical or better there doesn't seem to be a need to turn it off. However, if our users request otherwise and can demonstrate that it's worthwhile, we would consider adding an option to the control panel.

That's a poor excuse since majority of us here would want the option to disable it. Remember how long it took NVIDIA to add that? :?
 
This excuse of keeping the CP simple is inane. Most people probably don't even muck about with the CP (heck, it's a *separate download*), and those that happen to grab the broadband driver set probably have no friggin' clue what the Texture and Mipmap Quality sliders do, let alone what "vsync" means. IMO, ATI should default to HQ, and show users three "switches" in the same "3D" tab:

1. Texture + Mipmap Quality: Performance, Balanced, Quality.
2. AF: separate On/Off switch, and a 2x-16x on a slider that is enabled when the switch is On (and otherwise darkened out).
3. AA: same as AF.

In addition, ATi can include an "Advanced" toggle at the bottom of that same tab that will let ppl futz around with vysnc, triple buffering, Truform, Temporal AA, etc. They should also set Truform per hardware (say, on with the 9100 and 9500+, off with the 9200-) and default the 3D refresh rate override to be the same as the desktop refresh rate.

These are simple and seemingly obvious things to me. Surely these suggestions can't be that hard to implement, along with detailed pop-up explanations for each section (e.g., pausing mouse anywhere in either 1, 2, or 3 pops up a help window that illustrates the whole section, not just each individual setting)?
 
Are you all losing it? Why would ATi give the option to disable a feature they have worked very hard to design, and are awaiting patent approval on? That is a step backwards. If you don't like how ATi does AF then don't buy the card. Simple. You would think everyone would be giving props to ATi for finding effective ways to increase performance, but instead everyone cries foul and screams CHEAT CHEAT turn it off.

Unless someone can actually show that ATi is indeed doing ineffective rendering as if AF was set to double instead of tri, how can anyone cry foul here?

Someone enlighten me I just don't get this....
 
SiliconAbyss said:
Are you all losing it? Why would ATi give the option to disable a feature they have worked very hard to design, and are awaiting patent approval on? That is a step backwards. If you don't like how ATi does AF then don't buy the card. Simple. You would think everyone would be giving props to ATi for finding effective ways to increase performance, but instead everyone cries foul and screams CHEAT CHEAT turn it off.

Unless someone can actually show that ATi is indeed doing ineffective rendering as if AF was set to double instead of tri, how can anyone cry foul here?

Someone enlighten me I just don't get this....

That's a blatant statement. NVIDIA also worked "hard" to design their shader parser and other "optimizations" They finally decided to include trilinear filtering in their driver options -- we moaned, we got it.
 
And if enough people moan ATi will include an option to do "true" Tri AF. But it will not benefit anyone. I can't believe you are comparing ATi's AF routines with nVidia not doing tri AF AT ALL ON ANY LEVEL and not even attempting to do so, even though the setting was on tri.

Logic and reason have run away totally on this one, I guess there are people out there that want to believe ATi is cheating so badly that they will go to any lengths to see it, even staring at screenshots long enough until they "see" a cheat.

I am not sticking up for ATi just to do so, I am only trying to be reasonable here. It could be proven that ATi's AF routines do in fact have problems, but I have seen no such evidence.
 
You guys dont get it. ATI will do everything possible not to allow for a full trilinear option because of the not so wonderful benchmark results it will produce. This is because most sites will start benchmarking with trilinear set to ON and so the purpose of the new optimizations will be defeated. Benchmarks sell a product - like it or not. Thats why it is called benchmarketing. ATI and nVIDIA are out on the market to make money and sell products. Benchmarks sale products. End of story.
 
The Right Way ( TM ) for such new amazing features would be to expose them via OGL extensions and let developers use them if they like em.
But leave the legacy options untouched altogether, i.e. execute them exactly like many generations of graphics chips before.
IF its SOOO important, you can later implement a nifty feature that can be turned on via control panel that would let users "API override: substitute ATIoptimizedTM texture filtering for trilinear" and if they really want "substitute ATIoptimizedTM texture filtering for bilinear" too.
 
Honestly it's not a hard technical feat to force the card to do trilinear all the time, we're talking about a check box option for christ sakes. The only reason why it's not on the drivers tab page is purely a pr excercise and making sure their card is ahead when AF is being used. Simple as that.

Ati fans are a bunch of hypocrites honestly, you cry bloody murder when nvidia implements "optimisations" and yet when ATI does the same underhanded things you turn a blind eye. "Pot Kettle Black"!
 
volt said:
They finally decided to include trilinear filtering in their driver options -- we moaned, we got it.

Tell that to the FX5900 Ultra users. 499$, wasn't it?
 
ChrisW said:
If you get the same image quality from this optimization and full trilinear then why would you want ATI to waste precious cpu cycles on full trilinear? Why should these games be benchmarked again just to show slower results? Do you think people are going to want to disable this feature when they play games for no other reason then to slow down the card?

Bolding mine.

With their behavior for "developer specified" mip maps (a case where literal and mathematical trilinear filtering seems necessary and useful, and seemingly allowing the other behavior to be judged by the criteria of image quality alone), I think this statement and sentiment is quite accurate. But do we have the same minimum image quality the rest of the time?

I think this is a possibility as far as ATI personnel: that they might indeed have information and analysis of this X800 methodology that could prove the bolded statement true in every case.

Consumers don't have that information, though The most they might be able to verify easily is ATI's description of a universal behavior for developer specified mip maps, and therefore being able to evaluate the virtues of when it is deciding when not to apply the optimized methodology. This might or might not be enough to set them apart from the competition, but doesn't relate to the question of image quality in all cases as ATI proposes for not being able to turn it off, only some cases.

...

I think protecting the methodology might be a valid reason not to specify this information in exact detail, but that helps ATI's situation, not that of consumers. There isn't something necessarily wrong with ATI looking out for themselves, but good marketing is about keeping consumers accurately informed about beneficial aspects of your product while doing just that, and there is a marked absence of that in relation to this issue. It might be true that they are not cheating, but it is certainly true that they haven't done much to explain why that is the case.

It might be difficult to do so without divulging some sort of intellectual property, but it is people within ATI, not outside, who have familiarity with the methodology and the most information/control on how to:
  • demonstrate its image quality equivalence
  • come up with an explanation or a testable aspect of its behavior that protects the overall methodology while making it easier for independent verification
  • provide the consumer with tools to control it until that happens
There might be nothing lacking in the methodology, but currently only people within ATI know that with any general applicability. This is not the best case for ATI or consumers...it seems to me ATI should work towards one of the available ways of improving the situation.
 
A checkbox option would be fine with me. Hide it away under the Compatibility section if they like, but it would still be comforting to have it there.
 
Back
Top