ATI and App Specific Optimisations

Bjorn said:
Bouncing Zabaglione Bros. said:
ninelven said:
What is there to improve if you're already doing it right?

When games like Doom 3 seem to be incapable of telling the card how to filter texture properly, as per this thread :rolleyes:

Nothing wrong with Doom 3, just with Ati's driver. A problem that has been fixed without any performance loss also:

http://www.techreport.com/etc/2004q3/ati-doom3/index.x?pg=1
Ah, let me guess .... the problems are due to "driver BUG"

How familiar ... don't you think? The most expected game... special "doom3-only patch", and a "bug" connected to brilinear ....
That is reaallllly familiar to me.
on NV cards there is at least option to disable that shit.
 
Ah, let me guess .... the problems are due to "driver BUG"

How familiar ... don't you think? The most expected game... special "doom3-only patch", and a "bug" connected to brilinear ....
That is reaallllly familiar to me.
on NV cards there is at least option to disable that shit.

You do know there is a fix that displays trilinear right and gives you the same performance ?

Or did you miss that huge thread that humus started ?
 
chavvdarrr said:
Ah, let me guess .... the problems are due to "driver BUG"

How familiar ... don't you think? The most expected game... special "doom3-only patch", and a "bug" connected to brilinear ....
That is reaallllly familiar to me.
on NV cards there is at least option to disable that shit.

I don't see what the problem is since they fixed it while keeping the same performance. Though it could of course have been a "let's buy us some time to fix the problem" bug. But that's something that Nvidia surely have been guility of also.
 
jvd said:
We could have avoided all of this if nvidia just put in one option


* disable shader replacements .


That would have stoped all this crap :oops:

And we rightfully did flame Nvidia for all their bullcrap they put out, didn't we? This was a strong reason why we stuck with ATI, wasn't it?

But now ATI goes down that slope. Do I need to defend ATI for what they did? Hell no. I am not going to defend ATI for this nonsense, because it's just that, nonsense. I would prefer that all this be revealed, and have ATI wash its act up and return to being the company that we grew to respect. Defend them all you want. I'm an ATI supporter, but I'm doing them no good by helping them cover up their mistakes.

Besides, a real shame is that for all the finger pointing we make at Nvidia, can we really prove that Nvidia did any shader replacements in Doom3?

edit: Grammar mistakes.
 
I think optimising drivers to give gamers better perfomance and/or IQ sucks! What we want are badly performing drivers that are honest. I don't care if I get 10 fps less, I want that to be an honest frame-rate. I don't want games to look and run better without me having to spend at least 2 hours tweaking the registry, ini files and re-compiling the driver control panel. Oh, and can I just add that it's all Nvidia's fault. If Nvidia had never blown up the WTC this would never have happened. I'm shocked that ATI have now sold their immortal souls to THE DEVIL and joined Nvidia in the 7th ring of Hell! :devilish:
 
Smurfie said:
Besides, a real shame is that for all the finger pointing we make at Nvidia, can we really prove that Nvidia did any shader replacements in Doom3?

When JC himself says nVidia is probably "cheating" by replacing shaders in the driver and making the game "drop off the fast lane" when he does some innocuous changes to the shaders....

Anyway, I'm a little disappointed in ATi's decision but I can't say I don't understand their reasoning. I do hope that any special optimisations can be turned off, as they now promise. I'm still waiting for their "Fast Trilinear" toggle. I always prefer IQ over framerate and I really don't care if my card is running behind in some benchmark published half-a-world away.

I want to be able to install new drivers with the confidence that they won't drop IQ behind my back. I suffered through that until about a month ago when I bought my new Radeon. I wish my peace of mind lasts a little longer.
 
Mordenkainen said:
When JC himself says nVidia is probably "cheating"

He never said cheating, I think the general consensus is that app specific optimizations are OK as long as they do not specifically target benchmark modes (like timedemos etc) or impact image quality.

The problem is that when app specific optimizations are used, it is not possible to draw general performance conclusion based on the benchmark – as you can not assume that all games/mods are going to get the full treatment.
 
volt said:
Are the updated beta CATS 4.9 available to the public?

No, but the official 4.9's are due out in the next week or so. Even if the Inq and it's rumoured date of 2nd September is wrong, new Cat's would normally be due around the second week of the month.
 
chavvdarrr said:
Ah, let me guess .... the problems are due to "driver BUG"

How familiar ... don't you think? The most expected game... special "doom3-only patch", and a "bug" connected to brilinear ....
That is reaallllly familiar to me.
on NV cards there is at least option to disable that shit.

The bug's not connecting to "brilinear" at all, go read the article again. The bug in ATI's drivers is related to DOOM3 requesting filtering on a texture-by-texture basis. The TechReport guys did a pretty thorough job researching and testing ATI's claims, and found them to be on the level. Not only does their explanation make sense, but the claim that the fix won't impact performance holds up as well. Please get your facts straight before you start slinging mud next time.
 
Tim said:
Mordenkainen said:
When JC himself says nVidia is probably "cheating"

He never said cheating,

Actually, he did:

http://www.beyond3d.com/forum/viewtopic.php?p=331833#331833

Here's the relevant part:

John Carmack said:
The quote is from me. Nvidia probably IS "cheating" to some degree, recognizing the Doom shaders and substituting optimized ones, because I have found that making some innocuous changes causes the performance to drop all the way back down to the levels it used to run at.
 
Smurfie said:
I'm referring to the conclusion in that article.

ATI's decision to use app detection was also apparently influenced by its use of adaptive filtering algorithms. After the world learned of ATI's adaptive trilinear filtering algorithm used in the Radeon 9600 Pro and newer GPUs, the company challenged people to point out obvious image quality problems caused by this algorithm. Some folks apparently found some cases where ATI's filtering isn't as good as "full" trilinear filtering, so ATI will use application detection to address those problems on a case-by-case basis.

It's everything we flamed Nvidia for 2 years for. ATI's just doing it in its own red shade.

Apparently ...... apparently ..??? …… -- it’s just reporter slant. If Techreport has some information showing ATI’s Trylinear not doing sufficient filtering and causing IQ problems then post some links, shots and info about it. You know why they didn’t. Because they can’t -- it’s just hearsay. This is the second Techreport article in the last few months calling into question ATI’s Trylinear with no evidence.

With the FX series NV was so far behind technically the R300 they had to reduce IQ drastically to try and keep up in the benchmarks. The problem is that they pushed their Tri so far towards Bilinear that it became an IQ problem that people could spot. NV couldn’t have such lousy filtering for the 6800 so they had to ratchet things back towards the quality side of things (newer drivers apparently … :D… have better Brilinear). If you examine a set of mipmaps produced by ATI’s Trylinear versus NV’s Brilinear, the Bri on the 6800 is easy to spot and is certainly not doing the same level of filtering/blending. Here is a set of mipmap shots from ixbit … marked ANSO 8 APP, 30d.

9800-Bilinear-16AF
X800-Trylinear-8AF
6800-Brilinear-8AF

Download them to your desktop, load them up in MS Photo Editor, size them to 100% and control-tab between the shots. I threw in a 9800 Bilinear shot so you can gauge the difference and really see where the mipmaps are. Here are the 9800 and 6800 Trilinear shots if you want to compare those too …

9800-Trilinear-8AF
6800-Trilinear-8AF

If Hardware sites want to complain, why not complain about NV’s standard Bri filtering --which they are all benching with. NV is still pushing filtering much farther to the Bilinear side where it may be an IQ problem -- that’s why NV has to put a OFF switch for their Trilinear optimizations. ATI’s adaptive-Tri is so close in quality to the old Tri their really isn’t any point in putting a switch on the control panel to turn it off (going by the ixbt shots).
 
ATI has been useing app detection already, anyone remember ati3duag. dll recognises CT3.exe, pop.exe, RaceDriver.exe, SplinterCell2.exe, SplinterCell.exe?? Thats app detection. Nvidias are apperant, ATI's are hidden.
 
Those are apps that disable control panel FSAA because the games have issues with Muiltsampling FSAA.
 
DaveBaumann said:
Those are apps that disable control panel FSAA because the games have issues with Muiltsampling FSAA.

Yes I know what they do, The point is that it is app detection, no matter what its purpose. and ATI said it would never do that. Is it not app detection?
 
weeds said:
DaveBaumann said:
Those are apps that disable control panel FSAA because the games have issues with Muiltsampling FSAA.
Yes I know what they do, The point is that it is app detection, no matter what its purpose. and ATI said it would never do that. Is it not app detection?
Where did ATI state they would not do app detection? Seems like app detection to fix problems with games is a good thing, so why are you making a big deal over this? You don't like ATI fixing bugs?

-FUDie
 
weeds said:
and ATI said it would never do that.

Actually, I don't once recall ATI ever telling me they "won't do app detection" for any reason, since in the converstations I've had they actively point out that the R200 drivers have all kinds of app specific fixes (not least due to SuperSampling). The line they previously used is that they prefered optimisations that were as generic as possible.
 
Well I like anything that gives me better performance with my card at the same IQ, therefore I like it and I think it is about frikin time they got off their lazy buts and did it :p Really though I am glad they are doing it especially with the little checkbox to turn it off.

Now if they had a checkbox to turn of trylinear, which you notice doesn't give the same results anyway...
 
weeds said:
.....ATI said it would never do that...
Actually CM said THEY would do this if their customers needed them to ( hint, DEll/OEMS).
What they did say to NOT doing this was for 3dmark.
 
SO the current line of crud from nvidia fans is they can turn off nvidia's trileaner optimizations but you can't do that on ati cards.

What are they going to say when you can turn off shader replacement and app specific optimizations on ati cards but not nvidia cards ?
 
Back
Top