Post Mortem on ATIs Q3 Drivers ("cheat"/"bug")

jb

Veteran
Folks if all possible I would like to have an honest discussion about this. Please leave all <bleep> crap out of this thread. I am EE and enjoy analyzing all of the facts. I know from my EE back ground making an guess on what is going on with a systems by only observing the results is not a good idea. To really understand you need to go inside. Here that is going to be hard. Until we have the source code, we will never really know if/what really happened with this whole situation.


We all know that ATI did have specific Q3 optimizations in their drivers. We all know that the first set of retail drivers had an issue with the IQ in Q3. In a FS review ATI said it was a driver bug that did not handle the texture slider. We all know that when you removed the Q3 ref from the drivers the FPS dropped and the IQ was fixed. Or if you modify the Q3 its self and change the name, the same thing happed (IQ fixed with FPS drop). This is what most of the world knows about this. However these are only observations of tests and we all know that does not really show us much. But its obvious something is going wrong.



The best report I saw on this was:
http://www.tech-report.com/etc/2001q4/radeon-q3/index.x?pg=1

Their explanations:

"The answer: They are futzing with the mip map level of detail settings in the card's driver whenever the Quake III executable is running. Mip maps-lower resolution versions of textures used to avoid texture shimmer when textured objects are far away-are everywhere; they're the product of good ol' bilinear filtering. ATI is simply playing with them. When the quake3.exe executable is detected, the Radeon 8500 drivers radically alter the card's mip map level of detail settings"


My questions is does this accurately explain the FPS "increase" the original drivers showed? Maybe if we test Q3 at each setting of the texture slider and see whats the FPS change at each setting. In some reviews they showed that the "cheat"/"buggy" drivers netted almost a 24 fps increase at some settings. If it were just pushing around mip maps, a 24fps increase seems like a awful lot. But this is what I think is the "best evidence" of a cheat. I applaud TR for trying to make sense and doing more detailed test. That is the only way to help find out what is going on.


Then with the first set of beta drivers this whole issue was fixed. No more IQ issues and a FPS were with in 5 FPS or less in most reviews. Heck in Anandtech review he showed a 3 fps increase with the new "non-cheating" drivers. Wow. Here I was almost sure that ATI was caught with their hands in the cookie jar. However we all know driver bugs can cause all sorts of weird things including issues with textures. And since it was fixed with less than 5fps loss in most cases, it almost seems like it was a driver bug for their optimized code path. Ok now I am confused....


We all know that Hyperz2 was not working well(if at all) in the first set of drivers for OpenGL. I thought it was Dave that suggested here, part of ATI's optimizations could have been that Hyperz2 was only working for Q3 at the time and when you run the modified q3.exe that would avoid ATI's optimizations. Which means it "may" have avoided running HYZ2. From my testing HYPZ2 can help by around 15%ish in some cases. Any more thoughts on this?


Also I have heard from David or some one like him in the Rage3D board that those Q3 specific optimizations are now working for all Q3 based games. Is this true? Finally when (if they have) did the references to Quake3 leave the 8500 drivers?


Honestly I do not know what to believes. In the FS interview they said that the truth lies somewhere in the middle. That is my best guess on this whole mess. Again please try to keep any fan boy claims or what not. If you have facts then lets here them. I am not a 3d expert. But being an electrical engineer on consumer retail products for the last 10 years I have worked in about every area (Software, Digital/Analog. IC design, RF, QA, Manufac and System design). I have seen some really weird stuff happen before and the actually causes were just as weird. Just wondering about how much could mip map "playing" really effect frame rates and some other things. The observations them self could be made to support both claims that it was a bug or a cheat.


I also wanted to leave out of this thread weather or not it was right of them to do that. All software is optimized at one point in time. Wheater it was right or wrong is matter of personal opinion and I really dont care as long as their product works for what I need it to....
 
Yea I saw that two. But have you tried it to see if its really true? I wont have a chance till a few days. Traveling now and wont be back home till late weekend. I always try to verify if I can before I believe something. Sorry been burned before by just believing...

Dam typos

<font size=-1>[ This Message was edited by: jb on 2002-02-28 20:28 ]</font>
 
On 2002-02-28 19:51, jb wrote:
Also I have heard from David or some one like him in the Rage3D board that those Q3 specific optimizations are now working for all Q3 based games. Is this true? Finally when (if they have) did the references to Quake3 leave the 8500 drivers?

Checked now, there is at least no reference to Quake3 in the driver right now.
 
No offense but David pretends he knows alot more than he does (my guess is a student who has a friend with a dev relationship with ATI), he has been caught lying many times in the two years I spent at Rage3D, take whatever he says with a grain of salt.
The Quake3 specific optimizations have been in the drivers since the original Radeon, this DISCOVERY by Hardocp was very OLD news :rollseyes:

<font size=-1>[ This Message was edited by: Doomtrooper on 2002-02-28 20:42 ]</font>
 
Thanks for the info folks. Do we know now about when it was removed, or if it was removed in their first set of betas?
 
Checked more ... it was removed between the 3281 and 3286 drivers.

<font size=-1>[ This Message was edited by: Humus on 2002-02-28 21:08 ]</font>
 
Well, some "facts" (please verify or disprove, everyone):

The FPS was increased in subsequent revisions even without HyperZ II working.

Drivers with HyperZ II working in OpenGL increased the FPS even more.

Even with these facts, I still think the initial fiasco was the result of a desire for the most expedient way to get higher benchmark results upon initial release. Note, I make a distinction between objecting to the presence of Quake3 text in the drivers (I don't object) and the result of Quake3 being detected being lowered IQ and increased FPS (I do object).

I think the rather broken state of the drivers upon initial release shows that the initial drivers were a hurried release. This could be a factor in supporting either a defense of their claim that it was an honest mistake, or a claim that they had performance problems and used the Quake3 code path to "cheat" to make up for these problems for a well known benchmark.

I don't think, even with all of the detective work that has been done, we can conclusively decide one way or the other (but perhaps I haven't seen all the detective work). I do think, however, that you can conclusively state that ATi screwed up big time in Quality Assurance.

EDIT: change HyperZ to HyperZ II

EDIT: PS: there are other ways to detect whether a certain executable is running...the text match was just a primitive (and easily exposed) way, which added to the sensationalism.

<font size=-1>[ This Message was edited by: demalion on 2002-03-01 18:11 ]</font>
 
There were many theories and non-conclusive tests performed, but there is nothing out there that substantially "proves" one theory over another.

I performed my own "detective" work and saw nothing but varied results from other owners, and claims of same results from some, differing from others and it got to be all messy and annoying due to people not paying strict attention to details and differences in OS/driver revisions.

Most of the websites only illustrated why hardware/graphic incompetent teen agers really shouldn't attempt to proof/test a concept, nonetheless be running websites and hosting public content.

One thing I strongly disagree with are the HyperZ-II claims and the initial drivers. At least in the case of the Win98se drivers, HyperZ-II enjoyed some of it's greater performance in the early drivers (7191 CD drivers with the "cheat"). Using VillageMark OGL, you can see a much wider variance when enabling/disabling the HyperZ-II key in the early drivers than seen in the next "betas" that fixed the alleged "cheat" and successive driver revisions past that. I've seen the benefits of HyperZ-II steadily decrease with each new driver revision in both OGL and D3D with the Win9x drivers (XP/2k, results may vary) but I've also seen clipping bugs and occlusion bugs fixed in the process.

In general, my opinion on the whole fiasco is simply ATI's drivers are absolutely deplorable when it comes to driver settings, bugs within them, and drawing conclusions with as much crap that happens "automagically" when adjusting these settings... ESPECIALLY with LOD Bias and title specific variances that really confuse the heck out of anyone trying to test/proof concepts about them. It's 100% snafu in most cases.

In the 7206 drivers, you had LOD Bias *but* the slider (or manually changing/ensuring the registry key was getting effected) had little to no effect on SOME OGL games. Adding anisotropy caused an automatic LOD Bias shift, adding AA added yet another automatic LOD Bias shift (additive) so you wound up with overly aggressive LOD Bias, and the slider's behavior or failure to function varied depending upon combinations used, by title, etc.etc.

Right now, in the current drivers- LOD Bias is also completely "stuck" in Quake3, but stuck the opposite way (the sharpest setting). It appears to "unstick" when applying anisotropy to a lower setting (go figure), then automatically negative shifts when using AA back the otherway.. but not an additive shift when using both anymore.. *sigh*. This does yield good IQ (i.e. soften a bit for rip-mapping/aniso, sharpen a bit for Supersampling), but I'd prefer if the damn slider was allowed to override or actually work.

I dont see anything more or less broken today than the betas, just broken "opposite of cheating" which would yield lesser framerates if the theory behind the original concept is true. You just wont see any websites mentioning anything about this either.

Lastly, I found it interesting no "cheating" fiasco was brought to the mainstream with the first Detonator4's which boosted 3DMarks by largish margins. Of course, fog and effects in Dragothic were banded beyond 16-bit ugliness and many alpha effects were *totally* missing in these drivers (rocket trails, etc.etc.), and this only seemed to effect 3DMark2001... fog and alpha effects in Direct3D games was fine. If only someone made a bin/hex ripper to make it 3DMUCK or similar, then maybe we could have put this to the test as well.
 
On 2002-03-01 19:11, Sharkfood wrote:
Lastly, I found it interesting no "cheating" fiasco was brought to the mainstream with the first Detonator4's which boosted 3DMarks by largish margins. Of course, fog and effects in Dragothic were banded beyond 16-bit ugliness and many alpha effects were *totally* missing in these drivers (rocket trails, etc.etc.), and this only seemed to effect 3DMark2001... fog and alpha effects in Direct3D games was fine. If only someone made a bin/hex ripper to make it 3DMUCK or similar, then maybe we could have put this to the test as well.

I sincerely doubt that nvidia relies on the application name to detect the program... just as ATi no longer checks for Quake 3.

nvidia has made cheating into an way of life, IMO. In the early days, I thought nvidia had good drivers. I have a TNT and a TNT2 Ultra and both worked great and looked good. When the Det 3s came out, I was amazed that my TNT2 Ultra gained in performance. Then I realized that some features weren't being implemented properly (polygon offset being an obvious one). When the Det 5s came out, my TNT2 Ultra again gained in performance, but the IQ went down even further (more problems in polygon offset among others). Also, driver stability became dodgy around here as well.

When nvidia was working hard to become #1, it seemed they cared about stable drivers that looked good. Now it seems they are more concerned about the extra 1 pt in 3D Mark.
 
Why doesnt someone with a R8500 runs some tests in Quake3 with latest drivers?
Specifically, run one test at highest quality Quake3 settings, and another with the same settings EXCEPT change the texture LOD bias to as low as it will go. Whats the performance difference? If it is a 30FPS gain, then ATI MIGHT have been cheating. If its more like 5 or 10, then they werent.
 
Absolute performance difference isn't a good metric, you'd better use a relative metric.
 
run one test at highest quality Quake3 settings, and another with the same settings EXCEPT change the texture LOD bias to as low as it will go. Whats the performance difference?

That's the whole point, there is absolutely no difference... performance-wise, or IQ. The slider's been totally broken for the long run and Q3, at least this is the case for the Win9x/ME drivers. It's just broken at full sharpness now, versus broken at full blur like before.
 
Sharkfood, i never said USE the slider...use your BRAIN!
You can adjust LOD in any of the radeon tweakers, right............

And as for that, fine, make it a percentage, who really cares? If someone would just run these tests, it could easily be solved. If extremely low LOD gives a low perf. increase, then it was a bug....
 
Why is such a big deal being made about ATI optimizing drivers for Q3. Doesn't nVidia do this as well?
 
Sharkfood, i never said USE the slider...use your BRAIN!
You can adjust LOD in any of the radeon tweakers, right............

Brain is in use, might want to turn yours on for a second. :smile:

No, not in tweakers. No, not in RegEdit. No, not using the driver slider. OpenGLOGLLODBias key is completely ignored... as in broken, as in not working, as in the exact same thing as the 7191 drivers but "stuck" at a different default- nothing more.

Quake 3 Benchmark:
LOD Bias "Fuzzy" (OGLLODBias = 0): 187.6 fps
LOD Bias "Sharp" (OGLLODBias = 6): 187.6 fps

I've also made a demo, lod.dem, 10 seconds staring at the grainiest floor texture with a far view... captured with all methods (tweakers, ATI tools &amp; RegEdit) and the OGLLODBias value all of which effect have no effect in Quake3.

I'd be interested to hear if this makes any difference in Win2K &amp; XP, but only if the tester pays attention to my first post. I've seen enough false or misleading scores posted when the user didn't ensure identical settings or tried to rationalize a difference in scores due to fresh reboot or some systray doodad effecting the score.

The best thing to do is to make a demo. Sit and stare at a floor for like 300k worth of demo, then reuse this demo to take screenshots at the various levels- then compare the screenshots side by side. On Win9x/ME, there is absolutely NO difference. I can throw out some JPG's if anyone would like.

Most people dont notice this broken behavior because they play with anisotropy on to at least some degree which starts with substantially lower LOD Bias, but for now LOD Bias (along with Game Gamma and several other registry keys) are totally ignored in the 7191/7206/900x/90xx drivers. Whether ATI has renamed the key to an unknown key, changed the scale but didn't update their own tools (unlikely as I've tried all sorts of difference values with no visual or performance change whatsoever) or if they are just plain broken is unknown.
 
Back
Top