My thread @ Futuremark(Re =Waite for Unwinder)

Himself said:
I think optimizing for benchmarks is ok, even a positive thing, the more driver testing the better, so long as it's generic optimizations that will show up elsewhere. If it's just hacking benchmarks then it's useless.

Ok, I agree with you there, if there are generic optimizations made for benchmarks that help everything else out too, that's a good thing. But what are the chances of that happening? Isn't a driver team more likely just to eke out the most possible performance for a benchmark and call it a day?

Specifically, we are seeing that certain benchmarks/games are being detected, which is hardly a generic optimization. How much do you want to bet that the UT2003 detection in the Detonator drivers only helps out timedemos and not the actual game?
 
Sharkfood said:
I would be very interested in two things with this process:
1) The IQ differences explored. This would be the highest priority- as in researching the differences and extrapolating that these are 100% due to the application detection and not a bug as a side effect from the process (i.e. possibly mistake in props/caps from the wrapper)

2) Creating a "control" wrapper to isolate how much of the performance difference might be caused by the wedge itself. I have NO idea how Unwinder's tool works, but if it's anything like past attempts from others- it creates a wrapper wedge between application and API- which in turn adds an extra depth of indirection between every API call. This does cause a very measurable drop in performance in itself. A meatless version of the same (same wedge and indirection, but does absolutely nothing) should be benchmarked to remove overhead from the performance results (IF this applies to the implementation of the tool).

Ditto & ditto! 8) Let others explore & mimic what UW actually did before jumping on the "cheat" bandwagon. At best this is just a theory & since we don't presently know what controls were in place & what was specifically 'disabled' ... :?

Good call Sharkfood. ;)
 
Here's another piece to the thread over there...
SAS_SIMON said:
> Unwinder, i don't understand all the nittygritty as you do and you have probably talked about this
before, but i would like to know; does ati's "cheats"* degrade quality for performance, if the answer is yes, then that is a pie in the face of that lying scumbag, the catalyst guy, if not, who cares....
Unwinder said:
Just read the first post in this thread carefully:

Blocking application detection code caused dramatic performance drop in
3DMark2001/Nature on both NV (74->42 FPS) and ATI (66->42 FPS) boards. IQ in this test changed on both of the systems.
I cannot say that IQ is worse, it just looks differently. Both NV and ATi seem to render leaves on the trees in different manner when 2K1/Nature is detected (NV seems to use different vertex shader because the geometry is noticeably shifted comparing to non-patched version
 
At Rage3D:

quote:
--------------------------------------------------------------------------------
Originally posted by StealthHawk
He is saying ATI recoded some PS1.0 program into PS2.0.....you're right, normally there SHOULDN'T be any PS2.0 in 3dmark2001
--------------------------------------------------------------------------------



It's not true, at least it's not my words. I'm saying that I've identified one 1.1 and two 2.0 shaders detections in the driver's code. And there is no relations between these shaders and 3DM2K1.


__________________
Alex Nicolaychuk aka Unwinder, RivaTuner creator
 
just me said:
At Rage3D:

quote:
--------------------------------------------------------------------------------
Originally posted by StealthHawk
He is saying ATI recoded some PS1.0 program into PS2.0.....you're right, normally there SHOULDN'T be any PS2.0 in 3dmark2001
--------------------------------------------------------------------------------



It's not true, at least it's not my words. I'm saying that I've identified one 1.1 and two 2.0 shaders detections in the driver's code. And there is no relations between these shaders and 3DM2K1.


__________________
Alex Nicolaychuk aka Unwinder, RivaTuner creator
Why would there be PS 1.1 and PS 2.0 detection mechanisms in the driver when the nature test uses PS 1.0?, or are thes app detections not for the nature test?
 
demalion said:
Interesting. There are some significant things missing still, though, atleast in the English commentary on the topic..

What kind of detection mechanism was defeated? Shader file name, or actual code analysis effort?

Only checksum calculation code is corrected by NVAntiDetector script. Luckily (for us but not for NVIDIA, of course) they use unified checksum calculation routine in many (if not all) application detection mechanisms. I've located this code during investigating 3DMark/3DMurk issue (basically I wanted to check which applications are also detected via the command line). So I've found the code that calculates 128-bit checksum (64-bit checksum for executable name + 64-bit checksum for window caption) for application identification in the driver context creation routine. This checksum was also referenced in D3DDP2OP_CREATEPIXELSHADER/D3DDP2OP_CREATEVERTEXSHADER and some other tokens. In both shader related tokens checksum is calculated using tokenized shader code (BTW if somebody can share with me DX9-specific D3DDP2OP_ token names from DX9DDK, I would be happy. Checksum calculation is also referenced in some DX9 specific tokens, which I cannot identify yet).
So I simply created small script that distorts the seed in checksum generation code (this results in generating different checksums and blocking all pixel/vertex shader/command line detections).

ATIAntiDetector scripts is a bit more complicated. ATI use different ways of application detections so it's much more difficult to collect and block _all_ the application detection related pieces of code. At this time I was able to identify and block at least 8 application detection routines in the driver, but I cannot give you any warranties that there are no more detections left (this applies to NVAntiDetector too). ATI don't use checksums for application identification, they simply hardwired the shaders they detect (in tokenized format) in the driver's binary. Input shader in D3DDP2OP_CREATEPIXELSHADER is simply compared with one of hardwired shaders (I've found 2 2.0 pixel shaders and 1 1.1 pixel shader in the code). Some texture patterns are also stored in the driver and they are also used for some application identification in D3DDP2OP_TEXBLT handler.
All these application detections result in storing unique ID in the driver's internal variable (a sort of application ID), which is zero by default. So in ATIAntiDetector I simply blocked all the attempts to store anything different from zero into this variable.
2 ps 2.0 shader detections are also blocked by the script (the first 8 bytes of hardwired shader samples (0xffff0200 + the first four bytes) are replaced with 'SUCKERS' string :)). These detections don't result in appID identification and do modify shader code in some way.

demalion said:
Is it a tiered "turn on this optimizer functionality when you detect this application" approach or a tiered "replace this shader completely with this shader sequence stored in the driver"?

I cannot answer this question because I didn't studied application specific rendereng code yet. I only studied the way of application detections and block them to see the real D3D performance, the changes made by driver after detecting an application is a topic of further investigations.

demalion said:
What is the difference in image quality? Higher, lower? By what criteria? Since this is the most direct of the important issues, and I can hardly believe this won't be a focus of the article, I'm pretty confident that atleast one of my questions will be answered in it. But I don't know why the others haven't been answered yet...the precedent of the major issues in IHV application detection has already been established.

Not better, not worse. Just *different* on both NV and ATI in 3DMark2K1/Nature. According to the differences I see on the screenshots, NVIDIA probably alter vertex shaders that renders leaves on the trees (some of them just rotated sligthly comaring to original shcreenshot).
Differences on ATI scrrenshot are less noticable.
I'll try to grab screenshots and send them to Anthony few hours later (currently I'm in office with NV17 based display adapter).

Finally, I'd like to make some comments about the current test results. NVAntiDetector hurts NV performance much more that ATIAntiDetector hurts ATI results. Currently ATIAntiDetector affected performance in both 3DMark2001 and 2003 (performance drop in 2003 is similar to result of installing 330 patch).
NVAntiDetector caused performance drop in a lot of 3DApplications including UT2003, CodeCreatures, AquaMark etc. Performance drop in 3DMark2003 is not comparable to 330 results, results are a way *lower* so it seems like FM missed some detections:

Digit-Life results on FX5900Ultra, 3DM2K3 330

4403 - 4806
4461 - 5996
NVAntiDetector + 4403 - 3198
NVAntiDetector + 4461 - 3920

Digit-Life results on FX5900Ultra, 3DM2K3 320

4403 - 5850

2 Anthony and Dave

I've replied to emails from both of you with no response back. Did you received it?
 
Sorry, I've been very busy finisihing up a review. I'm gonna wade through a bunch of mails tonight.
 
you never pm'ed me back over at FM :devilish: , thanx for stopping in though. I didnt think you got my message.
 
Alexey, yes, I got your email.

You have very good driver disassembly skills :)

Anyway, I'm just interested to know when we can see your cheat-busting app make its debut in RivaTuner and even if it does, I'd really want it to have the same checks for ATI and NV boards. Otherwise, it's tough and can be viewed as "unfair" by folks.
 
That's more like it.

AFAICS...

For nVidia:

You've more thoroughly defeated application detection for 3dmark 03, and further substantiated the further cheats in 4403 beyond those defeated by the 330 patch and a whole collection of other application detection routines using the checksum method you've identified (this last part is completely new ground, AFAIK). We need further analysis to determine what new mechanisms were circumvented, and whether they have potential validity outside of their context (i.e., whether they are cheats because of application for benchmarks, cheats by simply lowering quality in a clearly defined manner for benchmarks, or cheats because of application detection and being useless for anything but benchmarking).

For the application detections, however, you've only enabled us to examine the validity of the application detection routines (which are not necessarily invalid in any regard) more thoroughly and effectively, to evaluate "cheat" or "optimization" on a case by case basis. The game application detection is not a bad thing in and of itself.

For 3dmark 2001, you begin to give some answer as to how nVidia could cheat besides changing precision, but it isn't yet clearly established (IMO, in the absence of screenshots) that they are doing something worse than what ATI did in 3dmark 03 (yes, the image quality is different, where there was none for ATI in 3dmark 03, but I'm not aware of there being as rigid a standard of reference established for the 3dmark 2001 tests that disqualifies the verbal descriptions provided...anyone with contradicting information feel free to correct me).
There is, however, the clipping plane question based on other nVidia behavior...considering the vertex shader modification used to achieve that in 3dmark 03, I think that warrants investigation. By "script", I assume you mean what I'd consider a "wrapper"...how sophisticated is it? Can you modify camera angles, and then check fps figues? Is there a "off the rail" version of 3dmark 2001? This also establishes an unsavory precedent for the application detections, but it seems clear that clipping plane adjustments aren't the only thing nVidia is doing, and there could be some completely valid things as well.


For ATI:

It sounds like you rediscovered the 3dmark 03 GT 4 shaders ATI has acknowledged ("water and sky") and found some new 1.1 shader. It would be nice to figure out what that goes to...you're asking for the token look up to examine the code, right? Hopefully, you'll share that here for some discussion.

Again, the application detections are not necessarily undesirable unless they are specific to benchmarks alone (because of their non general nature). Come to think of it, maybe you should check the interaction of "OptimizePVSCode" and "OptimizeTexStages" registry entries (under the "dxhal" registry path) with the detections.

I'm not too familiar with the rules for 3dmark 2001 on this, so the details of image quality changes becomes more important for determining what degree of cheating (beyond it being bad to target benchmarks specifically) is occuring. For example, I'm not aware that you can clearly say "different" is "bad" wrt to 3dmark 2001 output with any clearly established criteria...it depends on how stringent the standards are compared to those established for 3dmark 03.
I'm not aware of ATI having done any clipping plane futzing, so "invalidly activated valid optimizations" seems likely...but there are other possible methods of cheating that can be "invalid optimizations" in all contexts as well, even if not as clearly as the "off the rail" behavior of nVidia.

...

If you have a wrapper or rewrite the driver in memory or on disk, logging of activation of each cheat would be very handy for investigation, or some other way of alerting an investigator that a detection has been activated.

This functionality would be very handy, as independent determination and/or evaluation is important for issues like these...share as soon as possible. ;)
 
Ok, per the original translation in this thread: http://www.beyond3d.com/forum/viewtopic.php?t=6377

AT least 3 pixel shaders are detected (1@1.1 and 2@2.0), texture detections (thats how GT4 in 3dmark2001 is detected) ....
Full texture patterns& shader code are in the driver, but although easy to see what they do, its very hard to make FULL anti-detect patch.....
.... what ATi make with these 2 p.sh. 2.0 is hardly "shifting" ......
....

So, yeah. I thought he was talking about the pixel shaders being replaced in 3dmark2001 based off of comments in said thread, but apparently that is incorrect :oops:

Ok, so I have to ask this question then, why did people say ATI was replacing shader programs with a higher version of shader program? Did Unwinder say this somewhere, or is this total speculation that someone on these boards came up with?

Also, for clarification, ATI is detecting 3 different shader programs that hypothetically already exist in some program, is that correct? Or is it that Unwinder has detected 3 hardwired shader programs in the ATI drivers, and these shaders are replacing shaders used in some application(but not 3dmark2001).

Where does the "what ATi make with these 2 p.sh. 2.0 is hardly 'shifting'" comment come in to play? How exactly do we know that?
 
Reverend said:
Alexey, yes, I got your email.

You have very good driver disassembly skills :)

Anyway, I'm just interested to know when we can see your cheat-busting app make its debut in RivaTuner and even if it does, I'd really want it to have the same checks for ATI and NV boards. Otherwise, it's tough and can be viewed as "unfair" by folks.

Unfortunately I don't know when DL will prepare the review. If you wish, I can give you both of scripts for testing right now, email me for details.
And unfortunately it's impossible to provide the same checks for both IHVs, because "cheat busting" is driver code analysis based so I could simply miss something in each of the drivers.

BTW I've just uploaded the 3DMark2001/Nature screenshots on both NVIDIA and ATI boards (original Detonator/Catalyst drivers and patched with AntiDetector scripts) on www.nvworld.ru:

http://www.nvworld.ru/temp/antidetect.rar

The file is about 5MB and our server is rather slow, so if anybody can upload it to other server it would be really nice.
 
demalion said:
By "script", I assume you mean what I'd consider a "wrapper"...how sophisticated is it? Can you modify camera angles, and then check fps figues? Is there a "off the rail" version of 3dmark 2001? This also establishes an unsavory precedent for the application detections, but it seems clear that clipping plane adjustments aren't the only thing nVidia is doing, and there could be some completely valid things as well.

Nope, I don't mean wrapper, I mean .RTS (patch script) file, which can be executed by RivaTuner. It just slightly modifies the driver's binary to block the code I was talking about.
 
StealthHawk said:
Ok, so I have to ask this question then, why did people say ATI was replacing shader programs with a higher version of shader program? Did Unwinder say this somewhere, or is this total speculation that someone on these boards came up with?

It's not my words ;)

StealthHawk said:
Also, for clarification, ATI is detecting 3 different shader programs that hypothetically already exist in some program, is that correct?

Correct

StealthHawk said:
Or is it that Unwinder has detected 3 hardwired shader programs in the ATI drivers, and these shaders are replacing shaders used in some application(but not 3dmark2001).

Incorrect. The driver compares specified shader code with these hardwired shaders during shader creation.

StealthHawk said:
Where does the "what ATi make with these 2 p.sh. 2.0 is hardly 'shifting'" comment come in to play? How exactly do we know that?

Probably it's rought translation of my posting on iXBT. The driver do alter two 2.0 pixel shaders as ATI stated, but now I can hardly say that it's instuction shuffling. For me that code looks like partial shader replacement (series of AND'ing/OR'ing/XOR'ing tokens). Anyway, this question requires further investigations too.
 
Unwinder said:
Currently ATIAntiDetector affected performance in both 3DMark2001 and 2003 (performance drop in 2003 is similar to result of installing 330 patch).

So what you are saying is that the performance using the ATIAntiDetector is the same as using the 330 patch. So ATI didn't do more optimizations/cheats than already have been exposed?
 
Unwinder said:
BTW I've just uploaded the 3DMark2001/Nature screenshots on both NVIDIA and ATI boards (original Detonator/Catalyst drivers and patched with AntiDetector scripts) on www.nvworld.ru:

http://www.nvworld.ru/temp/antidetect.rar

The file is about 5MB and our server is rather slow, so if anybody can upload it to other server it would be really nice.

Hrmh... those frames are supposed to be the same on ATI and NVIDIA, though there is a difference between with and without cheat script there is also a large difference between ATI and NVIDIA. Actually ATI seems to drop to a lower mip level much sooner than NVIDIA. Did you use the same settings for both cards or did NVIDIA have Aniso enabled or something ?

LOD tricks can help performance quite a lot, so I would worry more about such a large change than a small pixel based changes.

K-
 
Kristof said:
Unwinder said:
BTW I've just uploaded the 3DMark2001/Nature screenshots on both NVIDIA and ATI boards (original Detonator/Catalyst drivers and patched with AntiDetector scripts) on www.nvworld.ru:

http://www.nvworld.ru/temp/antidetect.rar

The file is about 5MB and our server is rather slow, so if anybody can upload it to other server it would be really nice.

Hrmh... those frames are supposed to be the same on ATI and NVIDIA, though there is a difference between with and without cheat script there is also a large difference between ATI and NVIDIA. Actually ATI seems to drop to a lower mip level much sooner than NVIDIA. Did you use the same settings for both cards or did NVIDIA have Aniso enabled or something ?

LOD tricks can help performance quite a lot, so I would worry more about such a large change than a small pixel based changes.

K-

I grabbed screenshots of SoftR9500 pro today and ATI cards surely use default driver settings. Cannot comment settings on NV cards, probably AF was turned on when the screenshots were grabbed, I'll install Ti4600 grab the screenshots in the evening once more.
Anyway, your question can be answered right now if somebody have Ti4600 and can compare original screenshots w/ and w/o AF with nv.bmp from this archive.
 
Unwinder said:
4403 - 4806
4461 - 5996
NVAntiDetector + 4403 - 3198
NVAntiDetector + 4461 - 3920

Digit-Life results on FX5900Ultra, 3DM2K3 320

4403 - 5850
These are the first 4461 + GFFX + 3dmark03 330 results I've seen. That means nvidia put all, umm, "optimizations" right back in there and just "fixed" the detection (and I'm sure they now have a detection when the camera goes "off the rail"). I really think futuremark should do something about this (at least if the next WHQL'd driver contains the optimizations too), since they have clearly stated that application specific optimizations are not allowed.
Well at least the driver seems to have some quite significant performance improvements which are legit too. Though that 3920 score is still nothing to write home about - at least beats a 9600pro...
 
mczak said:
Unwinder said:
4403 - 4806
4461 - 5996
NVAntiDetector + 4403 - 3198
NVAntiDetector + 4461 - 3920

Digit-Life results on FX5900Ultra, 3DM2K3 320

4403 - 5850
These are the first 4461 + GFFX + 3dmark03 330 results I've seen. That means nvidia put all, umm, "optimizations" right back in there and just "fixed" the detection (and I'm sure they now have a detection when the camera goes "off the rail"). I really think futuremark should do something about this (at least if the next WHQL'd driver contains the optimizations too), since they have clearly stated that application specific optimizations are not allowed.
Well at least the driver seems to have some quite significant performance improvements which are legit too. Though that 3920 score is still nothing to write home about - at least beats a 9600pro...

Hmnn, last I heard nobody with a 5900 was able to publish results on the ORB.

EDIT: Seems they can now but the server is down for maintenance & I can't have a bit of fun looking at individual test results.
 
Back
Top