OMG HARDOCP REVIEW OF UT2003 AND FILTERING

Status
Not open for further replies.
Sorry, should've been more elaborate. Yes, I do know exactly what the code change does and I don't think he was/is aware of the side effects on performance. My "useless" statement was with regard to performance and not image quality. I should've been a bit more verbose about that. What I meant to say is that using AntiDetector is useless for performance comparisions.

-- Daniel, Epic Games Inc.

FUDie said:
Is that what you think is going on here or is that what NVIDIA is telling you? Have you examined Unwinder's code? Do you understand what he is doing?
 
vogel said:
Maybe because AntiDetector is useless? It just shows that you can make the driver perform worse by changing a couple of bytes. Big surprise there - might as well have inserted a sleep(1) while messing with it ;-)

-- Daniel, Epic Games Inc.

Following up on Doom's question, your description sounds very much like the patch FutureMark did for 3DMk03 in which the "couple of bytes" changed in a recompile completely corrected nVidia's clip planes, buffer overrun artifacts, and unrendered frame segments--while it also dropped the performance of the hardware running the benchmark by 30%. According to one description by nVidia (there were several) these artifacts were originally caused by a "driver bug" which the 3DMK03 recompile patch was miraculously able to cure by simply "moving a couple of bytes"--no assistance from nVidia required. Of course the recompile '03 patch had but a single purpose--to defeat driver-detection routines for the benchmark--which it seems to have accomplished handily.

But as to anti-detector I guess your position is that it doesn't work, even though oddly enough many of the ATi driver scores are unaffected by it....?

Edit: Following up:

vogel said:
...My "useless" statement was with regard to performance and not image quality. I should've been a bit more verbose about that. What I meant to say is that using AntiDetector is useless for performance comparisions.

Well, how do you separate IQ from performance, exactly? I think Doom's point was that using the anti-detector was the only way to enable a certain level of IQ from the Dets--an IQ level which should ordinarily be available normally (without having to use anti-detection.) Generally speaking, when you increase IQ you lower performance, no anti-detection code required. So it would appear that anti-detection in that case did precisely what it was intended to do, seems to me: the driver simply ran the game as it would run any other application for which no performance-optimization was coded into it. Hence, there was no difficulty in getting the proper IQ--but only after anti-detector was run. The drop in performance comes because of the increased IQ--not because of the anti-detector code.
 
vogel has a good point regarding performance comparisons, as the Anti-detector patch disables both "valid optimisations" (in the Sweeney/Carmack definition, ie faster result for identical output) as well as "cheats^H^H^H^H^H^Hoptimisations" (in the FutureMark non-definition).
 
vogel said:
Sorry, should've been more elaborate. Yes, I do know exactly what the code change does and I don't think he was/is aware of the side effects on performance. My "useless" statement was with regard to performance and not image quality. I should've been a bit more verbose about that. What I meant to say is that using AntiDetector is useless for performance comparisions.
Why? If you are CPU limited, then I can see how Unwinder's code could *possibly* hurt performance, but if you are graphics card limited, I don't see how this can be so, except for the fact that you are forcing different rendering (i.e. the requested trilinear instead of bilinear). What I am saying is that the reason performance is different with Unwinder's program is not because of the program itself but because things are being rendered as they should be. Of course there's a performance difference: application settings wouldn't have been changed things if there weren't, right?

-FUDie
 
My point is that it doesn't work as intended and that you therefore can't draw any performance conclusions from it. I don't really believe in any driver hacks like this as unless you have the source you don't really know what's going on behind the back and you don't really know what you are changing if you just look at the assembly.

It's hard to make a serious point if you use a hacked driver and there certainly are better ways to investigate issues. I'm surprised I haven't seen many people using small unlit cube test maps with a basic single texture material applied to it as this sounds like a much more scientific approach to me.

-- Daniel, Epic Games Inc.

WaltC said:
But as to anti-detector I guess your position is that it doesn't work
 
>>Sorry, should've been more elaborate. Yes, I do know exactly what >>the code change does and I don't think he was/is aware of the side >>effects on performance. My "useless" statement was with regard to >>performance and not image quality. I should've been a bit more >>verbose about that. What I meant to say is that using AntiDetector is >>useless for performance comparisions.

>>-- Daniel, Epic Games Inc.


No offense, but it's likely you may not know what the code change is doing.

The app detection in NV drivers seems to occurs at context create time and/or createshader time. All the script is doing is disabling a "positive" app/shader detect from hapenning.

Basically you are running the driver in fully API compliant mode when you run the script.

Granted some of the app detects and shader detects are being used for working around application or hardware issues. But that doesn't make the script invalid for performance measurements
 
One example from our engine - we have a hashing function as well and if you were to mess with it you'd end up uploading resources all the time.

-- Daniel, Epic Games Inc.

FUDie said:
but if you are graphics card limited, I don't see how this can be so
 
reever said:
" will not sit here and watch people slander HardOCP with our own bandwidth. There are places they can go and be welcomed to do that. Rage3D will allow you to do it all day long and B3D has a pretty good track history there as well. The mods will let you say pretty much anything and launch all sorts of vicious attacks that are personal and make all sorts of outlandish statements. "

Funny how he bans Wavey and the subsequently makes comments like these - at least there's an open house here. Of course, if he continues to ban people from voicing their opinion when it disagree's naturally they are going to want to vent elsewhere.
 
vogel said:
My point is that it doesn't work as intended and that you therefore can't draw any performance conclusions from it. I don't really believe in any driver hacks like this as unless you have the source you don't really know what's going on behind the back and you don't really know what you are changing if you just look at the assembly.

It's hard to make a serious point if you use a hacked driver and there certainly are better ways to investigate issues. I'm surprised I haven't seen many people using small unlit cube test maps with a basic single texture material applied to it as this sounds like a much more scientific approach to me.
But this completely fails to work because of the changes that are being done per application.

-FUDie
 
Mr Vogel, would you mind pointing out some specific issues you see/anticipate with UnWinders AntiDetect "hack"?

I am with you on the statement that one cannot use it for realistic performance comparisons, however, you must also admit that higher IQ will be reflected in performance, and as such, some of the performance drop is most surely coming from the increased IQ (else, why would nvidia lower it in the first place?).

I also dont quite understand your comment about a simple cube map being a more scientific approach - approach to what? I dont understand.
 
vogel said:
One example from our engine - we have a hashing function as well and if you were to mess with it you'd end up uploading resources all the time.
I can buy that. But what I can't buy is how said "hashing function" can result in lower image quality per application without a deliberate act on the part of the coder. Also, benchmarking applications are the ones where the biggest changes in performance were noted and I don't think that's coincidental.

-FUDie
 
vogel said:
My point is that it doesn't work as intended and that you therefore can't draw any performance conclusions from it. I don't really believe in any driver hacks like this as unless you have the source you don't really know what's going on behind the back and you don't really know what you are changing if you just look at the assembly.

It's hard to make a serious point if you use a hacked driver and there certainly are better ways to investigate issues. I'm surprised I haven't seen many people using small unlit cube test maps with a basic single texture material applied to it as this sounds like a much more scientific approach to me.

-- Daniel, Epic Games Inc.

But DV what the subject of concern is here is an IHV who hacks his drivers to provide substandard IQ simply to inflate benchmark performance framerate scores (Lets not even discuss IHV LOD and AF hacks done to accomplish the same thing.) The inability of the end user to select for full trilinear filtering in UT2K3 (detail textures included) with the latest Dets is the result of IHV driver hacking--and has nothing to do with anti-detection. In fact, the IHV driver hacks of this type not only predate UW's anti-detector but are the primary reason you see "anti-detection" software at all these days. Right? I mean, lets not pretend than an IHV can't hack his drivers--and hack them better than anyone else...;)

That's why I brought up the issue of how the ATi drivers seem almost unaffected by the anti-detector code. Nobody is saying that anti-detection software is the "perfect" approach--of course it's not. But right now it's about the only approach to ferret out whether or not an IHV is hacking his drivers to return inflated benchmark scores that are incongruent with average-case 3D game performance as supported in those drivers.

I agree that other, better methods are needed. One thing I can think of that would really be nice is to see Epic code in its own "anti-cheat" software, directly into the game engine, so that if the game calls for full trilinear either the driver provides it or the game doesn't run (until you set it for bilinear.) That would be nice--but very difficult to do, I'm sure. But there it is. If developers were doing their part we wouldn't have to worry about UW's anti-detection scripts...right?....;) This is basically nothing more than an effort being made by people to force a level of truth in advertising that some vested interests in the industry seem reluctant to provide. No mystery here.
 
So what are we saying about the anticheat detector exactly?

It removes all app specific code paths and makes all cards run an api industry standard rendering code?

It doesnt make any/little difference when running on ati cards.

It makes a huge difference to nvidia cards.

My real questions are -

is this programme biased in any way towards ati cards?
is anybody here compitent to know exactly why we see these results?
if they are can they prove it?
 
The anti-detector is not biased, just one IHV (starts with a N and ends with a A) has 70 references in its drivers for 'optimizations'.
I can't see how anyone can argue without those scripts, Nvidia users could not get full trilinear AF something the 44.03 driver is supposed to do when selecting 'quality'.
 
CorwinB said:
vogel has a good point regarding performance comparisons, as the Anti-detector patch disables both "valid optimisations" (in the Sweeney/Carmack definition, ie faster result for identical output) as well as "cheats^H^H^H^H^H^Hoptimisations" (in the FutureMark non-definition).

This is definitely a valid point, and also applies to game-bug fixes coded to require game-recognition to function. These would be disabled as well.

But you know, nVidia hacking its drivers to substitute a performance trilinear mode for full trilinear, even when UT2K3 calls for full trilinear, was something I did not anticipate. With lack of corrective action coming from nVidia, or from the game developers of the affected software, what else can you do besides something like "Anti-Detector"...? As DV points out there are other ways to investigate the problem apart from AD, but none of them that I'm aware of do anything to solve the problem in actual game play as AD does, imperfect as it is. Perhaps if game developers "develop" more of a sense for what their markets want they'll be of more assistance here. The relationship between 3D IHV's and 3D software game developers is certainly symbiotic. But both of them depend more on their market customer base than they do on each other. Possibly it may take developer pressure on nVidia to correct their present course--as nVidia seems fairly deaf to its (potential) customer base on these issues. I'll tell you I haven't seen conduct like this on the part of an IHV since I bought the V1 long years ago. I've never seen the like.
 
>>One example from our engine - we have a hashing function as well >>and if you were to mess with it you'd end up uploading resources all >>the time.

Good point...But the impact with disabling the hash detect seems to be higher in HW limited cases(with 4xAA 8X Aniso) than the default cases...So, I seriously doubt if the performance degradation is coming from inefficient resource loading.

Also, the fact that the trilinear works fully in UT2003 and the shaders in 3dmark03 look higher precision indicates that this hash function is being used for a lot more than optimizing redundant resource loading
 
WaltC said:
This is definitely a valid point, and also applies to game-bug fixes coded to require game-recognition to function. These would be disabled as well.

But you know, nVidia hacking its drivers to substitute a performance trilinear mode for full trilinear, even when UT2K3 calls for full trilinear, was something I did not anticipate. With lack of corrective action coming from nVidia, or from the game developers of the affected software, what else can you do besides something like "Anti-Detector"...?

I agree, the driver dropping quality in spite of what the app asks for is, shall we say, a "convenient bug". But, while we should never anymore take any benchmark number at face value, we should also be very careful when toying with things like the Anti-Detector, and not always conclude "result drops from 30% with AD enabled, hence this increase comes from cheats only" (an increase may come from "cheats", or from genuine optimizations, or from bug-fixes/workarounds...). Only the combination of various drivers testing, AD testing, and actual IQ analysis (with and without AD) can be of any help. If anything, this whole cheating/optimizing fiasco put even more burden on reviewers. On the positive side, perhaps it will be a bit easier to distinguish good reviewers from "PR specialists".
 
WaltC said:
changed in a recompile completely corrected nVidia's clip planes, buffer overrun artifacts, and unrendered frame segments

Sorry waltc but you keep on saying this but I think if there were buffer overruns it would be more likely crashing the vid card or cpu. The artificates are from failure of clearing the artifacts not writing past the end of the buffer.
 
I'm not familiar with UT2K3 but wouldn't it be possible to force bilinear (on purpose) in the driver or app, benchmark it, do it again with the same settings and Unwinder's script running and then compare the results to see if there's a difference in fps? You can isolate the performance difference between forcing trilinear via the script and any adverse effects it may have on legitimate optimizations. Would this work?
 
Status
Not open for further replies.
Back
Top