Cheating and its implications

Status
Not open for further replies.
boobs,

I expect that whispers of UT 2k3 in association with this 3dmark03 issue might have something to do with analysis of these files already, if something can be definitevly established by analyzing them. Atleast, if the issue is actually reflected in UT 2k3 timedemos.
 
Dammit, OGL Guy, do I have to spell it out for you with a graph?

Amusement Meter:
Truth [-----|----X] Conspiracy Theory
Reflection [-----|----X] Impulse

YeuEmMaiMai, not to belabor the point, but Japan has its share of questionable automakers. Both Mitsubishi and Nissan were in financial trouble recently, IIRC (tho it may have been only the former who got into it by lying--I'm not sure). Whee, everyone has problems, big deal. Now get back to the serious discussion of Drivers Gone Wild!
 
driver core is 6.3MB and when you expand it it is 7MB so I decided to take a look at what it all is

4.3MB driver files for ALL OF THE RADEON CARDS

setup 136 kb
setup.bmp 302 kb
setup.inx 144kb
data1.zip 1.17mb
ikernel.exe 331kb
driver.dll 88kb
driver.dat 6kb

Wow cvertantly looks like they are hiding something in there alright....

so I definately can account for Ati's files

btw if you install the control panel and the drivers you wind up with about 14MB of stuff NOT 20 MB
 
You guys should stop inserting your own preconcieved notions and take statements at face value.

I downloaded the drivers yesterday to see if it fixed my "display goes whacky when changing the forced AA setting" problem (it didn't by the way) and noticed they were 20 meg total.

Demalion says "hey, analyzers would add code bloat"

And I say "well, with 20 meg, you could put analyzers in there and not be noticed".

I did not say, or imply that there ARE analyzers in there. not with a winky, or a nudge, or even vaguely. You thought that, not me.
 
demalion said:
boobs,

I expect that whispers of UT 2k3 in association with this 3dmark03 issue might have something to do with analysis of these files already, if something can be definitevly established by analyzing them. Atleast, if the issue is actually reflected in UT 2k3 timedemos.

If somebody played through UT2K3 on the two most recent Nvidia drivers, got that type of data, and showed that performance increase in time demo is not reflected in actual game play, then that would be damning.

I'd expect heads to roll at Nvidia for something like that.
 
RussSchultz said:
And I say "well, with 20 meg, you could put analyzers in there and not be noticed".

Actually, that's not completely what you said. You said that you downloaded ATI's drivers....and after that said that you could put that stuff in there.

In short, I can see how you didn't mean it in a negative way, but I can also see how it was interpreted that way.

Had you said "It's common for Driver builds to be 20 megs or more..." then there would be no room for misinterpretation.

Just clearing the air. ;)
 
If somebody played through UT2K3 on the two most recent Nvidia drivers, got that type of data, and showed that performance increase in time demo is not reflected in actual game play, then that would be damning.

I'd expect heads to roll at Nvidia for something like that.


I wouldn't expect it. I would however expect a lot of people making excuses for them.
 
OGL Guy works for ATi and I do not blame him for trying to stop someone from insinuating that ATi is placing "inapproperiate" software into their drivers.

Well Nissan and Mitusbishi isn't exactly known for building the same calibre cars as Toyota or Honda. Most people rank then the same quality as american cars

Pete said:
Dammit, OGL Guy, do I have to spell it out for you with a graph?

Amusement Meter:
Truth [-----|----X] Conspiracy Theory
Reflection [-----|----X] Impulse

YeuEmMaiMai, not to belabor the point, but Japan has its share of questionable automakers. Both Mitsubishi and Nissan were in financial trouble recently, IIRC (tho it may have been only the former who got into it by lying--I'm not sure). Whee, everyone has problems, big deal. Now get back to the serious discussion of Drivers Gone Wild!
 
YeuEmMaiMai said:
Wow cvertantly looks like they are hiding something in there alright....

so I definately can account for Ati's files

btw if you install the control panel and the drivers you wind up with about 14MB of stuff NOT 20 MB

Blah blah blah. Why do I get dragged into this shit?

Look: according to this page (where I downloaded my drivers) http://www.ati.com/support/drivers/winxp/radeon-ml.html, the two pieces add up to 19 meg. So I rounded up. Shoot me.

Now, addressing your itemized list. Wow. you looked at the file names. You must know exactly whats in them? Gorsh. That was a zip file. Did you look in there? Oooh, maybe you did and its more files. Whoop-de-'-n-do.

Do you know whats in those files? Do you expect to see a file that say "hey look at me, I'm a cheater.exe"? or "runmetocheatat3dmk3.exe"

But, beyond that, I am and was speaking in generalities, using the latest drivers I downloaded (which were about 20 meg) and stated that analyzer code would likely be small enough that it could hide with no problem.

And my latest discourse about your inability to see whats inside those files is, again, not attacking ATI but talking about the generalities of how it would be difficult for anybody being able to find things like we're talking about.

BUT I SURE AS HELL DIDN'T SAY ATI IS CHEATING.

I can only imagine that if I explained all of this ahead of time, you'd probably say I was just TRYING to appear unbiased, but the fact that I mentioned ATI's drivers and used them as a reference to discuss the how easy it would be to hide small code in big file that I must be attempting a smear campaign.

Feh on you. And you. And you over there in the corner. Feh on you too. :p
 
due, you were definately trying to smear Ati there otherwise you could have used a generic statement such as "You know with driverdownloads being on average 20 MB I can definately see the possibility of secretly adding some malicious code into the drivers"


Oh wait a minute, Nvidia already did that now didn't they...........
 
Russ,

I would have figured you would had learn to keep your mouth shut when speaking in generalities. Especially considering the volatility of the situation. Everybody is very touchy when cheating is being discussed. :)

Tommy McClain
 
It ain't brain surgery simulation or something, I doubt that any such cheat would take more than 32KB at most, so the size of the archive is irrelevant here. :LOL:
 
I still think Nissan is making the best-looking non-luxury cars ATM. :) They've been on a design tear recently.
 
I registered so that I can ask a question about these benchmark “optimizationsâ€￾. Forgive me if this is not the right topic or this question has already been covered elsewhere. There are too many posts to wade through lately. ;)

Back to the question. Some people seem to allege that to insert clipping planes and what not, the driver developers need to ‘sit there and study it’, which implies that it is a time-consuming and labor-intensive task. Now, I don’t want to give any potential cheaters idea (at least not too much :LOL: ). But why would the developers need to actually sit there? If the input to an equation is known, wouldn’t all values of intermediate steps able to be calculated in advance? I mean, all the data are there for the plucking, either from hardware or software.

If only the harddisk is faster than the GPU, someone might even try to load pre-rendered images from it. :devilish:
 
I'm curious, would it even be possible to pull of this type of hack in game(if it isn't a bug)?

I can understand how you could say that a driver could be adjusted to look for, say, Demo Four from Q3 or Flyby from UT2K3, but how would the driver know when to enable them within any sort of useful parameters?

If you are running Quake3, just to pick a generic example, if you suspect that Demo4 is hacked, there isn't anything stopping reviewers from running Demo1 or Quaver(after reinstalling the game anyway) to check the authenticity of it. Then there are benches they could record themselves. What would the driver do in those cases? Attempt to calculate out the path for each demo? If it did that on the fly, the FPS would be absolutely slaughtered. If it did at load, the load times would be obscenely long.

With 3DMark2K3 it would be very simple, just check the exe and you know exactly what it is going to do. 'Quack' would be another good example of how they could pull it off. I don't see how using the clip planes hack to boost scores would work in games.
 
It would simply work with some demos and not others.

There is the possibility of triggering static clip plane assumptions based on commonality...for example, branching along a "tree" of clip plane sequences could be activated based on certain parameters (receiving certain checksums of vertex data in a certain sequence and time period, a number of things that would depend on things that could be precalculated by analysis) suited for what the drivers are assuming based on those triggers.

Some demos could meet the triggering criteria and activate the branch...this could of course turn off once the triggering sequence stops, and the quickest fix for using pre-calculated clipping planes for the 3dmark 03 "rail" is to update the triggering sequence for it in this fashion (what seems likely is that there is no branching and failing triggering sequence because nVidia was not aware of the possibility of jumping off the "rail"). What is is unique to nVidia and GT 4 is that their architectures, prior to the NV35 based on what has been indicated, simply could not compete with a general floating point pixel shader workload, and nVidia objected to having that accurately represented. However, this is not purely circumstantial...it is circumstantial with a clear motive, prior clearly established and specifically targetted benchmarking shortcuts being taken to gain performance, evidence linking nVidia with a specific act, nVidia on the record with prior comments attacking the benchmark and actions taken in that direction, and plenty of witnesses to something very specific and indicative (anyone with the card and the development version directly, and the rest of us looking at photographs of what they witnessed).

Condemning 3dmark 03 (speaking to the general case) is a red herring, because GT 4 does truly represent this floating point processing limitation (the NV35 would then be just a side beneficiary of artificial inflation of score in addition to having more competitive floating point processing ability). The behavior in other GT in 3dmark 03 would just be contempt for 3dmark and truthful representation of their architecture, since the NV3x are all much better suited for them than GT 4. That evidence is more heavily circumstantial, except that it is directly linked to the other evidence for GT 4 (that act facilitates these acts, so the additional factors there actually serve to correlate to them even though only some of the same things directly apply).

The occurence for other timedemos and in games would likely be more sophisticated (and should be less likely given the NV30-34 greater suitability to current games, as they are not DX 9 yet), and some other form of analysis would likely be required to expose if they are doing something similar, though nVidia's penchant for quoting certain timedemo results in particular and the apparent failure of those results to apply realistically in the general case, in the context of 3dmark 03 behavior, does make it look likely that such is occurring. Certain game rendering and performance issues, again more purely circumstantially indicated than for 3dmark 03, do point towards this type of triggering mechanism being used and simply misfiring in some driver releases/leaks.

There is also the possibility that I've mentioned before of the drivers containing code to analyze timedemos on load...it is possible for some timedemo formats to contain information in a way that would lend itself fairly easily to that, but the question is how slow that might be (given the rapid growth of system performance beyond what is necessary for many tasks besides 3D gaming, that is a big question).

I tried to emphasize the different places where I think it is fair to conclude, and where it is more fair to just suspect (strongly or not).
 
ATI, fighting fire with fire?

Now I know this is going to attract a lot of heat but as Nvidia is allowed to win the benchies due to certain and perhaps unethical optimisations what if ATI added the same optimisations to their drivers? It could be selected in the control panel and of course should be broadcast and documented in full. Although I think this may cause unintentional damage to 3DMarks credibiltity it would show all the sites two can play at that game? At the very least the sites would have to state that the optimisations aren't really going to translate into improved game framerates but hey, at least Nvidia is shown to be using the same optimisations and as they didn't allow the option to be controlled they would certainly loose a lot of credibility! Of course I'm assuming that if ATI added the same style of optimisations to their drivers we'd see a marked improvement in the scores and would probably level the playing field?

Other than that I dont see ATI with any options, at the moment Nvidia are clearly enjoying most people seeing them win the 3dmark benchies and won't care if a few specialised sites throw a little mud because of the optimsiations in place. As I hate to see a company pretty much pull the wool over it's consumer eyes I really do hope this whole affair isn't allowed to rest.

I hope ATI has the creativity to turn Nvidias under handed tactics into Nvidias worst nightmare!
 
Status
Not open for further replies.
Back
Top