3DMurk03 new cheats ?

It is trivial for the driver to check the deviceid as well and the module filename and optimize anisotropic settings..

No bug in the hardware would have the bug with one filename and not another..
 
The problem is with the Drivers and NVIDIA wanting to make the FX appear faster.

AF can be fixed with this cheat if the detection scheme is skipped.

Taking screen pics between your Geforce 4 (maybe Geforce 3 too) and Geforce FX card will show differences that NVIDIA is trying to make the FX appear faster.

The old Application setting in the drivers worked well but was slower in games (plus UT really sucked with that setting).
 
DaveBaumann said:
The silicon is coded to recognise "3DMark03.exe"?

Why not? I wouldn't put it past nV to have made the silicon recognise anything with the name of '3dmark' in general. ;) :LOL:
 
radar1200gs said:
No, the silicon is buggy.

The only way this could be a problem with the silicon is if something in the silicon is recognizing when "3DMark.exe" is being run instead of another .exe file. I cannot describe how horrible an engineering design this is. And I can see no way in which the silicon could "accidentally" be noticing the filename, which is what would have to be the case if it was indeed a bug and not an intentional part of the design.
 
Nazgul said:
radar1200gs said:
No, the silicon is buggy.

The only way this could be a problem with the silicon is if something in the silicon is recognizing when "3DMark.exe" is being run instead of another .exe file. I cannot describe how horrible an engineering design this is. And I can see no way in which the silicon could "accidentally" be noticing the filename, which is what would have to be the case if it was indeed a bug and not an intentional part of the design.

Maybe I am mistaken but have you or has anyone for that matter provided evidence that proves beyond doubt that the NV3x line of cards (the silicon) can detect wheather 3dmark is running or not?

I'm sorry for not reading the ENTIRE thread but I just want to know if there is any hardcore evidence presented.
 
K.I.L.E.R said:
Maybe I am mistaken but have you or has anyone for that matter provided evidence that proves beyond doubt that the NV3x line of cards (the silicon) can detect wheather 3dmark is running or not?

I'm sorry for not reading the ENTIRE thread but I just want to know if there is any hardcore evidence presented.
No. Just one person guessing/speculating - and all other persons disagreeing.
 
madshi said:
K.I.L.E.R said:
Maybe I am mistaken but have you or has anyone for that matter provided evidence that proves beyond doubt that the NV3x line of cards (the silicon) can detect wheather 3dmark is running or not?

I'm sorry for not reading the ENTIRE thread but I just want to know if there is any hardcore evidence presented.
No. Just one person guessing/speculating - and all other persons disagreeing.

Well, people want proof of these things. To suggest something and provide no evidence of it is like calling someone names. Everyone knows that name calling is just a whole bunch of adjectives, verbs and nouns without any basis whatsoever (in most cases anyway).
 
How would the GPU know what application is running without the drivers parsing the information to it? And why would the drivers pass the application name to the GPU?
 
Tokelil said:
How would the GPU know what application is running without the drivers parsing the information to it? And why would the drivers pass the application name to the GPU?

I wouldn't say something along the lines of a driver passing a value across the GPU to a certain register that will activate something along the lines of poor IQ. :LOL:

No really, I wouldn't say it's impossible.

I just want proof of such a thing occuring.
 
K.I.L.E.R said:
Tokelil said:
How would the GPU know what application is running without the drivers parsing the information to it? And why would the drivers pass the application name to the GPU?

I just want proof of such a thing occuring.

Just what would constitute "proof"?
Note that US courts simply require "shown beyond reasonable doubt" in order to sentence someone to death.
"Proof" as such only exist in formal logical systems.

If we were a jury, would we say that it had been show beyond reasonable doubt that nVidia....

Of course, we are not a jury here. We are, for the most part, consumers. For us the question is the reverse -
Do we trust nVidia NOT to have cheated and thus made this comparitive data invalid?

Entropy
 
radar1200gs said:
No, the silicon is buggy.

The responce was a glib one, because that doesn't explain what is occuring here - or at least, its highly unlikely to.

If the silicon were buggy then it would exhibit the same issues in all usages of Anisotropic filtering so the driver would be coded to use the 'fixed' route on a gobal basis (since they are the interface between the API calls and what the hardware needs to do), it would be relatively stupid to put in specific 'game' work arounds for something that would generically affect everything.
 
DaveBaumann said:
OK, here are some shots from the 3DMark texture filtering tests (2.6MB):

http://www.beyond3d.com/misc/3dmrendering/3dmurk.zip

If you compare the 1x 3dmurk to the 8x 3dmurk shots you'll not that the quality of filtering improves through all the texture sections. Contrast that with the 3Dmark 1x and 8x you see that only every other texture section gets improved filtering. With the name as '3DMark03' the red/green pink/yellow secions always appear to stay at the same filtering level as 1x, while the black/white and blue/pink sections do see improved texture filtering.

Thanks Dave, I needed that to form an opinion on the matter. :? The 3dmurk_8x is clearly better than 3dmark_8x in a way that the shots at tech-report doesn't really reflect IMO.

Anyway, I'm now left wondering whether nVidia added some detection to force Application mode to Quality mode? They might have felt that it didn't degrade image quality... (not an excuse, just a possible explanation)
 
DaveBaumann said:
But the Quality mode wouldn't selectively apply it to texture in that fashion.

Good point, I missed that entirely. So what is most possible explanation in your view? Custom made AF for 3dmark.exe or a tweaked Quality (more aggressive) mode?
 
Good point, I missed that entirely. So what is most possible explanation in your view? Custom made AF for 3dmark.exe or a tweaked Quality (more aggressive) mode?
Or both?
In my mind the most regrettable part of all of this is the ennui surrounding it because people are rightly fed up about NVidia vs 3dmark. If this had been discovered spontaneously before ET's first expose I think attitudes would be quite different.
 
man this is getting insane. will we ever be able to trust benchmarks again? for now the only benches I trust are custom-made timedemos, like B3D is using.
 
radar1200gs said:
I get identical screen shots for murk and mark using your method.

In fact the screenshots are indentical to setting the level of anisotropy in the driver panel.

I can provide the .bmps if you want.

bear in mid I have a GF3, not a GF-FX.


Man, you are really confused about some things....don't you realize that unified driver packages contain lots of individual drivers and configurations for the different chips they support...? If you download a set of unified drivers and install them for nv20--hey, you aren't even running the same drivers that are set up for an nv3x gpu running the same Detonator version. The unified driver means "all drivers in one package"--it does not mean "one driver for everything." (Think about it for a minute and you'll understand why.) Looking at a GF3 tells you ....nothing about the 3D Mark 03 320/330-Detonator/nv3x issue. Might as well be running a Parhelia and making comparisons...;)
 
LeStoffer said:
DaveBaumann said:
But the Quality mode wouldn't selectively apply it to texture in that fashion.

Good point, I missed that entirely. So what is most possible explanation in your view? Custom made AF for 3dmark.exe or a tweaked Quality (more aggressive) mode?

I think this one is an older driver cheat that's been in the code for awhile and overlooked--but maybe not as people say it doesn't happen with the GF4/3. It just seems like tying driver recognition to the executable name is an older, less-technically adept method of cheating dating back a couple of years. Certainly it's not in the same league with the kind of camera-track cheating nVidia was nailed doing in 320. Just almost seems like a left-over cheat somebody forgot about...perhaps it had to do with the nv30 fiasco--as it does seem apparent on the 5800U according to TR, even with the 330 patch--which I don't think dealt with name-recognition cheats at all(?)
 
Back
Top