Reply from nVidia about the 3DMark anomaly

Joe DeFuria said:
I think that's more or less Doom's point. ;) He's just trying to convince ceratin nVidia faithful that "most companies" includes nVidia.
Apparently not, because he doesn't seems to understand that ATI, in turn, tells every board maker who will listen that "NVIDIA is overcharging you, single-sourcing is dangerous, and what happens when they miss a cycle?".

Naivete.

Every company tries to press their own perceived advantages while insinuating weaknesses amongst their competitors. What's wrong with that?
 
74.jpg
<---Oompa Loompa :LOL:


I'm sorry after watching Wily Wonka and the Chocolate Factory I can't fathom a Oompa Loompa being a computer junkie..oh well times change :)
 
I guess that for their next driver release they have found another way of detecting the various 3DMark scenes so the optimised driver paths can be activated :)

K-
 
Kristof said:
I guess that for their next driver release they have found another way of detecting the various 3DMark scenes so the optimised driver paths can be activated :)

K-

That would be my guess too.

On the other hand, how do we know that Ati and other companies doesn't do this also ?
 
Kristof said:
I guess that for their next driver release they have found another way of detecting the various 3DMark scenes so the optimised driver paths can be activated :)
You took the words right out of my mouth! But I would have written optimized as "optimized". ;)
 
There are thousand ways to detect an application if you really want. Take for example the shader programs. The vertex programs used in the nature scene are unique, so it is easy to detect this scene by checking the vertex programs at load time.

I guess too that they just change the way how 3D Mark is detected. If they're wise they don't do it with the next driver release, so that the performance goes down and everybody thinks the cheat has gone, and sometime later they introduce the cheat again with a different detection mechanism.
 
Kristof said:
I guess that for their next driver release they have found another way of detecting the various 3DMark scenes so the optimised driver paths can be activated :)

K-

As others have said previously.... I couldn't have said it better myself. :)

They're definately doing per-test optimizations, at least for the game tests. Not sure i'm going to be able to do up a modified 3dmark to confirm that though...
 
BoardBonobo said:
There's already a SurfMonkey registered here. No men in black theories over that one :D

Not that I want to drag myself into this little bedate, but there isn't actually.
 
DaveBaumann said:
BoardBonobo said:
There's already a SurfMonkey registered here. No men in black theories over that one :D

Not that I want to drag myself into this little bedate, but there isn't actually.

Oh well, I'm pretty sure that when I tried it told me there was. All this subterfuge for nothing :rolleyes:
 
Well, from my own tests on a couple different systems here, the *closest* tie was from this P4 system here-
Splash screens on:
splash-on.txt


Splash screens off:
splash-off.txt


So roughly a 500 3dmark improvement between the two, with the biggest score impact likely coming from the Nature test. My worst case was a P4-2.4 with a 920 3dmark spread. I consider +/- 50 3dmarks a "wash" as far as benchmark accuracy.

The only thing that really bugs me about this whole thing is the intense double standard that has occurred with an alleged "benchmark cheat" when compared to the same allegations for ATI previously with the whole Quake/Quack issue.

I'd really like to know why this isn't on the front page of HardOCP, Firing Squad, Sharkey's, Toms and everywhere else. I'd also like to know why there are no bin/hex edited binaries or patches floating around to change the text "Game 4 - Nature" to "Fart 9 - Muddle" and to etch some pattern into the splash screen bitmaps to throw off the "bug"... It just seems to me that if NVIDIA has an allegation against it, there will be little to no effort into the pursuit of isolating what exactly the truth is concerning the allegation. The accepted standard is to just let their PR department answer in an email then let it lie at that.

Be it a cheat, bug, optimization or whatever- why is there the bias concerning delving into the "bottom" of what the true situation is? Wouldnt such interest "clear" NVIDIA if this were indeed a bug? Or is it something else? Just a bit confused myself.
 
Sharkfood wrote:
Be it a cheat, bug, optimization or whatever- why is there the bias concerning delving into the "bottom" of what the true situation is? Wouldnt such interest "clear" NVIDIA if this were indeed a bug? Or is it something
else? Just a bit confused myself.

I always thought the core of the whole uproar over the Quack issue was that it seemed as though ATI was intentionally lowering video quailty to gain FPS in application specific optimizing. As we all know, this turned out not to be the case.

If it is the case that these sites thought that application specific optimization was 'teh devil', then they ought to wake up to themselves...
 
Sharkfood said:
So roughly a 500 3dmark improvement between the two, with the biggest score impact likely coming from the Nature test. My worst case was a P4-2.4 with a 920 3dmark spread. I consider +/- 50 3dmarks a "wash" as far as benchmark accuracy.

The only thing that really bugs me about this whole thing is the intense double standard that has occurred with an alleged "benchmark cheat" when compared to the same allegations for ATI previously with the whole Quake/Quack issue.

I'd really like to know why this isn't on the front page of HardOCP, Firing Squad, Sharkey's, Toms and everywhere else. I'd also like to know why there are no bin/hex edited binaries or patches floating around to change the text "Game 4 - Nature" to "Fart 9 - Muddle" and to etch some pattern into the splash screen bitmaps to throw off the "bug"... It just seems to me that if NVIDIA has an allegation against it, there will be little to no effort into the pursuit of isolating what exactly the truth is concerning the allegation. The accepted standard is to just let their PR department answer in an email then let it lie at that.

Be it a cheat, bug, optimization or whatever- why is there the bias concerning delving into the "bottom" of what the true situation is? Wouldnt such interest "clear" NVIDIA if this were indeed a bug? Or is it something else? Just a bit confused myself.

I Couldn't agree more. Nvidia gets a slap on the hand. ATi gets raped on its Radeon 8500 launch. I don't think that it is any sort of error on behalf of the sites mentioned. For some reason ATi faces what seems to be increadable bias... is it because they are not centered entirely in Santa Clara? Possibly because they are Canadian? Don't know, what I do know is that it isn't right, whatever the root of the bias is.

Sabastian
 
Sabastian - I would suggest taking it up with the sites who were most vocal about it; writing it here will do little good.
 
DaveBaumann said:
Sabastian - I would suggest taking it up with the sites who were most vocal about it; writing it here will do little good.

I am sorry Dave I just thought I was posting along in this thread like every one else. I didn't realise MO was not welcome. Your right at any rate.

Sabastian
 
I don't mind you posting about it but I'm just suggesting you may get more of a responce as to why these sites haven't taken issue with it by actually contacting the sites themselves, I'm sure you'll get a glib responce back but someone may actually take up on it.

Although, HarOCP were one of the sites to run up the 'ATi Cheating' flag and I note Brent posts here but has been silent on ths subject so far.
 
That a major company goes on record and claims that a specific issue is a certain way. Then I start accepting it more.

Hmm if I remember right you did not seem to agree with the FS interview where ATI said it was a bug with the way the texture slider was being read in their defense for a Q3 issue. However that was on the old board and I was not able to find that thread so sorry if I have mis quoted :(

Optimizing for a game is fine as thats what its all about, playing games.

Optimizing for a benchmark is wrong, but we know everyone does it.

Optimizing for a game thats used as a benchmark is very grey as its right in the middle of the two.
 
http://firingsquad.gamers.com/hardware/8500int/page2.asp

Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them. The current RADEON 8500 driver revision has an issue that prevents it from correctly interpreting the texture quality slider setting in Quake 3. This issue will be corrected in the next driver release.
 
DaveBaumann said:
http://firingsquad.gamers.com/hardware/8500int/page2.asp

Most of our optimizations for Quake 3 and other applications have no impact at all on image quality, and therefore it would be pointless to allow users disable them. The current RADEON 8500 driver revision has an issue that prevents it from correctly interpreting the texture quality slider setting in Quake 3. This issue will be corrected in the next driver release.
\
And it was, and performance didnt suffer.
Ok. We know this.

RE: 3dmark, there are two possibilities.
1) Scores stay up when bug is "fixed". If this happens, without investigative reporting :) no one will know if they are indeed optomizing and just changed the detection method or not.
2) Scores go down to the lower (splash off). nvidia takes heavy flack, etc. I view this as unlikely.
 
Back
Top