New 3DMark03 Patch 330

John Reynolds said:
martrox said:
Bottom line will be that ATI is optimising, while nVidia is cheating......AND, there is a difference!

Don't be too quite to assume that. 8% difference in just one test is worth investigating.

John, it should be investigated...... If ATI is cheating, then they should be exposed, too. And I will be just as quick to condemn them as I am with nVidia..... The difference, ATM, is the extemt that nVidia has been doing it...and that they have definately been caught cheating....
 
Nite_Hawk said:
mczak:

Do you know how the test is performed though? Is it running the mother nature pixel shaders, doing something else?
Sorry, I'm probably wrong - I typed faster than thinking, it could well run the same shaders as GT4 (don't know, my R9000 won't do it...).
 
galperi1 said:
oh god...

I just read on 2 forums and guess what....

now that the report is out, everyone is switching from Nvidia didn't cheat because you can't see it to Ati is cheating too.

Now they have yet another mindless defense to spew in favor of Nvidia.
:(

>o< Yeah, I just finished reading the forum at NVNews... and the very ones that were saying that NVidia wasn't cheating before (mostly because they couldn't see it) are jumping at ATI's throat for cheating too. It is really pretty sad. It is obvious that ATI is using some program recognition for game four, but those optimizations don't necissarily effect the rendering of G4T, but rather just optimize the driver specifically for that test... that said even if this is true, I'm not to thrilled that ATI is making a game specific optimization like that... though in the case of 3DMark I feel it is excusable in that unlike Engine specific optimizations, 3DMark does not use a shared game engine, so any optimizations would have to be game specific.
 
Brent said:
Wow, I didn't realize the extent of cheating that was going on in 3dmark03

It’s unacceptable

Whether it be a game or a synthetic test, it's unacceptable


Good job Futuremark, it was a very well laid out and easy to understand PDF.

Thanks Brent. I agree with your comments and really appreciate you saying that.

Cheers,

AJ
 
martrox said:
Hmmmm...1.9% for ATI, and 24.1% for nVidia.... Keep grabbing at those straws, ED. ;) I guess that just goes to prove how much better nVidia's drivers are than ATI's.... :rolleyes:

Bottom line will be that ATI is optimising, while nVidia is cheating......AND, there is a difference!
1.9% because of just GT4 ;)
 
Kid_Crisis:

if you reverse the numbers, you get 5806/4679 = 1.241 which is a 24.1% increase with the old build over the new build using the new build as a baseline. Thus, you can say that there is a drop of 24.1% (of the new build's score) going from the old build to the new build.

Nite_Hawk
 
kid_crisis said:
What Is the Performance Difference Due to These Cheats?
A test system with GeForceFX 5900 Ultra and the 44.03 drivers gets 5806 3DMarks with 3DMark03 build 320. The new build 330 of 3DMark03 in which 44.03 drivers cannot identify 3DMark03 or the tests in that build gets 4679 3DMarks – a 24.1% drop.

4679 / 5806 = 80.6% or a 19.4% drop
lol, very true. They probably calculated a 24.1% increase from 4679 to 5806 (which is correct) and assumed the drop percentage is the same, a mistake often made.
 
About Nvidia cheating, I'm sure no one here is surprised. Aside from the regular f@nboys(we all know who they are) that is.
The argument that ATI was found "cheating" as well, is not correct either, since it is yet to be determined what is exactly being done at the driver level(other that application detection).
On the other hand, Nvidia has sumarily been caught CHEATING at the driver level(not optimizing but in addition to application detection), with clip planes, changing shader precission, etc.

Kudos to FM for setting a new precedent in benchmarking fairness. Now get cracking and find out how exactly ATI is getting that 1.9% increase with the old unpatched version.
 
OICAspork said:
It is obvious that ATI is using some program recognition for game four

It will be a shader recognition - if it wasn't clear from the document its very, very easy to detect a shader program and replace it with another. This can be occuring anywhere...
 
martrox said:
John Reynolds said:
martrox said:
Bottom line will be that ATI is optimising, while nVidia is cheating......AND, there is a difference!

Don't be too quite to assume that. 8% difference in just one test is worth investigating.

John, it should be investigated...... If ATI is cheating, then they should be exposed, too. And I will be just as quick to condemn them as I am with nVidia..... The difference, ATM, is the extemt that nVidia has been doing it...and that they have definately been caught cheating....

It doest matter is a student cheats in all the exam questions or just 1 questions. He still gets punished if he gets caught.
And there is less of a reason for ATI to do these "optimisations" as you call them since they are clearly in the lead here. Unless both the IHVs were trying to out cheat each other :LOL:
 
Evildeus said:
martrox said:
Hmmmm...1.9% for ATI, and 24.1% for nVidia.... Keep grabbing at those straws, ED. ;) I guess that just goes to prove how much better nVidia's drivers are than ATI's.... :rolleyes:

Bottom line will be that ATI is optimising, while nVidia is cheating......AND, there is a difference!
1.9% because of just GT4 ;)

Let me just repost what Dave said:

DaveBaumann said:
Thats because its not visible anywhere.

My suspicion is that they are replacing the shader code with one that is optimised for their Vec3 / parallel Scalar pixel shader. Because ATI only have one internal precision, unless they start ripping chunks of the shader out, its not going to be visible.

Keep grabbing at those straws...... ;)
Bottom line is nVidia is GUILTY, ATI may be guilty but, ATM, that's not proven... and there is a pretty good chance that ATI will not be proven guilty.....
 
Nite_Hawk said:
Kid_Crisis:
if you reverse the numbers, you get 5806/4679 = 1.241 which is a 24.1% increase with the old build over the new build using the new build as a baseline. Thus, you can say that there is a drop of 24.1% (of the new build's score) going from the old build to the new build.
Nite_Hawk

Yeah, no kidding. I know exactly the math mistake they made; it's quite common, and still wrong. So if the performance dropped from 5806 3Dmarks to 2903 (exactly half) would you say performance dropped by 100%? No, it dropped 50%.

They can either say the the old build had INCREASED performance by 24.1% or that the new build DECREASED or DROPPED performance by 19.4%. They can't mix and match whatever numbers they like.

Well, actually they can , but they look stupid.
 
Wow - first they loose Dawn (faster and better IQ on R3x0), then the Doom 3 results look increasing suspicious as a very unlevel playing field for comparisions and now they are caught badly cheating in 3D mark 03.

Is trying to falsify a major industry benchmark deceptive practices in your country - legally that looks like the sort of practices that might end one up in a class action? They are presenting something as much better than it really is for commercial gain == deceptive practices in Australia.
 
mczak:

Yeah, it's misleading. They should have said that there is a 24% increase in speed when the cheats are enabled, rather than a 24% decrease is speed using the new build. I think they wanted to have the non-cheating build be the baseline.

Nite_Hawk
 
martrox,

I'm right with you on that one...

Remember, this is the guy that put up that lame post about this whole thing and said...

Two days after Extremetech was not given the opportunity to benchmark DOOM3, they come out swinging heavy charges of NVIDIA intentinoally inflating benchmark scores in 3DMark03...."

Which is to say, "let me invalidate all of Extremetech's findings because they're simply pissed that they're not kool enough to be invitied to participate in the Doom III testing process."

I'm _still_ waiting to see a single thing about any of this crap on Anandtech...and for that matter, the followup article on the FX drivers that Anand promised by the end of last week...
 
martrox said:
Hmmmm...1.9% for ATI, and 24.1% for nVidia....
Or, ATI did much more clever cheats which couldn't get detected by fm...
I'm in no way suggesting this is actually the case, but we just have to trust fm that they in fact did a thorough examination and not only look for the already suspected cheats in nvidias driver. Though I think it's really in fm interests to detect all cheats, it would be really bad for their reputation if they would be seen as pro-ATI, anti-Nvidia.
 
kid_crisis said:
What Is the Performance Difference Due to These Cheats?
A test system with GeForceFX 5900 Ultra and the 44.03 drivers gets 5806 3DMarks with 3DMark03 build 320. The new build 330 of 3DMark03 in which 44.03 drivers cannot identify 3DMark03 or the tests in that build gets 4679 3DMarks – a 24.1% drop.

Well, darn good thing you had a lawyer look over that release. That way there wasn't any embarassing errors, such as not being able to calculate a simple percentage correctly..... :rolleyes:

4679 / 5806 = 80.6% or a 19.4% drop

Yes, it's a 19.4 % drop but the cheating overstated the performance by
(5806-4679)/4679 = 24.1 %

(Getting slightly OT here, I heard that NVidia's stock went up by 1/3 with the release of the NV35. I can see this issue coming up in a shareholder's lawsuit if the stock tanks again.)

It's looking more and more like much of the supposed advantage of the NV35 over the NV30 (aside from the legitimate advantage of increased bandwidth) is smoke and mirrors. The apparent pixel shader cheats are most disturbing.
 
Back
Top