Futuremark Announces Patch for 3DMark03

Reverend said:
The point is that NVIDIA appears to have total lack of respect for FM wrt their beta partner agreement. I think history has proven why this appears to be the case. Unless FM has the balls, I doubt this scenario will change.

IMO, I think that Futuremark is doing the right thing. No use getting into a semantics game. Circumvent the cheats and throw the ball into nV’s court.

BTW I am glad the Dell guys read B3D. :LOL:
 
DaveBaumann said:
http://www.beyond3d.com/articles/3dmark03/340/

Thanks for that Dave, reading through it now. Slight corrections for you :-

-I think the index for Game Test 4's high end graph may be the wrong way round.
-"It is possible to detect the shader for the trees and then and a pre-computed on that replaces it which is executed over the CPU, the difference in precisions between the CPU and graphics chip vertex shader could account for these tiny variations." I think an and too many crept in.
-"comparison toll" no doubt you meant tool.
-tome should perhaps be tone on the conclusion page?

All small picky stuff.

Out of interest what makes you suspect that the vertex shader differences are perhaps related to extra precision by being executed on the CPU and not, as we seem to have seen in the past, simply a lower precision shader substitution?
 
DaveBaumann said:
http://www.beyond3d.com/articles/3dmark03/340/
A good article, though I must say that I had been hoping for some benches of the sythetic tests in 3dMark03. Some confirmation of NordicHardware's results would have been nice, as well as some investigation as to how the PS2.0 scores managed to remain the same.
 
I only whish they would make a option to enable AF in 3dmark2k3 :/ if they did no nvidia drivers would pass.
 
DaveBaumann said:
http://www.beyond3d.com/articles/3dmark03/340/

At one we asked Derek how this sat with the optimisations guidelines that were given to press by NVIDIA, specifically the guideline that suggests "An optimization must accelerate more than just a benchmark" To which Derek's reply was "But 3DMark03 is only a benchmark" -- it was suggested that this particular guideline should read "An optimization must accelerate more than just a benchmark unless the application is just a benchmark"!

Oh....my.....God....
 
It would seem that Vertex Shaders are being tinkered with (from my understanding of Dave's article)... and that was the only stat truly affected by the 340 patch....

From that article.. NVidia's response to Dave... that's truly sickening to what NVidia wants to do with benchmarking in general... (the PR guys are really running the show these days... arg)
 
Great work B3D :!:

It would be interesting to have Futuremark make another set of reordered shaders to see if they produce the exact same results (image) as the new patched version.
 
Joe DeFuria said:
At one we asked Derek how this sat with the optimisations guidelines that were given to press by NVIDIA, specifically the guideline that suggests "An optimization must accelerate more than just a benchmark" To which Derek's reply was "But 3DMark03 is only a benchmark" -- it was suggested that this particular guideline should read "An optimization must accelerate more than just a benchmark unless the application is just a benchmark"!

Oh....my.....God....

Yep. Just when you think you can't lose any more respect, you're proven wrong.
 
John Reynolds said:
Joe DeFuria said:
At one we asked Derek how this sat with the optimisations guidelines that were given to press by NVIDIA, specifically the guideline that suggests "An optimization must accelerate more than just a benchmark" To which Derek's reply was "But 3DMark03 is only a benchmark" -- it was suggested that this particular guideline should read "An optimization must accelerate more than just a benchmark unless the application is just a benchmark"!

Oh....my.....God....

Yep. Just when you think you can't lose any more respect, you're proven wrong.

That guy should run for office. (Not that I would ever vote for his ^%$^*& butt)
 
banksie said:
It certainly is a doozy isn't it?

Futuremark are going to have their work cut out for them evidently.

Or they could just refuse to slap their seal of approval on any Nvidia drivers until the IHV ceases their current practices. But that's not going to happen, IMO, and looking at how few sites even use 3DMark in their reviews these days. . . .
 
So here's the most important question :

Anyone knows if any sort of "optimizations at the expense of changing the image output that differs from what developers have deemed as what is to be expected based on drivers provided by IHVs to them during the course of the development of their games" happens in any games they know of?

Note that I did not say the comparison is between drivers image output and refrast image output, which probably shouldn't be the comparison anyway (regardless of the fact that B3D has used this method) because of the nature of development (i.e. devs use drivers supplied and "workarounds" are used and hence "the reference image" can be disputed... is it the refrast or those produced by IHV drivers used as a reference by devs?)
 
BTW, call me extremely stupid, but can anyone explain to me what to look for in those "image difference" screenies? I'd have to look at both sets of images (by respective IHVs) and then look at this "difference in quality" image to know the difference? Kinda tiresome if this is the case... (sorry Dave :oops: ).

PS. Those images are the result of using that ATI tool Dave brought up recently? I haven't used that tool yet... it should explain my ignorance/stupidity.
 
Reverend said:
Anyone knows if any sort of "optimizations at the expense of changing the image output that differs from what developers have deemed as what is to be expected based on drivers provided by IHVs to them during the course of the development of their games" happens in any games they know of?
Can't comment on this as I'm not sure exactly what you mean.
Note that I did not say the comparison is between drivers image output and refrast image output, which probably shouldn't be the comparison anyway (regardless of the fact that B3D has used this method) because of the nature of development (i.e. devs use drivers supplied and "workarounds" are used and hence "the reference image" can be disputed... is it the refrast or those produced by IHV drivers used as a reference by devs?)
I seriously doubt people use the refrast much... I get far too many "bugs" from ISVs that I can prove are not bugs by showing them the refrast image. People seem to assume that the behavior they get from their development platform is the proper behavior...
 
John Reynolds said:
Joe DeFuria said:
At one we asked Derek how this sat with the optimisations guidelines that were given to press by NVIDIA, specifically the guideline that suggests "An optimization must accelerate more than just a benchmark" To which Derek's reply was "But 3DMark03 is only a benchmark" -- it was suggested that this particular guideline should read "An optimization must accelerate more than just a benchmark unless the application is just a benchmark"!

Oh....my.....God....

Yep. Just when you think you can't lose any more respect, you're proven wrong.
Oh....my.....Gods.... :oops:

Great article Dave, I'm just still a little stunned by that last page. :oops:
 
DaveBaumann said:
http://www.beyond3d.com/articles/3dmark03/340/
I think you should also have provided screenshots from a Radeon or the refrast to see if there are image quality differences between version 330 and 340 on other cards.

Not that I think there will be, but it'll help rule out "application error" as a cop out.
 
Reverend said:
PS. Those images are the result of using that ATI tool Dave brought up recently? I haven't used that tool yet... it should explain my ignorance/stupidity.

I think they are and it would be a great timesaver over manually inspecting. The comparison tool should give you a heads up quickly on where to look although, as Dave has noted, even mild difference can be quite telling. Like the speckling that occurs only with the trees which is visually not a dramatic difference.
 
John Reynolds said:
Or they could just refuse to slap their seal of approval on any Nvidia drivers until the IHV ceases their current practices. But that's not going to happen, IMO, and looking at how few sites even use 3DMark in their reviews these days. . . .

And you know what? That's what really kills me.

All of these sites are looking for "fair" benchmarks. And here we have one, (3DMark) where they are actively doing something about cheating, and in some perverse way, it's turned against FutureMark. :devilish:

I've read things like "If the benchmark can be manipulated and it needs patches, what's it worth?"

AARRRRGHHH!!

Hello?! EVERY BENCHMARK IS BEING MANIPULATED...it's just that FutureMark are the only ones (well, so is the Shader Mark crew), doing something about it. It amazes me how many dumb-ass sites bought into the nVidia FUD about the 3DMark...nVidia has these sites so "confused" about 3DMark that they just don't use it.

Though some of the blame lies with FutureMark in how they handled the whole "cheat statement retraction" fiasco.
 
Back
Top