Futuremark Announces Patch for 3DMark03

That PR looks as though it was meant to downplay just this sort of finding. Even though allot of these sites have already been picked at for their findings. Bla, the same old tricks from nvidia.
 
I am not so sure that FutureMark will hold this line of objective results. The stars are aligned against them, so to speak. Unless there is substantially more press with regards the whole issue will be buried under piles of dung unloaded off the back of a very large green dump truck.

Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.â€￾

http://www.xbitlabs.com/news/video/display/20031112031947.html
 
DaveBaumann said:
digitalwanderer said:
Are you being serious or sarcastic with this one Dave? :|

No, I actually had a call that went along those lines this morning.
Oh...my...gods... :oops:

BTW-That Inquirer blurb is just fucking shameful!
The fucking shameful Inquirer: said:
NORDIC HARDWARE has figures on its site showing how the latest build of 3DMark03 affects performance on Nvidia cards, and on ATI based graphics cards.
Yeah, ATi had about a .01% change and nVidia had a 13% change...that means it affected both their performance. :rolleyes:
 
Doomtrooper said:
There will be another Futuremark patch following the next driver release from Nvidia, so once we figure out how many Nvidia drivers releases are left this year, we can then extrapolate that into how many patches will be released from Futuremark :D

Well, that's easy...:) Based on recent and very strange comments from nVidia, the number of official drivers they will release annually will range from a maximum of 4 to a minimum of 1...:)

Personally, the only drivers I think FM should ban are the so-called nVidia "beta" drivers routinely released from mystery sources, which are actually not official beta drivers released by nVidia as beta drivers, nor a part of any official "beta-driver program" open to the public which nVidia uses to chase down bugs (if you send nVidia a bug report based on any driver other than its officially released drivers it will be ignored.) nVidia officially recommends against using these so-called "beta" releases of its drivers (which is reason enough in itself to ban them from the FM results tabulation, certainly.) To that end, I think that all so-called modded drivers for all IHV products should be banned as well, and that on general principle, if nothing else.

OTOH, it is not reasonable to expect that FM can regulate nVidia's driver activities, as nVidia employs many more people and is much wealthier than FM. What FM can do, though, is defend its own software regardless of what nVidia does with the Forcenators. That's why I think releasing a recompile patch every time nVidia releases an official set of Forcers is exactly the right way to handle this situation.

I'm not at all surprised to see that nVidia is continuing its practice of 3dmk03 application-specific driver modification ("optimization") for nV3x. What I am surprised about, and pleased about, is that FM is going the patch route! I didn't think they'd do it, and this has surprised me--pleasantly.

There's no doubt in my mind that releasing these kinds of patches is just the ticket for them. It puts them in control of their own software. No need to make a fuss and spend hundreds of man hours chasing down every little infraction--that situation is like a rat chasing its tail...:)

Simply releasing regular recompile patches designed to defeat 3dmk03-specific driver optimizations is the perfect approach, IMO. It gets the job done, and it is absolutely the most cost-effective way to do it. And since nVidia will begin to get the picture that spending all of that money and time optimizing for benchmarks that will be patched is chasing its own tail, it's barely possible nVidia will lose its incentive to special-case its drivers for 3dmk at some point in the future (nVidia is not a quick study and so I expect this will take time.)

It is not, and never has been, FM's fault that IHVs ran all over its benchmarks with drivers riddled with special-case optimizations. FM's responsibility was to formulate a procedure to stop it, as much as the company could, seeing that FM doesn't control driver development inside any IHV. IMO, releasing a recompile patch hot on the heels of each official Forcenator release seems to be the very best approach, bar none. It would be nice if nVidia, like ATi, would just stop special-casing for 3dmk in its drivers when FM asked them to; but obviously, nVidia is not prepared to ever do so. So the patch approach is something that has to be done to protect the integrity of their software and to keep its control firmly in FM's hands. FM cannot control what IHVs do, nor can it control the manner in which individual web sites use its software. But FM can control what FM does, and doing patches like this is a giant step in the right direction going forward.

Now, if FM would be "perfect," they will also ban all drivers from their database aside from those offcially released by IHVs.
 
Sabastian said:
I am not so sure that FutureMark will hold this line of objective results. The stars are aligned against them, so to speak. Unless there is substantially more press with regards the whole issue will be buried under piles of dung unloaded off the back of a very large green dump truck.

Hans-Wolfram Tismer, a Managing Director for Gainward Europe GmbH said today: “According to my information patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance. This may not reflect gaming performance and may point in the wrong direction. To me 3DMark03 may look less and less suitable to be used for benchmarking. By the end of the day the end users are running games and applications, not 3DMark03.â€￾

http://www.xbitlabs.com/news/video/display/20031112031947.html

Heh-Heh...:) Indeed a load of dung...:) What, is it now going to come out that nVidia's "GPU compiler" is nothing but a fancy name for special-case driver optimizations done on an application-specific basis for 3dmk03? IMO, that's the only way the patch could render it impotent.
("running on the cpu"--man, that's funny! Wouldn't be much of a "GPU compiler" if it stopped working simply because nVidia drivers could no longer recognize that 3dMk03 was the application being run...:)) I wonder if people realize how stupid their remarks sound before they make them?...:)

Anyway, same-old, same-old--let's shoot the messenger so that we can preserve our delusions. FM is on the right course here, and will be OK as long as they do not deviate from it, IMO.
 
It is also noteworthy that there is no word about this as of yet from ExtremeTech. Isn't ExtremeTech a part of FutureMarks beta tester team? That is remarkable though if you consider they are in that PR released today by nvidia with regards to forceware drivers.
 
nggalai said:
The same applies to theories claiming the "re-ordering of the code to disable application detection" is the culprit--there's no logical reason why such a re-ordering, even if it leads to sub-optimal code for the GFFX architecture, should produce such IQ differences.
That was not the theory.
We have two very obvious facts:
1) 3DMark03 build 340 differs from build 330
2) NVidia cards show a lower score in build 340

This leads me to the following possible explanations:
1) The driver uses application-specific hacks but fails to recognize build 340,
2) Build 340 contains changes that generally cause NVidia cards to lose performance, or
3) both of the above.

Almost everyone here is instantly pointing to 1), not without good reason I might add. But I don't think we have enough information yet to dismiss 3) entirely.

Just an - admittedly far-fetched - example: FM could have noticed that some PS1.4 shaders don't give the desired results on GFFX cards because they use FX12 precision, so they rewrote them using PS2.0. That would be a reasonable workaround in GT4 which already requires PS2.0. But there would be no cheating involved.

btw, I might be good at splitting hairs, but counting hairs isn't one of my strengths. That girl might have a slightly different haircut, but I don't really see more hair.

Regarding the glowing sword, the fire and the smoke, we have screenshots showing they're different, but not whether they're wrong.
 
Xmas said:
btw, I might be good at splitting hairs, but counting hairs isn't one of my strengths. That girl might have a slightly different haircut, but I don't really see more hair.

Regarding the glowing sword, the fire and the smoke, we have screenshots showing they're different, but not whether they're wrong.

The output was changed, that is pretty clear. If the output is changed from what the developer is looking for then indeed it is wrong. When the correct yield is produced, it isn't horribly ironic but, the frame rate is dramatically reduced in some instances. It doesn't take a rocket scientist to see what is going on here Xmas.
 
Just an - admittedly far-fetched - example: FM could have noticed that some PS1.4 shaders don't give the desired results on GFFX cards because they use FX12 precision, so they rewrote them using PS2.0. That would be a reasonable workaround in GT4 which already requires PS2.0. But there would be no cheating involved.

No. Categorically, absolutely not.

WRT to shaders, all we are talking about is moving instructions about in order to defeat complete shader replacements.

Regarding the glowing sword, the fire and the smoke, we have screenshots showing they're different, but not whether they're wrong.

One or the other must be, as every other comparison I've tried betwen the two patches shows no differences.
 
Has anyone done a before/after performance comparison on a GF3 or GF4ti?

I don't know why, but I'm really interested in hearing if there was a performance change on those also. :?
 
Hi Xmas,

I agree with your posting, but:
Xmas said:
btw, I might be good at splitting hairs, but counting hairs isn't one of my strengths. That girl might have a slightly different haircut, but I don't really see more hair.

Regarding the glowing sword, the fire and the smoke, we have screenshots showing they're different, but not whether they're wrong.
Well, Futuremark approved of 52.16 in combination with patch 340, but NOT with patch 330 due to the differences shown in the screenshots and, quite possibly, NV's stance to application optimisations as alluded to by Wavey in his article. Hence, those 330s screenshots show different and, yes, wrong output from GFFX, as far as Futuremark are concerned.

93,
-Sascha.rb

P.S. I am aware of the fact one bloke in our forums reported different output on his Radeon, going from 330 to 340. But so far, no-body came up with an explanation or in support of that report.rb
 
P.S. I am aware of the fact one bloke in our forums reported different output on his Radeon, going from 330 to 340. But so far, no-body came up with an explanation or in support of that report.rb

Do you know which test and which frame?
 
DaveBaumann said:
P.S. I am aware of the fact one bloke in our forums reported different output on his Radeon, going from 330 to 340. But so far, no-body came up with an explanation or in support of that report.rb

Do you know which test and which frame?
GT2, frame 1043. The smoke billows differently. Otherwise, the output is identical.

Correction: it's not different smoke from 330 to 340, but with 340, two different runs. i.e. one run 340, take screenshot, do another run, take screenshot -> different smoke.

93,
-Sascha.rb
 
nggalai said:
Correction: it's not different smoke from 330 to 340, but with 340, two different runs. i.e. one run 340, take screenshot, do another run, take screenshot -> different smoke.

Hmmm...does the score for GT2 change from one run to the next?
 
Joe DeFuria said:
nggalai said:
Correction: it's not different smoke from 330 to 340, but with 340, two different runs. i.e. one run 340, take screenshot, do another run, take screenshot -> different smoke.

Hmmm...does the score for GT2 change from one run to the next?
apparently not, i.e. just in the .x realm (<1%)

93,
-Sascha.rb
 
DaveBaumann said:
No. Categorically, absolutely not.
That's why I said it is a far-fetched example. I hoped it would get the point across, though.

WRT to shaders, all we are talking about is moving instructions about in order to defeat complete shader replacements.
Which can have an impact, even leaving shader replacements out of the picture.

Regarding the glowing sword, the fire and the smoke, we have screenshots showing they're different, but not whether they're wrong.

One or the other must be, as every other comparison I've tried betwen the two patches shows no differences.
What do the refrast and the Radeons show?
 
Back
Top