Futuremark Announces Patch for 3DMark03

DerekBaker said:
Dave,

Why isn't that article mentioned on Beyond's front page?


Derek


Currently nvidia is in negotiations with Dave over the article.

Dave- "Yea, I'm the Dave that wrote that article"
nvidia- "Have we told you we are interested in massive amounts of advertising on your site?"
Dave- "Ahh no, but what has that got to do with my article?"
nvidia- "We would have editing rights to your entire site content."
Dave- "I don't think so, good day gentlemen."
nvidia- "Ok Bubba, go break his legs."
 
All these difference could be valid as they may have made slight shader modifications in the new patch which could modify the renderings I believe the diffs are between the FX 330 vs 340 not 340 vs refrast.
 
bloodbob said:
All these difference could be valid as they may have made slight shader modifications in the new patch which could modify the renderings I believe the diffs are between the FX 330 vs 340 not 340 vs refrast.
pray tell, what good would comparisons of FX 340 vs refrast do?
All it would do is verify that the 52.16 drivers arent cheating in 3dmark03 patch 340 - which is totally NOT what your post implied - it implies that we'd find out if the differences between 330 and 340 rendering on the GFFX are valid or not by comparing the 340 renders agaisnt the refrast, which would not do that at all.


What we need to establish is proof, this can be done by comparing the refrast to BOTH 330 and 340 images.
 
ByteMe said:
DerekBaker said:
Dave,

Why isn't that article mentioned on Beyond's front page?


Derek


Currently nvidia is in negotiations with Dave over the article.

Dave- "Yea, I'm the Dave that wrote that article"
nvidia- "Have we told you we are interested in massive amounts of advertising on your site?"
Dave- "Ahh no, but what has that got to do with my article?"
nvidia- "We would have editing rights to your entire site content."
Dave- "I don't think so, good day gentlemen."
nvidia- "Ok Bubba, go break his legs."



^^^....Simply amazing quote =)

keep em coming folks
 
OpenGL guy said:
Did you see any performance difference on Radeons? Did you see any image quality differences? No? Why do you suppose that is?
Two possibilities:
a) there are no changes in the rendering code
b) the changes don't affect the Radeons (much)
I "favor" neither of those over the other.

Even simple code reordering can make a driver choke on your application, but that doesn't mean all drivers will.
Who said anything about code reordering? Shouldn't the driver/hardware give the same correct result in either case?
Code reordering was an example. As an experienced programmer, you should certainly be aware that even slight changes can result in big differences.
 
Reverend said:
Here's what a certain OEM wrote me and Dave in mid-October :

An OEM personnel said:
Anthony/Dave,

My name is [removed], and I’m an engineer for the Performance/Architecture team at [name of OEM removed], with one responsibility being to define benchmarks that we use internally at [name of OEM] for our internal qualification of product (graphics or platform). I (as well as many others, I realize) ran across your Tomb Raider benchmark last month when an IHV had problems with people using it as a benchmark.

We would like to implement it as a standard DX9 test that we use in our labs for our fall/spring graphics refreshes and wondered if you’d be willing to share the Prague3a demo & batch files that you had created for your own tests. I thought these were on your site at one time, but just found out yesterday that is not the case, assuming it was removed by request.

I’d appreciate the help and look forward to hearing back from you.

Thanks!

[name]
Performance & Architecture
[name of OEM]
Anthony,
thanks for sharing this, it had me laughing out loud.
Maybe OEMs deserve some credit after all..

Oh, and no, we didn't grant him his request, naturally.
Sorry if I don't get this completely.
But I suppose you want to keep the custom demos/batches strictly to yourselves to avoid a certain IHV getting hands on them and optimizing for them, right?

Since I think this OEM's on the right track, I hope you where able to help them in one way or the other.

On the general topic:
while I kinda understand the FM's "clean slate" approach, I doubt it'll work out all right.
Some predictions:
- we'll see a new "BetaDet" soon that'll be up to pre-340 "performance"
- certain websites will (continue to) use those betas, as they are practically identical to the new WHQL-drivers which will be released very shortly
- FM will be accused to favor Ati 'cause their score didn't chance
Those are not my opinion, but will happen.

Cheers,
Mac
 
banksie said:
Thanks for that Dave, reading through it now. Slight corrections for you...

Thanks, updated.

banksie said:
Out of interest what makes you suspect that the vertex shader differences are perhaps related to extra precision by being executed on the CPU and not, as we seem to have seen in the past, simply a lower precision shader substitution?

That was a theory that could explain why there were only minute differences. However, there is only one precision for Vertex Shaders AFAIK, as everything is done in FP32. If it were to do with some kind of large scale precision difference then you would probably see differences across all the leaves etc. Here we only have tiny, single pixel differences, which could be accounted for by the difference a CPU and the graphics chip handles rounding even, meaning that some vertex positions end up in one pixel rather than another.

Ostsol said:
DaveBaumann said:
http://www.beyond3d.com/articles/3dmark03/340/
A good article, though I must say that I had been hoping for some benches of the sythetic tests in 3dMark03. Some confirmation of NordicHardware's results would have been nice, as well as some investigation as to how the PS2.0 scores managed to remain the same.

Look on page 4, there is a CSV download for all the results I've run.

nelg said:
Dave, I cannot open up the image showing the difference in the hair(game test 3 centre image).

Fixed.
 
Joe DeFuria said:
At one we asked Derek how this sat with the optimisations guidelines that were given to press by NVIDIA, specifically the guideline that suggests "An optimization must accelerate more than just a benchmark" To which Derek's reply was "But 3DMark03 is only a benchmark" -- it was suggested that this particular guideline should read "An optimization must accelerate more than just a benchmark unless the application is just a benchmark"!

Oh....my.....God....

Note that I suggested that - Derek was kinda like "Uhh, weeeeell, I guess so...".
 
IIRC, when the build 330 was released, it was later found that FutureMark had missed one of the cheats in the nVidia drivers (I recall OpenGL Guy talking about compressed textures or something). How can anyone have faith in this pathetic way of going about things, where we know nVidia are cheating (and don't confuse cheats being blocked with not cheating at all), yet have no way of verifying every cheat was hit (and from the PS 2.0 tests it's obvious they weren't).

This is feeble response after months of stalling and I fail to see how it validates 3DMark03 in any fashion when it's obvious from the PS 2.0 test results (and the cheat that slipped by 330) that this patch approach is not an effective method of dealing with the problm.
 
bloodbob said:
I only whish they would make a option to enable AF in 3dmark2k3 :/ if they did no nvidia drivers would pass.
If you have the Pro version, then you can - just go into Settings and change the texture filtering. Do you mean 2001 rather than 03?
 
Quitch said:
This is feeble response after months of stalling and I fail to see how it validates 3DMark03 in any fashion when it's obvious from the PS 2.0 test results (and the cheat that slipped by 330) that this patch approach is not an effective method of dealing with the problm.

It's all good and well to say this, but what more could they do?
 
First of all let me say that I generally like what FutureMark is doing now. However, there's one thing I do not understand:

4. Generic optimizations that do not violate the above rules and benefit applications in general are acceptable only if the rendering is mathematically consistent with that of Microsoft® DirectX® reference rasterizer.
The "brilinear" filter of the 52.16 (see http://www.3dcenter.de/artikel/2003/10-26_a_english.php) is a generic optimization, which is mathematically not consistent with the reference rasterizer. So could anyone please explain to me why FutureMark approved the 52.16? The 52.16 drivers definately violate that rule quoted above. Or is there someone who does not agree with me here?
 
Althornin said:
bloodbob said:
All these difference could be valid as they may have made slight shader modifications in the new patch which could modify the renderings I believe the diffs are between the FX 330 vs 340 not 340 vs refrast.
pray tell, what good would comparisons of FX 340 vs refrast do?
All it would do is verify that the 52.16 drivers arent cheating in 3dmark03 patch 340 - which is totally NOT what your post implied - it implies that we'd find out if the differences between 330 and 340 rendering on the GFFX are valid or not by comparing the 340 renders agaisnt the refrast, which would not do that at all.


What we need to establish is proof, this can be done by comparing the refrast to BOTH 330 and 340 images.

My bad it should be 330 vs refast but still the fact their might be shader alterations holds true.

Neeyik said:
bloodbob wrote:
I only whish they would make a option to enable AF in 3dmark2k3 :/ if they did no nvidia drivers would pass.

If you have the Pro version, then you can - just go into Settings and change the texture filtering. Do you mean 2001 rather than 03?

Thanks I did know that

May I ask how does the 3dmark03 stop nvidia using brilinear/pseudo trilinear in D3D. This comes under the Generic optimisation header but it does not produce the correct image. I'm sure lots of people would love to know how to force AF and proper tri-linear filtering.

Its been said serveral times already I know.
 
For immediate release

Recently, FutureMark released a new patch, version 340 for its once industry-leading 3DMark2003 benchmark. We at Nvidia are part of FutureMark's beta program (a program that can cost up to hundreds of lawyers per year to bully your way into), and we would like to say that we completely agree with FutureMark's optimization guidelines (we internally refer to those as "wishful thinking"). Since FutureMark are our good pals now, we will not say they did something to make us look bad with the 340 patch, but we will instead remind them that their finnish headquarters are mostly made of wood, and we would hate if something bad was to happen.

Here at Nvidia we are dedicated to provide our 120 450 465 406 546 504 654 065 461 0425 406 546 876 406 903 541 658 706 013 210 601 300 304 650 465 044 056 106 473 customers with the best benchmark results available. We would love to see FutureMark release "3DMark 2003 : the game" so we could optimize for both it and the benchmark and be consistent with our own guidelines (we internally refer to those as "PR crap for gullible reviewers"), but they don't want to so we have to settle for optimizing for the benchmark itself. Regarding the recently released numbers, here is what our superhuman chief scientist David Kirk has to say : "We don't object to those performance numbers as released by FutureMark, but don't forget our direct competitor is cheating in the benchmark : since they are using only 24 of those complicated "bits" thingies, it is likely they use the remaining 8 bits (remember, bits can only come into powers of 2) to hide their 3DMark2003 cheats. We tried to raise the problem with FutureMark, but so far they haven't listened to us.
I still personally believe that patching benchmarks is the wrong answer, since it can disrupt the result of perfectly legit and totally generic optimizations. This is just a slight bug in our software that our perfectly legit and totally generic optimizations only kick in when 3DMark 2003 is started, but our next driver revision should bring back the exact same level of performance, together with bilinear filtering (which looks just as good as the real thing in a variety of situations)"

Nvidia is a global cheater in the communication age, and our goal is to "deface every pixel on the planet".
 
Hanners said:
Quitch said:
This is feeble response after months of stalling and I fail to see how it validates 3DMark03 in any fashion when it's obvious from the PS 2.0 test results (and the cheat that slipped by 330) that this patch approach is not an effective method of dealing with the problm.

It's all good and well to say this, but what more could they do?

What they said they would, not approve drivers which cheat. And none of this rubbish about how they don't... they do, they try so hard, but 340 stops them. Remove 340 and the cheats get through. It's like saying computer A isn't trying to crack computer B because computer B has a firewall stopping the attacks.

Nor do I like this sympathetic takes on the FutureMark situation. If it was a free product, sure. But it isn't, I paid money, others paid money, the Beta partners pay money... I want the service that was initially laid out where cheats were a no-no.
 
XForce said:
Oh, and no, we didn't grant him his request, naturally.
Sorry if I don't get this completely.
But I suppose you want to keep the custom demos/batches strictly to yourselves to avoid a certain IHV getting hands on them and optimizing for them, right?
We keep demos to ourselves but do avail them to certain websites we trust. However, we treat OEMs the same as IHVs (basically) for obvious reasons.

Since I think this OEM's on the right track, I hope you where able to help them in one way or the other.
We offered the OEM an alternative (record a similar demo but not identical to the one we use in our reviews/articles and give this alternative demo to the OEM) but it looks like they appear to be able to do this themselves.
 
Quitch said:
What they said they would, not approve drivers which cheat. And none of this rubbish about how they don't... they do, they try so hard, but 340 stops them. Remove 340 and the cheats get through. It's like saying computer A isn't trying to crack computer B because computer B has a firewall stopping the attacks.

As I understand it, the purpose of the 340 patch is to give everyone a 'clean slate' - No working cheats, and a set of appoved drivers. The ball is now firmly in the IHVs court, it's up to them to 'do the right thing' from here on in, and face the consequences if they don't.

From what Derek Perez said to this site, it appears that nVidia won't be having another set of FutureMark approved drivers, 52.16 will be as good as it gets.
 
Hanners said:
As I understand it, the purpose of the 340 patch is to give everyone a 'clean slate' - No working cheats, and a set of appoved drivers. The ball is now firmly in the IHVs court, it's up to them to 'do the right thing' from here on in, and face the consequences if they don't.

Reading this thread, there are still doubts that this "clean slate" has been achieved, because the performance drop in 340 does not mirror the drops we see in other benchmarks.
 
Bouncing Zabaglione Bros. said:
Reading this thread, there are still doubts that this "clean slate" has been achieved, because the performance drop in 340 does not mirror the drops we see in other benchmarks.

That remains to be seen I guess, but 'cleanish' slate at least - If nothing else it gives us a starting point for nVidia's almost guaranteed future deceptions in this application.
 
Back
Top