Futuremark Announces Patch for 3DMark03

There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.

Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
 
tb said:
Tridam said:
tb said:
Could someone with a gffx card try 3D-Analyze's "Anti-Detect-Mode"(shaders option only) with the new 340 patch and the official drivers? I only want to know, if the Pixel Shader 2.0 test performance will decrease... (two bmp/png screenshots would be nice, one with and one without "ADM" -> thomas@tommti-systems.com)

Best Regards,
Thomas

Result is the same with 330, with 340, with AMD and without ADM : 50.9 FPS.
So, the shader compiler seems to do a good job. Thanks.
A little blue bird told me that it's possible the ADM is not working ;)

-FUDie
 
Tridam said:
There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.

Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
Err - the nose cones are supposed to be transparent! Or do you mean that the alpha textures just aren't there full stop?
 
Neeyik said:
Tridam said:
There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.

Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
Err - the nose cones are supposed to be transparent! Or do you mean that the alpha textures just aren't there full stop?

I need to take a closer look at this. Maybe I'm wrong.
 
digitalwanderer said:
Heck, could someone try the 52.70 drivers that someone found on MSI's site and posted up over at www.guru3d.com out on the 3.40 patch to see if it scores the same as the 52.16 set or the same as the pre-3.30 build? (The files in the beta are dated October 23, how long was the 3.33 out for and did FM let nVidia have a copy of it? If so, when-ish? )

Results are the same with a FX5950 and 52.16 or 52.70.
 
Neeyik said:
Tridam said:
There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.

Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
Err - the nose cones are supposed to be transparent! Or do you mean that the alpha textures just aren't there full stop?

Actually, it seems different on ATI and NV boards.
 
Here are those test results I was on about in full:

3dmark03test.png


Drops all round on the FX with the drivers. Aww but look, the little old GF4 gets a wee boost with the new patch 8)...
 
Tridam said:
digitalwanderer said:
Heck, could someone try the 52.70 drivers that someone found on MSI's site and posted up over at www.guru3d.com out on the 3.40 patch to see if it scores the same as the 52.16 set or the same as the pre-3.30 build? (The files in the beta are dated October 23, how long was the 3.33 out for and did FM let nVidia have a copy of it? If so, when-ish? )

Results are the same with a FX5950 and 52.16 or 52.70.
Thank you! :)
 
Neeyik said:
Drops all round on the FX with the drivers. Aww but look, the little old GF4 gets a wee boost with the new patch 8)...
Could NVIDIA be using app. detect to artificially hold back GF4 performance in an effort to make the GFFX parts look better? :D

-FUDie
 
Could NVIDIA be using app. detect to artificially hold back GF4 performance in an effort to make the GFFX parts look better?

Could be their app specific optimizations for 3d mark 03 just dont work very well with the gf4 as they were designed to help out the gffx possibly by taking advantage of the quirks in its architecture.
 
dan2097 said:
Could NVIDIA be using app. detect to artificially hold back GF4 performance in an effort to make the GFFX parts look better?
Could be their app specific optimizations for 3d mark 03 just dont work very well with the gf4 as they were designed to help out the gffx possibly by taking advantage of the quirks in its architecture.
That would be pretty silly, doncha think? I mean, if you're making HW-specific-app.-specific changes (say that 5 times fast), then they should affect the relevant HW only, right?

-FUDie
 
Thx Neeyik for the results.
Im left wondering why the PS test score is the same for the 330 and 340 with the 52.16 driver though. :?
 
DaveBaumann said:
Quitch said:
So obviously FutureMark are trying to make nVidia look bad (for reasons we can't understand) by disabling their GPU compiler but not ATI's ;)

Actually, according to a call from one Journo I had this morning its all Dell's fault... You know, Dell, who are currently carrying all ATI's high end, asked for a patch from Futuremark that purposefully disables NVIDIA's shader optimiser... :!:

I'm curious... Is nVidia giving you any flak over your article?

Regards,

Taz
 
There was a point made regarding nVidia Unified Compiler on another forum that I figured I'd toss up here and let people investigate:

Applications have to 'allow' the compiler to work or else they can bypass it and force their own coding routes, effectively killing ALL optimizations, including legitimate ones.

http://www.nvidia.com/object/IO_9292.html - That will explain it.


I'm both over-busy and under-capable enough to delve through and pick out the appropriate points, so I figured I'd abuse the technical knowledge of you guys to shed light on it. That's right, I'm USING you! :p ;)
 
cthellis42 said:
There was a point made regarding nVidia Unified Compiler on another forum that I figured I'd toss up here and let people investigate:

Applications have to 'allow' the compiler to work or else they can bypass it and force their own coding routes, effectively killing ALL optimizations, including legitimate ones.

http://www.nvidia.com/object/IO_9292.html - That will explain it.


I'm both over-busy and under-capable enough to delve through and pick out the appropriate points, so I figured I'd abuse the technical knowledge of you guys to shed light on it. That's right, I'm USING you! :p ;)

No where does it say anything about bypassing the optimisations actually if anything the block diagram shows it is impossible to do so this is fud or I have gone blind from my study this morning which I should get back to.( Probably the latter but can someone verify that I'm not going crazy )
 
cthellis42 said:

That document reads more like marketing FUD to me.

Besides, the application can't "force" anything on the driver. The driver works at a much lower level and sits between the application and the graphics card. All the application can do is request things, and the driver responds to those requests however it wants to. This is why Nvidia are capable of outputting changed screenshots, low precision shaders, or giving bilinear when trilinear is requested.
 
Tokelil said:
Thx Neeyik for the results.
Im left wondering why the PS test score is the same for the 330 and 340 with the 52.16 driver though. :?
That struck me as very odd, too.
Either the Nv driver guys are wizzards, or something slipped past FM.

cthellis42 said:
Applications have to 'allow' the compiler to work or else they can bypass it and force their own coding routes, effectively killing ALL optimizations, including legitimate ones.
Can't be bothered to sive through this tech-FUD .pdf, but I suppose by "allowing the compiler to work" they really mean "allowing the driver to detect the app". :?
Besides, everyone in the know (which is not me, at least not to this extend) says that corrupting the compiler is not possible from app layer.
 
With regards to Gainwards comments earlier.(This is good to see BTW.)

Futuremark has made an official comment to Gainward his accusations:
The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler.

The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.

Let's also repeat that 3DMark specific driver optimizations are forbidden in our run rules because they invalidate the performance measurement and the resulting score is not comparable to other hardware.

Thus, the right conclusion is that the new version of 3DMark03 is now very suitable for objective performance measurement between different hardware.
Will the optimizations madness ever stop?


You will have problably noticed that there is a list on Futuremark his website, with approved drivers. You may be wondering why the ForceWare 52.16 is in that list having in mind that it still contains benchmark optimizations. Here is Futuremark his answer :
Reason why any given driver is listed there is that those drivers produce a valid performance measurement result with 3DMark03 build 340. Looking from our point of view, the most important thing for us to do is to enable our customers get a comparable score.

http://www.dvhardware.net/article2116.html
 
Sabastian said:
With regards to Gainwards comments earlier.(This is good to see BTW.)

Futuremark has made an official comment to Gainward his accusations:
The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler.

The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.

Let's also repeat that 3DMark specific driver optimizations are forbidden in our run rules because they invalidate the performance measurement and the resulting score is not comparable to other hardware.

Thus, the right conclusion is that the new version of 3DMark03 is now very suitable for objective performance measurement between different hardware.
Will the optimizations madness ever stop?


You will have problably noticed that there is a list on Futuremark his website, with approved drivers. You may be wondering why the ForceWare 52.16 is in that list having in mind that it still contains benchmark optimizations. Here is Futuremark his answer :
Reason why any given driver is listed there is that those drivers produce a valid performance measurement result with 3DMark03 build 340. Looking from our point of view, the most important thing for us to do is to enable our customers get a comparable score.

http://www.dvhardware.net/article2116.html
Isn't NVIDIA still compressing textures in GT1? If so, how could FutureMark miss it?

-FUDie
 
Back
Top