A little blue bird told me that it's possible the ADM is not workingtb said:So, the shader compiler seems to do a good job. Thanks.Tridam said:tb said:Could someone with a gffx card try 3D-Analyze's "Anti-Detect-Mode"(shaders option only) with the new 340 patch and the official drivers? I only want to know, if the Pixel Shader 2.0 test performance will decrease... (two bmp/png screenshots would be nice, one with and one without "ADM" -> thomas@tommti-systems.com)
Best Regards,
Thomas
Result is the same with 330, with 340, with AMD and without ADM : 50.9 FPS.
Err - the nose cones are supposed to be transparent! Or do you mean that the alpha textures just aren't there full stop?Tridam said:There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.
Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
Neeyik said:Err - the nose cones are supposed to be transparent! Or do you mean that the alpha textures just aren't there full stop?Tridam said:There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.
Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
digitalwanderer said:Heck, could someone try the 52.70 drivers that someone found on MSI's site and posted up over at www.guru3d.com out on the 3.40 patch to see if it scores the same as the 52.16 set or the same as the pre-3.30 build? (The files in the beta are dated October 23, how long was the 3.33 out for and did FM let nVidia have a copy of it? If so, when-ish? )
Neeyik said:Err - the nose cones are supposed to be transparent! Or do you mean that the alpha textures just aren't there full stop?Tridam said:There's a little bug with the rev340 and det 52.16. The "nose" of the planes in GT1 are transparent.
Futuremark has not checked this kind of things ? det 52.16 are approved but don't render the GT1 correctly ?
Thank you!Tridam said:digitalwanderer said:Heck, could someone try the 52.70 drivers that someone found on MSI's site and posted up over at www.guru3d.com out on the 3.40 patch to see if it scores the same as the 52.16 set or the same as the pre-3.30 build? (The files in the beta are dated October 23, how long was the 3.33 out for and did FM let nVidia have a copy of it? If so, when-ish? )
Results are the same with a FX5950 and 52.16 or 52.70.
Could NVIDIA be using app. detect to artificially hold back GF4 performance in an effort to make the GFFX parts look better?Neeyik said:Drops all round on the FX with the drivers. Aww but look, the little old GF4 gets a wee boost with the new patch 8)...
Could NVIDIA be using app. detect to artificially hold back GF4 performance in an effort to make the GFFX parts look better?
That would be pretty silly, doncha think? I mean, if you're making HW-specific-app.-specific changes (say that 5 times fast), then they should affect the relevant HW only, right?dan2097 said:Could be their app specific optimizations for 3d mark 03 just dont work very well with the gf4 as they were designed to help out the gffx possibly by taking advantage of the quirks in its architecture.Could NVIDIA be using app. detect to artificially hold back GF4 performance in an effort to make the GFFX parts look better?
DaveBaumann said:Quitch said:So obviously FutureMark are trying to make nVidia look bad (for reasons we can't understand) by disabling their GPU compiler but not ATI's
Actually, according to a call from one Journo I had this morning its all Dell's fault... You know, Dell, who are currently carrying all ATI's high end, asked for a patch from Futuremark that purposefully disables NVIDIA's shader optimiser...
Tokelil said:Thx Neeyik for the results.
Im left wondering why the PS test score is the same for the 330 and 340 with the 52.16 driver though. :?
cthellis42 said:There was a point made regarding nVidia Unified Compiler on another forum that I figured I'd toss up here and let people investigate:
Applications have to 'allow' the compiler to work or else they can bypass it and force their own coding routes, effectively killing ALL optimizations, including legitimate ones.
http://www.nvidia.com/object/IO_9292.html - That will explain it.
I'm both over-busy and under-capable enough to delve through and pick out the appropriate points, so I figured I'd abuse the technical knowledge of you guys to shed light on it. That's right, I'm USING you!
cthellis42 said:http://www.nvidia.com/object/IO_9292.html - That will explain it.[/i]
That struck me as very odd, too.Tokelil said:Thx Neeyik for the results.
Im left wondering why the PS test score is the same for the 330 and 340 with the 52.16 driver though. :?
Can't be bothered to sive through this tech-FUD .pdf, but I suppose by "allowing the compiler to work" they really mean "allowing the driver to detect the app". :?cthellis42 said:Applications have to 'allow' the compiler to work or else they can bypass it and force their own coding routes, effectively killing ALL optimizations, including legitimate ones.
Futuremark has made an official comment to Gainward his accusations:
The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler.
The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.
Let's also repeat that 3DMark specific driver optimizations are forbidden in our run rules because they invalidate the performance measurement and the resulting score is not comparable to other hardware.
Thus, the right conclusion is that the new version of 3DMark03 is now very suitable for objective performance measurement between different hardware.
Will the optimizations madness ever stop?
You will have problably noticed that there is a list on Futuremark his website, with approved drivers. You may be wondering why the ForceWare 52.16 is in that list having in mind that it still contains benchmark optimizations. Here is Futuremark his answer :
Reason why any given driver is listed there is that those drivers produce a valid performance measurement result with 3DMark03 build 340. Looking from our point of view, the most important thing for us to do is to enable our customers get a comparable score.
Isn't NVIDIA still compressing textures in GT1? If so, how could FutureMark miss it?Sabastian said:With regards to Gainwards comments earlier.(This is good to see BTW.)
Futuremark has made an official comment to Gainward his accusations:
The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler.
The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.
Let's also repeat that 3DMark specific driver optimizations are forbidden in our run rules because they invalidate the performance measurement and the resulting score is not comparable to other hardware.
Thus, the right conclusion is that the new version of 3DMark03 is now very suitable for objective performance measurement between different hardware.
Will the optimizations madness ever stop?
You will have problably noticed that there is a list on Futuremark his website, with approved drivers. You may be wondering why the ForceWare 52.16 is in that list having in mind that it still contains benchmark optimizations. Here is Futuremark his answer :
Reason why any given driver is listed there is that those drivers produce a valid performance measurement result with 3DMark03 build 340. Looking from our point of view, the most important thing for us to do is to enable our customers get a comparable score.
http://www.dvhardware.net/article2116.html