Futuremark Announces Patch for 3DMark03

Neeyik said:
Here are those test results I was on about in full:

Drops all round on the FX with the drivers.
Though I find it VERY strange that the 44.03 driver drops performance so much (since 330 already disables optimizations this driver has for 3dmark03). But the FX5900Ultra loses ANOTHER 50% (in GT 2 and 3 at least) with the 44.03 driver going from 330 to 340???
 
I assume he's talking about 44.03 drops from the non-patched 3dMark to the 340 patch of 3dMark - not from the first patch to this patch. 52.17 doesn't drop ANOTHER 50% over and above the drops already witnessed with 44.03 ;)
 
PaulS said:
I assume he's talking about 44.03 drops from the non-patched 3dMark to the 340 patch of 3dMark.
Well the diagram clearly says 330 which IS the patched version - unpatched would be 320 iirc.
 
cthellis42 said:
There was a point made regarding nVidia Unified Compiler on another forum that I figured I'd toss up here and let people investigate:

Applications have to 'allow' the compiler to work or else they can bypass it and force their own coding routes, effectively killing ALL optimizations, including legitimate ones.

http://www.nvidia.com/object/IO_9292.html - That will explain it.

From the linked document:

<ul>Transparent benefit from NVIDIA enhancements and special features. Even if the application is written to an API in a manner to maximize portability among platforms, the NVIDIA compiler will take advantage of NVIDIA hardware that goes beyond the API. The three previous examples illustrate this.
Forward and backward compatibility. Today?s applications ensure that tomorrows drivers will optimize the same code to best fit tomorrow?s GPUs.[/list]

Those points alone tell us that the compiler cannot be bypassed. If it will work on applications written before the introduction of the compiler, it obviously does not need to be "allowed" to do its job -- it simply will. Another thing is that in order for it to be disabled there has to be an interface to such functionality. However, the DirectX spec has not changed and NVidia has not released a new OpenGL extension.

In any case, NVidia's Universal Compiler appears to fully compliant with FutureMark's guidelines regarding optimizations. It does not change the mathematical result of a shader and can be applied automatically to any and all applications without any specific application detection. As such, there is no rational reason to try and work against it.
 
FUDie said:
Isn't NVIDIA still compressing textures in GT1? If so, how could FutureMark miss it?

-FUDie

I am not sure about this one. All I know about GT1 is that nvidia initially was not very happy about the single textured sky. But there was the matter of the vertex shader detection in GT1 that FutureMark was unable to clear up with the 330 patch. I am really not familiar with the compressed textures matter you are addressing. Has the 340 patch fixed the matter of compressed textures? I honestly don't know they were able to detect the cheat but if they were there would be a decrease in frame rate from the 43.51 DETs to the 52.16 Forceware. Instead though, we see an increase. I am far from qualified to answer the question however someone did note earlier that there was some sort of anomaly in GT1 wrt to the nose of the aircraft. That is at best though, an unqualified observation on my behalf. It is true though the 44.03s or the 43.51 and all others after them were an improvement in performance over previous drivers on GT1.
 
Not seen the exact source of this, but evidently there is a statement from FM over Gainwards comments:

http://www.dvhardware.net/article2116.html

The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler.

The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.

I assume this came from Patric. Patric?
 
Sabastian said:
FUDie said:
Isn't NVIDIA still compressing textures in GT1? If so, how could FutureMark miss it?1
I am not sure about this one. All I know about GT1 is that nvidia initially was not very happy about the single textured sky. But there was the matter of the vertex shader detection in GT1 that FutureMark was unable to clear up with the 330 patch. I am really not familiar with the compressed textures matter you are addressing. Has the 340 patch fixed the matter of compressed textures? I honestly don't know they were able to detect the cheat but if they were there would be a decrease in frame rate from the 43.51 DETs to the 52.16 Forceware. Instead though, we see an increase.
I don't mean that FutureMark could have disabled the cheat (maybe it's not possible) but it should be obvious that textures are being compressed by just comparing results to the refrast or other vendor's results. I believe it was Xbitlabs that originally showed that the NVIDIA drivers were using texture compression on GT1.

-FUDie
 
FUDie said:
I don't mean that FutureMark could have disabled the cheat (maybe it's not possible) but it should be obvious that textures are being compressed by just comparing results to the refrast or other vendor's results. I believe it was Xbitlabs that originally showed that the NVIDIA drivers were using texture compression on GT1.

-FUDie

Could you point me too where you read that at Xbitlabs?
 
DaveBaumann said:
Not seen the exact source of this, but evidently there is a statement from FM over Gainwards comments:

http://www.dvhardware.net/article2116.html

The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler.

The only change in build 340 is the order of some instructions in the shaders or the registers they use. This means that new shaders are mathematically equivalent with previous shaders. A GPU compiler should process the old and the new shader code basically with the same performance. Of course, if there are application specific optimizations in the driver that depend on identifying a shader or parts of it, then you might see performance differences because these optimizations will not work if the driver is not able to detect the shader.

I assume this came from Patric. Patric?

It was Tero Sarkkinen according to The Tech Report.

http://www.tech-report.com/onearticle.x/5874
 
Sabastian said:
FUDie said:
I don't mean that FutureMark could have disabled the cheat (maybe it's not possible) but it should be obvious that textures are being compressed by just comparing results to the refrast or other vendor's results. I believe it was Xbitlabs that originally showed that the NVIDIA drivers were using texture compression on GT1.
Could you point me too where you read that at Xbitlabs?
My mistake, it was from Digit-life.

-FUDie
 
PaulS said:
I assume he's talking about 44.03 drops from the non-patched 3dMark to the 340 patch of 3dMark - not from the first patch to this patch. 52.17 doesn't drop ANOTHER 50% over and above the drops already witnessed with 44.03 ;)
Nope. Both cards were first tested with build 330 across the various drivers and then tested again with the new patch. The drops you see are simply between the 330 and 340 patches.
 
I'd find it rather funny, but I fear that a lot of people will actually believe this:

http://www.xbitlabs.com/news/video/display/20031112181114.html

An official from NVIDIA Corporation confirmed Mr. Tismer’s accusation that “patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance.â€￾ “Yes, that is actually the case with the new patch 340 that Futuremark posted,â€￾ said an NVIDIA spokesperson on Wednesday.

“Few weeks ago we released our 52.16 driver that includes our brand new unified compiler technology. With the new patch the benchmark, our unified compiler gets not used by the app so it goes to CPU and we are definitely slower,â€￾ Luciano Alibrandi, NVIDIA’s European Product PR Manager, added.
 
madshi said:
I'd find it rather funny, but I fear that a lot of people will actually believe this:

http://www.xbitlabs.com/news/video/display/20031112181114.html

An official from NVIDIA Corporation confirmed Mr. Tismer’s accusation that “patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance.â€￾ “Yes, that is actually the case with the new patch 340 that Futuremark posted,â€￾ said an NVIDIA spokesperson on Wednesday.

“Few weeks ago we released our 52.16 driver that includes our brand new unified compiler technology. With the new patch the benchmark, our unified compiler gets not used by the app so it goes to CPU and we are definitely slower,â€￾ Luciano Alibrandi, NVIDIA’s European Product PR Manager, added.
Riiight. They expect us to believe that the CPU is doing the rendering. :rolleyes:

-FUDie
 
FUDie said:
madshi said:
I'd find it rather funny, but I fear that a lot of people will actually believe this:

http://www.xbitlabs.com/news/video/display/20031112181114.html

An official from NVIDIA Corporation confirmed Mr. Tismer’s accusation that “patch 340 disables the GPU compiler. The compiler has to run on the CPU instead resulting in code harder to digest and taking away 20% of the performance.â€￾ “Yes, that is actually the case with the new patch 340 that Futuremark posted,â€￾ said an NVIDIA spokesperson on Wednesday.

“Few weeks ago we released our 52.16 driver that includes our brand new unified compiler technology. With the new patch the benchmark, our unified compiler gets not used by the app so it goes to CPU and we are definitely slower,â€￾ Luciano Alibrandi, NVIDIA’s European Product PR Manager, added.
Riiight. They expect us to believe that the CPU is doing the rendering. :rolleyes:

-FUDie
Or, rather, that the "unified compiler" was supposed to run on the GPU but now is forced to run on the CPU. ;D

FUD or SNAFU?

93,
-Sascha.rb
 
FUDie said:
Sabastian said:
FUDie said:
I don't mean that FutureMark could have disabled the cheat (maybe it's not possible) but it should be obvious that textures are being compressed by just comparing results to the refrast or other vendor's results. I believe it was Xbitlabs that originally showed that the NVIDIA drivers were using texture compression on GT1.
Could you point me too where you read that at Xbitlabs?
My mistake, it was from Digit-life.

-FUDie

This would be worth checking, because if it can't be patched, the driver shouldn't be approved. I also remember hearing tell of "compressed textures" back around the time after build 330.
 
Nv PR monkey said:
“Few weeks ago we released our 52.16 driver that includes our brand new unified compiler technology. With the new patch the benchmark, our unified compiler gets not used by the app so it goes to CPU and we are definitely slower,â€￾ Luciano Alibrandi, NVIDIA’s European Product PR Manager, added.
cwmddd.gif

Ok, so, according to his logic, CPU activity would go up measurably during performing the game tests which lost most fps, which could be easily monitored.
 
Who are they kidding ? Seriously, they are a FM beta member, so chances are they had ample access to the candidate builds to the patch. If this was a genuine concern, such as "application blocking the unified compiler" (which is a no-go from a technical standpoint anyway), I'm pretty sure they would have brought this up with FM, and would have worked a solution together.

Which is more likely :
1) FM is, once again, out to "get Nvidia"
2) The new 340 patch breaks the Nvidia compiler, although the compiler works at a much lower level and does not need to have applications aware of its existence
3) Nvidia was caught once again with their hand in the cookie jar (and are trying to wiggle their way out with "but it's breaking our genuine technology" and "if the benchmark tries to emulate a game environment, then it's ok for us to optimize for it)
 
Katsa said:
GPU compiler == using idle CPU time during 3dmark to run some vertex shaders on the CPU?
I wonder. Previous incarnations of 3DMark03 (i.e. 320 and 330 patches) didn't scale all that much with CPU on my GFFX 5800, no matter whether I ran benchmarks on a 2.4GHz P4 or a 1GHz P3. 'twas in the 1-2% region, overall.

93,
-Sascha.rb
 
More bullshit coming from NVidia:

http://www.theinquirer.net/?article=12657

we advocate that when reviewers are using 3DMark as a game proxy, they must run with the unified compiler fully enabled. All games run this way. That means running with the previous version of 3DMark, or running with a version of our drivers that behave properly.
All games run this way? Most games were written well before Det5x.xx, so the game developers didn't even have a chance to write their games in a way that pleases NVidia's "Unified Compiler technology". If they *nevertheless* *all* run this way - then how can 3DMark behave differently, if all they did was reordering the shader instructions? That's nothing but pure bullshit, dear Derek Perez. Do you honestly think people will fall for that? Well, some may even do...
 
Back
Top