On Ati's "optimization"

galperi1

Newcomer
I'm not gonna pretend I know it all, but I will throw this out to all the experienced coders and developers here on a theory for GT4's drop

From my understanding Ati's architecture only allows it to run at FP24 in all PS2.0 shaders and such. Is it possible that GT4 is calling for FP32 shaders and that since Ati's hardware is incapable of running them it is detecting it and running their own FP24 ones (which are DX9 compliant and produce nearly the same picture)

I wouldn't consider it cheating if this is the case at all
 
galperi1 said:
Is it possible that GT4 is calling for FP32 shaders and that since Ati's hardware is incapable of running them it is detecting it and running their own FP24 ones (which are DX9 compliant and produce nearly the same picture)

No, because there is no way for DX9 to explicitly call for "FP32" shaders. In layman's and highly generalized terms (someone correct me if I'm wrong), DX9 only has "floating point" shaders, and "non-floating point shaders". (You can also query the hardware to see if it supports "partial precision", ala nVidia FP16).

Floating point shaders in DX9 means (again, generalized) "at least FP24 precision."

I wouldn't consider it cheating if this is the case at all

If you could call for FP32 shaders, and ATI did drop to FP24, I would call it cheating. ;)
 
That's the thing though Joe. Ati DOESN'T drop to FP24. They can ONLY run FP24 because of their architectural design. As long as they are following DX9 spec should it be considered cheating??

I think that FM designed the PS2.0 shaders to use FP32 in mind because it would not be fair to Nvidia in that they cannot run FP24.
 
galperi1 said:
That's the thing though Joe. Ati DOESN'T drop to FP24. They can ONLY run FP24 because of their architectural design. As long as they are following DX9 spec should it be considered cheating??

Yes, of course. What a silly question, of course following the spec is cheating.
 
galperi1 said:
That's the thing though Joe. Ati DOESN'T drop to FP24. They can ONLY run FP24 because of their architectural design. As long as they are following DX9 spec should it be considered cheating??

I think that FM designed the PS2.0 shaders to use FP32 in mind because it would not be fair to Nvidia in that they cannot run FP24.

No, FM designed it to use DX 9 which specifies "at least FP 24".

And of course they're not cheating as long as they follow specifications. Seems pretty obvious to me at least.
 
galperi1 said:
That's the thing though Joe. Ati DOESN'T drop to FP24.

Not to beat a dead horse, but ATI can't drop to FP24 becuase 3DMark can't sepcify or code for FP32.

I think that FM designed the PS2.0 shaders to use FP32 in mind....

Why stop there? I think FM designed the PS 2.0 shaders to use FP64 in mind...that makes EVERYONE a bunch a cheaters... :oops:

This is just beyond silly...
 
I've contacted ATI and they made a preliminary statement to me basically saying that my thoughts in the other thread were the case, however I'm trying to get an official statement out of them at the moment. Potentially there was even mention of providing 'before' and 'after' shader code to prove that they mathematically did the same thing, however I'm not sure they will actually do that.

FYI (although not explicitally related to this thread) I've contacted NVIDIA for their position but all my contacts are in the far corners of Europe and the Eastern block finishing up on all the launches so they were not able to offer me any comments yet. I doubt they will be able to say anything that doesn't come from 'on high' anyway.
 
DaveBaumann said:
I've contacted ATI and they made a preliminary statement to me basically saying that my thoughts in the other thread were the case, however I'm trying to get an official statement out of them at the moment. Potentially there was even mention of providing 'before' and 'after' shader code to prove that they mathematically did the same thing, however I'm not sure they will actually do that.

I still disagree with the replacing of any shader code, and FM has every right to prevent such things from occuring. However, if it produces the same mathematical result in all cases (meaning, no matter what the input is, whether it's the benchmark data or not, the output is exactly the same), then I can at least see a case for it being a legitimate optimization.

Again, I'd still rather not see it done, and the practice of it should be eliminated in synthetic benchmarks. If FutureMark's policy is that it is not to be done with their benchmark (and it is there policy), then it shouldn't be done.

FYI (although not explicitally related to this thread) I've contacted NVIDIA for their position but all my contacts are in the far corners of Europe and the Eastern block finishing up on all the launches so they were not able to offer me any comments yet. I doubt they will be able to say anything that doesn't come from 'on high' anyway.

Good luck getting info from them. ;)
 
Joe DeFuria said:
I still disagree with the replacing of any shader code, and FM has every right to prevent such things from occuring. However, if it produces the same mathematical result in all cases (meaning, no matter what the input is, whether it's the benchmark data or not, the output is exactly the same), then I can at least see a case for it being a legitimate optimization.
I agree, but at least ATI said that Dave was right, rather than just saying "it's probably a driver bug". ;)
 
...maybe FutureMark needs to start stamping ATI's and nVidia's drivers with something like WHQL - "Futuremarks Stamp of Approval!" :)
 
dksuiko said:
...maybe FutureMark needs to start stamping ATI's and nVidia's drivers with something like WHQL - "Futuremarks Stamp of Approval!" :)
That'd only remove the possibility of cheats in 3dMark. There's other benchmarks and games that could be cheated, unless you're suggesting that FutureMark test those games, too. Unfortunately, as the audit said, it is difficult to find driver cheats in games for which one does not have the source code.
 
DaveBaumann said:
I've contacted ATI and they made a preliminary statement to me basically saying that my thoughts in the other thread were the case, however I'm trying to get an official statement out of them at the moment. Potentially there was even mention of providing 'before' and 'after' shader code to prove that they mathematically did the same thing, however I'm not sure they will actually do that.

If that's the case then it can be questioned whether it's cheating or an optimisation. However, that it's hardcoded for that shader leaves a bad taste in the mouth though. I'd hope they would implement a general driver path that can do that optimisation on a more general level so that would be able to apply that optimisation as long as it's applicable.
 
Humus said:
If that's the case then it can be questioned whether it's cheating or an optimisation. However, that it's hardcoded for that shader leaves a bad taste in the mouth though. I'd hope they would implement a general driver path that can do that optimisation on a more general level so that would be able to apply that optimisation as long as it's applicable.
I would agree, but I think it'd be better for ATI to simply actively advise game developers on good, high performance shader code rather than forcibly changing the code with their drivers.
 
The only way something that like could not be cheating is if their driver had a generic shader optimizer that parsed shaders and "optimized on the fly" any non-optimal constructs.


Simply detecting a shader and replacing it with your own hardcoded optimized version is benchmark cheating. Why? Because the technique used will not translate into increased performance in any game, unless you think ATI is going to produce detection routines for every new DX9 game that comes out and the thousands of shaders that will exist.


If ATI thought the shader was non-optimal, and since they are a member of FM's beta program, why didn't they have FM patch 3DM to use the more optimal pixel shader in the first place? Perhaps ATI's optimizations might boost the scores of other cards as well.


I wish both NVidia and ATI would stop this specific detection stuff. I don't care if the optimization doesn't produce any visiual artifacts. I only care about optimizations that are generic, dynamic, and will work on future games that haven't even been written yet I do not want to rely on NVidia and ATI to produce bulky, buggy drivers that have all this wasted detection logic code designed specifically for each new game that comes out.
 
DemoCoder said:
The only way something that like could not be cheating is if their driver had a generic shader optimizer that parsed shaders and "optimized on the fly" any non-optimal constructs.

Agreed. (Tried to say the same thing to Russ in another thread.)
 
Humus said:
If that's the case then it can be questioned whether it's cheating or an optimisation. However, that it's hardcoded for that shader leaves a bad taste in the mouth though. I'd hope they would implement a general driver path that can do that optimisation on a more general level so that would be able to apply that optimisation as long as it's applicable.

Agreed. And if ATI knew what was going on with nVidia I would call it extremely stupid of the company to go for those silly 8 percent higher scores in GT4 when they could have showed 'clean hands'. They already have the superior chipsets for FP performance so why screw themselves like this?

While I'm very disappointed with nVidia, I have an ATI product and are more concerned about the quality their drivers.
 
DemoCoder said:
I wish both NVidia and ATI would stop this specific detection stuff. I don't care if the optimization doesn't produce any visiual artifacts. I only care about optimizations that are generic, dynamic, and will work on future games that haven't even been written yet I do not want to rely on NVidia and ATI to produce bulky, buggy drivers that have all this wasted detection logic code designed specifically for each new game that comes out.

However, suppose there's a gamedeveloper out there who for some reason only cares how his game performs on Nvidia cards, and writes all his shader code to run best on CineFX hardware. ATI contacts the developer with an offer to help alternate shader code that could be used as an option for players with ATI cards. Developer for whatever reason refuses. If the game is popular enough, would you still oppose ATI "detecting" these shaders and replacing them with more optimal code?

See, in games it's a much grayer area, but for synthetic benchmarks I definitely oppose any "hand-tweaking" like what both IHV's are doing.
 
Optimizing for a benchmark is cheating. However it doesn't bother me so much if they 'cheat' to get higher Doom3 scores, even if it involves specifically writing a clunky adhoc few lines of code, so long as quality doesn't suffer of course.

Sometimes writing the general case can be

a) considerably longer and bulkier
b) impossible
c) slower

So it sounds good, but probably not realistic in practise.
 
I am all for Application Detection and optimization if.

1. It does not impact the intent of the application

2. It does not affect the IQ in a negative Way

3. Increases the Experience of the gamer with the best game specific support available. (again without reducing Quality or otherwise skimping out)
 
Back
Top