Doomtrooper
Veteran
Um.. how bout its the otehr way around. Like Futuremark could sue Nvidia for Slander. After all its pretty clear that NVidia is outright cheating. Nvidia are the ones that should be looking out after making statements like that.There's a subtle but very important differences between saying that a widely accepted industry benchmark doesn't reflect how real game are made and flat out saying that it is biased against the market leader and thus useless. In other words: Futuremark has just come under an attack that could be lethal for them
I am actually reevaluating this position.I have to agree. Replacing a shader routine imo is exactly the same thing as inserting clip planes.
RussSchultz said:I agree. But consider: Suppose that whatever change you make breaks the optimizer (I'd presume due to poor testing). If you were changing the shader to try to expose 'detection' routines did you just expose a detection routine? Or did you break the optimizer?
Now, lets assume it was a bug in the optimizer and next week a driver comes out that fixes it. From the outside, you can't tell--it just looks like they've gone and developed a new detection routine that hasn't been exposed yet. I'm not saying this is what's going on, but it is a possibility.
And, that, partially, is what i percieve to be NVIDIAs complaint:
Does that make the benchmark a valid means of comparing of real world performance?
Inserting clip planes into a racing game, or a flight simulator, or a tank simulator, or something like that doesn't seem like its necessarily a bad thing.Hellbinder[CE said:]However, inserting clip planes is another story...
RussSchultz said:Xspringe said:All we know is that 3dmark scores are clean now.
I think all we know is Futuremark has made an attempt to prevent cheating. We can guess it went well, but we don't KNOW for sure the scores are clean.
Joe DeFuria said:What should 3DMark performance "reflect", in your opinion, to be representative of the "real world?" I contend that HONEST (non cheating scores) in 3DMark in part DEFINES what the "real world" performance is for these parts.
dan2097 said:IMO for a benchmark like 3dmark03:
Replacing shaders is not ok
Lower image quality i.e. reducing LOD, lower precision is not ok
Doing specific optimization is not ok i.e. not clearing the screen or whatever it was and clipping planes
Basically the work load should be the same for ALL competiting products
RussSchultz said:To some degree, vendor specific paths, if necessary.
If eveybody knows that on hardware A you should group X,Y, then Z; hardware B you should group Y, Z, then X. Then maybe you should have two paths if both parties can benefit?
Or, conversely, if product A suffers when doing X,Y,Z and does much better when doing Y,X,Z--and product B does ok either way, wouldn't the real world performance be best represented by chosing YXZ?
Granted, not all games would do it that way, but NVIDIA (for example) does a pretty good job of evangializing what best fits their hardware.
It would be desirable if the companies informed the developers when they could produce a more efficient version of the shader. But it's not fair to silently slot it in when run on their cards only - developers want their programs to run as well as possible on all supporting hardware. If only one company was doing it users of other companies' cards would lose out, so the morally good step would be to let everyone enjoy the benefits.Hellbinder[CE said:]I am actually reevaluating this position.
If it were a game then it would be more than desireable for the shader work to be tweaked for the hardware. Thus, i wonder if it is really that big a deal if Ati, Nvidia or anyone else Tweaks the shaders to the hardware on an application basis.
IMO it makes for a more True, real world benchmark enviornment.
However, inserting clip planes is another story...
(PS i am still rethinking this and forming new ideas on this whole subject. So im open to everyones logical input before i personally come to a real conclusion)
Joe DeFuria said:That doesn't mean FM or the game developer will agree.
But that doesn't make it more representative of what the real world is, now does it?
Bjorn said:Tim said:No it has not. This quote only addresses the issue of replacing shadercode and it does not contradict what I wrote. Replacing code is not rendering what the developer intended even if it looks exactly the same and I agree that would be cheating especially in synthetic benchmarks.
But you said:
There are no image quality degradation with Ati´s drivers and they have no problems with the free lock mode. Other than the performance improvement there seems to be no difference, this indicates that it is optimizations not cheats. (Ati could off cause be using some kind of free mode detection).
Joe DeFuria said:In your "real world", because the hardware is not popular, and therefore devs probably won't optimize for it much if at all, THAT hardware shold gets treated differently (should NOT get hand optimizations in synthetic benchmarks), just because it's not popular in the "real world?"
RussSchultz said:I'm of the opinion the game developer will agree--since he wants his engine to run as fast as possible on as many cards as possible.
Joe DeFuria said:Yes, Devs want their code to run as good as possible. But they don't have unlimited resources to do this for all hardware. And nVidia isn't going to care much unless it's a high profile game.
RussSchultz said:That about sums it up. Its hard to translate that into rules, but I do think that some sort of consideration should be given to market leaders.