Wait for news from Unwinder (nvworld.ru)

Doomtrooper said:
since Futuremark never included PS 1.4 in scoring during the benchmarks 2 years of existence...a really big joke.
The point of 3dmark is to have a consistant test. You can't change the contents of the test without invalidating the scores already taken.

This is why they came up with 3dmark03, instead of simply updating 3dmark01.

Yes, I agree, the changes done by ATI probably make much sense on modern day cards, however its not what was being tested.
 
Forgive my lack of intelligence, but let me see if I'm following this correctly.
GT4 in 3DMark 2001SE is calling for PS1.0, but the Catalyst drivers are delivering another? which shader are they delivering exactly? 1.1? 1.4? both?
 
micron said:
Forgive my lack of intelligence, but let me see if I'm following this correctly.
GT4 in 3DMark 2001SE is calling for PS1.0, but the Catalyst drivers are delivering another? which shader are they delivering exactly? 1.1? 1.4? both?
Dunno the exact details, but that seems to be the gist of it.
 
RussSchultz said:
Doomtrooper said:
since Futuremark never included PS 1.4 in scoring during the benchmarks 2 years of existence...a really big joke.
The point of 3dmark is to have a consistant test. You can't change the contents of the test without invalidating the scores already taken.

This is why they came up with 3dmark03, instead of simply updating 3dmark01.

Yes, I agree, the changes done by ATI probably make much sense on modern day cards, however its not what was being tested.

I hope the same mentality is used with FX12/FP16 _pp hint 'patch' , as the same mentality came be applied.

In fact absolutley the same, except PS 1.4 would not be lower in IQ.
 
RussSchultz said:
micron said:
Forgive my lack of intelligence, but let me see if I'm following this correctly.
GT4 in 3DMark 2001SE is calling for PS1.0, but the Catalyst drivers are delivering another? which shader are they delivering exactly? 1.1? 1.4? both?
Dunno the exact details, but that seems to be the gist of it.
Thank you for replying Russ....
 
micron said:
Forgive my lack of intelligence, but let me see if I'm following this correctly.
GT4 in 3DMark 2001SE is calling for PS1.0, but the Catalyst drivers are delivering another? which shader are they delivering exactly? 1.1? 1.4? both?


PS1.0 is used in 2K1 GT4. Changing to 1.1, 1.2 or 1.3 wouldn't be a performance improvement. PS1.4 &/or PS2.0 would tho'. *If* they use PS1.4 for 8500/9100 & PS2.0 or PS1.4 for 9500+, that *could* explain the results (IMO).

It *could* simply be that the drivers call for the highest PS supported. I find nothing wrong w/that even in a benchmark as it gives a true 'real world' optimization that would be applied 'across the board' & wouldn't be 'application specific'.

I don't know if that is what *is* happening, but it seems that might be what is happening. ;)

HTH & it's just my .01 ;)
 
You can't "call for the highest level pixel shader". (well, not when 3dmark2001 was written)

Unless you're using a high level shading language (like Cg, or the HLSL in DirectX, or GSLang, etc), you code your shaders using 'assembly'.

If a driver is recognizing a particular shader PS1.0 assembly sequence and replacing it with an optimized PS1.4 assembly sequence, its "cheating".

Even though, in the real world, you'd likely find developers using that optimized PS1.4 sequence, it isn't what the benchmark tool is asking for.

Just like the partial precision hints that some people think that Futuremark should have implemented, it doesn't change that fact that they didn't. Going behind the benchmark's back and doing it there isn't the right answer.
 
RussSchultz said:
You can't "call for the highest level pixel shader". (well, not when 3dmark2001 was written)

Unless you're using a high level shading language (like Cg, or the HLSL in DirectX, or GSLang, etc), you code your shaders using 'assembly'.

If a driver is recognizing a particular shader PS1.0 assembly sequence and replacing it with an optimized PS1.4 assembly sequence, its "cheating".

Even though, in the real world, you'd likely find developers using that optimized PS1.4 sequence, it isn't what the benchmark tool is asking for.

Just like the partial precision hints that some people think that Futuremark should have implemented, it doesn't change that fact that they didn't. Going behind the benchmark's back and doing it there isn't the right answer.

'Call', 'code', 'assemble' ... whatever, you get the gist.

This is 2003. 3DM2K1 is old. It was written for PS1.0 DX8 (max) for the score. Running anything over a GF3 'Classic' is a *cheat*. ;)

Allowing a driver to 'real world' (not application specific, but for everything) optimize 2K1 for the newer cards isn't wrong > it is SOO right, IMHO. What better way for a consumer to see the difference in the new hardware & it's supported API's? 8) The caveat being: as long as it doesn't lower IQ. 8)


PS: The topic is 3DMark 2001 NOT 3DM03. There is a HUGE difference. The main being > MO never said optimizing is a *cheat*. ;)
 
gah, i cant believe this.
It is still a CHEAT.
However, there are shades of grey.
IF indeed all ATI has done is rework some shaders so the output is mathematically identical but it executes faster on ATI hardware, then that is STILL A CHEAT in the context of 3dmark01.

Is it less wrong than a cheat that lowers IQ?
Hell yes.
 
Althornin said:
IF indeed all ATI has done is rework some shaders so the output is mathematically identical but it executes faster on ATI hardware, then that is STILL A CHEAT in the context of 3dmark01.
Did they not use a means of app detection to achieve the shader switcharoo?(or test detection)
 
micron said:
StealthHawk said:
Does anyone know if nvidia's optimizations in 3dmark2001 yielded in a drop in IQ?
Replacing PS 1.0 with 1.4, or 2.0 should not drop the IQ.

Well, no it shouldn't.

But AFAIK there is no indication that is what nvidia has done. For R300 cards, ATI has replaced some PS1.0 shaders with PS2.0 shaders. I haven't read anywhere that nvidia was also converting PS1.0 shaders to PS2.0/PS1.4 shaders.

edit: corrections.
 
StealthHawk said:
micron said:
GT 4 is calling for PS 1.0.....

Ok, whatever. The point still stands.

Where was it proven that nvidia was converting PS1.0 programs into PS1.4 or PS2.0 programs?
Does anyone know exactly which pixelshaders Nvidia is using in GT 4?(other then 1.0)?
 
Back
Top