Shadermark

Tridam said:
Could you do the same tests with a GeForce FX ? :)

Sure. You're French, right?
Then the shipping costs for you to send me one shouldn't be too high since I'm Belgian ;)

More seriously though, I'm buying a NV35 once I can find a Vanilla at a reasonable price here in Europe. Would you perhaps know any place offering that?

Madashi: 6x? Nah. What we need is unordered, GC 8x or 9x MSAA. Now THAT would be tasty! And trust me, the NV40 does have what it takes to make that run fine. Very fine indeed... It's just about nVidia not being too lazy to include it in the core.


Uttar
 
Uttar said:
6x? Nah. What we need is unordered, GC 8x or 9x MSAA. Now THAT would be tasty! And trust me, the NV40 does have what it takes to make that run fine.
Well, if it looks better than R300's 6x AA then give it to me, baby!
 
well this sucks

what is nvidia thinking here?

this is creating dis-trust, now when numbers are represented on these gpu's people will question its true performance, generally thinking its actually slower then advertised

thats basically what nvidia has created by cheating in all these benchmarks
 
Dunno how reliable this really is folks.

I really doubt that nVidia would spend time on optimizing something as relatively unknown/unused as Shadermarks and the shaders: Fixed Function - Gouraud seems way to low anyway. I always thought they ran those in FX12 since this was the only part where FX5800 would beat the R300 initially.

Maybe Thomas Bruckschlegel evoked a performance bug in the already twichy FX drivers?
 
You can take the original shadermark and fiddle with the instructions order (where it won't produce a difference) to create the same effect yourself. This may be all Thomas has done here.

However, has anyone actually shown any visual differences?
 
Brent said:
well this sucks

what is nvidia thinking here?

this is creating dis-trust, now when numbers are represented on these gpu's people will question its true performance, generally thinking its actually slower then advertised

thats basically what nvidia has created by cheating in all these benchmarks

Now you're starting to see the reasoning for getting upset with the 3DMark cheats. It doesn't matter if it's a synthetic benchmark or you think it's worthless or you think it doesn't represent games or that Futuremark is no longer calling it a cheat. Doing the kind of cheating/optimizing they did in that one application is enough to distrust any performance metrics in all applications, even games. NVIDIA should be ashamed. And anybody that based their buying decision on those inflated results should really consider a class-action lawsuit for fraud.

Tommy McClain
 
AzBat said:
Brent said:
well this sucks

what is nvidia thinking here?

this is creating dis-trust, now when numbers are represented on these gpu's people will question its true performance, generally thinking its actually slower then advertised

thats basically what nvidia has created by cheating in all these benchmarks

Now you're starting to see the reasoning for getting upset with the 3DMark cheats. It doesn't matter if it's a synthetic benchmark or you think it's worthless or you think it doesn't represent games or that Futuremark is no longer calling it a cheat. Doing the kind of cheating/optimizing they did in that one application is enough to distrust any performance metrics in all applications, even games. NVIDIA should be ashamed. And anybody that based their buying decision on those inflated results should really consider a class-action lawsuit for fraud.

Tommy McClain

Exactly, if we can't test them to see how they perform then how are we supposed to know how they perform? You said it exactly right, it doesn't matter if you think a benchmark is crap or not. Look at the shit that reviewers are going to have to do to make sure IHV's are not cheating and find out what the actual performance is for consumers, what a great legacy .... thanks nvidia!
 
I really doubt that nVidia would spend time on optimizing something as relatively unknown/unused as Shadermarks and the shaders: Fixed Function - Gouraud seems way to low anyway. I always thought they ran those in FX12 since this was the only part where FX5800 would beat the R300 initially.

Maybe Thomas Bruckschlegel evoked a performance bug in the already twichy FX drivers?

Yes, but then it wouldn't run that fast anyway, those speeds without the the shuffling are simply insane considering what the NV3x is supposed to be capable of doing theorically.

As for yoru first point - that nVidia wouldn't spend time optimizing something relatively unknown/unused...
That has actually crossed my mind too. The numbers of place where nVidia cheat are just too big for it to be possible to be done by a human workforce, and even more so considering how fast they go from "begin using benchmark on review sites" to "optimized benchmark".

So either they've got workers in another dimension, or they developped a way to do cheating automatically - I mean, a script as insane as you'd nearly run a program with special drivers, and the drivers would automatically output "optimization" information usable by real drivers.

For shaders, that isn't too hard. Considering a LOT of performance metrics, and the possibility to not determine the optimizations in real time at all, you could do an awful lot better than the default compiler which got to run real fast.

Of course, I'm getting into my conspiracy theories here. But it's either that, or nVidia's driver team is on proactive drugs. Eh...


Uttar
 
Of course, they could have automatic systems, and then adding a little human work here and there to make it even better.

Maybe they're doing all the work manually. But if it was the case, then I'd have to say they've got the industry's most dedicated, and untrustful, employees :devilish:


Uttar
 
Uttar said:
I really doubt that nVidia would spend time on optimizing something as relatively unknown/unused as Shadermarks and the shaders: Fixed Function - Gouraud seems way to low anyway. I always thought they ran those in FX12 since this was the only part where FX5800 would beat the R300 initially.

Maybe Thomas Bruckschlegel evoked a performance bug in the already twichy FX drivers?

Yes, but then it wouldn't run that fast anyway, those speeds without the the shuffling are simply insane considering what the NV3x is supposed to be capable of doing theorically.

As for yoru first point - that nVidia wouldn't spend time optimizing something relatively unknown/unused...
That has actually crossed my mind too. The numbers of place where nVidia cheat are just too big for it to be possible to be done by a human workforce, and even more so considering how fast they go from "begin using benchmark on review sites" to "optimized benchmark".

So either they've got workers in another dimension, or they developped a way to do cheating automatically - I mean, a script as insane as you'd nearly run a program with special drivers, and the drivers would automatically output "optimization" information usable by real drivers.

For shaders, that isn't too hard. Considering a LOT of performance metrics, and the possibility to not determine the optimizations in real time at all, you could do an awful lot better than the default compiler which got to run real fast.

Of course, I'm getting into my conspiracy theories here. But it's either that, or nVidia's driver team is on proactive drugs. Eh...


Uttar

I pretty sure I saw a quote somewhere when a Nvidia rep was asked why the NV30 performed so bad in Shadermark and the guy responded that he never heard of it . Then the next driver set tripled the performance.
Anyone know where that interview was?
 
Look at the shit that reviewers are going to have to do to make sure IHV's are not cheating and find out what the actual performance is for consumers, what a great legacy .... thanks nvidia!

In a way, i actually think that something good can come out of this(but i'm not going to thank Nvidia for it :)), At least for us consumers since now most reviers will be more paranoid about the results, which they should be. And hopefully more of them will do what Rev said that he was going to do. Do benchmarks but also run the game and check the FPS while playing to see if there were any kind of match between them. Maybe 40 fps is playable on one card and not on the other :) And you don't need to do this for all resolutions , just pick 2-3 settings randomly and change them for the next review. And if they don't seem to match, do some investigation.
 
I have a hard time getting any Multisampling in >42xx Det's in several applications with the NV25, so careful with those AA scores (ie check if there's any MSAA at work at all prior, in case you get extremely high scores with latest drivers).
 
Uttar said:
Hmm, well, there doesn't seem to be any cheating in SS:SE "The Grand Cathedral" timedemo on a GF4 Ti4200.

Uttar

I wouldn't be surprised, nVidia didnt have a killer competitor product to worry about from the Gf3 to the Gf4.
 
Randell said:
Uttar said:
Hmm, well, there doesn't seem to be any cheating in SS:SE "The Grand Cathedral" timedemo on a GF4 Ti4200.

Uttar

I wouldn't be surprised, nVidia didnt have a killer competitor product to worry about from the Gf3 to the Gf4.

Yes, but wouldn't they want to cheat to look more competitive against the Radeon 9500/9600 now?
I find two explanations:
1) They aren't cheating in SS:SE
2) They're only cheating with the GFFX series, because they want it to look better against the GF4 than it really should.


Uttar
 
Uttar said:
Randell said:
Uttar said:
Hmm, well, there doesn't seem to be any cheating in SS:SE "The Grand Cathedral" timedemo on a GF4 Ti4200.

Uttar

I wouldn't be surprised, nVidia didnt have a killer competitor product to worry about from the Gf3 to the Gf4.

Yes, but wouldn't they want to cheat to look more competitive against the Radeon 9500/9600 now?
I find two explanations:
1) They aren't cheating in SS:SE
2) They're only cheating with the GFFX series, because they want it to look better against the GF4 than it really should.


Uttar

There really isn't much reason to cheat on the GF3/GF4 chips at all. It would diminish the precieved effect of the FX cards, but just as importantly it would also open a big window for people like you with GF4 cards to figure out they are cheating. I mean, had you not been a smart individual like yourself, you might have assumed that because the GF4 isn't cheating that nvidia's drivers arn't cheating in general. Even if you don't buy this and think they still might be cheating with the FX, then you'll need to go buy an FX to figure it out.

Nite_Hawk
 
I agree and its curious that the older release geforce4 drivers perform slightly better than current ones.
 
gkar1 said:
I agree and its curious that the older release geforce4 drivers perform slightly better than current ones.

not really I bet the best performing TNT2 drivers are older than say the 2x.xx series. As newer cards get optimised for, older cards can suffer a bit.
 
indio said:
I pretty sure I saw a quote somewhere when a Nvidia rep was asked why the NV30 performed so bad in Shadermark and the guy responded that he never heard of it . Then the next driver set tripled the performance.
Anyone know where that interview was?
Do you mean the quote by Ante P in this thread http://www.beyond3d.com/forum/viewtopic.php?t=6019&postdays=0&postorder=asc&start=0 ?
However, these new shadermark numbers just don't sound right IMHO. In that thread above, the GFFX 5800Ultra (with a driver not optimized for shadermark) got 60fps (average). There's no reason it should be 2.5 times slower than that now on the 5900Ultra, unless the gffx does "DX8 Mip Filter Reflection" (whatever that is) in software (btw could this change in the benchmark also be responsible for the slight score drop on the R9700pro?).
 
Back
Top