Detonator 50 benches here!

xGL

Newcomer
Here you go :

Deto_5157VS4543_3DMark3.gif


Deto_5157VS4543_Gun2.gif


Specs : Athlon 2500+ @ 10 x200, 2x256 Corsair PC3200LL, A7N8X Deluxe (nForce 2), Geforce FX 5900 Ultra.
 
Wow! the Det50s are 2.5X faster than the det 4x's with 2X AA and pixel shaders!

:oops:

(Translation...when will people learn to properly construct a graph! )
 
Wow, quality online journalism there. Nice second graph that starts it's scale at 24, ends up showing a massive difference between 26 and 29 FPS.

And you know what, I'm thinking it may have been intentional. Have a look closely at the X axis in the second graph. Note the 2 squares that are inverting the colours of the graph. Looks to me like the X was selected, because someone decided to modify it's scale.
 
The community is good :)

I was just getting to rip the unqualified person who constructed these graphs a new a-hole, but have obviously been beaten to the punch.

I might also add that the next time DX9 is going to be tested, how about using several different benchmarks. Furthermore, I'm _hoping_ that the Shadermark author will also do a little bit of that "rearrange the shaders" action between the time those drivers surface and the time the online press puts out their respective articles.
 
If that's all nvidia have to offer from their amazing new drivers, I don't think that ATI have anything to worry about in HL2... and if HL2 speeds up vastly more, I think we can guess that they're up to no good again, replacing shaders or whatever.
 
Nvidia needs to put more cheats in. And maybe change the scale of the graphs a little more so it looks like a few more frames is actually quadroupling the score. :rolleyes:

For a company that supposedly has a good rep for it's drivers, Nvidia sure have a lot of accidental "bugs" in their drivers. Luckily, they only seem to increase the scores on benchmarks... :rolleyes:
 
xGL said:
AMDMB has come up with a Detonator 51.75 driver comparison as well using UT2003, 3D Mark and X2 Bench
http://www.athlonmb.com/article-display.php?ArticleID=257&PageID=1

Wow! PS2 goes up by nearly 10 frames per sec! And a lot of the other scores go up by 2-6 fps! That kind of massive improvement will make all HL2 style games instantly playable on NV3x! All bow down to the greatness that is Nvidia! You came through again! I want to have your babies Nvidia!

Alternatively: Big fat hairy *YAWN*.
 
People, PEOPLE! I think you're missing the big news here!

The new dets outperform the old dets by almost a full inch and a half!!!
 
digitalwanderer said:
People, PEOPLE! I think you're missing the big news here!

The new dets outperform the old dets by almost a full inch and a half!!!

No doubt there will be a Nvidia PR announcement advising that we set our screen sizes to 320 x 480 in order to see the full optimisation increase to nearly four inches. The new marketing by-line will be:

"320 x 480 - The Way It's Meant To Be Displayed"
 
I will not comment on how they achieved such an improvement but my guess is that it has to do with Pixel shader 1.1 and 1.3. (they change the code whenever possible for the card to do 1.1 or 1.3 instead of 2.0 : they call it an "optimisation" since the end result looks almost the same, but you're not running 2.0, so yes it is STILL a cheat).

nVidia did say after all :
Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

Yay, let's go back to DX8 specs instead of DX9!
If DX8 shaders are so much better, why should we buy their super DX9 cards when DX8 cards are so much better ?

Also I've heard of nVidia changing all of it's precision down to 12bit in these new drivers. Only time will tell but you'll be sure to see sites like , Beyond 3D or Extreme tech telling us more about this
 
nelg said:
xGL said:
AMDMB has come up with a Detonator 51.75 driver comparison as well using UT2003, 3D Mark and X2 Bench
http://www.athlonmb.com/article-display.php?ArticleID=257&PageID=1

Q. I thought Game Test 4 was shader limited. If so why would Vertex and Pixel shading performance be up and GT4 results down. ?
Best case scenario: shader performance has -genuinely- been improved, eliminating the need for the shader re-writing that was implemented for GT4 before.
 
I'm gonna speculate my way out on a limb again...
The results from the X2: The Threat probably provides the best indication of what to expect from these drivers in hl 2. I say this because X2 runs dx 8, which is what fx cards will be doing in hl 2.

What's the performance gain? About 5% :?
If hl 2 performance while running a dx 8 path is boosted by more than that, there's probably something fishy going on...


edit: of course, this says nothing about dx 9 performance. But nVidia already has talked about dropping precision and going back to dx 8.1 "with no image quality loss." :rolleyes:
 
digitalwanderer said:
People, PEOPLE! I think you're missing the big news here!

The new dets outperform the old dets by almost a full inch and a half!!!

<laughs> Damn, that caught me while I was drinking. You bastard! :p
 
Typedef Enum said:
Furthermore, I'm _hoping_ that the Shadermark author will also do a little bit of that "rearrange the shaders" action between the time those drivers surface and the time the online press puts out their respective articles.
He's working on Shadermark 2.0. See www.shadermark.com . Here's an extract:
The ANTI-DETECT-MODE provides and easy way for non-HLSL programmers to test if special “optimisationsâ€￾ are in the drivers.
Sounds nice, doesn't it?
 
Jokes aside, I think these benchmarks are rather useless. Gunmetal, while DX9, doesn't have any PS 2.0 shaders (I think only PS 1.1, and they're probably minimal). And we all know that 3DMark2003 shader performance has been severly skewed by hand-tuned shader replacement, and in no way represents real shader performance anymore.

Eventually, someone will do benchmarks on lesser known shaders, say something from Humus or other independant coders. Even ShaderMark has app-specific shader replacement thanks to HardOCP.

The problem is that anyone doing real PS 2.0 development probably works on a 9800 :)
 
Back
Top