Shadermark

You're welcome AJ & thank you for the reply & that you'll look into it. 8)


Neeyik, I can relate. Summer is here now & my kids will be out of school soon to alleviate my boredom though. :LOL: Have to say you are an asset on FM too. :D

See you @ FM,
 
I was just getting ready to post this. Amazing results.

I mean, you cannot begin to overemphasize the significance of these scores.

The Radeon's numbers also go down, but anywhere near as badly. In fact, if you exclude the first 2 tests, the others don't drop too much at all. But the nVidia drop? Holy crap!

They freaking went from nearly 600 FPS to 35 FPS! Across the board, you're looking at hundreds of percentage points drop!?

In fact, their numbers were so bad that not a single one exceeded 35 FPS.

Did anybody catch that other review that was posted within the last 2 days? I think it was a 5900 Ultra review, and the person who did the review used a different Quake3 timedemo. Guess who came out on top?
 
OMG! This is really sad to see this kind of performance looks like nothing has really changed much from the 5800 after all!
 
OK....Try this chart on for size :)

shadermark_5800.gif
 
Ouch.

But still lots of people (I don't belong to them) say: "Who cares about synthetic benchmarks!?". So I'm wondering: Is there anybody trying to find out whether game timedemos are cheated, too? I've no heard yet that anyone is putting some time into such tests, what I find quite strange. I'd do it myself. However, I refuse to buy a GeForceFX card just to test it for cheating! :?
 
We're already seeing indications. FiringSquad just released their MSI 5900 Ultra review, and they used different timedemos for Serious Sam2 and Quake3. Guess which card came out on top with all the bells and whistles enabled? I'll give you a hint...Their HQ's is north of the border :)

Your post actually made me think...At this point in time, doesn't the NV35 almost become a product whereby you want to take one home just to 'crack' the thing? It's almost like a hacker competition where you're trying to solve a puzzle...Ya know what I mean? To the informed person, you just know the thing doesn't come close to the potential that nVidia is hyping, so you just want to take the thing for a spin and reveal all the cheats/hacks.
 
I think the IQ problem with the mandelbrot demo has to do with driver settings. I've noticed that in "maximum performance" settings lower precision is used on shaders with artifacts becoming apparent, while in highest quality settings you don't get the artifacts.
 
Typedef Enum said:
We're already seeing indications. FiringSquad just released their MSI 5900 Ultra review, and they used different timedemos for Serious Sam2 and Quake3. Guess which card came out on top with all the bells and whistles enabled?
This review is really a good start. But its intention is "only" to compare the cards fairly. They didn't try to find out whether NVidia is cheating.
Typedef Enum said:
Your post actually made me think...At this point in time, doesn't the NV35 almost become a product whereby you want to take one home just to 'crack' the thing? It's almost like a hacker competition where you're trying to solve a puzzle...Ya know what I mean?
Yeah, I know! :D But I simply don't feel like giving NVidia money right now. That would almost feel like rewarding them for cheating. I wish Dave (or Reverend or ...) would spend some time on analyzing possible game cheatings. Well, perhaps they already do. Dave?
 
Yeah, I was slightly kidding. I wouldn't give them $400-500 just so I could sit there for a couple of hours/days trying to prove something I already know :)
 
And it's calling the hacker in me too ;) Actually, I *am* buying one.
Oh, and I also wanna say: "See, I even bought a 5900 from you - so now stop saying I'm biased against you, because otherwise, I wouldn't have bought one" ;) j/k
Another reason being that I refuse to have waited six months after my traditional PC buy to get a new GPU, and all that for nothing. Oh, and really, since I update quite often, I doubt the PS2.0. abysmal performance is gonna hurt me much anyway.

Although I'd guess I'm still slightly biased for nVidia... And even more now that I've got some *new* NV40 info. I won't leak it just yet. Very reliable one, too. Be patient, my boy! Be patient! :devilish:

I'm gonna stop talking now and analyze some more game cheating on my Ti4200 ( not like I got a lot of games for it - Serious Sam, mostly, but eh... )


Uttar
 
Nice, Uttar, let us know what you find out, thanks!

Also, you'd better post what you've heard about the NV40, otherwise I will ... :devilish: hmmm ... I will ...








be sad... :cry:
 
Everybody has his preferences more or less, but I wouldn't call myself biased. While I had a great time with the oc'ed Ti4400 for over a year now and I was really looking forward for anything NV3x, I feel more disappointed with NV than ever before. I'll ignore the "optimisations" completely and that the competition's products have dropped by a lot in price in the meantime, feeding me for 3 rounds with the same old tired AA algorithms is beyond my personal tolerance.

That said if they'll still have OGMS in NV40, they deserve to be shot heh.

Oh and I believe I just ordered a R300 *hides under the desk* :oops:
 
Well, I've heard of better AA/AF IQ in the NV40. Nothing particularly reliable, but IMO, it makes a lot of sense considering the NV35 uses GDDR2 with a 256-bit memory bus - limiting yourself to 4X AA is *insane* there! :)

Oh, and don't worry about that NV40 piece of info I got. It ain't a lot, but it's still interesting. Won't release it just yet ( maybe next week, we'll see ) because I don't want anyone to suspect my source since I think he even nearly asked it specifically. Would be too, let's say, "coincidental" ;)


Uttar
 
Hmm, well, there doesn't seem to be any cheating in SS:SE "The Grand Cathedral" timedemo on a GF4 Ti4200. The scores are even very slightly higher in the 27.30! ( margin of error I guess, otherwise might be higher GPU usage in the 44.03 )

44.03: The Grand Cathedral standard demo: 32.4
27.30: The Grand Cathedral standard demo: 32,6

FYI, the 27.30 are the original drivers for the GF4 reviews.

And just to make sure, I also did my own custom benchmark in the same level - it isn't made in exactly the same area, but the the type monsters are mostly same, the walls are roughly identical, ... - nothing to justify a huge difference.

44.03: The Grand Cathedral custom demo: 30,8
27.30: The Grand Cathedral custom demo: 31,1

Again, an indentical difference can be spotted.
All scores are without AA/AF. I'll test it with 4x AA soon too.


Uttar
 
Uttar said:
Well, I've heard of better AA/AF IQ in the NV40. Nothing particularly reliable, but IMO, it makes a lot of sense considering the NV35 uses GDDR2 with a 256-bit memory bus - limiting yourself to 4X AA is *insane* there! :)
I agree. A fast 6x non-ordered grid AA mode (preferably gamma corrected) is a must in the NV40. Otherwise I won't even consider buying one. After all it would be embarassing for NVidia, if their next generation NV40 won't even have an equal AA quality to the quite old R300.
 
Uttar said:
Hmm, well, there doesn't seem to be any cheating in SS:SE "The Grand Cathedral" timedemo on a GF4 Ti4200. The scores are even very slightly higher in the 27.30! ( margin of error I guess, otherwise might be higher GPU usage in the 44.03 )

44.03: The Grand Cathedral standard demo: 32.4
27.30: The Grand Cathedral standard demo: 32,6

FYI, the 27.30 are the original drivers for the GF4 reviews.

And just to make sure, I also did my own custom benchmark in the same level - it isn't made in exactly the same area, but the the type monsters are mostly same, the walls are roughly identical, ... - nothing to justify a huge difference.

44.03: The Grand Cathedral custom demo: 30,8
27.30: The Grand Cathedral custom demo: 31,1

Again, an indentical difference can be spotted.
All scores are without AA/AF. I'll test it with 4x AA soon too.


Uttar

Could you do the same tests with a GeForce FX ? :)
 
Back
Top