GF4 has inflated 3dmarks scores so says the INQ.....

So is the score too high with the screens or too low without?
That nvnews thread has someone with over an 800 point difference between the two scores. More than 'a few percent' to me :-?
 
When you dont show the screen the score drops. Again not trying to turn this into a bashing thread. It will be interesting to see of other review sites that put ATIs quake/qauke thing under such a microscope will do the same...or will then just shrug it of like HardOCP did?
 
If it turns out to be driver optimizations (or cheating as some call it), big whoop. We know ATI and nVidia custom code their drivers, so I really hope we don't have to make fuss everytime someone finds one of these things.
 
I disagree about the big 'whoop', we need to see about 200 major sites put it on their front page..a couple of articles done at Firingsquad and hundreds of Mad Onion shots blown up 400% :-?

Lets not forget the hundreds of threads on forums :)
 
LittlePenny said:
If it turns out to be driver optimizations (or cheating as some call it), big whoop. We know ATI and nVidia custom code their drivers, so I really hope we don't have to make fuss everytime someone finds one of these things.
I agree, but it does seem kinda unfair to have this attitude now, without printing some sort of retraction/apology to ATi about the whole quake/quack thing.
 
Haven't really looked into this issue (been too busy working on this particular article and not sounding like a fool. :p) - but the intro screens turns off, your score goes lower in 3DMark 2001SE, and people are calling that cheating? I thought the goal in the benchmark is to get higher scores?
 
Well, the "potential" issue, Matt, is that by default, the tests are run with the intro screens ON (the higher scores.) So the question is...which scores are "correct". The default scores (which are higher), or the scores when the screens are turned off (which are lower.)

Regardless of which ones are correct, I'm at a loss as to how it could make such a significant difference. It would be interesting to learn the cause of it in any case.
 
Althornin said:
I agree, but it does seem kinda unfair to have this attitude now, without printing some sort of retraction/apology to ATi about the whole quake/quack thing.

True, but I also think HardOCP owed ATi an apology long before this ever happened. Everyone knew the references to Quake were in their drivers before the 8500 came out. What Bennet did was journalistic trash.
 
Someone has gone around mailing a number of sites with an Excel sheet displaying differences between the 'splash screen' on and off with the conclusion that nVIDIA is using the splash screen to turn on its 3Dmark 'driver optimisations'; we recieved the mail last night but I didn't actually look into it.
 
Pretty hard to doubt the article when the forum links prove it :-?

I think the concern here is why would disabling of title screens affect scores in the first place , and only on Nvidia based cards.
What is being done, not being done in the drivers as its not a 3Dmark issue when title screens are disabled...
 
Crusher,

Yes, the very same Inquirer. Despite popular belief, they are not always wrong. ;) (They couldn't always be wrong, because most of their stories contradict other stories they publish.) Anyway, from the MadOnion NVIDIA Faq:

This question applies to: NVIDIA GeForce3, NVIDIA GeForce3 Ti 200, NVIDIA GeForce3 Ti 500, NVIDIA GeForce4 Ti 4200, NVIDIA GeForce4 Ti 4400, NVIDIA GeForce4 Ti 4600
Q: When I disable the loading screens between the benchmarktests, my score drops when run with NVIDIA accelerator. Why?
A: This most likely is a NVIDIA driver issue. We are investigating this.

LittlePenny,
True, but I also think HardOCP owed ATi an apology long before this ever happened.

And that's the problem. Double-standards. (Not you, Hard OCP for one.) Lot's of people accused Hard OCP of being nVidia biased based on their "Quack Expose." I Believe Hard OCP defended themselves saying stuff like "they are only after the truth" kinda thing...so what would you expect Hard OCP's apprepriate response to this issue, which clearly requires further "investigation" to learn the truth?
Here's what they said:

It is a hot topic via email to us today that there is something afoot with the NVIDIA GeForce4. Seems as though if you disable splash screens between benchmarks in MadOnion's 3DMark 2001 SE, your score will decrease by 500 to 800 points. Some seem to think that this is a sure sign of NVIDIA cheating while others see a 3DMark 2001 bug. I say let's all meet up tonight, get drunk, and argue about it.

Hell, if you are going to beat up on NVIDIA, why not ask them about their Ti GPUs having a green banding issue that we have seen across multiple GPUs and card builders, including our reference design. Another thread. And another.

We specifically questioned NVIDIA on this over a month ago, to never get a reply.

Where's the hard-nosed investigation? An e-mail that was ignored...and no "red alert" to the community about potential shenanigans? Why not re-name 3DMark, or hack in a different splash screen, look at individual tests, compare FPS numbers with the reported 3D Mark score....
 
Please dont turn this into a anti-nVidia or pro-ATI stuff. Just interesting no?

This, is utterly unacceptable imo. Ati people sit here and get thronged by people like chalnoth(sp?), and many others. Yet suddenly becuase its Nvidia we are all supposed to say, "hmm thats interesting". Somehow Quak's 4 reduced lod textures that resulted in 5fps, and lasted for exactly ONE 2 week driver release is = to a few months of 400% inflated 3dmark scores?

I dont think so, but this is the only post on the subject i will make.
 
My take on it is that the default benchmark would be with title screens on, as that's enabled when you start the program up. I really wonder what the option is there for though, why did Madonion put it in there in the first place? Is there any purpose to turning off the intro screens?

This is a pretty interesting issue, but I fail to see the similarities between Quack and this event, as we've yet to see any code that resembles If.3DMark=true then run>optimizations or anything of that sort. If what Inquirer said is true though, then the situation could be the same (the whole "bug in the drivers" routine). I'd ask NVIDIA myself, but my contacts are unavailable for the day, it seems.
 
There are some similarities:

Both are very popular benchmarking tools
Both showed a difference in FPS scores when run in a different matter (3dmark disable splash & Q3 rename exe)

Agreed that the case is not the same thing. With all of the data gathered I believed that the Q3/ATI was a bug. At this point I believe that the same thing exists with nVidia that is not a driver optimization but a bug. Also people like HardOCP blew that whole ATI thing right out of the water when they failed to realize that Q3 ref had existed way before the launch of the 8500 in their drivers. They over analyzed and provided results based on a sub set of the facts not all of them. Maybe the learned their lesson and will wait until they get all the facts? Maybe they are indeed bias? Who knows but I will bet you that we probably wont see as many if any articles investigation this vrs the 5 or so we saw with the Q3/ATI stuff...
 
LittlePenny said:
Althornin said:
I agree, but it does seem kinda unfair to have this attitude now, without printing some sort of retraction/apology to ATi about the whole quake/quack thing.

True, but I also think HardOCP owed ATi an apology long before this ever happened. Everyone knew the references to Quake were in their drivers before the 8500 came out. What Bennet did was journalistic trash.

Hmmm ATI remembers the 8500 launch well...and I think and Hope the Review cards go like this...

1) www.beyond3d.com
2) www.anandtech.com
3) www.tomshardware.com
4) www.firingsquad.com
110)www.hardocp.com ETA Summer 2003 :LOL:
 
Back
Top