GF4 has inflated 3dmarks scores so says the INQ.....

Interesting to note above is that Fraps 17a counter works with or without splash screen in all game tests except in Game4. In Game 4 Nature the Fraps counter when using the splash screen is blanked out once the benchmark starts. I was still able to get a avg FPS using Fraps using the Frapslog. What ever the case it looks like 3DMark is slowing down my GF3 when I disable the splash screens for Game 4 Nature from the numbers. On the other hand I really can't tell to much difference in the frame rate visually so I wonder if something else is going on.

I will try the updated 3dMark2001se latter.
 
When 3DMark2001se is configured to show a splash screen in Game 4 test, Nature, the Fraps counter is visible until the test begins, then it just disappears (weird) :eek: .

When 3DMark2001se is configured not to show the splash screen in Game 4 test, Nature, the Fraps counter display is visible through out and is consistent with the 3DMark2001 frame rate counter.

All other 3DMark2001se game tests the Fraps counter display is properly displayed with or without splash screen configurations.

Regardless, it appears that Fraps 17a was still counting the frames even though the Fraps display counter was not visible.

I hope that clears it up for you :) .
 
I think this is time for some 'Mission Impossible Music'...
krank.gif
 
LOL :LOL:

Hopefully Madonion will be able to figure this out since it is their benchmark which gives two different numbers when it shouldn't. I wonder what else is wrong with their benchmark???
 
Doomtrooper said:
So its not a Geforce 4 specific issue ..... :-?

No. It's an Nvidia Driver specific issue. Anyone using Nvidia drivers is affected. I've seen scores from GF3, GF3-Ti and GF4 all affected the same way. Scores on the GF4-4600 seems to be affected by nearly 10%.

This is not a MadOnion issue, unless every other video card driver is working around this issue. The ATI-8500 is not affected at all.
 
For reference:

I score 8900-8950 with the new Catalsyt drivers on my Gigabyte 8500LE at 285/275. Amd XP 2000 and 512MB DDR RAM

This is with and without splash screens, this is also playing with D3D Quality setting it to low, setting it to high, so I think Ati disable low quality or enables a "default" regardless of settings when running 3DMark 2001SE.

Regardless, the 8500 is unaffected by this.
 
noko said:
LOL :LOL:

Hopefully Madonion will be able to figure this out since it is their benchmark which gives two different numbers when it shouldn't. I wonder what else is wrong with their benchmark???

Hmmm hard to say its a 3Dmark issue as it only affects Nvidia based cards(all other cards don't exhibit this behavior)..so if it is a 3Dmark issue..what is it specifically looking for when running the Nature test that Differentiates a Radeon 8500 from a Geforce 4 or any other non-nvidia based card for that matter.
Or why would the Madonion 'engine' react differently to a specific brand of cards :oops:
 
HMMMMMMM

nvidia went quack ... someone should make sure they aren't "cheating"in all other benchmarks . Who knows the could have been doing this since the riva 128 days.
It could explain how they get 30% increases on 6 month old hardware. oh where oh where is hardocp with an indept investigation on the matter... mabye they can do a walters and the the nvidia ceo to cry and confess to everything. Btw I don't love ati , i just hate nvidia.
 
You hate NVidia for what reason? I mean, hate is a pretty strong word. Did they kill your father or something? Did your dad work at 3dš£§F
 
Doomtrooper,

Are you absolutely and I mean absolutely sure? I'd suggest that more than one vga's and driver sets should be tested. I just saw a user that checked it out on his v5 and had a 13% drop (1698 splash screens on/1482 off).

Ironically the issue with 3dmark2001 was detected by a user named Egon Olsen back in 24th November 2001 at Slackercentral, yet no one really paid attention to it. Suddenly someone spreads countless of emails accross websites with the "new found" bug and everyone jumps on it.

In the meantime everyone ignores that 3dmark is a synthetic benchmark as a whole and should get the interpretations it deserves and nothing beyond that. Otherwise show me one game - parallelized to the said issue - where you can toggle splash screens between maps and gain or loose performance.

Just because numerous of people have misinterpreted the true purpose of 3dmark, it shouldn't mean in the end that the interpretation is correct or that it reflects absolute reality. I find it also strange when more than once 3dmark scores on a specific driver set decline, yet gaming performance in D3D increases, or vice versa.

I know the usual reply.....3dmark yadda yadda.....industry standard etc etc. I take it for what it really is and the last thing I base my purchase decision on is a 3dmark score.
 
I can never understand how someone can hate a company. A company comprises of many individuals who, in NVIDIA's case, has gone to college, gained a degree, got a good job, and work daily and long hours. Usually they have a family - a wife, and even kids. They pay the bills, put food on their table, and they try to get by day after day by working at that company. So essentially when you say you hate a company, you hate the people who are normal people getting by everyday, doing something that they may even love, and that gets them by each day. If you ask me, someone who hates a company is severely disturbed, and I just can't fathom why anyone would go that far mentally.

There's plenty of companies I don't like, but I don't say I hate them, I just don't buy their products. I vote with my wallet, not voicing my hate. To each his own, I guess.

Anyways, on topic with this post, I don't think it's a 3DMark bug either, and think it is a bug or something with NVIDIA's drivers. It does sound fishy that higher scores are gained over not having intro screens while benchmarking. That just doesn't sound right.

Ailuros:

I've noticed that someone at the Rage3D forums, that owns a R8500 said that he got lower scores when he ran without intro screens. I think more investigation is needed in this matter to really come to a conclusion.
 
its easy

i bought a geforce 256 for $300 then 3 months later the ddr version came out for the same price , i bought a ti 500 and 3 months later a faster geforce 4 came out for the same money. I don't care that ati after waht 7months released a card with twice the ram , it was 7 months and only more ram. I would have rather have given nvidia my money and had them kick me in the you know whats and call it a day. its like going out and buying a sony wega for a grand and then 3 months later the same tv but hdtv comes out and is cheaper. I also hate women... but thats a story for when i'm not drunk
 
I also bought a GF 256 SDR for $320 (tax included) and a week later the DDR came out for the same price. Did I feel cheated? Not one bit, because the card really met my expectations, and played all of my games perfectly fine. I remember when I first plugged the card in and played a game of Quake 2, it was so much faster than my TNT1, and then when Q3Test came out, it really kicked ass. Shortly after that, I became a staff at 3DGPU and haven't really had to buy a card again, but I don't get cards for free, I invest a lot of time on my site, and as they say, time=money. Anyways, everyone has different criterias then I do, but back then, the GeForce 256 cards were really powerful for the games that were out in those days. Yes, the fact that T&L games really didn't come out til years later is a bit of a disappointment, but I didn't buy the card for the future, I bought the card to play my games at the time.
 
Matt Burris said:
I've noticed that someone at the Rage3D forums, that owns a R8500 said that he got lower scores when he ran without intro screens. I think more investigation is needed in this matter to really come to a conclusion.

Matt,

I didn't notice any such post, can you provide a link? I am unable to reproduce such behavior on my Radeon 8500.
 
Anyone using Nvidia drivers is affected.

Some of us aren't :) The GeForce 2 can't run the Nature test, so my results aren't significantly different with the title screens off. But I suppose someone will claim NVIDIA found a different way of cheating to get my scores inflated to a whopping 2850 (woosh!).

i bought a geforce 256 for $300 then 3 months later the ddr version came out for the same price , i bought a ti 500 and 3 months later a faster geforce 4 came out for the same money.

So, you hate NVIDIA because you're a terrible shopper and they have been good about sticking to their product schedules? Sounds like you would have been happier with a Savage 2000. ^_^
 
OpenGL guy:

I was wrong, the post was at SlackerCentral. I visit over 10 forums and sometimes get them mixed up. The post is by Plaster at this thread, said he got a 2.5% decrease with a R8500. http://boards.slackercentral.com/showthread.php?s=&threadid=20151

--------------

Anyways, I did get an email tonight from one of my contacts who used to work at Real3D in the drivers dept., and according to him, it looks like the "bug" may really an optimizations. Here's what he wrote to me:

Saw your comment on the nVidia "bug" that causes results to drop in 3DMark 2001 without splash screens. That's an optimization for sure, we used to do the same sort of tricks at Real3D in Winbench 2D and 3D. We would detect the splash screen (check for a texture of a certain size and format, etc) and at that time we would flush all our buffers and stuff. At that point you know that your scores aren't being measured so that's a REAL good time to do any critical "clean up and get ready" work or anything that takes more time then normal.

Gonna post this up on 3DGPU tomorrow morning, I'm about to head to bed, but thought I'd throw this out there for you guys to talk over.
 
If this isn't a 3DM-specific cheat, I wonder if ATi just got a clue as to how nV optimizes its drivers?
 
BRiT said:
No. It's an Nvidia Driver specific issue. Anyone using Nvidia drivers is affected. I've seen scores from GF3, GF3-Ti and GF4 all affected the same way. Scores on the GF4-4600 seems to be affected by nearly 10%.

We made similar comparisons over in the 3DCenter forums, and a couple of people--I included--didn't get any significant drop in either "Nature" fps or 3DMark score when disabling the splash screens. In my case, we are talking 6238 / 6208 3DMarks, 24.0 / 23.6fps Nature (with splash / w/out splash screens). That's on a regular GF3 with a clean 29.42 installation on Win98SE.

On the other hand side, Macci @ gamershq claims that, when he upgraded from 3DMark2k1 to 3DMark2k1 SE, he experienced the same phenomenon with the Nature score on his R8500. Newer drivers had fixed the issue, he said.

Also, one of my friends experiences a 20fps drop in Dragothic low detail test when turning off the splash screens, but hardly any drop in Nature. The average performance drop in Nature seems to be about 5-10% for GF3/3 boards with some people having 0 drop, and others 30%.

*shrug*

ta,
.rb
 
Back
Top