Nvidia Against 3D Mark 2003

I don't know if anyone has been over to [H] and read the new editorial about the validity of 3DMark and other synthetic benchmarks as tools for comparing the performance of GFX cards. Kyle makes a lot of good points and what he says is completely valid.

However, I have this little voice in the back of my head saying, "Why now?" Why, all of a sudden has this become an issue for some people when these concerns have been essentially the same for years? I really hate to lower myself to the level of some but this really smells bad to me.

A new version of 3Dmark is released that for the first time is not somewhat biased towards Nvidia based cards and if fact is more geared toward rewarding manufacturers who built more advanced features into their GPU's. Nvidia doesn't like this. Nvidia complains publicly. Nvidia exclaims, "We don't want to divert our resources towards optimising for a benchmark!" Strangely something they had no issues with previously. And all of a sudden webmasters start declaring the value of this benchmark worthless.

Kinda makes you think, don't it?

We all knew 3Dmark has always had issues concerning it's validity as a benchmark. We all knew it rather favoured one manufacturers hardware above another. What we never heard was any other manufacturer pissing and moaning about before.

Maybe I've watched too many episodes of 'The X-files', but if more sites decide not to use 3Dmark03 as a benchmarking tool, then I suspect that some of them might have had a phonecall from a certain manufacturer suggesting they might not receive new hardware to review quite as quickly as they once did should they decide to benchmark Futuremark's product.

Now where's Scully?? :)
 
Joe DeFuria said:
...

Damnit...

3DMark is not about current games. It's forward looking.

9500/9500 Pro might be on par with GeForce4 Ti in current games....but what happens when games start making heavier use of shaders? Forget for a moment the fact that the GeForce4Ti won't even be able to run DX9 shaders...

Will be interesting to see how the 9500/9500 Pro stacks up to the GeForce4 Ti in Doom3.

In any case, in a DX9 test, Radeon 9500 is actually "infinitely faster" than a GeForce4 Ti.

Which brings me to my next point, and something that I think FutureMark does not stress enough:

...

The 3DMark score is not strictly about PERFORMANCE!. I agree that FutureMark does not do enough to educate on what the 3D Mark score is suppossed to mean. If it were just a "performance number", then ALL TESTS would run on all cards, and the 3D Mark score would be given in FPS.

The 3D Mark score is supposed to represent the "Goodness" of a Card. So, a card that scores 2X that of another card is to be considered "Twice as good." Yes, that is an "abstract" way of looking at the score, but that is much closer to reality than looking at the score from a purely performance perspective.

Is the 9500 Pro "twice as fast" as a GeForce4 Ti4600 in Non AA situations? No. But because it supports DX9 shaders, it can be argued that it's "twice as good."

Thank you for making this post Joe!

For 3 pages of reading i'd been intending to say the exact same thing. Everyone is somehow hooked on the idea that the 3DMark score is entirely about the performance of a video card. There is one level that is about performance, but your way of putting it that the overall 3DMark score reflects the "goodness of a card" is exactly right (and explained better than i would have managed).
 
cellarboy said:
However, I have this little voice in the back of my head saying, "Why now?" Why, all of a sudden has this become an issue for some people when these concerns have been essentially the same for years? I really hate to lower myself to the level of some but this really smells bad to me.

Why now? A new version just came out. That is usually when groups get a chance to evalute and discuss it for the first time.

Nvidia and [H] gave their reasons why they thought this version was worse for benchmarking than previous versions. You can agree with those reasons or not. Your choice.

You seem to be saying Nvidia doesn't like the benchmark because their scores are low.
Nvidia says it doesn't like the benchmark because their scores are low due how the tests operate.

No need to make a conspiracy out of it; you're saying the same thing...

As far as bias is concerned I don't think it is at all clear 3DMark was biased in earlier versions or with this version. 3DMark tests what it tests. If one card scores better in those tests it does not mean 3DMark is biased. It means that card is better in those particular tests. Whether those tests reflect a majority of real games or a particular DX version is another question entirely.
 
You seem to be saying Nvidia doesn't like the benchmark because their scores are low.
Nvidia says it doesn't like the benchmark because their scores are low due how the tests operate.

Actually, nVidia didn't say anything about their scores being low at all. nVidia is trying to take the "moral high ground" by just saying the benchmark isn't "right".
 
Boy o Boy, Nvidia sure has been easy pickings of late. Maybe they stayed in the beta program just long enough to gain the inside info they are now using to discredit it.

Kyle's arguments are and have been vaild for all versions of 3DMARK, though I think it would be fair to say that '03' has enough new features that if at anytime in its history it should be used it is now. Perhaps, not for its aggregate score, but for the info it can shed on a cards capability in a speific area. Has anyone tried the image quality test to compare FSAA and Ansio on the 9700 or FX.
 
Has anyone tried the image quality test to compare FSAA and Ansio on the 9700 or FX.

Unfortunately, there aren't enough any FX boards in circulation to expect that. (Only enough apparently, for Hard OCP to post FX 3DMark benchmarks with super secret drivers...) ;)

I don't expect we'll get to see some of the comparisons we REALLY want to see (like those you mention) until either B3D gets an FX, or they get released to the public.

Maybe another month or two the way things are going...
 
Luminescent said:
Yes, Nelg, the single textured fillrate still lags the R300's in light of the 125 MHz clock speed advantage. This fact, alongside the lower pixel shading performance, continues to emphatuate a <8 pixel-per-clock architecture.

Has anyone inspected the "always writes to backbuffer framebuffer in 32-bit" theory I proposed? You'd get better 16-bit output, and it isn't like many people benchmark at 16-bit or nvidia could push performance there as an advantage.

I also think of nVidia saying their color compression is "always on", and that it would seem silly to implement that color compression in a flexible enough manner to work with 16-bit, when I consider the possible reasons for this behavior.
As far as it being a driver issue, I assume someone has tried the fillrate tests in 16-bit with newer drivers?
 
Deflection said:
Why now? A new version just came out. That is usually when groups get a chance to evalute and discuss it for the first time.

One could also say that the 'why now' factor also relates to the fact that Nvidia has new hardware hitting the shelves that doesn't perform as well as their competitors some of the newer tests. They could also have made a comment at any time during the life span of 3Dmark2001, saying, "well, 3Dmark is nice and all, but really doesn't reflect our products performance in real games." Of course, they didn't, probably because their cards beat everyone else's in these tests for the longest time.

And that Nature test in 3Dmark2001 sold a whole hell of a lot of GF3's and 4'....

If Nvidia REALLY had issue with the testing methods, they could well have come out and made a comment when they left the 3Dmark beta program. Doing it now stinks like damage control to try and make consumers beleive that it's the software that doesn't work 'correctly', not that their hardware doesn't perform as well as they claimed.
 
Joe DeFuria said:
Unfortunately, there aren't enough any FX boards in circulation to expect that. (Only enough apparently, for Hard OCP to post FX 3DMark benchmarks with super secret drivers...) ;) ...

I thought Hans Blix was looking for those ;) or is looking for a thermo nuclear divice aka NV30. :oops:

Brent, did these new drivers have any affect in game performance or are they tailored fo 3DMARK ?
 
I think what joe said about "goodness" is correct, but I think that most people here will replace their cards b4 diretX9 becomes a sandard and thus, the goodness of being directX9 compatible is not as significant as the goodness of running current games fast.

Measuring "goodness" is like measuring opinions, or IQ, and IO am sorry I dont agree with most peoples vehemence on this issue either. I just bought a 9500pro b/c it seems great for the price, but I have never in the past been stressed by nvidias IQ performance.

That is my two cents.
 
I have found the solution to Nvidia's benchmarking delima. The new 3DMAXISMARK , see how many SIMCITY points you can score (ATI patch must not be installed in order to work). :LOL:
 
Well, I have to say that I've been captivated by the quality of discussion over here, and that prompted me to register as well.

I especially like the "goodness" of the card argument by Joe DeFuria, and the goodhearted but intense PS1.4 arguments.

Having been responsible making these difficult decisions in benchmarks from Final Reality to 3DMark2000 and partially 3DMark2001, it's somehow rewarding to see this kind of argumentation - one way or the other. (Note I have not been involved with 3DMark03 anymore)

Making forward-looking content is hard. Looking at the past benchmarks, in hindsight, I think I can be happy on how we succeeded. But I think it is only correct that you guys continue to doubt and analyze the decisions.
 
cellarboy said:
If Nvidia REALLY had issue with the testing methods, they could well have come out and made a comment when they left the 3Dmark beta program.

Not if they signed an NDA (edit) that didn't expire until the product was released.

If they could have said anything, I imagine they would have started then. They obviously didn't need to wait until the benchmark was released to the public to see the numbers.
 
Not if they signed an NDA (edit) that didn't expire until the product was released.

True.

If they could have said anything, I imagine they would have started then. They obviously didn't need to wait until the benchmark was released to the public to see the numbers.

Not necessarily true. nVidia's argument has more "credibility" if they can show that their FX is competitive (3DMark score) with the Radeon 9700. So, it's possible that they needed to "wait" until they had the chance to tweak their drivers to increase the 3DMark performance. Imagine how nVidia would have looked if FX scores were only as fast as the old drivers show. :oops:
 
I guess actions speak louder than words. From what nvidia says about 3DM03 ( I am too lazy to write 3dmark03 :) you;d think they'd turned their back on it, but I have just tested the latest 42.86 unofficial Det drivers and had an interesting result

I have been using 40.52 for a long time now as they have been quickest in 3dmark2001 for Win 2000. Tonight I tried 42.86, because I saw what the later drivers did for the FX. My score went from 1850 to 2050. That's 10%+. I bet if I go back to 3dmark2001 they are the same or worse :).

Regards

Andy
 
Cyborg said:
http://xbitlabs.com/news/story.html?id=1045073804

Xbit fall to nvidia's smear campaign.

This is getting ridiculous.

How is posting the document for people to read equal to falling to NVIDIA's smear campaign? I thought that's what reporters were supposed to do, report things. If you'd bother to read their comments at the top, they even say they will use 3DMark03 in their reviews.

Anyway, I think what's much worse than NVIDIA's letter, are sites like Digit-Life that do complete video card reviews using absolutely nothing but 3DMark scores to compare the performance. I think we can all agree that omitting 3DMark03 scores in favor of real game benchmarks is a Good Thing (at least, that's what everyone around here has been asking for for years anyway). And I think we can all agree that using 3DMark scores, and nothing else, is a Bad Thing.

While it looks bad for NVIDIA to start whining about it now, but truth is 3DMark03 is a completely different beast than 3DMark01SE was. I don't think people will be flaunting their 3DMark scores quite as much with this version as they did with the last one. The scores just don't seem as relevant this time around. I've worked with shadow volumes enough to know that if FutureMark is doing them the way NVIDIA claims they are, then NVIDIA's claims that the benchmarks will never correlate to any game are pretty accurate. Generating geometry 3 times per frame is just rediculous.
 
Relevant to what, Crusher?

EDIT: Hmm, and you go and edit and add a sentence that makes a start to answering my question.
 
demalion said:
Relevant to what, Crusher?

To the actual capabilities of the video cards as it relates to games. I'm actually quite suprised all the pro-PVR people aren't slapping NVIDIA on the back, they've been saying the same thing about 3DMark all along. Except this time, it's not just TBDR's that are being abused.

I edit a lot, I don't proofread very carefully before submitting posts, and usually think of a better way to say things after I hit the button :)
 
Back
Top