Nvidia Against 3D Mark 2003

IMO the fact that 3dmark isn't representative of real games is incosequential. 3dmark should actually be as system-independant as possible and be bottlenecked only at the graphics card. Only a fool would assume that a couple of hundred 3d marks (on 2001 anyway, dunno about the scale on 2003) show any real difference but a few thousand would be quite conclusive as to what card is better, no matter if there is a small bias one way or the other. I wouldn't trust a review that only had a 3d mark score but it is a nice and simple comparison between cards that are far apart (2x 3d marks would be roughly 2x performance). The thing about game benchmarks is that they tend to be very specific and mostly only 1st person shooters (ie ut2k3, serious sam, etc are widely used) and they are just as inconsequential to a non fps player as the benchmarks in 3dmark. No benchmark is ever 100% accurate but rather it is a rough indecator of performance and 3dmark2003 looks like it does fine for that purpose. Nvidia are simply annoyed that it doesn't favour them and heck that's fair enough too, they're out to make a profit after all.

I love some of the game demos in 3dmark2003 though (haven't got it yet but I saw the screenshots). Mighty impressive, especially when you consider that a couple of years ago prerenders barely looked as good as this stuff. Shaders make a huge change to the visual quality and realism possible in real time. I can't wait to see what the next few years will bring.
 
I've got a little inside bet going...

I will bet that at least a couple of threads will pop up, this one included, that will end up sporting employees from both ATI and nVidia :)

I said about 3 weeks ago that nVidia's PR was going to end up working some serious OT to makeup for the disaster that would become the FX...and it looks like I was understating it.

We all know that the arguments that nVidia raise are _fairly_ legitimate...However, how ironic is it that, only now, are they raising these objections? I mean, you could pretty much take most of their beefs with 3DM2003 and apply them to any/all 3DM's in the past.

Funny how, all of a sudden, they don't like the results. If there was no such thing as the ATI 9700, and their competitors continued to push out crappy 3D solutions, we all know this would not be happening right now.

I wonder what their next PR move is going to be...
 
Typedef Enum said:
I wonder what their next PR move is going to be...

Well, they should take a hint from somebody who did it the wrong way, namely Nixon: It's not the lies, but those damn cover ups that kills you! ;)
 
Evildeus said:
demalion said:
Heh, what constitutes "barely DX 9"? I'm suspecting it is using shader 2.0 functionality instead of "2.0+".
I don't think so. I think, it's because game 4 is a mixture of DX9 and DX8.1 as pointed in all the reviews of 3DMark03.

I thought such "reviews" were the ones echoing nvidia's statements, i.e., nvidia's "review"?

This is the only DirectX 9 game test in this benchmark, and only partially DirectX 9 at that. Mother Nature takes us through an outdoor environment reminiscent of the Nature test in 3DMark2001. This test does use Pixel and Vertex Shader version 2.0 which is DX9 spec. But it also uses a mixture of Vertex Shader 1.1 and Pixel Shader 1.4. Every leaf on the trees is individually animated using Vertex Shader 2.0; however the grass is animated using Vertex Shader 1.1. The water in this test is the best I have ever seen rendered in real time on my computer. It is rendered using Pixel Shader 2.0 and is truly remarkable looking. The sky is also rendered using Pixel Shader 2.0 utilizing a higher dynamic range of color DX9 cards support. The ground however is produced using Pixel Shader 1.4.
http://www.hardocp.com/article.html?art=NDI4LDM=

I read that as pure marketing speak. How do you "mix" DX 8 and DX 9 shaders? If you mean using lower precision, that sounds an awful lot like a GF FX style optimization. What other way is that phrase justified? I'm open to a correction of some gap in my understanding (since I currently don't see how it makes sense), but it would take more than comments such as those.

For one example, what if animating the grass only needed instructions that also existed in vs 1.1? Does it not count if you only use new vs 2.0 capabilities everywhere? :oops: What objective sense does that make?

To be fair, that ps 1.4 comment sounds like it could mean something significant...do they mean instructions that are not in ps 2.0 but are in ps 1.4, or perhaps can be done in ps 2.0 using more advanced instructions instead of instructions common to ps 2.0 and ps 1.4 (which ones would those be?)? We need some more info on the significance of this...
What I'd guess, however, is that what they are calling a "ps 1.4" shader is a shader using ps 2.0 that could also be done with ps 1.4 instructions due to relative simplicity (but maybe not done with ps 1.3, as I'm guessing they would have used that if they could have gotten away with it). This seems consistent with the way their vs 1.1 comments look to me.

If there is some alternate technical basis to this, I'm all ears, and, as I've said, I think the test should have used HLSL in any case (I even used bold letters! :p ). Currently, however, I have a strong impression of dump trucks full of bovine waste material queuing up for delivery.
 
Simon F said:
Humus said:
RussSchultz said:
I think (personally) that the tests should use HLSL and let the best man win. That would give each vendor the ability to use their card to the best of their abilities.

I'll second that.
Hee hee... I can just picture all the effort that would go into the HLSL compilers to make them recognise specific shaders and instantly choose some hand-optimised object code. ;-)

I'll second that.

I'm pretty sure this will happen with the assembly code too...
 
demalion said:
I thought such "reviews" were the ones echoing nvidia's statements, i.e., nvidia's "review"?
Oh yeah? Then before 3Dmark was pro Nvidia, now anti 3Dmark are pro Nvidia? :rolleyes:

Well, that's what i understand from their PR statement, take it or leave it.
 
From the results we're seeing, it looks like same-generation cards are matching up similarly to how they do in real games: the GF4 4200 seems to be just ahead of the Radeon 8500, and the GFfx Ultra is a little bit ahead of the 9700 Pro. So why all the whining from Nvidia?

Here's how things seem to stack up:

GF4MX 440 - ~250
Radeon 9000 - ~1000
Radeon 8500 - ~1100
GF4 Ti 4200 - ~1300
GF4 Ti 4600 - ~1600
Radeon 9500 - ~2800
Radeon 9500 Pro - ~3500
Radeon 9700 - ~4000
Radeon 9700 Pro ~4800
GFfx Ultra - ~5100

Obviously, Nvidia's problem is that while their same generation cards all win, ATI has a higher generation part at every price point (until GFfx comes out). Let's see if we can trace Nvidia's complaints back to the actual reasons their GPUs are uncompetitive in 3dMark03...

The 128MB GF4MX 440 sells for about the same as a 9000; the only GF4MX 440 scores I can find in the ORB are 64MB, but it shouldn't affect anything since they can only run game1. How much is the 9000 getting from it's ability to run game2 and game 3? Well, according to this guy's report, about 535 marks. The 9000 is still about twice as fast as the GF4MX 440 in game1 alone.

The reason for this would appear to be...the fact that the 9000 is 4x1, the GF4MX is 2x2, and game1 is mostly single-textured, which is exactly what Nvidia complained about! Direct hit!

Now, the GF4 Ti4600 amazingly sells for more than the 9500 and 9500 Pro. Let's compare some subscores:

The GF4 Ti4600 gets around 750 marks from game1, 415 from game2, and 490 from game3.

The 9500 NP gets around 790 from game1, 600 from game2, 740 from game3, and 700 from game4.

The 9500 Pro gets around 925 from game1, 780 from game2, 960 from game3, and 890 from game4.

If we take game4 out of our scores, we have the 9500 NP around 2100 and the 9500 Pro around 2600. The 9500 NP gets the remaining 500 marks over the 4600 primarily via roughly 50% better performance on games2 and 3. Nvidia's complaint about these games is that they both use PS1.4 with PS1.1 fallbacks; if the R300 can run PS1.4 shaders at roughly the throughput NV25 can do PS1.1 shaders, then it's very reasonable to guess that the extra passes are the only thing seperating the 4600 and 9000 NP in this bench. Another on target complaint from Nvidia. (OTOH, are the shaders really common enough that the 9500 NP's smaller multitexturing rate never comes into play?) Meanwhile, it's interesting to note that the 9500 NP also fails to take a penalty in game1 to the extent it is single-textured.

Meanwhile, the 9500 Pro's remaining 1000 marks over the 4600 come in the form of 175 marks on game1 (higher pixel fillrate, again with the single-texturing), and 350 on game2 and 475 on game3. 9500 Pro's margin over 4600 is nearly double that of 9500 NP in these tests; obviously this is pure pixel shading power coming to bear.

Finally, let's look at a 9700 NP score for kicks. Compared to 9500 Pro, we see gains of 16% in game1, 23% in game2, 16% in game3, and only 1% in game4. (These might be lower than actual due to the stronger CPU score on the 9500 Pro.) Clearly game4 is completely shader-limited (no surprise there), while the other three games benefit moderately from the extra bandwidth; game1 perhaps only because it raises single-texture fillrate, whereas games 2 and 3 are still very moderately bandwidth bound.

Compared to today's games, then, it appears 3dMark03 is very much shader-limited, and much less bandwidth-bound than the average. One reading of this conclusion is that it would benefit the GFfx, although as we've seen its higher clocks don't seem to translate to higher shader performance; still, it's worth noting that the 9700's extra bandwidth may be buying it less vs. the GFfx than on many games. OTOH a case can be made that this is a proper weighting on Futuremark's part: the argument being that a 2004 budget/mainstream card (what this suite should be aimed at, IMO) will probably have a considerably higher shader throughput-to-bandwidth ratio than, say, the 9700.

As for Nvidia's arguments, they're still pretty silly. Nonetheless it's nice to see they at least address what are likely the primary causes of the Nvidia lineup's disastrous showing compared to the ATI product in each card's price bracket. Before I started this post I thought the complaints were not only childish but didn't even speak to Nvidia's weaknesses.

Congrats Nvidia: this temper tantrum would appear to be the best-targeted piece of engineering you've released in the past 12 months! :LOL:

So them's my conclusions. Perhaps someone will bother searching the ORB for better examples than I picked (I didn't bother matching CPUmark scores, for example, although I was careful to at least stick to stock GPU clocks).

One final note: it's interesting to see that the 9000 loses just as many marks compared to the GF4 4600 on the game2 and 3 tests as on game1, despite its ability to use the PS1.4 path. Is the pixel shader throughput on R200/RV250 that much worse than NV25? (It sort of looks like the presumably FP32-related pixel shading penalty on NV30!) Or is there something else slowing them down in those tests?
 
Evildeus said:
demalion said:
I thought such "reviews" were the ones echoing nvidia's statements, i.e., nvidia's "review"?
Oh yeah? Then before 3Dmark was pro Nvidia, now anti 3Dmark are pro Nvidia? :rolleyes:

How did you equate my specific discussion of the text you provided with this rather empty statement?

What you are quoting is referring to the nvidia comment I was talking about when you replied to me, and how how the excerpt you provided echoed exactly the same conclusions nvidia reached and still didn't answer their basis (as I discussed in the rest of the post after it). Review is in quotes because answers to the questions I posed would be required for those conclusions to be a review instead of an echo of nvidia's "published" comments.

Well, that's what i understand from their PR statement, take it or leave it.

:?: With the rolled eyes and "Oh yeah?" above, I'd assume this is sarcasm? Do you have the answers to the questions I posed in the rest of that post?
 
Well, considering the settings, it's not really representative of how the cards cards work in actual games, axcept for high end parts (9700 and GFFX)...
 
@demalion

Well, i give you an answer and you talk of biased sites, biaised review, pro-nvidia etc. So what's am i supposed to think? So yeah i'm using sarcasm.

I think that correspond to Nvidia point of view. Now if you don't agree with their point of view it's ok. But if you think Nvidia say X, and all sites think X. Good for you, but it's not my way of thinking, sorry.

concerning the DX9 test, why didn't they just use the PS2.0? For me, it's partially DX9 because it use some old technics when it could have use some new introduces by DX9. It's supposed to be a DX9 bench, not a DX9 when i'm pleased and anything else when i'm not.

When you look at Dave H ranking, perhaps something should catch your eyes :)
 
Evildeus,
I addressed what you just asked me in the post you replied to with the rolled eyes. As I mentioned...read past the part you quoted, and you'll see what I'm thinking about those issues. Namely, the idea that you have to use features unique to vs 2.0 and ps 2.0 everywhere for it to count as DX 9. As I asked...if you have answers to those questions, feel free...
 
demalion said:
Evildeus,
I addressed what you just asked me in the post you replied to with the rolled eyes. As I mentioned...read past the part you quoted, and you'll see what I'm thinking about those issues. Namely, the idea that you have to use features unique to vs 2.0 and ps 2.0 everywhere for it to count as DX 9. As I asked...if you have answers to those questions, feel free...
Well, i don't understand why Futuremark didn't use just PS2.0 and VS2.0. I have no answer, perhaps because they thought it was overkill?

I understand your point demalion, and DX9 is backward compatible, so i really understand what you are saying and i tend to agree. I'm just saying that, from my point of view, Nvidia used the way of thinking i pointed out a few post before.
 
Typedef Enum said:
We all know that the arguments that nVidia raise are _fairly_ legitimate...However, how ironic is it that, only now, are they raising these objections? I mean, you could pretty much take most of their beefs with 3DM2003 and apply them to any/all 3DM's in the past.

Bingo! now thats whats got me miffed. Why the change of heart now by nV?


Matt,
its no big deal weather or not 3dmax uses a real game engine. Look at 3dmark2001 and the K2/GF2. The stock GF2 was about 1000 marks a head of a k2. But in Max Payne the perfromance of the two were nech in nech. So even with a real engine that is used in a real game 3dmark2001 was still poor at posting results in real games.
 
nVidia has a point. Without FSAA, the Radeon 9500/9500Pro is a bit slower than GF4Ti4200/4600, in most current games you can find. Yet, if Joe Average looked up a videocard review and saw "GF4Ti4600 1600 3dmarks, Radeon9500Pro 3500 3dMarks", he'd think that the Radeon was nearly twice as fast.

Worse yet, the GeForce4MX's performance is diastrous. Sure, it's a crappy videocard that I wouldn't recommend anyone to buy, but is it really 1/4 the performance of a Radeon9000? I doubt it.

Looking at those scores, it's clear that nVidia has a point. Unless you are willing to say that a Radeon9500 (non-pro!) is actually 75% faster than a GeForce4Ti4600, 3dmark03 is clearly biased against the GeForce4 generation.

It will be interesting to see whether nVidia reverses their stance once NV31/NV34 come out, which will probably give the 3dmark dominance back to nVidia.
 
nVidia has a point. Without FSAA, the Radeon 9500/9500Pro is a bit slower than GF4Ti4200/4600, in most current games you can find.

Damnit...

3DMark is not about current games. It's forward looking.

9500/9500 Pro might be on par with GeForce4 Ti in current games....but what happens when games start making heavier use of shaders? Forget for a moment the fact that the GeForce4Ti won't even be able to run DX9 shaders...

Will be interesting to see how the 9500/9500 Pro stacks up to the GeForce4 Ti in Doom3.

In any case, in a DX9 test, Radeon 9500 is actually "infinitely faster" than a GeForce4 Ti.

Which brings me to my next point, and something that I think FutureMark does not stress enough:

Yet, if Joe Average looked up a videocard review and saw "GF4Ti4600 1600 3dmarks, Radeon9500Pro 3500 3dMarks", he'd think that the Radeon was nearly twice as fast.

The 3DMark score is not strictly about PERFORMANCE!. I agree that FutureMark does not do enough to educate on what the 3D Mark score is suppossed to mean. If it were just a "performance number", then ALL TESTS would run on all cards, and the 3D Mark score would be given in FPS.

The 3D Mark score is supposed to represent the "Goodness" of a Card. So, a card that scores 2X that of another card is to be considered "Twice as good." Yes, that is an "abstract" way of looking at the score, but that is much closer to reality than looking at the score from a purely performance perspective.

Is the 9500 Pro "twice as fast" as a GeForce4 Ti4600 in Non AA situations? No. But because it supports DX9 shaders, it can be argued that it's "twice as good."
 
Cuz nothing makes your Sims(tm) play better than pixel shaders. :)


But in all honesty, hopefully it'll light a fire underneath their butts to get rid of their DX7 class stuff. boo hiss on that stuff.
 
I think the fire was already lit with the Radeon9000.

With any luck, NV31/NV34 will quickly replace all that DX7 crap. I feel sorry for any sucker who bought a GF4MX and thought it'd perform like a GF4.
 
BoddoZerg said:
nVidia has a point. Without FSAA, the Radeon 9500/9500Pro is a bit slower than GF4Ti4200/4600, in most current games you can find. Yet, if Joe Average looked up a videocard review and saw "GF4Ti4600 1600 3dmarks, Radeon9500Pro 3500 3dMarks", he'd think that the Radeon was nearly twice as fast.

How do you know it won't be twice as fast with the use of DX9 shaders (right now a 9700 is 3X faster than a FX in PS 2.0 tests, I didn't see anyone listening to the original Radeon owners who were unable to run the Nature scence in 2001 and got hammered on the ORB in '3Dmarks' vs the Geforce 3.
Geforce 3 was putting up 3X the scores of a Radeon, was a Geforce 3 3x faster than a Radeon back then...

SS_1024.gif


Nope :!:
 
Me thinks this reaction might be more telling as to the time table for the Nv35. The current Nvidia line (NV2x) is getting spanked across the board. The Nv30 seem to have an uncertain future. 3DM2003, it seems, will not shown any current Nvidia product in a postive light and maybe they are worried that this will impact sales far to much until they can release and improved Nv30. By the way, does the P.S. 2.0 test shed any light on the GeForce FX being an 8x1 or 4x2 debate ?http://www.beyond3d.com/forum/viewtopic.php?t=4252
 
Yes, Nelg, the single textured fillrate still lags the R300's in light of the 125 MHz clock speed advantage. This fact, alongside the lower pixel shading performance, continues to emphatuate a <8 pixel-per-clock architecture.
 
Back
Top