Nvidia Against 3D Mark 2003

RussSchultz said:
Sigh. The server ate my message.

To recap: I'll be painted an nvidiot for saying this, but I agree-ish.

From what I have heard, 1.4 isn't being supported in games.

I'm going to refer to "PS 1.4 functionality" rather than to the name "PS 1.4".

Well, Doom 3 does support that functionality. It is also presumable that similar shadow creation approaches will also benefit from that approach. Current games without such approaches do not, but it is not reasonable to assume that future games also will not unless they specifically ignore the benefits it can offer. It is a logical fallacy to make a statement like "that many of the pixel shaders use specific elements of DX8 that are promoted by ATI but aren't common in current games." when you are trying to promote "CineFX" architecture, and I don't think they could when trying to promote the GF FX in the same breath (and notably absent in that quote is a mention of the GF FX and Cine FX).

It is natural that games that utilize PS 2.0 functionality can benefit from PS 1.4 functionality, and this simple issue is what their statement completely sidesteps.

Of course, I'm not writing games, so I don't know for sure. NVIDIA works with a lot of developers (but I'm sure they don't get too many 1.4 questions ;)), but supposedly Futuremark has been doing more than simply writing cool looking demos/benchmarks and actually determining what will be the future. Who to believe? I dunno.

About efficiency...well, is it efficiency, or efficiency on GF cards in particular? I think nVidia has that complaint about DX pixel shaders entirely (above 1.3), not to mention the current ARB fragment shader functionality, even on the GF FX due to the decision they made, and they are making noise about it in regards to 3dmark because of its wide consumer recognition. But maybe there is some valid efficiency issue...will nVidia educate us about it, or is this just going to be talk without any backup?

About single-textured for the first game test...hmm...from what I read it is not single textured. Do they mean just one "color texture"?


Sounds a lot like the 8500 versus GF 4/GF 3...except the 8500 was clocked slower than the competition, and the GF FX is clocked faster. Also, I'm not sure the theoretical advantages of the GF FX would map especially well with games, and the 8500's advantages seem to have successfully done so. nVidia making their case for this with some example shader code to illustrate their statements would certainly help, and it is not like they would want to keep developers in the dark on how to optimize for the GF FX. They (and others) have much more shader experience under their belt than ATI did when trying to make the case for the 8500.

Hmm...well, it would be interesting to see benchmarks of what extent Cg optimizations benefit in comparison to DX 9 HLSL compilation for a variety of cards (for example, DX 9 HLSL and Cg for GF FX, then both for the 9700 Pro).

I think (personally) that the tests should use HLSL and let the best man win. That would give each vendor the ability to use their card to the best of their abilities. Of course, if the HLSL can't be reduced to work on 1.1, what to do...

MDolenc has pointed out that HLSL won't address the central issue nVidia has, which is that their architecture pre-GF FX is simply less capable with regards to some advanced techniques, and that this deficiency has increasing opportunties to be exposed going forward. As for PS 2.0 and beyond, I do agree that HLSL usage might offer opportunities in the future for improved performance.

Anyways, it is interesting. Its also interesting that HardOCP is following this line of thinking. 2 minds think alike? or a mindless drone?

I dunno. I don't follow how it was OK for 3dmark 2001 to be used when a test using features that didn't exist EDIT: in games, and weren't foreseen to be existing for a while yet, where they simply failed to run on other cards, and then later it was still OK the last time one vendor's architecture's functionality wasn't fully exposed by the test, yet for some reason this time when that happens it disqualifies 3dmark as a benchmark.

It could be viewed as they are trying to correct past mistakes, but the case doesn't seem as strong now as it was then, so that seems a bit bass ackwards. But perhaps future information will change that outlook.

Of course, the most cynical outlook is that the justification of the site is gaining significant "nVidia brownie points" in such a way that significant "ATI brownie points" aren't lost. With the entire "nVidia dropping the Quack dime" thing brought to light, this will likely be a popular view.

For myself, I tend more towards guessing they are "doing the right thing too late" with a healthy skepticism that it is indeed the right thing right now, if only to avoid getting mired in considering politics with such a scarcity of real information.

EDIT: for clarity
 
Crusher said:
So Joe, changed your mind about your "3DMark might as well be called nVidiaMark" comment yet? :)

OT: I love your sig. :)
Is it new or have I just never had a keen enough eye?
 
WTF ?? They backed out of the beta program ..
rant.gif


Great benchmark when you are leading the benchmark tests, but not good when you are not...bahhhh
 
Funny things is with the newer GFFX drivers they are leading the total score at least.

Mucho weirdness going on to me too.

Edit: perhaps NVIDIA wanted Futuremark to use Cg... gets slap from almost everyone for such a poor joke .. hehe :LOL:
 
Heck, I still feel the advanced shader feature test in 3dmark:2001SE and the decisions made to create this by Madonion were odd but also understand it's hard to please all parties and to be fair; and this may offer an odd impression for people that question things and motives a lot.

I don't see any oddness yet with Futuremark.

I have a problem with this from nVidia:

and that many of the pixel shaders use specific elements of DX8 that are promoted by ATI but aren't common in current games


If nVidia has a problem with 1.4 shaders in this FutureMark test......well, why didn't nVidia have a problem with the advanced shader test in 3dmark:2001?

Maybe because the FutureMark bench counts in the score and looks nice and the Advanced Shader test in 3dmark:2001SE did not?
 
Humus said:
OT: I love your sig. :)
Is it new or have I just never had a keen enough eye?

I just added it last week; I thought it was pretty clever. Supposedly a quote from Steven Wright.
 
I find it absolutely hilarious that there are some websites that are actually including Nvidia propaganda in their 'reviews'. For those who don't know, there is a document that Nvidia is circulating around that lists how it thinks 3D Mark 2003 should be discredited (even though they were in the BETA program as recently as December). Just think when that actually comes to light and they will have to deal with everyone knowing that they sold their soul to Nvidia over this. Totally a lack of credibility from where I sit (in my comfy chair at home ).

Now, as for this PS 1.4 nonsense, there are games out there that use them (Tiger Woods 2003 actually does). Nvidia doesn't like to acknowledge ATI's DX8.1 market size because that would defeat their argument. Plus, considering that any PS 2.0 part can run PS 1.4 Shaders, I really don't see the point. It isn't the fault of Mad Onion / Futuremark that Nvidia is actually BEHIND in the technology area. Now, has Nvidia et al ever shown that those effects can use PS 1.3 significantly better than PS 1.1? I haven't seen / heard anything but am willing to listen. But, funny that I can use something that Nvidia says, developers are not really using PS 1.3 since it really can't do much more than 1.1. So, in game, developers are likely to support PS 1.1, 1.4, and maybe 2.0 since a lot of stuff can be acheived via 1.4. Plus, the developers have FAR more incentive since the install base for PS 1.4 capable parts are growing rapidly by the month. Unlike the proliferation of GF4 MX's, which Nvidia seems to think actually means something.

My 2 CAN cents worth.
 
Has ATI ever sent anything like that to Futuremark regarding 3Dmark2001? I doubt it eh. And why did Nvidia leave the beta program - maybe because all their PR can't compete with head-on number comparisons.

Course the funniest bit is the recent conspiracy posts against ATI cards and Worm's "dundedumdedum" post. heheheh

If it's only the DX8 scores are very different then it could be said that the reason is the use of PS 1.1 v 1.4 so maybe the best test is with DX9 PS2.0 - if the NV30 falls short by a big margin there then there's no excuse. But really it's Nvidias own fault, PS 1.4 may have been instagated by ATI but it's not proprietry - it's part of DX8.1 and therefore they should support it - especially if it's so much superior than earlier PS 1.1 or 1.3.

(Unless of course, you don't subcribe that these tests are to find the best speed with best effects + best image quailty).
heh.. that thought reminds me that I doubt Nvidia are overjoyed at having image quality tests within 3DMark2003 either!
 
SirPauly said:
If nVidia has a problem with 1.4 shaders in this test......well, why didn't nVidia have a problem with the advanced shader test in 3dmark:2001?

Easy, because at the time they didn't have any parts which supported 1.4.

As a result, GF3 and GF4 fell back to PS1.1 which they supported fully and correctly, so they ran fine.

GeForce FX, on the other hand, is ludicrously slow at pixel shaders 1.4, judging from its speed on the advanced pixel shaders test (~180fps for R9700Pro, ~110FPS for GFFX - see AnandTech)
 
Actually, NVidia probably didn't complain about 3dMark2001 because PS1.4 wasn't used in any tests that affected the overall score.
 
The Geforce 4 Ti series came AFTER then 8500 series. Nvidia had PLENTY of time to work DirectX 8.1 support into those cards (IE PS1.4) The only people they should blame is themselves for this.

Futuremark used the Most up to date version of the last 3 DX's
DX7, DX8.1 and DX9.

Maybe with the upcoming release of DX9.1 there will be a newer build of 3dmark03 that will put their cards to even more shame. It seems like they are trying to get out now before it gets much worse
 
/me evil mode on

When nv35 comes out and beats the r350 in 3dmark2003 Nvidia will support futuremark

Two months later the support is gone ...

/me evil mode off
 
Maybe this is silly to some of you, but I have (maybe) trouble understanding the use of PS1.4 in Doom3 and 3DMark03.
The way I understand it (grossly simplified) if you have a Doom3 like shader consisting of a number of "rendering tasks that have to be done" I see two approaches:

1) target the shader at PS 1.1 only and it will consist of several passes on any hardware that is able to run it (PS 1.1, PS 1.4, PS 2.0, etc.), making it equally "slow" on every hardware
2) target the shader at PS 1.4 and making it (most of the time) faster on hardware that is able to run it (PS 1.4, PS 2.0, etc.), but provide a (multipass) fallback for PS 1.1 hardware

So approach No.2 would be beneficial for hardware supporting PS 1.4 and above and has no disadvantage for PS 1.1 hardware as it will run the same speed as approach No.1.

If my above assumptions are correct and games would use PS 1.4 hardware in the same way, as a middle ground between "old" PS 1.1 compatibility and a speed boost on newer hardware, it seems only beneficial to me.

If somebody could point out the relative cost/difficulty of supporting a PS 1.4 path with a PS 1.1 fallback opposed to a single PS 1.1 path, it would be greatly appreciated.
 
It's a shame Futuremark didnt include details like what precision each test is run at. The 3Dmark2003 score seems to be very a different between Nvidias 42.63 and 42.67 drivers. I assume what they've done is forced 12/16bit Int mode but thats just guesswork - the test app should supply this information.
 
Actually according to the HardOcp report alluded to earlier, Nvidia's GFFX won the overall highest score in Futuremark 2003, but lost most of the tests of speed at specific tasks. HardOcp said this was b/c the GFFX played the fake games faster than the 9700, and further stated that they thought it was kind of a silly program.

First let’s look at the 9700 Pro, it is a strong performer from the start, even the new Catalyst 3.1 drivers do not improve the score noticeably. There is a very small 0.004% increase in performance at 1280x1024 with the new Catalyst 3.1 drivers. But the GeForceFX, with a new set of drivers, has now beaten the 9700 Pro in the overall 3DMark result.

I am sure Nvidia said that they dislike it b/c it made the GFFX look bad, but it is obviously not representative of reality in any case, so the point is moot.

Nevertheless I care little for sythetic reviews, and much more about how a card actually performs in the situations I use it for, since I don't use it to impress others with my 3d mark score.
 
Back
Top