Nvidia Against 3D Mark 2003

jvd, I thought B3D article showed exacly how the final score was computed.
 
Of course, the most cynical outlook is that the justification of the site is gaining significant "nVidia brownie points" in such a way that significant "ATI brownie points" aren't lost. With the entire "nVidia dropping the Quack dime" thing brought to light, this will likely be a popular view.

Hey Brent....

Rename "3dmark03.exe" to "3dQuack03.exe" and run it on the new FX Drivers.... ;)
 
Personally, I don't think the problems Nvidia has with this benchmark are the use of PS 1.4. The results (from what I've seen) still seem to suggest the GeForce4 Ti 4200 is faster than the Radeon 8500, even though the GeForce4 gets "penalized" for not being able to execute 1.4 shaders. So, what's the problem?
Buf if you look what graphic cards are available from ATI and Nvidia, Nvidia simply CAN'T recommend that benchmark. Their current value line (the 4MX) get around 200 or so marks and can't really run any significant part of the benchmark, whereas ATI's value part (Radeon 9000) gets around 1000 (I hear it already, "the Radeon 9000 is 5 times faster!"). Likewise, in the mainstream/performance segment the GeForce4 Ti obviously can't hold a candle to the R9500 pro either. Only the GFFX seems to do pretty well against ATI's best, but even this chip is not yet available (and never might be - but that's a different story).
Nvidia is just late to the party, and I wouldn't be surprised if they change their opinion about futuremark 2003 in the future (when their value/mainstream NV3x chips will be available).

mczak
 
Entropy said:
This smacks strongly of corporate pressure, and since nVidia is one of the sources of revenue for FutureMark, I wonder about the financial consequences for our Finnish friends, and how much further pressure nVidia can bring to bear on them.
Entropy

don't worry... we used to live all the time that Sovjet Union existed next to it. So we are kind of got used to live under "mighty power" pressure and still keep decisions as we like, not as they like. ;)

(we actually had even 2 wars against them, but no one knows here 1939 winter war, so it would a way long story and a way off topic as well.)

Worm, if you read this, tell the greets to Patrick and the whole team: "Perkele! meitä ei oo täällä kukaan komennellu sitte 1917:sta, eikä muuten komentele nytkään! Linja on hyvä, etten sanoisi loistava. pitäkää se." :)
 
demalion said:
Sounds a lot like the 8500 versus GF 4/GF 3...except the 8500 was clocked slower than the competition, and the GF FX is clocked faster.
Last time I checked the 8500 was clocked at 275MHz, that's higher than the competition (except Ti4600 at 300MHz)
 
From VE3d...

The point HardOCP makes (and which I believe is a valid concern) is that they don't fall back to PS 1.3 (i.e supported on Geforce 3 & 4 cards) but instead fall all the way back to PS 1.1. So of course the ATI cards have an advantage over most Nvidia cards in the tests that use the pixel shaders.

This is the problem Nvidia has spread Nothing but misinformation for years. And it continues. PS 1.2 and 1.3 do not increase performance at all. it has No effect on the number of passes it takes to render a shader. Never has never will. Nvidia DAMN WELL KNEW that the dx 8.1 spec was going to be PS 1.4 and that it was a subset of the comming PS 2.0.

Whos fault is it??? Thats right THEIR OWN!! Now They are going to cry and whine like little babies? 3dmark 2001 used PS 1.1 and took no real advantage of PS 1.4. ATi said nothing, and Nvidia pushed it as the best benchmark on earth. They did not even peep about the lack of PS 1.2/1.3 in the SE release. Why? like i said PS 1.2/1.3 is for show and tell only. makes no difference at all. Notice not one game has ever included ps 1.2/1.3 not even for a single small effect???Now its all different.

For the record....
Cards that support PS 1.1
GF3
GF4
GFFX (when it ships)
All exactly the same only clocked different for model variations.

Cards that support PS 1.4
R 8500
R 9000
R 9500/9500 pro
R 9700
R 9900 (in a coupple more weeks)

Most of the ATi cards are also DX9 where Nvidia does not even have a SINGLE DX9 card out yet. No wonder their scores suck. The Other Question they raise about Vertex shaders. Ati cards have a QUAD vertex engine and Nvidia only supports an array of 3.. whos fault is that??? They cant even win the DX9 Shader tests with their PAST DX9 super card clocked 200mhz FASTER than anything else. Again Whos Fault is that???

I really honestly hope that most people and web sites see through this and do whats right. Not pandering to Nvidia, they dont deserve it.

Edit: Doom-III and several other comming games are using PS 1.4
 
Actually, what NVIDIA has stated is pretty much spot on. The stinky part is that what they have stated has.. and always was.. the case with 3DMark. It just gives the perception that it's not until the issue somehow effects their hardware that they feel the need to speak up and make such a statement.

I thought it was common knowledge (Doomtrooper, HellBinder, Sharkfood, et al) that FutureMark was biased against ATI and that Nvidia had effectively bought off FM. That 3dMark2001 SE's advance pixel shader test was deliberating made to be biased against ATI and pro-NV.

Not sure if you put my name in that list incorrectly or whathaveyou, but even the most basic level of reading comprehension would understand that my standpoint has always been that *inconsistencies* in the benchmark have made it clearly unsuitable for use to compare different IHVs products. It's unfortunate that the inconsistencies are generally in favor of one particular IHV at any given time. I'm unsure where you arrived at the "bought off FM" comment but it's definately unfounded to say the least.

If anything, history has shown that IHV's were not truly active participants but instead favored by nothing more than developer preference or possibly some form of personal reasons of the developers of the benchmark. I've seen several 3D demos with fileid files that are like 'NVIDIA #1! 3DFX SUXORZ! DOWN WITH THE ##@#! VIDEOCARDS!" that do not run on anything but one IHV. It's silly to expect NVIDIA went around writing checks to all these sources, but instead more logical to assume that personal preference or "fandom" is more likely the end result.

Interestingly enough, if you DO dig deep enough into history, NVIDIA once before had some commentary on 3DMark2000, which was to the tune of soothing owners of Geforce256 cards that had built up expectations towards how their hardware would perform considering their 3DMark2000 results. After seeing benchmarks that were 2x-4x that of any other IHV at the time, droves of new Geforce owners wanted to know why not a single game available illustrated the same kind of lead as 3DMark2000. PlanetGeforce used to mirror that official statement and I dont know if anyone still has that document as it was pretty telling. It was also pretty honest.

So I do not discount what NVIDIA has stated concerning 3DMark03- I only question the timing of such a statement. Obviously, when designing a benchmark, many crossroads are going to be hit- should we single texture, multitexture- what kind of shader program lengths should we use, how many lights for scene, what shader revision to code for, etc.etc. At each juncture, if the benchmark is going to be single purpose, a decision has to be reached. That decision should *NEVER* be poised around what intended performance it will achieve under IHV A,B or C, but instead the most logical use for the scene and the best all-around choice for all IHV's (whenever possible). If particular choices are made without any reasonable or logical basis, that's when bias starts to become suspect- and no, bias doesn't always come in the form of financial assistance.. and more often comes in the form of self-instigated policy for a favorite.
 
Hellbinder[CE said:
]Now you tell me how the hell anyone or Nvidia claim that there are more products out there that offer PS 1.1, or that PS 1.4 is not a valid offering. Not to mention IT IS THE DX 8.1 SPEC!!!
Um, because nVidia still has the higher marketshare? That and ATI's low-end DX8 parts just haven't been out for very long.

As for PS 1.4 being "the DX 8.1 spec," there has yet to be a single video card that has supported every feature in any current DirectX release. And DX 8.1 also added PS 1.3 (and 1.2? not sure), not just PS 1.4.
 
Xmas said:
demalion said:
Sounds a lot like the 8500 versus GF 4/GF 3...except the 8500 was clocked slower than the competition, and the GF FX is clocked faster.
Last time I checked the 8500 was clocked at 275MHz, that's higher than the competition (except Ti4600 at 300MHz)

You're right...while not 500 versus 325, 275 versus the 240 of the GF 3 Ti 500 (the card it had trouble outperforming but eventually did) is still a noticable clock speed advantage. While the magnitude of the disparity is different, another similarity is that the 275 MHz clock speed (of the non-LE cards) came as sort of a "last minute surprise".

My comment should be "except the 8500 wasn't clocked as much higher than the competiton as the GF FX".
 
galperi1 said:
The Geforce 4 Ti series came AFTER then 8500 series. Nvidia had PLENTY of time to work DirectX 8.1 support into those cards (IE PS1.4)
No they didn't. GF4 came out last spring, about one year after GF3. It was never supposed to be more than an optimized GF3 chip, just like what the GF2 was to GF1. I don't know when the specs for PS1.4 were finalized and available to NVidia, but I guess it was some months after the GF3 launch. So count for yourself the months NVidia had to completely redesing their pixel pipelines (PS1.1 and 1.4 have some big differences), test it and bring the chip to market in time, and that with GFFX scheduled to come out half a year later. Simply put: not enough.
And NVidia doesn't have unlimited ressources. NV40 development had to start, NV30 was a big project running, nForce2 too, so there wasn't much left to put into the design of NV25 and NV17.
 
depth_test said:
I thought it was common knowledge (Doomtrooper, HellBinder, Sharkfood, et al) that FutureMark was biased against ATI and that Nvidia had effectively bought off FM. That 3dMark2001 SE's advance pixel shader test was deliberating made to be biased against ATI and pro-NV.
"The wheel turns, does it not, Ambassador?"

:LOL: , I never saw that..instead of defending the benchmark why did Nvidia leave 'the beta program' if its not a big deal, did ATI leave when they were getting hammered on the ORB. Wave your pom pom some where else :rolleyes:

My concern is to have a unbiased and fair benchmark , this version does that including all popular Pixel Shader versions up to 2.0.
I'd still rather see games used in benchmarks, always have.
 
I'm more interested in which sites will parrot Nvidia's stance and after years and years of heavy 3DMark usage suddenly stop using it. I'm sorry and I don't mean offense to anyone but [H]'s conclusion sounds like it was taken right off a memo/PDF from Nvidia.
 
Actually, what NVIDIA has stated is pretty much spot on. The stinky part is that what they have stated has.. and always was.. the case with 3DMark. It just gives the perception that it's not until the issue somehow effects their hardware that they feel the need to speak up and make such a statement.

I completely disagree. I dont think Nvidia has even one single valid point. 3Dmark 2003 is a completely legit fair benchmarks. There are some major games comming that use PS 1.4 Doom-III is just one.

They made their own Bed. Now they dont want to sleep in it.
 
Sharkfood said:
Actually, what NVIDIA has stated is pretty much spot on. The stinky part is that what they have stated has.. and always was.. the case with 3DMark. It just gives the perception that it's not until the issue somehow effects their hardware that they feel the need to speak up and make such a statement.

- * snip! * -

If particular choices are made without any reasonable or logical basis, that's when bias starts to become suspect- and no, bias doesn't always come in the form of financial assistance.. and more often comes in the form of self-instigated policy for a favorite.

Futuremark is a business. Most businesses have a single overriding goal - the bottom line. Futuremark is dependent on the manufacturers of the equipment they test for part of their revenue - unknown amounts from the various sources. It is logical to assume that they, as any business, know which way their bread is buttered.

However, they are in a position that is both tricky and good. They enjoy a fair amount of independence, and it is in the interest of the industry that they are healthy and get media attention, because they quantify what the PC industry has always used to spur upgrades - performance improvements. In their own way, Futuremark helps move kit. And in all probablility they do it to a degree far beyond their size. With 23 or so employees, the money they need to operate is less than coffee money to Intel/Microsoft and of course the gfx-providers. They don't necessarily have to sell out, because the function they perform benefits the whole industry, generally speaking. If they mean to operate long-term, they are probably better off trying to keep as clean as you can in this business, because that will help ensure their long-term viability and continued press/consumer interest.

Maybe.
Because review sites still use BAPCo Sysmark, and BAPCo has been a marketing front for Intel for several years. It's a crap benchmark to start with, measures apps that are solved problems by unknown means (until exposed by AMD), produces a single figure of merit by secret weightings of results, the group of companies behind it all have an interest in promoting Intel processors, and Intel has controlled and funded the outfit. Noone in their right mind should touch Sysmark with a ten-foot pole, even before their pants were pulled down. How big does the writing on the wall have to be before you ask yourself if maybe, just maybe, the controlling influence behind the product influenses the product itself?
But review sites still use Sysmark.

So while I would be able to convince myself that having a strict "we walk the straight line" policy would be in the best interests of FutureMark, I have to recognize that reality paints a murkier picture. Any benchmark provider has to be critically scrutinized, because they might be bought, with several kinds of coin, and commercial outfits are particularly closed to the public eye.

And FutureMark is sensitive to pressure from its sources of revenue - any business is. To what extent nVidia has been "discussing issues" out of sight of the public eye, we can never know. But it would be very surprising if their open statement was the only way they indicated their annoyance.

As a community, 3D-entusiasts should be suspicious of Futuremark and call foul if they see a cause. But as long as Futuremark maintains a reasonable integrity, 3D-entusiasts should also be damn grateful that there is a company that supplies them with polished tools for free.
It seems that FutureMark is now providing greater incentive for people to pay for their product, both by restricting access to more functions and by providing new ones for paying customers. It can only be in our interest that they are less economically dependant on their business partners, and a bit more dependant on the community.

Therefore, it's your duty to buy 3Dmark03.
And then, as paying customers, hold Futuremark to a high standard.

Entropy
 
Well I totally support not using 3dmark to draw conclusions about what videocard to buy but to suggest that the results it gives are somehow meaningless is a bit much. 3dmark has done a pretty decent job of estimating DirectX performance in future games--certainly better than any other benchmarks I can think of. 2k3 is a DX9 benchmark. Someone explain to me (I'm serious, I'm not pretending to know anything about this) how a DX9 game wouldn't take advantage of a card with ps1.4 over ps1.1? If I coded a game to take advantage of ps2.0 would I need a different codepath for ps1.1 and ps1.4 cards? Just curious. I guess the point is that (note the name change) Futuremark see's the relevance of it's benchmark as its ability to help guide people who want to upgrade infrequently. The irony is most people here probably upgrade too frequently for it to have any meaning. If I want a card to run IL-2 Sturmovik today I'm gonna want to know the IL-2 scores not the 3dmark scores. DOH!!

<edit--forgot to mention, I understand the problems people have with its inclusion of dx7 d8.1 in the final score. After all they've already got a benchmark to test dx8 performance. However, if I was Futuremark, I'd have a tough choice to make. Not to include them splits up your product. The beauty of 3dMark for all its flaws is it gives you a single number. Why do you think so many reviewers, etc. use it. Simplicity.>
 
Sections that HotHardware state are excerpts from the comments nVidia are circulating about 3dmark03:

nvidia said:
"3DMark03 combines custom artwork with a custom rendering engine that creates a set of demo scenes that, while pretty, have very little to do with actual games. It is much better termed a demo than a benchmark. The examples included in this report illustrate that 3DMark03 does not represent games, can never be used as a stand-in for games, and should not be used as a gamers’ benchmark."

Well, this just strikes me as hypocrisy, but I tend to still agree. What I think has changed compared to 3dmark 2001 is that all vendors are playing the shader performance game (or intend to), and benchmarks limited by shader functionality is a more predictable criteria for indicating future performance since the methods of exploiting it going forward will depend on the same fundamental benefit to a large degree (how fast basic shader instructions can be executed). The issue I see as far as that goes is whether the techniques used are optimal, or atleast reasonably optimal for a wide variety of cards...

nvidia said:
"Unfortunately, Futuremark chose a flight simulation scene for this test (game 1). This genre of games is not only a small fraction of the game market (approximately 1%), but utilizes a simplistic rendering style common to this genre. Further, the specific scene chosen is a high altitude flight simulation, which is indicative of only a small fraction of that 1%."

Well, it seems similar to space sims as well as flight sims to me, and that might grow the 1% figure a bit (where are the percents for the game types they would have preferred to put it into perspective, and the recognition of the weighting of this test in the score?). And what is the "game market"? Games of the type sold altogether, or 3D games? Is it percentage of game owners who play flight simulators (perhaps the number of actual importance for determining the applicability of the test), or the percentage of games sold that are flight sims? Is that 1% as "pulled out of a stinky place" as it sounds to me, or is it based on even one of the questionable, but real, determinations above?

Their argument strikes me as a red herring...if I were to attack its applicability as an indicator of game performance, I'd focus on whether the flight modelling was as demanding on the CPU as it would be in an actual flight simulator.
But as far as I'm concerned, 3dmarks have never been a good indication of game performance, and the blind adherence to it as such was not any more correct when nvidia profited from it. What I've hoped for it to be is a good indicator of graphics card power that the user could learn to more accurately use as part of an evaluation when comparing cards. The problem was, in my view, that the previous 3dmark (except synthetic tests) couldn't even serve that function well. In my view, due to shaders, the game tests have become more like "complex synthetic tests" than they could have been prior, and as such the results are more useful...but, then the actual fps values would serve too, or perhaps even better, using frame based rendering and determining time to completion. Hmm...I really like the idea of the last for the shader tests...keeps things in the proper proportion in my view (think back to the UT 2003 thread for my reasoning :p ). The added bonus of this is that the "mindless" 3dmark mentality might be circumvented a little bit.

nvidia said:
"For all intents and purposes game tests 2 and 3 are the same test. They use the same rendering paths and the same feature set. The sole difference in these tests appears to be the artwork. This fact alone raises some questions about breadth of game genres addressed by 3DMark03. --- These two tests attempt to duplicate the “Z-fi"rst†rendering style used in the upcoming first-person shooter game, “Doom 3â€. They have a “Doom-like†look, but use a bizarre rendering method that is far from Doom 3 or any other known game application."

I wish nvidia would provide the infor analyzing this. This "bizarre rendering method" could be a valid concern... or a complaint about PS 1.4 functionality and architectural failings with the GF FX with a heavy spin.

nvidia said:
"Finally, the choice of pixel shaders in game tests 2 and 3 is also odd. These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn’t support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4."

This strikes me as quite the tremendous smoke screen, with, again, a notable absence of logical consistency with the CineFX push. Also striking is the odd feeling of "shoe on the other foot"-itis in regards to ps 1.4, but I also still think it raises an important issue...

A simple question: can PS 1.3 reduce the number of passes for the techniques likely used in 3dmark, or in some other way enhance performance significantly over 1.1? I know Carmack's comments don't make me think so right now.

A not so simple question: could a HLSL compiler offer performance advantages for a "ps 1.1 compatible" shader compiled to a ps 1.3 target?

On this issue, I might end up being in complete agreement with nvidia, depending on the answers to these questions, especially the latter.

nvidia said:
"This year’s 3DMark has a new nature scene (game 4). It is intended to represent the new DirectX 9.0 (DX9) applications targeted for release this year. The key issue with this game scene is that it is barely DX9.

Heh, what constitutes "barely DX 9"? I'm suspecting it is using shader 2.0 functionality instead of "2.0+". Despite all sorts of hypocrisy and spin alarm bells going off, I'd be inclined to agree if there are opportunities for "2.0+" to improve functionality significantly. I certainly had similar feelings about the prior nature scene and the various issues of scoring (which I don't think is as much of an issue this time around, as we're on the cusp of shader enabled games appearing) and failure to be optimized for higher capability (which I feel could very well still apply, as I strongly feel that this test at the very least should have been a HLSL showcase).
 
Hellbinder[CE said:
] Nvidia DAMN WELL KNEW that the dx 8.1 spec was going to be PS 1.4 and that it was a subset of the comming PS 2.0.

PS1.4 is not a subset of PS2.0. To be a subset, a legal PS1.4 program would have to assemble/compile via a 2.0 parser. C is a subset of C++. Any C program can be compiled by a C++ compiler. PS1.4 cannot be compiled (assembled) by a compiler that only understands 2.0 syntax. Tne syntax is different.

Moreover, 1.4 can't even be said to be the "basis" for 2.0 anymore than 1.1 could. All 1.4 did was slightly increase the program length by allowing two phases, and added some instructions for texture coordinate manipulation.

It was 2.0 to that added the requirement for a true "general purpose" pipeline that allows arbitrary mixing of color ops and texture ops. 1.4 still kept the same old segmented shader architecture with separate texture and color op sections. It's sort of disingenous to act like PS1.4 is almost 2.0, as if, if you simply took a 2.0 program and shortened it, you'd end up with PS1.4 or that 1.4 "layed the groundwork" for 2.0. To go from 1.4 to 2.0, you need to make really big changes to your pipeline. 1.4 is much closer to 1.1 than it is to 2.0.
 
Back
Top