Nvidia Against 3D Mark 2003

The RISC approach of PS 1.4 by itself distinguishes it as being much closer to PS 2.0 than anything else. Granted, PS 1.4 is otherwise not much more than a PS 1.3 x2. Anyway, it was more of a step in the right direction than PS 1.3. . .

To Demalion: PS 1.3 offers 2 other arithmetic instructions and 4 other texture addressing instructions over PS 1.1. That's the only difference. This could result in one or two less instructions being used per pass, but the number of passes required would remain unchanged.
 
I'm more interested in which sites will parrot Nvidia's stance and after years and years of heavy 3DMark usage suddenly stop using it. I'm sorry and I don't mean offense to anyone but [H]'s conclusion sounds like it was taken right off a memo/PDF from Nvidia.

Well theres been quite a bit of discontent with 3dmark over the years, and with every release theres some controversy and a few big sites condemn it. Now cutting Kyle a bit of slack he was probably already discontented with 3dmark, then top that with perhaps a phonecall from a slickster at Nvidia detailing some 'biases' and the result isnt surprising.
Funny thing is though that 3dmark has a life of its own, anyone who doesnt use 3dmark will inevitably get barraged with emails asking for it that they all cave sooner or later and include it. :D
Those 3dmark guys amaze me in how strongly they engage the community, they're talented programmers.
 
depth_test said:
Hellbinder[CE said:
] Nvidia DAMN WELL KNEW that the dx 8.1 spec was going to be PS 1.4 and that it was a subset of the comming PS 2.0.

PS1.4 is not a subset of PS2.0. To be a subset, a legal PS1.4 program would have to assemble/compile via a 2.0 parser. C is a subset of C++. Any C program can be compiled by a C++ compiler. PS1.4 cannot be compiled (assembled) by a compiler that only understands 2.0 syntax. Tne syntax is different.

Hey, I'm not making Hellbinder's statements, remember that when you reply to me...but your reply gives me the opportunity to ask some questions that came to mind:

Would you agree that ps 1.4 functionality is a significant subset of ps 2.0 functionality? I'd think it would be more applicable to consider the basic functionality rather than draw a parallel to a high level language, wouldn't it?

Moreover, 1.4 can't even be said to be the "basis" for 2.0 anymore than 1.1 could. All 1.4 did was slightly increase the program length by allowing two phases, and added some instructions for texture coordinate manipulation.

I get the impression that this has a significant impact on the functionality it offers, though...

It was 2.0 to that added the requirement for a true "general purpose" pipeline that allows arbitrary mixing of color ops and texture ops. 1.4 still kept the same old segmented shader architecture with separate texture and color op sections.

But, with two phases...I though the second phase allowed some of the benefits of intermixing? If so, wouldn't this place it rather closer to ps 2.0? If this isn't so, what is the significance of the two phase?

It's sort of disingenous to act like PS1.4 is almost 2.0, as if, if you simply took a 2.0 program and shortened it, you'd end up with PS1.4 or that 1.4 "layed the groundwork" for 2.0.

Hmm, I do think implementing ps 1.4 functionality does lay the ground work for implementing ps 2.0 from a design standpoint...I'm sure you'll correct me if you think otherwise. :p

To go from 1.4 to 2.0, you need to make really big changes to your pipeline. 1.4 is much closer to 1.1 than it is to 2.0.

Which is more significant, the changes necessary to go from ps 1.1 -> ps 1.4, or from ps 1.4 -> ps 2.0? What constitutes "really big changes" when discussing this, and on what are you basing your evaluation?
 
To be pedantic: C is not a clean subset of C++ these days, especially with regards to conversions of pointer types from void * (always allowed in C, requires a cast in C++) and the complex number types introduced in C99.

As for PS1.4 vs PS2.0: AFAIK, for every possible PS1.4 program it is possible to generate a functionally equivalent PS2.0 program, although obviously not vice versa. This would imply that PS2.0 functionality is a proper superset of PS1.4 functionality - make of that what you want...
 
arjan de lumens said:
As for PS1.4 vs PS2.0: AFAIK, for every possible PS1.4 program it is possible to generate a functionally equivalent PS2.0 program, although obviously not vice versa. This would imply that PS2.0 functionality is a proper superset of PS1.4 functionality - make of that what you want...

But by that argument, any PS1.0, or DirectX1.0 "shader" has a functionally equivalent 2.0 program. It's not really relevent. We know that many shaders can be expressed in a 1.1, 1.4, or 2.0 pipeline, with or without, multipass too.

All 1.4 did was allow two shader programs to effectively be concatenated together via a phase marker so that you could perform one layer of dependent texturing. The other thing was add an instruction to separate texture address manipulation from loading. Before 1.4, it was "implicit" in the instructions. 1.4 still feels as "hackish" and constrained as the 1.1 pipeline, with its arbitrary limitations on things. Much like programming 80x86 vs 68000.


Now look at 2.0: no phase marker hacks. Completely general intermixing of texturing and color operations in any order (except a limitation that dependencies may only be 3rd order or below). 2.0 separates texture coordinate registers from texture sampler registers so that a register containing a texture address is not "implicitly bound" to a particular texture to be sampled.

Yes, de-coupling those tex* instructions into separate load and copy instructions was an improvement over 1.1, but I think it is very disingenous to say that this is the "basis" for 2.0. 2.0 removes what was one of the most irritating things about 1.0-1.4: the severe limitations on the order of operations in the shader, and it also decouples registers from samplers, while extending the pipeline to floating point.
 
Dont kow if this was already posted and i mised it.. here is Nvidias detailed response...
"3DMark03 combines custom artwork with a custom rendering engine that creates a set of demo scenes that, while pretty, have very little to do with actual games. It is much better termed a demo than a benchmark. The examples included in this report illustrate that 3DMark03 does not represent games, can never be used as a stand-in for games, and should not be used as a gamers’ benchmark."

NVIDIA:
"Unfortunately, Futuremark chose a flight simulation scene for this test (game 1). This genre of games is not only a small fraction of the game market (approximately 1%), but utilizes a simplistic rendering style common to this genre. Further, the specific scene chosen is a high altitude flight simulation, which is indicative of only a small fraction of that 1%."

"For all intents and purposes game tests 2 and 3 are the same test. They use the same rendering paths and the same feature set. The sole difference in these tests appears to be the artwork. This fact alone raises some questions about breadth of game genres addressed by 3DMark03. --- These two tests attempt to duplicate the “Z-first†rendering style used in the upcoming first-person shooter game, “Doom 3â€. They have a “Doom-like†look, but use a bizarre rendering method that is far from Doom 3 or any other known game application."

"Finally, the choice of pixel shaders in game tests 2 and 3 is also odd. These tests use ps1.4 for all the pixel shaders in the scenes. Fallback versions of the pixel shaders are provided in ps1.1 for hardware that doesn’t support ps1.4. Conspicuously absent from these scenes, however, is any ps1.3 pixel shaders. Current DirectX 8.0 (DX8) games, such as Tiger Woods and Unreal Tournament 2003, all use ps1.1 and ps1.3 pixel shaders. Few, if any, are using ps1.4."

"This year’s 3DMark has a new nature scene (game 4). It is intended to represent the new DirectX 9.0 (DX9) applications targeted for release this year. The key issue with this game scene is that it is barely DX9."

NVIDIA:
"So, where do you find a true gamers’ benchmark? How about running actual games? Most popular games include a benchmark mode for just this purpose. Doom3, Unreal Tournament 2003, and Serious Sam Second Encounter are all far better indicators of current and upcoming game performance."
 
I wouldn't mind seeing pictures of the new and old nvidia drivers with the fx and compare both of them to the 9700 pro to see if there are any big diffrences in quality. Oh i forgot who said that the geforce 4 ti was a refresh of the geforce 3. It was not , the geforce 3 ti was the refresh . The geforce 4 adding hardware for better fsaa and was clocked even higher than the geforce 3 tis. While adding in the fsaa hardware they could have update the pixel pipe lines .
 
depth_test said:
All 1.4 did was allow two shader programs to effectively be concatenated together via a phase marker so that you could perform one layer of dependent texturing. The other thing was add an instruction to separate texture address manipulation from loading. Before 1.4, it was "implicit" in the instructions. 1.4 still feels as "hackish" and constrained as the 1.1 pipeline, with its arbitrary limitations on things. Much like programming 80x86 vs 68000.
You make this benefit of 1.4 shaders sound like it is of little consequence, even though it saves you from needing an extra rendering pass and processing twice as many polygons. Given the simplicity of the pixel shaders that most games are probably going to use in the near future for performance reasons (a couple of dozen instructions at most, I'd guess), this should have a major impact on relative performance vs. 1.1 shaders. In fact, this is exactly what the 3DMark03 tests seem to be showing so far.

Sure 2.0 shaders give you a lot more flexibility when it comes to more complex techniques, but given what a 100+ instruction shader will do to the fill rate of any existing card, I can't see many game developers taking advantage of that flexibility until much faster graphics chips arrive on the market.
 
What depth_test appears to be saying is that PS 1.4 was nothing revolutionary in capabilities over previous versions -- epecially when compared to the difference between PS 2.0 and 1.4. I have to agree. The ability to do a dependant texture read is really quite limited as compared to PS 2.0, where you can have several levels of dependancy spread throughout the shader program. Also, in order to utilize the maximum number of instructions in a PS 1.4 program, the second phase -must- be used, even if one doesn't need a dependant texture read. Each phase is really just like a single pixel shader program from a previous version.
 
From my talks with NVIDIA recently, I was told they didn't feel they should have to pay to be a BETA partner, nor anybody else. After all, ATI, NVIDIA, Matrox, S3, etc. are Futuremark's bread and butter when it comes to the 3DMark benchmarks. So by not paying, Futuremark possibly decided to go ahead without NVIDIA's input/advice.

I'll refrain from commenting on 3DMark 03 personally til I get some more details and figure out what's up with the benchmark. So far, I don't like that it doesn't even use a real game engine like the previous one did. :?
 
Noone's going to discount FutureMark03 just because Nvidia decides to "poo-poo" it.

But we "must" pay attention to the game benchies that matter, ie the ones where Nvidia gets its way with developers.

Perhaps they should spend more time on developing good products on time and less time whinging about software that doesn't bother with PS/VS 2.0+.

Oh but that's being ignored just like PS1.4 isn't it?

:)
 
I completely disagree. I dont think Nvidia has even one single valid point. 3Dmark 2003 is a completely legit fair benchmarks.

I dont see how you can possibly disagree and still hope to maintain any shred of objectivity.

One example point from NVIDIA's original quote on this thread-
----"Nvidia contends that the first test is an unrealistically simple scene that's primarily single-textured"---

This is a good point. A benchmark that is single textured may obviously showcase a 6x1 or similar architecture, whereas a competing 4x2 or similar product would excel at multitextured tests- IF AND ONLY IF- texturing has some form of major influence on the overall benchmark results.

I think if you took all the hundreds of games released in the past year.. along with possibly the plethora of games due to be released in the future, the games with single-texturing you can probably count on one hand.

As a synthetic, single-textured DX7 test, this is all fine and good... but if the benchmark is being touted as some form of indicator of true "game" performance, then this goal is obviously not very well suited.

---"the stencil shadows in the second and third tests are rendered using an inefficient method that's extremely bottlenecked at the vertex engine"---

This has to be taken on the "honor" system that they indeed have a case with this one. It's unknown unless the sourcecode is released. NVIDIA is stating they feel the method used for rendering stencil shadows varies greatly from the methods used by any current or future games, and in such a way to create bottlenecks that wouldn't present otherwise in the normal game coding process. Can't make a determination here, but the performance numbers will definately be hashed out in the weeks to come to help see if this is a valid claim or not.

On the more detailed information from NVIDIA now posted on this thread- yeah, some of them are utter nonsense. A flight-sim benchmark is a completely valid "game" test from the standpoint that ILK-2, FS2002, Combat Flight Sim 3 and dozens of other flight sim games are available. From the standpoint of "genre" they are making an empty claim. It is foolish if NVIDIA wishes to dictate that owners of 3D cards shouldnt pick a particular genre of videogames in order to be "appropriate" for their hardware purchase. Other more detailed claims are similarly foolish/PR laiden.. but their original query was fairly well formed, albeit very late for them to finally go into such discussion.. especially since the focus of their points have been the case since 3dmark99.
 
From my talks with NVIDIA recently, I was told they didn't feel they should have to pay to be a BETA partner

I would have to agree with them on this point. I got to thinking about this the other day after reading worm's interview that was linked to on these forums. Reading that interview was the first time I realized you have to pay a "membership fee" to be in the beta program. Sounds quite a bit like extortion to me. So if brand x video card company doesn't want to pay the membership fee, they are basically left out in the cold. Sounds like a very bad idea.

As for the rest of Nvidia's complaints, I feel they are crying over nothing. They certainly weren't complaining when they had the only video card that could run the Nature test on 3dmark2001. Didn't here any complaints of "no games are using this yet".
 
that first test problem is nvidia's fault...

had they not released the Geforce 4 MX and it's nice directX7 capabilities the situation might not be as bad.

Imagine this... if not for the Geforce 4 MX, I bet the DX7 test would have been dropped. Imagine all the non-educated computer users trying to run 3dmark03 to find out that a Geforce 3 runs it better.

It might have even been 2 DX 8.1 and 2 DX9 tests rather than the current set up.

And if you look at Nvidia's current roadmap, they will continue selling DX7 Geforce 4MX 8X's throughout 2003. :(

They are holding back the train the ATi has taken control of
 
jjayb said:
As for the rest of Nvidia's complaints, I feel they are crying over nothing. They certainly weren't complaining when they had the only video card that could run the Nature test on 3dmark2001. Didn't here any complaints of "no games are using this yet".

If you are running a large company, will you dig the grave for your company by putting down your own products and companies? If it pays your bills, plus tens of thousands of other employees bills, and keeps everyone fed, I don't think you'll complain that something isn't fair when it benefits your company. :p

Sad fact, but true.
 
Thanks Ostsol, you put it better than I could. I'm not saying 1.4 isn't useful. I'm quibbling with the idea that 1.4 and 2.0 share the same architecture: they don't. 2.0 is no more "based on" 1.4 than it is on 1.1. The most significant change is to remove most of the restrictions on operation order, to increase the length of shaders to respectable limits, and addition of floating point precision. These are orthogonal, general purpose, architectural changes. Where as 1.1, 1.2, 1.3, 1.4 merely "tacked on" new instructions piecemeal, each with their own bizarre restrictions on use and order.


There appears to be some attempt equate 1.4 "on par" with 2.0, as if 1.4 was a huge advancement over 1.1 and served as the foundation for 2.0. This makes 1.4 capable DX8 cards sound like they have very advanced shader hardware compared to their 1.1 cousins, and that 2.0 is merely a subtle evolution of 1.4. In reality, 1.4 is a subtle evolution of 1.1, and 2.0 is a significant general purpose extension to 1.x shaders.

The only big limitation in 2.0, besides program length, is 4th order lookup. GFFX doesn't have this limitation, but in practice, it's irrelevant, since there are very few cases were you need more than 4-chained indirect lookups.
 
Here are a coupple more. Especially notice the second one. I anxiously await what you'all have top say about that..
"The reason that we're not all gung ho about it is that (3DMark'03) is not representative of (actual) games, nor is it a good benchmark," said Tony Tamasi, senior director of desktop product management at Nvidia. "That means Nvidia has to expend effort to make sure it runs well on our hardware. All that energy that we spend doesn't benefit the user. None. Zero. All that effort doesn't go to benefit any game, either. That's kind of depressing."
Oh another thing from the last version. How many games have you seen with a woman ridding a dragon and torching a bunch of little guys??? Or how many games have you seen with a Truck driving through a wasteland getting shot by a giant robot with missles???

This is complete Two faced Hypocracy worse than ANY company i have EVER seen before.
Specifically, Tamasi said he objected that Futuremark apparently chose to emphasize single-textured pixels in the benchmark's four tests, while previous versions had pushed multitexturing. Tamasi also criticized FutureMark's use of older version pixel and vertex shaders, and the benchmark's heavy emphasis on running and rerunning vertex shader operations -- 36 times, by his count.

Um, I thought the whole point of future games and proposed by ATi and Nvidia, and the DX9 spec and just about everyone here was Single-textured games with lots and lots of shaders used...

Further, 3dmark 2001 did not have every aspect support Dx8.1 either. The limited vertex tests uses 1.1 and thats it. Nearly the whole of 3dmark03 benchmark is solid vertex and pixel shaders. THE EXACT direction game development is going. no one is going to use all dx9 shaders over the next 2 years. ITs going to be mixed EXACTLY liek you are seeing in the new 3dmark.

Complete utter Deception and Hypocracy...
 
Thanks Ostsol, you put it better than I could. I'm not saying 1.4 isn't useful. I'm quibbling with the idea that 1.4 and 2.0 share the same architecture: they don't. 2.0 is no more "based on" 1.4 than it is on 1.1. The most significant change is to remove most of the restrictions on operation order, to increase the length of shaders to respectable limits, and addition of floating point precision. These are orthogonal, general purpose, architectural changes. Where as 1.1, 1.2, 1.3, 1.4 merely "tacked on" new instructions piecemeal, each with their own bizarre restrictions on use and order.


There appears to be some attempt equate 1.4 "on par" with 2.0, as if 1.4 was a huge advancement over 1.1 and served as the foundation for 2.0. This makes 1.4 capable DX8 cards sound like they have very advanced shader hardware compared to their 1.1 cousins, and that 2.0 is merely a subtle evolution of 1.4. In reality, 1.4 is a subtle evolution of 1.1, and 2.0 is a significant general purpose extension to 1.x shaders.

No what there seems to be from people like you, is to OVERSTATE the importance and impact of PS1.3, and UNDERSTATE the importance of PS 1.4 There is a Sizable night and day difference between what each does for you. Stop trying to Spin this into a case of equals. They are not equals. There are no comparrisons that can be made AT ALL between the results in your program each one generates.

To deny the fact that ps 1.4 was a stepping stone towards ps 2.0 is nothing but utter denail. Starting multi paragraph disections of minute terms to change the nature of the argument will not change the facts of the end result.

PS 1.4 IS A HUGE STEP OVER 1.1, and they ARE more advanced than their Ps 1.1 cousins.

Otherwise You and your coworkers at Nvidia would not be bitching their asses off. You cant have it both ways. Either it matters or it doesn't. You people at Nvidia can not have it both ways.
 
RussSchultz said:
I think (personally) that the tests should use HLSL and let the best man win. That would give each vendor the ability to use their card to the best of their abilities.

That would be the best way to do it, but...:

RussSchultz said:
Of course, if the HLSL can't be reduced to work on 1.1, what to do...

... exactly. :arrow: We will have this 'problem' until we are talking DX9 and up.

Anyway: I obviously don't have the insight into the workings of the engine that nVidia have, so I can't comment on whether it is based on un-realistic game engine programming or not.

But I not quite get why nVidia focus so much on the PS 1.4 vs PS 1.1. A GF4 is still faster or on par with ATI's 8500/9000, so they should not loose marketshare because of that.

The major point is that both ATI and nVidias DX8-level cards will look a bit pale with this benchmark, while their DX9-evel can show of their advantage. And isn't is the goal for both companies to convince consumers to upgrade, upgrade, upgrade?

I reall didn't expect this move from nVidia.
 
1 DX7 (Even in a couple of years there will still probably be some DX7 type games hanging around and it's important to atleast verify that the card makes a reasonable effort to perform well with them).

Maybe i'm a bit late here but anyway, i don't see any reason at all to include a DX7 game test. Sure, there will be some DX7 type of games hanging around but, which new and upcoming graphics card do you think will have any problems running them ?

DX7 is old news and all new and upcoming graphics cards will have no problems at all running those type of games so i would say, remove that test and add another pure DX9 test. I also agree with some people here that they should use HLSL instead.
 
Back
Top