Nvidia/ATI Feature philosophy

And by the time R500 arrives their status with developers will still be much better than before the R300.

I believe ATI is expecting the role flipping in terms of the Xbox change is likely to have a fairly significant factor for the next few years.
 
I don't think it helped that much for NVidia. It turned out that most multiplatform games ended up being done for PC, XBox, PS2, and GameCube, meaning developers could not simply focus on GF3/4 only, since they had to make sure it would run on PS2/GC as well.

I expect role flipping to only be a major factor for developers that focus on console "exclusives"
 
I think it certianly helped in a number of cross platform titles such as Halo and Splinter Cell being quite a pertinent example. However, not all these things manifest themselves overtly in terms of solely focusing on one platform, but just in terms of compatibility - whatever the developer platform is whats good for that architecture tends to be whats implemented and can end up as driver "bugs" as far as the end user is concerned. This is where I would expect ATI to see a leg up.

It'll be curious to how XNA affects development as well.
 
Just a nit, I wouldn't really call Halo "crossplatform". It wasn't simultaneously released for 2+ platforms like most "crossplatform" titles (what, 2+ years later?), and the differences between the PC and X-Box in terms of development are much smaller than the other platforms, not to mention it was done by Gearbox and significantly upgraded in it's shaders over the XB version.
 
I thought MS held Halo for PC in that long to sell Xboxes..
but it doesnt seem that any of that time was acctually spent on converting the game..
with all the horrortestaments I hear on how bad it runs for its visuals..
 
Bjorn said:
I agree and disagree :)

As an extreme example:

Ati releases R9700. Nvidia releases NV3X with only DX8 support but 50% faster then the R9700 and better quality FSAA.

Would it then have been wrong of fans and Ati to try and evangelize those features ?

I don't think it is ever "wrong" for a company to evangelize the features of its products, regardless of how poorly those products compare with a competitor's, but with an important caveat: honesty is imperative. Dishonest evangelization is always wrong, and counterproductive, for a company, no matter how well, or how poorly, its products stack against a competitor's.

Some example statements to illustrate:

Right: nV3x does a very good job with ps1.x, and we are writing our drivers to ensure that nVidia customers will see the benefits of our ps1.x performance in the games they prefer to play which support pixel shaders. We look forward to improving our > 1.x pixel shading capabilities in future products.

Wrong: The DX9-specific, pixel shader 2.0 specification is, we believe, in error and does not constitute the future of 3d-game development. We disagree so strongly with this direction for the API that we are resigning our membership in the 3dMk program because we think the benchmark is unrealistic in using ps2.0 as we believe there's no future in it, and that therefore the benchmark itself is a useless fiction and is of no value. We are therefore recompiling our drivers to substitute ps1.x for 2.0, as we feel this is the best option for the future of 3d gaming and our customers.

Comment: The first approach guarantees that you have nothing to apologize for later and no crow to eat on down the line. It states your intentions clearly and rationally and positively. The second approach is arrogant and presumptive, and assumes rational people will see value in your statement simply because you make it as opposed to whether it can stand on its own merit factually. If you change your mind and decide to support ps2.0 later on, as nVidia is doing in earnest with nV40, you have to then counter months of your own intense anti-ps2.0, pro-1.x propaganda.

Right: The pixel pipeline organization of nV30 is 4x2. We recognize that ATi is doing R3x0 with an 8x1 organization, which certainly provides a per-clock advantage to them in single-texturing scenarios. However, where we feel we have an advantage with our approach is in multitexturing scenarios, where both nV30 and R300 operate as 4x2 architectures on a per-clock basis, and our approach to nV30 manufacturing allows us to run nV30 at a higher clock than ATi can run R300, and so in multitexturing scenarios we believe we have a clear performance advantage over R300, and we believe that multitexturing scenarios may well be more common than single-texturing scenarios in current 3d games.

Wrong: The pixel pipeline organization of nV30 is 8x1 (because it does 8 ops per clock.)

Comments: Had nVidia done this the "right" way as I've stated above, they could have avoided so much negative publicity over the last year it is almost inestimable, maintained their credibility as a company, and sought to turn a liability into an asset, and done it all without ever fibbing about "the future of 3d" and needlessly tearing down small, struggling 3d-benchmark companies in the process. They really missed it badly, there.

Right: It is our intention to emphasize the partial-precision, fp16 mode of nV30 under DX9, and we have prevailed upon M$ to officially include support in the API for our fp16 mode. We believe that because of specifics relative to the nV30 architecture, greater precision than fp16 would be counterproductive to performance but at the same time offer no increased visual benefit to rendering in 3d games. While nV30 does offer an alternative, full-precision fp32 rendering mode, we are mindful of the fact that this mode is intended for uses other than 3d-gaming, such as semi-professional 3d rendering with programs like LightWave, because our nV30 fp32 mode just isn't designed to support 3d gaming. We do expect, however, that in future versions of nVidia gpus fp32 will be fully useful for 3d game play, and by that time we expect to see 3d games which will also support that level of precision.

Wrong: nV30 is 128-bits precise throughout the pipeline, and we can tell you right now that 96-bits is not enough.

Comment: Again, had nVidia told the truth from the start it could have avoided so very much unpleasantness brought about by its own reluctance to admit that if "96-bits is not enough," then certainly nVidia's fp16 "64-bits" would "not be enough," either...;) But as we all know, unfortunately, such was not the case.

As it was, nVidia was forced by unavoidable fact to own up to all of these things eventually anyway, and the mystery to me always has been how the company imagined it would suffer no penalty even if its customers, not to mention investors, perceived it as dishonest.

The thing I wanted to illustrate here is that it is possible to evangelize without lying...;) Indeed, that is the way it should be done, in my opinion, even if you think your product is short of the mark set by a competitor. When you tell the truth positively about your own products without tearing down anyone else's products or software by lying about them or your own products, you never have to apologize in the future for anything you said in the past, and you keep your most valuable asset--your credibility-- intact. Considering nVidia's 12-month propaganda war against things like ps2.0, getting decent yields out of nV40 won't be the only problem ahead for nVidia in the coming months--convincing people that *now* they are telling the truth for a change will be a hurdle of their own making they'll have to jump, and I daresay it is a high one.

The point to veracity is not just a moral one. It's very much a common-sense issue to its roots. Our financial instititions are built on a foundation of honesty and integrity, and will not function otherwise (as the examples of Enron and WorldComm illustrate so well.) If customers, and investors, lose their faith in the credibility of a company, then that company is not long for this world. The truth of that proposition is so evident that it still has me scratching my head in perplexity over the many statements nVidia has made within the last year. Lies are like boomerangs in that they return to their source eventually, and like chickens which come home to roost just to bite you on the butt...;) First-class PR never, ever lies, because it simply isn't necessary (and is almost always inevitably counterproductive.) Lying in a sense is like modern technology itself: for every problem it solves it creates three new ones.
 
Sometimes the "right" one seems to be to long and to technical for marketing to endusers, maybe that's one reason....

Developers don't need marketing nor will they listen to it....

Thomas
 
tb said:
Sometimes the "right" one seems to be to long and to technical for marketing to endusers, maybe that's one reason....

Developers don't need marketing nor will they listen to it....

Thomas

Most of the examples above come out of relatively lengthy Internet interviews done with nVidia personnel, in which they sometimes went to lengths to promote some fabrication or other revolving around these points. (Remember Kirk's interview in which he went to lengths explaining his take on how numerology was important in 3d gpu design, and that good things only happen in pairs instead of triplets?) My position would be that if you feel that explaining your products truthfully is too lengthy a process to engage in that you say nothing at all as opposed to lying about them. Lying for any sake, including the sake of brevity, is counterproductive, imo.

As to developers, are they incapable of spotting bogus technical information a company promotes to consumers and investors? I would think if anything they would be even quicker to catch them than the general public. To think that they would be unimpressed or unmoved by such tactics is the same as to call them sub-human, imo. I would think developers would have no less disdain for technical fiction cloaked as fact than anyone else. This valuable lesson here is that when a company makes a public statement that statement is seen by consumers, investors, and developers alike.
 
Back
Top