MS, ATI, NVidia, DX .....

RussSchultz said:
Simon F said:
CorwinB said:
3) They chose (from the reliable person who did the post we are discussing) not to pay MS money for DirectX 9, and as a result the specs from DX9 were skewed toward the R3xx (bad ATI !).
Money for DX9? :? To Microsoft? :? :rolleyes: I don't think so.
I think Corwin got confused. The money in question, according to the original poster, was the squabble over how much the Xbox chips would cost. (IIRC, there was something about Microsoft wanting to renegotiate the deal to reduce the cost of the equipment).

Presumably, according to the original poster, Microsoft punished NVIDIA for not dropping their price on the xbox chips.

I think CorwinB was sarcastic ;)
 
Personally I would suggest that this may have some things to do with the relationship with MS.

NVIDIA Licenses Breakthrough 3D Technology to Microsoft

Strategic Agreement Enables Key Innovations and Advanced Features in Microsoft DirectX 8.0


SANTA CLARA, Calif.--(BUSINESS WIRE)--Nov. 28, 2000--In a continuing effort to drive the evolution of 3D graphics technology, NVIDIA(r) Corporation (Nasdaq: NVDA - news) today announced it developed and licensed enabling 3D features to Microsoft Corp. for their new DirectX 8.0 3D application program interface (API). NVIDIA contributions were in the areas of programmable vertex shaders, programmable pixel shaders, and rect/tri-patch support for high order surfaces. These features allow software content creators to use more elaborate artwork and flexible 3D rendering techniques in their applications and games.
 
DaveBaumann said:
Personally I would suggest that this may have some things to do with the relationship with MS.

So you're saying that Nvidia wanted licencing money from Microsoft for DX9, and because MS wouldn't bow to Nvidia's blackmail, Nvidia walked off to be left with useless, non-DX9 tech they had hoped to get into DX9, and MS went off to play ball with ATI. :oops:

See, Nvidia *IS* the devil! :LOL:
 
I have no idea about all this craziness, but I do know one thing businesses like money, if MS could make money from charging huge fees for DX9 use they would, OpenGl keeps this from happening and I am sad more developers don’t use openGl for this reason. There is also no doubt in my mind that MS loves the idea of punishing Nvidia b/c Nv was probably getting a little to big for their britches in MS's mind, but considering all the court issues MS has had they most likely would not overtly decide to screw them due to the risk it would be. While NV has been stupid, I liked the fact that they were standing up to MS a bit because we need some kind of faction that can stand up to MS and there is not many.

No I don't think MS is the devil I just like competing interests and no one to dominate all others.
 
Sxotty said:
No I don't think MS is the devil I just like competing interests and no one to dominate all others.

That kind of applies to Nvidia too though. DX has advanced a long way over the last few years with MS in charge. If one IHV, especially one like Nvidia that has the *stated aim* of dominating all display devices on the planet and seems not to have much in the way of scruples, we wouldn't be seeing any other chip supplier in the picture at all.

So while I don't think MS is a wise and magnanimus company by a long shot, I think the DX API standard is better in their hands than being controlled by Nvidia. MS has the aim of controlling the PC graphics API and making sure that is what everyone uses. Nvidia also wants to control the PC graphics API, but in order to ensure that everyone uses Nvidia products.
 
What a luxury it must be for companies to be able to blame Microsoft for every single instance of incompetence and poor judgement they've undertaken over the last several years...:) I wish I could get away with saying to people: "Yes, I know there's a flaw in our product that I didn't tell you about before you bought it. But you see, it's not our fault, it's Microsoft's fault, because there's *something in their code which is done wrong* (insert various technobabble explanation here) and until Microsoft fixes it, I'm afraid we can't fix it, either. I suppose that at some point in the future Microsoft will get around to patching the OS--did you know they've already patched Windows XP over 40 times this year alone? You didn't? Well, you can see the point I'm talking about, then. In the meantime, let me suggest this workaround for you which will have to suffice until Microsoft gets around to delivering a patch that will allow us to fix this bug in our software..." Must be nice...

I also find the "divide and conquer" sentiment amusing...;) What I have observed over the last decade is many companies doing everything within their power, up to and including lobbying on Capitol Hill to the tune of literally millions of dollars a year going to paid lobbyists, to divide and conquer Microsoft. The DOJ anti-trust trial was the culmination of years of this kind of political lobbying on the part of SUN and Netscape. I see greed-stricken lawyers salivating while they are filing the most irresponsible, frivolous "class-action" suits against Microsoft imaginable--the only class of citizen benefitting of course being the laywers--who always act with the "public good" in mind, of course. I see states Attorneys General seeking to "cash in" on the dividing and conquering of Microsoft, mainly so that they can make a name for themselves locally, just as some of them did with the tobacco industry settlements--even though that money was prohibited from going to smokers and ex-smokers, the very people on behalf of whom the suits were supposedly launched in the first place. Some states I gather see Microsoft as a new, unofficial source of tax revenue, it would appear. Clearly, the most obvious and visible "divide and conquer" target is Microsoft, as opposed to the other way around, as far as things look to me.

Even now, in silly and brain-dead conspiracy theories as promulgated at the start of this thread, the idea is not that nVidia simply designed an inferior chip and has been honestly beaten by a competitor--but rather that Microsoft made them do it, according to an arcane formula only understood by people who subscribe to the conspiracy. What these people are actually saying, apparently without realizing it, is that nVidia didn't design nV3x--Microsoft did. I think that's pretty funny...:)

Edit: It's about as funny as the idea that ATi did not design R3x0, but Microsoft did. Seems you'd have to believe one or the other in order to swallow the theory.
 
I think a lot of these conspiracy theories are born because people think R300 came out of nowhere, with performance way above what anyone expected. The thing is that it's not as unexpected as it might seem. The big problem is that one year after the GF3 was released, NVidia just slightly tweaked it and released the GF4, and then it took them a whole year more to get NV30 out there, whereas ATI put out a product that was took the R200 technology to the next level in a bit less than a year. Heck, R300 was planned to be out even earlier, so R200 and R300 would be a "one, two punch", but I'm glad ATI took their time and got it right.

Whose pattern is unpredictable? ATI has put out a new offering every year for quite a while. Plot ATI's performance (let's say quake3) vs. time and you get a fairly steady exponential. NVidia is the one who screwed up and dropped the ball. They were the ones that didn't do what they should have. Once I saw what GF4 brought to the table I knew it was ATI's time to shine.

As for DX9, ATI's performance is not much of a mystery at all. R200 had a good foundation to design for DX9, although there were some performance bugs to work out. PS 2.0's instructions follow quite logically from PS 1.4, but not so much from PS 1.1 which is really just a better way to write multitexturing with a few extra features. The original Radeon had good dependant texturing performance (EMBM), close to the GF3 and the 8500 with only half the pixel pipes. R300 corrected this R200 flaw, and doubled the pipelines to boot, so the 3-4x shader performance increase makes sense.

In the end, you can't justify saying ATI had a pact with MS because of their performance, because there is nothing that spectacular about it when NVidia isn't in the picture. They just plain made bad decisions all round. I don't see what is unpredictable about DX9 except maybe 24-bit precision, which is only a small part of NVidia's performance woes.
 
Looking back at the 3-5x quote...
I'm seriously beggining to think we all, me included, were idiots looking at it. Could be wrong, of course, but...

http://www.hardocp.com/image.html?image=MTA2MzI2MzQ4OHMwTVVnQlIybHZfM180X2wuZ2lm

The exact quote, thus, is:
5x as much time optimizing NV3X path as we've spent optimizing generic DX9 path

Let me requote that...

5x as much time optimizing NV3X path as we've spent optimizing generic DX9 path

I think what's important here is how this is put in context with the other things Valve said. About how there weren't ATI optimizations, for example.
So, that means, if Gabe's true to his worth, no optimization made with Vec3+Scalar in mind. Just general things.

And seriously... I don't see how 5x is so impressive then. Sure, it IS bad. But it was to be expected. The number is IMO vastly exagerated by the fact there were no specific optimizations for anyone but NV, so it's compared to generic optimizations. Which are generally easier to implement IMO.

It's also questionable whether understanding the hardware is in that 5x number. We all know how much time is required to figure out how in the world this crap works, even if we got it explained to us... It ain't easy stuff, for sure! ( Okay, it's not amazingly hard either, but eh :p )


So, I think even though is number IS impressive and does show the gravity of the situation, it is used way too much compared to its real meaning, which is exagerated for at least a reason or two.

And no, I'm not trying to defend NV, these happenings are most ridiculous and they certainly ain't innocent. I'm just trying to see if that quote was not put out of context - and I might be wrong, and Valve might have done a lot of generic optimizations, eh...


Uttar
 
Walt, you do a very poor job of understanding what was written, or a very good job of making straw man arguments.

What these people are actually saying, apparently without realizing it, is that nVidia didn't design nV3x--Microsoft did. I think that's pretty funny...

I have no idea where you pulled that one from, but it wasn't from what the original post wrote.
 
Uttar said:
Looking back at the 3-5x quote...
I'm seriously beggining to think we all, me included, were idiots looking at it. Could be wrong, of course, but...

http://www.hardocp.com/image.html?image=MTA2MzI2MzQ4OHMwTVVnQlIybHZfM180X2wuZ2lm

The exact quote, thus, is:
5x as much time optimizing NV3X path as we've spent optimizing generic DX9 path

Let me requote that...

5x as much time optimizing NV3X path as we've spent optimizing generic DX9 path

I think what's important here is how this is put in context with the other things Valve said. About how there weren't ATI optimizations, for example.
So, that means, if Gabe's true to his worth, no optimization made with Vec3+Scalar in mind. Just general things.

And seriously... I don't see how 5x is so impressive then. Sure, it IS bad. But it was to be expected. The number is IMO vastly exagerated by the fact there were no specific optimizations for anyone but NV, so it's compared to generic optimizations. Which are generally easier to implement IMO.

It's also questionable whether understanding the hardware is in that 5x number. We all know how much time is required to figure out how in the world this crap works, even if we got it explained to us... It ain't easy stuff, for sure! ( Okay, it's not amazingly hard either, but eh :p )


So, I think even though is number IS impressive and does show the gravity of the situation, it is used way too much compared to its real meaning, which is exagerated for at least a reason or two.

And no, I'm not trying to defend NV, these happenings are most ridiculous and they certainly ain't innocent. I'm just trying to see if that quote was not put out of context - and I might be wrong, and Valve might have done a lot of generic optimizations, eh...


Uttar

Duh, is my response. You should have figured that out right from the start. It was pretty obvious you know.
 
nonamer said:
Duh, is my response. You should have figured that out right from the start. It was pretty obvious you know.

Eh :p
Actually, I questionned the thing from the start. But I didn't follow it all much - heck, maybe I'm restating what has been said fifty times here, I don't know.
But looking at how a lot of people use the quote to bash NV, it does seem to me that even if it was obvious to you, it wasn't obvious to everyone.


Uttar
 
Uttar said:
Actually, I questionned the thing from the start. But I didn't follow it all much - heck, maybe I'm restating what has been said fifty times here, I don't know.
But looking at how a lot of people use the quote to bash NV, it does seem to me that even if it was obvious to you, it wasn't obvious to everyone.

I'm scratching my head a little on this Uttar.

It's quite obvious that the "context" in which the statement was used was in Valve's displeasure at the level of optimizing effort and return for that effort it got them.

The number "5x" isn't particularly important, other than to illustrate Valve's overall message.
 
Isn’t that the whole point Uttar? Developers spend millions in creating new games and for them to spend 5X more time to do optimization development for a specific IHV because of their maverick architecture is rather absurd considering it had need not be in the first place.
 
Sabastian said:
Isn’t that the whole point Uttar? Developers spend millions in creating new games and for them to spend 5X more time to do optimization development for a specific IHV because of their maverick architecture is rather absurd considering it had need not be in the first place.

And of course Valve would not have mentioned this "5x more effort for Nvidia" were it not for the fact that even after all that extra attention, the NV35 is still far, far, behind the competition and (most likely) with some reduced image quality. Remember that it was only the 5900U that actually improved using partial precision. All the other lesser Nvidia cards showed virtually no difference at all - their framerates stayed in the toilet.

The real kicker is that if Valve had put in all that extra effort and got some kind of parity with the DX9 path, they would have grumbled, but not been so pissed off and vocal. As it is, I'm sure Valve feel that they did a load of extra work, only to end up going backwards as far as Nvidia cards are concerned.
 
Joe DeFuria said:
I'm scratching my head a little on this Uttar.

It's quite obvious that the "context" in which the statement was used was in Valve's displeasure at the level of optimizing effort and return for that effort it got them.

The number "5x" isn't particularly important, other than to illustrate Valve's overall message.

Well, to make myself clearer, I question the message too thus. Without knowledge of how many hours & weeks 5x is, it doesn't seem to mean much to me.

But don't worry about me today. I have a headache and am constantly thinking about 248594 things "Damn, I never realized I was THAT dumb!"... Oh well.


Uttar
 
Uttar said:
Well, to make myself clearer, I question the message too thus. Without knowledge of how many hours & weeks 5x is, it doesn't seem to mean much to me.

OK, now I know where you're coming from.

My return question to you is...if 5X in the absolute sense was a total of 5 hours....do you think Valve would have made a big stink about it? In other words, had it come from ATI, I can see your concern. But Valve (Gabe) was apparently pretty vocal about the whole thing.

But don't worry about me today. I have a headache and am constantly thinking about 248594 things "Damn, I never realized I was THAT dumb!"... Oh well.

How many licks does it take to get to the center of a Tootsie Pop?

(just thought I'd make it 248595. ;) )
 
Back
Top