MS, ATI, NVidia, DX .....

chavvdarrr

Veteran
Because ATI gave Valve 6 million reasons to align itself. And frankly I don't believe that Valve spent 3-5x the time to program the mixed mode. Given the tools out there, that would qualify them as technical idiots, which they are not. I mean, we have source code now. Its got ATI employee names in it. Oh! Wait! THAT'S how they did it. If you have ATI program 80% of your code, well poof there is your 3-5x more to program the Nvidia code.

Microsoft wants to screw Nvidia for a variety of reasons. After the issues they have with Intel, they have no wish to let any 1 chip supplier give them "you'll do it our way" issues if they can possibly sabotage said company's growth earlier in the game. I'm sure the complete irony here is that MS is "promoting competition" in other markets when they are quite happy to have none in their own. We know the Xbox issue. We know how MS feels about its DX9 issue, frankly a good idea overall but don't even pretend to think they don't plan to rule the industry with it.

It is painfully evident that MS and ATI agreed to some specs that, if built into a design, would give vast performance edges to ATI in their then future tapeout of the R3xx. From the performance I've seen under MS games, I'd even say that ATI and MS had some planned code paths as well. Yes Nvidia walked out of the game. Again, that was when they were asked to hand over a check to MS. Now considering what DX9 does for the MS product, and the sheer cash wealth MS has gained from same, you'd think that if they truly wanted to "promote competition" they wouldn't have charged any vendor a penny to particpate in DX9. Ah but that was not the case. You'd almost think they had a good guess that if they simply sent Nvidia a bill, what the reaction would be. You can bet they did know after the Xbox issue. Their billing of each other I'm sure had a long history of familiarity with how each felt about the other. ATI, well you can sure bet they didn't mind this rivalry one single bit. Heck, this was their chance to turn their entire fortune around in the 3d industry. And even better they knew if they played their cards right they'd have MS to show them how to write a decent driver? Anyone catching on to the fact that after they got kissy, kissy with MS their drivers started getting a lot better? I'd bet more than a few phone calls we've never heard about have taken place over the past year and a half between ATI and MS.

Now to be honest, this corporate stuff happens all the time in many industries. The illogic of the situation and the key to understanding is in the facts. No company would deliberately make a half billion dollar product line (NV3x), and do that to be incompatible with the worlds #1 OS. However if the worlds #1 OS was a moving target, there is no "evidence" to go to court on, that would say that such an OS was not simply making "evolutionary changes" to "promote competition" in the marketplace. You have two companies that for 3 years have fought each other over money and a set of intellectual properties. Each one has no wish to lose either money or said properties. Each one would rather not see the other calling the shots, as that contributes to who gets said properties. Party #3, ATI, who strangely had been trying for years to make a successful product and couldn't keep up, suddenly leaps to the forefront. MS knows exactly what it is doing, and given its court issues already isn't goign to talk about this. Funny, they said that they've hated Nvidia for years, shock that. Nvidia has really nothing to gain by pointing this out about MS. Why make a bad situation worse. ATI isn't going to say a word. You want the key to ATI, look to their product history. I even watched a thread on an ATI driver writer where he said something could be done in 1 cycle instead of the 3 currently and another poster (industry code writer) had to correct a basic oversight in his logic (which he had presented in his post). He then retracted his statement. ATI are not brilliant folks, don't kid yourself. MS helped them build a chip that matched the code that MS made them agree on. Then MS helped them make some seriously well coded drivers. And until MS tells them what they are going to do for the 400 (strange that the 400 got delayed about the same time Longhorn got delayed), they will build to that spec as well. ATI goes with a .15 micron part. MS reuses old code and builds on that. You know you really should see the patterns here.

Now this is a lot of speculation and if that's all you have to say about it, I do understand. But, I caution you to be careful delving into the tech of why "Nvidia screwed itself with the NV3x" when there is a lot more to this than just taping out a working GPU chip...
Any comments? Right now I'm unable to show the original from where this quote comes from. Such things were hinted several time in last year (MS "punishing" NV on purpose - divide&conquer approach)
 
That's some pretty neat nonsense in this post you quote. Of course, stuff is happening behind the scenes, that's a given. But this is just a rehash of the current line of Nvidia PR, "everyone is out to get us", or "we don't know what they did, but it seems they want to make our product look bad".

The NV3x line, although late, is, on its own merit, no better or worse than previous Nvidia GPUs, ie an incremental upgrade, with faster execution of current functions (in this case DX8.1) when compared to the previous generation, with some additional features thrown in for OEM checkboxes (DX9 "support"). Just like Intel when they were completely dominating the field (before the Athlon), NV was perfectly ok with "milking the cow", which makes perfect business sense (no NV critic here, that's what all successful companies do). What makes the NV3x product look very bad, of course, is the extremely good R3xx line. Saying that MS and not ATI engineering is behind this success is of course, ludicrous. Another key here is that Nvidia, as a company, seems to have lost the distinction between the means and the end, and its engineering strategy relies only on faster RAM an smaller die process. That was bound to fail at one generation or another, and it did for the NV3x. Now ATI is in extremely good position, because their company managed to stay on top performance-wise while not needing the bleeding-edge process, something 3dfx tried against Nv with the V5 line, but failed at. Of course, ATI could do a similar mistake and refuse to make the jump to smaller processes when needed.

Regarding MS being out to kill Nvidia, that's something we are hearing a lot from desperate Nvidiots recently. The one you quote is just a bit more articulated than the others. The small kernel of truth to this large load of crap is that MS are indeed not happy of Nvidia trying to sabotage the whole gaming industry by trying to force developers to use CG and code "special paths". They are probably less than happy too with the Cheatonator crappola discrediting the WHQL process especially in the light of Longhorn arriving (Longhorn will require rock-solid 3D drivers since DX will be part of the interface). But NV brought this on themselves...

IMHO, Nvidia's hubris grew to unheard of levels when they finally managed to kill 3dfx (with a nice help from 3dfx's clueless management). They thought ATI was not in the race for the first place and the mindshare, that Matrox had all but retired of the competition, and that no new company would enter the D3D/OGL arena since the entry fee was becoming higher and higher with each generation. So they tried to lock the market into place by introducing proprietary standards, ie CG, and securing their mindshare with the TWIMTBP program. They just forgot that they had won this mindshare with a good respect of standards over the previous years...

And when you see "ATI are not brilliant folks, don't kid yourself", you just know something is fishy with the writer...

Nvidia fans better start facing the truth that the NV3x architecture is underperforming when compared to the R3xx line. Occam's razor here : it's much simpler that NV engineering got this generation wrong and ATI did a much better job at the same time than to go with wild conspiracies involving MS.
 
From what I've heard there is OGL code in the source and other hints of HL (one), that being said that source code leak could be a publicity stunt of some sorts. I'm not convinced it's completely official unless something new has happened.

Additionally, considering how finky the NV3x have been, I wouldn't be suprised it took them a LOT longer to get them to work. Nvidia's current line really is completely ass, if you think otherwise, I really have to wonder. How much evidence does one need till their convinced that there are major deficiencies in their implmentations.

Oh well, whatever.
 
You may want to consider that ATI's RenderMonkey has had a hand in development time (I presume people are using it) as it's suppsed to render the stuff in realtime as you code it... but maybe that's just me.
 
From my prosumer (as opposed to professional) perspective, the only thing "painfully evident" is that ATi's R300 was much faster in DX9 games AND had better-looking AA than nV's NV30. All the conspiracy theories in the world can't hide the fact that the 9700P was, at the time, simply a better card in almost every way than the NV30.

I'm also not sure what the R300's process has to do with anything at all. What, did MS give them advice on how to engineer GPUs, too? :rolleyes:

I agree that MS probably swung to favor ATi rather than nV this round, but I don't buy into these complete conspiracy theories. nV contributed to the NV30's shortcomings as much as ATi did. Both companies aimed for a certain target, and, this generation, ATi hit theirs better and earlier than nV.
 
CorwinB said:
That's some pretty neat nonsense in this post you quote. Of course, stuff is happening behind the scenes, that's a given. But this is just a rehash of the current line of Nvidia PR, "everyone is out to get us", or "we don't know what they did, but it seems they want to make our product look bad".

~

Regarding MS being out to kill Nvidia, that's something we are hearing a lot from desperate Nvidiots recently.

Seeing these conspiracy theories popping up more and more. As if nVidia just could not design a bad product.

The Valve assertion in particular makes no sense. Isn't Valve out to make money. How does aligning themselves with ATI prior to the HL2 auction benefit Valve. It's ludicrous to say ATI wrote 80% of the code then had to bid on it. If that was the case ATI should have saved themselves 6 million since it was a done deal.

I guess nvidia should have paid more for Tomb Raider because they screwed nVidia as well.
 
Any comments? Right now I'm unable to show the original from where this quote comes from. Such things were hinted several time in last year (MS "punishing" NV on purpose - divide&conquer approach)

Well, my comment would be "what a load of crap". nVidia designed a chip that couldn't handle DX9 specs... simple as that. Everything that runs DX9 has shown this, which just shows what a load of this rubbish this "Valve is out to get us" nonsense is. nVidia didn't expect a good card from ATI, they didn't expect DX9 to be used by a major game this soon... simple as that.

Uh... and what supposed hints would you be referring to? I've seen a lot of desperate nVidia fans (why don't people just buy the best card??), but nothing more.
 
In no way i claim this is true - it's just something interesting POV noticed on aceshardware forum (no I don't have the link :( if someone is interested he can try&search )
I just hope someone with more "inside" info will comment.
One note - almost an year ago, just before or after the NV30 "launch", on russian forum , man who has conections with NV (judging from other info he had given in time and which was corect), said that initially DX9 full-precision specs required fp32 support, which was changed later.
That's what I know.
My IMHO - Possibly MS did made some decisions in order to punish NV - they were becoming too strong, probably trying to dictate what DX future should be.
Obviously ATi's R300 very closely follows DX9 standarts - which given the development cycle of no less than 2 years is great achievement. Or DX9 & HLSL were written to match closely R300's characteristics. (hint: sincos)
Obviously NVidia failed to make chip which is superior , instead they made chip which is more "in past looking".
Obviously NVidia was unable (and still is) to make good optimising compiler for their NV3x family
 
well thats right.
MS did make some last minute changes to the DX9 specs when NV3x development was already under way.
Why? Well its pretty obvious.
 
No, they didn't. There was a typo in the spec, that was corrected - even before they changed the typo it was fairly clear that 16-bit would have been partial precision, and higher would be full.
 
chavvdarrr said:
I just hope someone with more "inside" info will comment.
Garbage like that is hardly deserving of a comment.

Please note that this isn't about your post, but about the original 'information'.
 
The long quote from Aceshardware is just a big load of conspiracy-based twaddle, as far as I can tell. Just about the only thing he doesn't accuse Microsoft of is paying-off TSMC to sabotage their 0.13 micron process and make NV30 a failure!

Personally, I think that for most DX9 games, FP16 will probably provide 'good enough' image quality so insisting on FP24 as the minimum spec could be seen as 'punishing' NVidia. On the other hand, with the all spin and hype about NV3X which NVidia were chucking out, perhaps they thought that FP32 on NV3X would provide good performance? The fact that WHQL drivers are now happily accepting the use of FP16 indicates that this hasn't been too great of a problem.

Ultimately, the obvious problems that NVidia had in producing the NV30 on time and fully functional are nothing to do with Microsoft. Similarly, the poor drivers which are still lacking support for many features which the NV3X family is reputedly able to produce is also not the fault of Microsoft or, indeed, anyone other than NVidia themselves.

Conspiracy theories are so amusing sometimes. :LOL:
 
Not to give credence to most of the typical smear tactics in the originally quoted material, but the divide and conquer theory (concerning microsoft) has been going around for quite a while, and I'll give it more than a passing nod.

They've been very mercenary in their business dealings, and not allowing one of your suppliers to become too powerful is classical "good business sense".

How much that played into the current situation is obviously up to speculation, but I don't think its pure fantasy.
 
You know Russ, I somewhat agree with you in believing that M$ does not want any one IHV to wield too much power. In this round though, I would chalk up the decision to simply being the best one. No conspiracy.
 
Yes Nvidia walked out of the game. Again, that was when they were asked to hand over a check to MS.

One has to give credit to Nvidia for being a company run by honest money-saving people :

1) They chose not to take part in the 3DMark beta program, as it costs too much money. As a result (from what they say), they don't know why 3DMark03 performs badly on NV hardware, except it's probably something FutureMark did on purpose

2) They chose not to run for the HL2 bidding (too expensive), and as a result the deal went to ATI. That's only because Valve are greedy bastard and chose not to take advantage of the vastly superior CineFX architecture

3) They chose (from the reliable person who did the post we are discussing) not to pay MS money for DirectX 9, and as a result the specs from DX9 were skewed toward the R3xx (bad ATI !). It takes some incredible integrity to condamn your hardware to crawl in 97% of all published games because you didn't want to fork some money...

While they are at it, they really should cut down on the R&D money, that's obviously a bad investment, and the money could be saved to bring more developers in the TWIMTBP program and invest into more lawyers in case more benchmarks appear...
 
CorwinB said:
3) They chose (from the reliable person who did the post we are discussing) not to pay MS money for DirectX 9, and as a result the specs from DX9 were skewed toward the R3xx (bad ATI !).
Money for DX9? :? To Microsoft? :? :rolleyes: I don't think so.
 
nelg said:
You know Russ, I somewhat agree with you in believing that M$ does not want any one IHV to wield too much power. In this round though, I would chalk up the decision to simply being the best one. No conspiracy.
Meh. I simply harken back to the past: "DOS ain't done 'till lotus won't run" was a popular slogan in the 80's. The OS/2 fiasco wasn't a paragon of cooperation between Microsoft and a supplier/partner.

I know I'll never invest in a company who depends on being in Microsoft's good graces to survive and prosper. Love is such a fickle thing.

It doesn't mean that there was an outright conspiracy, of course; or that NVIDIA didn't drop the ball, engineering wise.
 
Simon F said:
CorwinB said:
3) They chose (from the reliable person who did the post we are discussing) not to pay MS money for DirectX 9, and as a result the specs from DX9 were skewed toward the R3xx (bad ATI !).
Money for DX9? :? To Microsoft? :? :rolleyes: I don't think so.
I think Corwin got confused. The money in question, according to the original poster, was the squabble over how much the Xbox chips would cost. (IIRC, there was something about Microsoft wanting to renegotiate the deal to reduce the cost of the equipment).

Presumably, according to the original poster, Microsoft punished NVIDIA for not dropping their price on the xbox chips.
 
Back
Top