ATI engineering must be under alot of strain. MS funding?

Joe DeFuria said:
Just some blurbs from the conference presentation above:

ATI stated:

ATI said:
The company has entered into an agreement with Microsoft to participate in future products and services for the X-Box and future X-Box derivatives."

My guess is that Nintendo's next console will be a Xbox2 derivative. IIRC there were rumours that M$ was looking to buy Nintendo.
 
Uhm, that would be something. Nintendo is big. That would be one expensive deal for MS.........
 
nelg said:
My guess is that Nintendo's next console will be a Xbox2 derivative. IIRC there were rumours that M$ was looking to buy Nintendo.

That would finally get some "decent" games on the xbox. My wife looked at the list and said there was nothing there that she found interesting. And she likes sports games.
 
Ailuros said:
I contend that a large part of the reason that the NV3x performs poorly is because of how the DX9 spec was defined.
So there isn´t a chance in a million that the design was actually impressive but several mistakes happened while transitioning it from paper to silicon?

I´d personally call the design on paper actually brilliant....
I would agree that that's probable, but is not the only reason. The register usage issue is the one glaring problem, I think, of the NV3x architecture. That problem may have been there on paper to begin with, we just don't know. What is sure is that that part of nVidia's problem is independent of the DX9 specs.

It's the other major problem the NV3x is having that I'm talking about here: the necessity of the NV30-34 to use integer types for reasonable shader performance. Microsoft totally shut this avenue down (for developers who use Direct3D). It seems to me such a simple thing to add integer types.

Or they just knew while making those statements (I did listen to almost every live broadcast in the past months) that someone else already had won the XBox2 design. How long ago did rumours appear about ATI winning said deal? Those statements you mention where NOT made before those rumours appeared.
I think those rumors were based on what people saw in the Radeon 9700 Pro, not on anything close to real knowledge. The actual deal was, almost certainly, not settled until just before the announcement (1-2 weeks tops). If anything, nVidia representatives may have been doubtful about getting the deal due to their relationship with Microsoft at the time. I don't believe there was anything in the way of concrete information to be had.

Coming from official NV lips it was stated a couple of months ago that if they don´t win the XBox2 deal, that they´ll shift to alternative markets, the PDA/mobile market included.

Of course does your sentence above make perfectly sense, but I don´t see why resources then get shifted elsewhere and more specifically (as it sounded) those resources that were dedicated to Xbox1.
Well, I don't know, either. It could also be related to a possible idea about how much in the way of resources each project would take. Regardless, we will never know for sure the real reasons, feelings, motivations, etc. The real important thing is results, as always.
 
nelg said:
My guess is that Nintendo's next console will be a Xbox2 derivative. IIRC there were rumours that M$ was looking to buy Nintendo.
Heh, that sounds more than just a little improbable.

Microsoft will never purchase Nintendo. Nintendo will remain Japanese. I don't believe for a moment that the development of the next Nintendo console and the next Microsoft console will include anything at all in terms of collaboration.

I would even contend that even if Nintendo does go for ATI to produce their next chip, Nintendo will ask for different specifications than Microsoft will.
 
Microsoft will never purchase Nintendo. Nintendo will remain Japanese. I don't believe for a moment that the development of the next Nintendo console and the next Microsoft console will include anything at all in terms of collaboration.

Don't be so sure of that. MS tried to purchase nintendo (for billions) before the start of the current generation consoles, and the negative factors ending up being nintendo not wanting to throw away the gamecube hardware in exchange for XBox.

If Nintendo (that has HQ just up the street from MS) and M could decide to collaborate on the hardware (which is far mor elikely now that they are using ATI, then there' sa chance somehting ocul dbe worked out. Realistically, it's the only way Sony can be beaten in this market. MS needs Major help in japan and with more console exclusives, and the nintendo brand has slipped in popularity in other regions while not providing the a steady flow of software that appeals to older gamers. A collaboration between these two companies would probably be just enough to compete with PS3.

Also, both companies are on good terms, specifically after the Rare deal.
 
Spong.com made up that story months back. They have made up TONS of stories that turned out to be BS. Obviously they had a 50/50 chance of getting something like this correct, since it's either Nvidia again or ATI (the two top video card companies in terms of populairy and technology).

However all those spong rumors about MS buying sega, Nintendo buying capcom, konami and sega (Megaton) were nothing more than nonsense.

Dave, even if you just guess at everything in life, you're bound to be correct sooner or later... I'm not going to talk about the "back slapping" on both companies whenever they take sales away form playstation comment. Since that was pure made up BS. They certialy wernet seen togther at E3, or else some real reporters would have seen thes. Nintendo and MS are on good terms, but not THAT good and at this point they are still competitors. They certianly aren't buddies... yet.

BTW why is it that spong reported this buddie mentality between these companies and Nobody else that has "actual inside information" have said anything similar? psh "back slapping".... :rolleyes:
 
Regarless of where it was coming from, perhaps you should have been looking at who was posting it...
 
Regarless of where it was coming from, perhaps you should have been looking at who was posting it...

erm, would you like a pat on the back? ;) no seriously, what are you trying to say since you're singling me out and saying "you"?

You were one of "many" that posted the same link to the spong article that week and i'm not the only one that didn't believe it. Anything they back up as truth is higly suspect IMO.

btw, the guessing at everything comment wasn't directed at you, it was directed at spong. Spong is still full of it though, but they are bound to be right sooner or later.
 
FUDie said:
Chalnoth said:
It's all a matter of perspective. I contend that a large part of the reason that the NV3x performs poorly is because of how the DX9 spec was defined.
What an odd way to look at it. I contend that a large part of the reason that the NV3x performs poorly is because of how the chip was designed.

There, that makes more sense.

-FUDie

Agreed. Although I wouldn't word it that way. I'd rather say that a large part of the reason that the NV3x performs poorly is because of how the chip's design was implemented.

The design isn't bad IMO. Quite a few good ideas. But they didn't think about some major problems, didn't have enough workforce/budget in the beggining ( I'm sure management must have given them way more men than they could use in the end though, eh - too bad they probably knew nothing about the project... )

Saying they aren't responsible or anything is kinda BS IMO. They took risks, and they failed in a lot of POVs. What more is there to it, really?


Uttar
 
Chalnoth, from what has been said, the next Nintendo console GPU has been in R&D at ATI for well over a year. I would think perhaps even more like 2 years. Unless some unforeseen disaster happens, Nintendo will use the ATI chip now in development.
 
Hm i'm not sure about this.
But wasn't it sensaura that designed the audio chip for the xbox?
Since i remember i got a flyer about xbox saying something along those lines
 
Unit01 said:
Hm i'm not sure about this.
But wasn't it sensaura that designed the audio chip for the xbox?
Since i remember i got a flyer about xbox saying something along those lines
No. The "audio chip" was embedded within the south bridge of the xbox chipset. Its very similar to the nforce audio subsystem. Both are based off of a DSP core purchased from Parthus, Inc (a Motorola 56300 clone).
 
Qroach said:
Regarless of where it was coming from, perhaps you should have been looking at who was posting it...

erm, would you like a pat on the back? ;) no seriously, what are you trying to say since you're singling me out and saying "you"?

Isn't it obvious?

Dave has certain inside information and contacts that most of us don't...and such information he is undoubetdly he isn't suppossed to share.

But if SOMEONE ELSE publishes the same or similar info, Dave is of course, free to make reference to it.

In other words, I don't see Dave making a habit out of posting spong or any other source "for no reason." Ususally, if Dave posts from a source, I've come to learn that it's for a good reason: that source tends to corroborates his own sources. Not that I know his sources or what they tell him...but when Dave does make a post with a reference, that tends to become true. The source may be completely guessing, and they may not. But the point is, Dave made a reference to it. Of course, he won't / can't say that the source agrees with his own...but he probably figured most people should be able to read between the lines and figure it out. Out of frustration....he just dropped you the most blatant hint. ;)
 
1. Possible friction between Microsoft and nVidia over DX9 specification.
2. The price mediation on the original X-Box chipset.

If the contract award regulations work anywhere near the same as Europe as in the US, basing award of contract on either of those conditions (even in part) would be significant grounds for litigation.
 
Uttar said:
Agreed. Although I wouldn't word it that way. I'd rather say that a large part of the reason that the NV3x performs poorly is because of how the chip's design was implemented.

The design isn't bad IMO. Quite a few good ideas. But they didn't think about some major problems, didn't have enough workforce/budget in the beggining ( I'm sure management must have given them way more men than they could use in the end though, eh - too bad they probably knew nothing about the project... )

Saying they aren't responsible or anything is kinda BS IMO. They took risks, and they failed in a lot of POVs. What more is there to it, really?


Uttar

If you look at nv30 and what's happened since it merely follows the same pattern nVidia's taken consistently since 1999--the difference being that this time their aggressive position toward adopting a new FAB process ahead of everybody simply blew up in their faces. The strategy they'd used successfully in that regard since 1999 backfired on them.

nVidia's always been weak in gpu core design, IMO, but very strong in FAB process implementation. nVidia got lucky moving from .25 microns to .18, and then from .18 to .15--pretty much ahead of everybody else. Being aggressive about it allowed them to ramp up MHz without paying much attention to performance increases relative to the core design architecture (not to say there weren't any--just nothing revolutionary.) Where the other guys were conservative in their approach to adopting new processes, nVidia has been consistently aggressive. At .13 microns their luck ran out and exposed the soft underbelly of the company--its lack of imagination in core architecture design.

I mean, this is something that to me is very clear. It matches the historical product record, and to back it up you have the public record of the nVidia CEO, JHH, saying well over a year ago that nv30 was impossible at .15 microns--hence they were going to .13 for it. These are statements he made long before nVidia even knew if if could do a viable, competitive nv30 at .13 microns--which proves the point conclusively, I think, of just how process-heavy nVidia's strategy had become.

By way of yet another concrete example--immediately after ATi had shipped R300 in August of last year with its 8-pixels-per-clock architecture, details on the architecture of nv30 remained a matter of gossip and conjecture until the nv30 reference design was officially unveiled at Comdex. nVidia officially released it as an 8-pixels-per-clock chip--it was in their marketing literature and could later be found even on their product boxes. Direct questions asked in interviews to nVidia such as "Is nv30 an 8-pixel-per-clock architecture?" were answered with this kind of reply, "Yes, we do 8 ops per clock." For a long time nVidia would not even answer the question--but sought only to evade and dodge it at every opportunity. nVidia simply didn't want it known what enormous differences there were between R300 and nv30--nor that nVidia was unable to match R300 architecturally--it's only hope of doing that--extremely high MHz afforded by a good-yielding .13 micron chip--now dashed.

So what was nv30 apart from a .13 nv25 with enhanced integer precision and a bolt-on fp capability? Very little else I suspect. It wasn't really "new" or "revolutionary", despite nVidia's promotional efforts to the contrary. But most of all, relative to being weak in gpu architecture design but top-heavy in advanced process implementation, the gpu-design chickens finally came home to roost with the .13 nv3x gpus.

There are lots of of supporting examples we could discuss--like nVidia's public show of blaming TSMC and moving to IBM for fabbing, only to back off of that substantially later on--to illustrate that their core strategy has always been fab-centered, but the real question is:

What can nVidia do to break out of the mold it's been in since 1999 relative to gpu design? Short of cleaning house and bringing in some fresh blood, I can't see a lot they can do. IMO, nV3x probably represents the very best effort the current gpu-design teams at nVidia are capable of at present. Yes, you can teach old dogs new tricks--but it's never easy...;)
 
WaltC said:
nVidia's always been weak in gpu core design, IMO, but very strong in FAB process implementation. nVidia got lucky moving from .25 microns to .18, and then from .18 to .15--pretty much ahead of everybody else. Being aggressive about it allowed them to ramp up MHz without paying much attention to performance increases relative to the core design architecture (not to say there weren't any--just nothing revolutionary.) Where the other guys were conservative in their approach to adopting new processes, nVidia has been consistently aggressive. At .13 microns their luck ran out and exposed the soft underbelly of the company--its lack of imagination in core architecture design.

Just what the hell is all this babble!? You don't run a company on luck. You look at foundry technology, goals, and actual progress to make these decisions. Not luck. TSMC had a very realistic goal of having .13 micron technology ready in time. They thought so and so did Nvidia. Just because it didn't pan out on time doesn't mean they were in any way relying on "luck." Also, I'm tired of this dribble about a "lack of imagination." I don't see how you can say that the N3x architecture is in any way lacking in imagination of design. Perhaps it would be better worded to say that because of an overimaginative design the product was less stellar than expected.
 
bdmosky said:
Just what the hell is all this babble!? You don't run a company on luck. You look at foundry technology, goals, and actual progress to make these decisions. Not luck. TSMC had a very realistic goal of having .13 micron technology ready in time. They thought so and so did Nvidia. Just because it didn't pan out on time doesn't mean they were in any way relying on "luck." Also, I'm tired of this dribble about a "lack of imagination." I don't see how you can say that the N3x architecture is in any way lacking in imagination of design. Perhaps it would be better worded to say that because of an overimaginative design the product was less stellar than expected.

According to the information I saw, TSMC was adamant in expressing its reservations to ATi and to nVidia about the state of its .13 micron process early last year--or, I should say, the immaturity of the process. Is there some reason you might suggest that as of early '02 TSMC would maintain to anybody that it's .13 process was mature and ready to go? ATi has said for a long time this is why it did not commit to R300 at .13 at the outset. So far ATi has only done RV350 at the moment at .13 and unlike nVidia this would suggest the basic RV350 architecture was simply a better fit for .13 than nV3x (which according to nVidia wasn't a fit at all at .15)

I really can't understand what your point is, as nVidia just a week or so ago was quoted widely in saying its .13 micron yield problems have been solved and that it looked forward to shipping nv35 in quantity, for the first time, next month (September.) Of course nv30 was withdrawn and officialy declared a failure by nVidia. As you know, there have been no problems with either ATi's .15 micron R3x0's or nVidia's .15 micron gpus for its 5200 series. In fact, the vast bulk of chips nVidia's sold in the last year have been .15.

Imaginative? Sorry--I see nothing remotely imaginative about overvolting your chips so that you can overclock them on 12-layer pcbs, with giant heatsinks and fans, along with double-wide backplanes, just in an attempt to compete by ramping up MHz in the higher-end market segments. I'm not sure what word I'd use--but it wouldn't be "imaginative." Now, I will admit they've employed quite a bit of imaginative PR in the last year....;) Most definitely.

The "luck" I was referring to is the bad luck the company had in that it couldn't depend on the .13 micron process to pull its bacon out of the fire this time. I agree with you that aside from that, luck had little to do with it--it was just poor core cpu design--which was quite deliberate. I thought I made it clear that I believe that nv3x is the best nVidia can do at the moment.
 
Back
Top