Official: ATI in XBox Next

Status
Not open for further replies.
Read the next post before this please:

Joe DeFuria said:
The point is (can't believe you refuse to acknowledge it), 65nm is NOT ready today. Hence: all else is not equal.

Joe, you design a 100M gate device today with 65nm in mind for production when the design is done.

If you just designed with what lithogrpahy is available today, then you're... well, ATI ;)

Or conversely, sensitivity to time to market can dictate which lithography process you shoot for.

True, but this has no bearing on the fact that you have a timeframe and plan accordingly. Your never going to get ahead playing by your rules, you're never going to be in a position where you have that IC that's 30% larger (or some arbitrary number) than your competitors while being smaller, cooler, and just kicking their ass. You have 5 years to do this - you get it done.

Again, the quote sums it up.

Right. And ATI likely felt that lower time-to-market risks of 150nm outweighed any other potetial advantages of 130nm. And in that case, they guessed right.

In the PC Sector - this is the Console Forum. The game is different, much different.

ATI is not designing in a 5 year umbrella, like I stated.

Of course they are. They're designing the center part for the next XBox. Whether they get 5 years or not to develop the IC is insignificant as people expect 5 years of work and advancement. Welcome to the world of consoles.

And if 65nm turns out to take a year longer to be able to get the product out in quantity than was anticipated 5 years ago...then what?

Then your fucked. You play the game making sure this doesn't happen; you do this by forging allineces, you throw a shit load of money at it and get it done. This is the game Joe - you win by taking chances, except you ensure by the above means that they work.

At the same time R&D is telling you what processes are "obtainable", someone else in management is telling you how flexible they think the ship window is, before you run the risk of getting into "trouble."

Which is why you need a more unified structure. It's not ATI's fault and I'm not saying that. But the less fragmentation the better - by a long shot. Which is a plus for the nVidia semiconductor way as seen in XBox1.

And it's irrelevant because....you say so? Or because we always know 5 years in advance the state of a particular fab process will be in?

Because of the reason I've outlined. Intel does this consistently and they achieve this threw their huge R&D budget and tight planning.

You don't plan your vacation for next summer around if your otherwise healthy Aunt may or may not decide to drop dead during that period. You put your ass on the line and go for it.

Um, if my Aunt DOES drop dead, then guess what? My vacation is screwed. The fact that I put my ass on the line doesn't make me any less screwed.

So let me get this strait: You plan you life around random eventualities that are based on absolutly random occurances? What do you live your life according to Schrodinger or something? Your wife stays with you how? ;)

Sure...and I really admire risk-takers. The key is recongizing and evaluating the risks, not just ignoring them. That's what gets you killed.

There are no points for second place. As the Iceman said, "The plaque for the alternates is down in the ladies room."
 
Joe DeFuria said:
That is exactly my point in the related side argument...look at the sheer size. Certainly more complex than the Intel and nVidia ICs, right?

Faf or Archie (among others) would be better at answering this, but the EE basically kicks the shit out of the XBox/Cube's CPU. The EE is a FP monster, especially compered with superscalars.

And yet, "how much more powerful" is the PS2 than the X-box?

Ahh, but the XBox's power (the NV2A) is built on much more advanced lithography - 150nm. If the PS2 was designed around 150nm rules inctead of 250nm while maintaning the same size - it would... well, I'll leave the speculation to others ;)

Behold the power of lithography.

EDIT: Just imagine what you could do to the Graphic Synthesizer. At 250nm it's already an amazing achievement with it's eDRAM. With the shrink size alone you could have (just by basic physical scaling) 12MB of eDRAM and 7.2GPixel/sec fillrate (48 pipes) @ 150mhz - which would probobly be capable of a higher clock. And this is just simple scaling. DOT3 is a possibility, etc. Would be a monster to be reckoned with...

EDIT2: You could fit 3.3 EmotionEngines on the same size die @ 150nm. Although you'd probobly want to tear that sucker apart, enhance the R5900 core and then work on adding up the VUs or something like that. Hmm, I dunno.

Now do you see what I'm talking about?

If PS3 is on 0.065, and if Xbox2 is on 0.9, that doesn't say much about overall performance / features. The architectures are just so vastly different....

True to an extent. But as the PC side evolves and gets Unified Shaders and such it moves closer and closer to a Cell type architecture.
 
Ahh, but the XBox's power (the NV2A) is built on much more advanced lithography - 150nm. If the PS2 was designed around 150nm rules inctead of 250nm while maintaning the same size - it would... well, I'll leave the speculation to others

And the XBox GPU's initial spec's were quoted with 130nm fabrication in mind - the subsequent climb downs to 250MHz upon the realisation that 130nm wouldn't have been ready in time and then 233MHz when they understood what they were initially yielding at. Clearly this is a case where "advanced lithography" would have been better in terms of performance for the XBox, but it just wasn't ready and MS were in no mood to hold up the design another year to wait for 130nm so they shipped with a lower clocked graphics core - that was a desicion based on the numerous factors that we have been talking about, and in this case the factor of "time to market" was clearly more important than that of process.

Of course, as we've stated earlier in this thread, all bets are off for what processes will actually be utilised since we only have a limited amount of information as to who will design the graphics element and not what it will be fabbed in or by whom, or even whether ATI is doing the layout.

I would suggest that if Vince really wishes to progress this argument in a "console framework" then I suggest the he stops brining his issues in with the PC space, since I can count an inordinate number of times where he has done it, only to tell other people its not relavent to the discussion.
 
Vince said:
Joe, you design a 100M gate device today with 65nm in mind for production when the design is done.

And if 65nm "that youhave in mind for production" is not READY when you're done? (See Dave's post on X-box.)

If you just designed with what lithogrpahy is available today, then you're... well, ATI ;)

Wrong. If you ship on lithography that is capable of an actual product ramp of your product, then you're ATI.

True, but this has no bearing on the fact that you have a timeframe and plan accordingly.

Of course it does. All the planning in the world doesn't guarantee lithography will be ready.

Your never going to get ahead playing by your rules, you're never going to be in a position where you have that IC that's 30% larger (or some arbitrary number) than your competitors while being smaller, cooler, and just kicking their ass. You have 5 years to do this - you get it done.

Why didn't Sony "get it done" and ship PS2 on 65nm?

Right. And ATI likely felt that lower time-to-market risks of 150nm outweighed any other potetial advantages of 130nm. And in that case, they guessed right.

In the PC Sector - this is the Console Forum. The game is different, much different.

(shakes head.) Of course the game is different. That doesn't make the illustration irrelevant because "you say so."

Or are you implying that there's no time to market risks for consoles? Tell that to me again as we near 2005.

Of course they are. They're designing the center part for the next XBox. Whether they get 5 years or not to develop the IC is insignificant as people expect 5 years of work and advancement. Welcome to the world of consoles.

Welcome to the MS model. X-Box two will display 5 years of advancement over x-box 1.

Just because ATI isn't spending 5 years and building its own fabs and such, doesn't make their part less than 5 years advanced. They've been doing other things over the past 3 years....Or haven't you noticed that parts like the R300 and NV30 are not more advanced than NV2a..

And if 65nm turns out to take a year longer to be able to get the product out in quantity than was anticipated 5 years ago...then what?

Then your fucked.

Congratulations. You admit it.

You play the game making sure this doesn't happen; you do this by forging allineces, you throw a shit load of money at it and get it done. This is the game Joe - you win by taking chances, except you ensure by the above means that they work.

Ahhh...so then, your whole premise is "risk? What's that? Doesn't matter....just throw more and more money and resources at the issue, and that ensures things will work. Risk is just some illusion...it's not real. All it takes is will, some money, and a little elbow grease."

Vince, you certainly can win by taking chances. This doesn't mean you will. It can also mean a spectacular failure.

Which is why you need a more unified structure. It's not ATI's fault and I'm not saying that. But the less fragmentation the better - by a long shot. Which is a plus for the nVidia semiconductor way as seen in XBox1.

We'll see, won't we? That's all I have to say about that. MS tried the "nVidia way", and they didn't like it. That's all the facts we have to go on...

Because of the reason I've outlined. Intel does this consistently and they achieve this threw their huge R&D budget and tight planning.

And as the market leader, they are less sensitive to delays. It makes more sense to do it that way.

So let me get this strait: You plan you life around random eventualities that are based on absolutly random occurances?

Actually, in some cases yes. And when random activities occur, that doesn't change the fact that you did or didn't plan for them.

I have life insurance, Vince, on the chance of a "random occurance" of me dropping dead tomorrow, and putting a financial strain on the rest of my family.

Do you have life insurance, auto insurance, medical insurance? Don't tell my you "plan your life around" random eventualities. :rolleyes:

Whether or not a process is ready is not a "random activity", Vince. It's a calculated risk.

What do you live your life according to Schrodinger or something? Your wife stays with you how? ;)

Because we love one another, how about your wife?
 
Flipper was pretty much exactl what Nintendo wanted, and what they wanted was something simple and capable, not needing "the best."

Was it, or did nintendo just not make any big deal when the chip speeds dropped from 200 mhz to....what was it, 145?
 
Vince said:
Faf or Archie (among others) would be better at answering this, but the EE basically kicks the shit out of the XBox/Cube's CPU. The EE is a FP monster, especially compered with superscalars.

Unfortunately, the CPU is not the entire console. It's the whole package...CPU, GPU, memory, etc. All we care about is how the thing plays games, and last I checked, x-box and PS2 were pretty equal. (Again, each system having it sown pros and cons.)

Ahh, but the XBox's power (the NV2A) is built on much more advanced lithography - 150nm. If the PS2 was designed around 150nm rules inctead of 250nm while maintaning the same size - it would... well, I'll leave the speculation to others ;)

Behold the power of lithography.

Oh, so then Sony was just "dogging it then?" I mean, they were IDIOTS for not targeting 150nm or even 130 nm for console launch! Doomed for failure PS2 is...

EDIT2: You could fit 3.3 EmotionEngines on the same size die @ 150nm. Although you'd probobly want to tear that sucker apart, enhance the R5900 core and then work on adding up the VUs or something like that. Hmm, I dunno.

Now do you see what I'm talking about?

Yes, I see that despite all of this "mircale of lithography", Sony did not create PS2 on 150nm, or even 180nm for product launch. Sony should have had the chance to UTTERLY DISMISS any comptetition for cryin' out loud! Why the hell did Sony take the obviously wrong road go with something as pathetic as 250 nm?

True to an extent. But as the PC side evolves and gets Unified Shaders and such it moves closer and closer to a Cell type architecture.

There's much more to an "architecture" than the level of programmability / flexibility. Memory structure is a huge differentiator for exaple.
 
Joe DeFuria said:
Yes, I see that despite all of this "mircale of lithography", Sony did not create PS2 on 150nm, or even 180nm for product launch. Sony should have had the chance to UTTERLY DISMISS any comptetition for cryin' out loud! Why the hell did Sony take the obviously wrong road go with something as pathetic as 250 nm?



well it's quite obvious that in 1999 the problem of transistor count and gate size wasnt really the main concern of ANYONE.... i guess they had to make sure that a million other things worked before even thinking about risking with a more advanced lithography process....

still, the EE was designed and unveiled THEN improved, unlike BOTH Nintendo and Microsoft who had to downgrade the clockspeed BY A LOT before shipping their consoles. goes to show who had the best or at least most coherent designs to begin with...

of course we're gonna get thingy whats his face, oh, Deadmeat, here telling us that Sony, or actually Kutaragi himself decided to sold a second VU in the EE at the last moment, but i digress...
 
Riddlewire said:
Why does every thread in this forum eventually turn into a discussion about the Cell architecture?
Because no-one is really that much interested in what PC components MS will put in their next box?
 
rabidrabbit said:
Riddlewire said:
Why does every thread in this forum eventually turn into a discussion about the Cell architecture?
Because no-one is really that much interested in what PC components MS will put in their next box?




fair enough, but thats not exactly the point..... if a thread is titled "ATI in the XBOX2" that means that it will have to stay a boring thread on what PC parts MS will put in the next xbox....
 
Or haven't you noticed that parts like the R300 and NV30 are not more advanced than NV2a..

Sure they've got more features... but are they more advanced by at least an order of magnitude in perf.(maybe with AA, an, etc.)? hmmm... raw perf wise...?

Oh, so then Sony was just "dogging it then?" I mean, they were IDIOTS for not targeting 150nm or even 130 nm for console launch! Doomed for failure PS2 is...

They invested far less in R&D, and had far less involved in R&D, even if the people behind ps2 would've wanted to they could not have justified, to their superiors, such expenditures... even after two succesive wins in a row, the expenses in current R&D are making some wary of sony's future.

There's much more to an "architecture" than the level of programmability / flexibility. Memory structure is a huge differentiator for exaple.

Indeed, and it appears, unlike other architectures, cell was designed with relinquishing most memory and b/w bottlenecks in mind.

Thus with the smallst mem. and fastest trans. in the world, the gap between the 'Theoretical' and the 'Real' world performance will be further bridged.

Why does every thread in this forum eventually turn into a discussion about the Cell architecture?
Nintendo hasn't unveiled their stuff yet...(... and it's commonly though MS will just throw a pc derivative sol... )
 
Vince said:
Joe, you design a 100M gate device today with 65nm in mind for production when the design is done.

If you just designed with what lithogrpahy is available today, then you're... well, ATI ;)

Did they or did they not put 0.13 to market at the same time as nVidia? Who has the better manufactured chip?

Basically you seem to base this whole thing on their track record before (in the PC world where everything "doesn't matter"), NOT look at what they've done in this generation, and not relinquish that we CANNOT KNOW what either nVidia or ATI is really up to until we see their next major architecture duel.

Vince said:
In the PC Sector - this is the Console Forum. The game is different, much different.

And yet some of the same things that effect EVERY developer in basically EVERY end of the hardware industry effects them as well? How utterly curious...

Vince said:
ATI is not designing in a 5 year umbrella, like I stated.

Of course they are. They're designing the center part for the next XBox. Whether they get 5 years or not to develop the IC is insignificant as people expect 5 years of work and advancement. Welcome to the world of consoles.

That's not "in a 5 year umbrella" because Microsoft didn't work out the deal with them 5 years previous. MS also doesn't seem to driving design in the console world, but is merely happy with a specifically tasked PC with modified internals. As such it was tagging along on nVidia's PC development process and get some extras in NV2A (making concessions as they went), and seems likely to do the same with ATI. (Only with more time to spend, but more decisions to make on the rest of the internals and how they want to apply the IP they'll gain.)

Yes, the world of consoles. THIS world doesn't sit in a box either.

Vince said:
Joe, you design a 100M gate device today with 65nm in mind for production when the design is done.

If you just designed with what lithogrpahy is available today, then you're... well, ATI ;)

Did they or did they not put 0.13 to market at the same time as nVidia? Who has the better manufactured chip?

Basically you seem to base this whole thing on their track record before (in the PC world where everything "doesn't matter"), NOT look at what they've done in this generation, and not relinquish that we CANNOT KNOW what either nVidia or ATI is really up to until we see their next major architecture duel.

Vince said:
In the PC Sector - this is the Console Forum. The game is different, much different.

And yet some of the same things that effect EVERY developer in basically EVERY end of the hardware industry effects them as well? How utterly curious...

Vince said:
ATI is not designing in a 5 year umbrella, like I stated.

Of course they are. They're designing the center part for the next XBox. Whether they get 5 years or not to develop the IC is insignificant as people expect 5 years of work and advancement. Welcome to the world of consoles.

That's not "in a 5 year umbrella" because Microsoft didn't work out the deal with them 5 years previous. MS also doesn't seem to driving design in the console world, but is merely happy with a specifically tasked PC with modified internals. As such it was tagging along on nVidia's PC development process and get some extras in NV2A (but seemingly no

And if 65nm turns out to take a year longer to be able to get the product out in quantity than was anticipated 5 years ago...then what?

Then your fucked. You play the game making sure this doesn't happen; you do this by forging allineces, you throw a shit load of money at it and get it done. This is the game Joe - you win by taking chances, except you ensure by the above means that they work.[/quote]

Again, who controls this? Do either nVidia or ATI possibly have the resources to force as severe and effort as S/I/T have to try to bring Cell to light? Not even close. As we've seen, they're still very much dependant on what TSMC is capable of for their chips. Where does the motive force (and more precisely, the cashola) come from? Not them. MS wasn't willing to do it for NV2A and design concessions had to be made. MS seems to be spending more some on Xbox2, and will quite possibly outlay more cash and push the fabbing harder this time (since they're also in charge of how it progresses), but if not ATI will be in the same boat. Reverse the situations and ATI would likely have had to make the same changes for and Xbox solution, and nVidia would still ultimately be under MS's thumb for X2.

Vince said:
At the same time R&D is telling you what processes are "obtainable", someone else in management is telling you how flexible they think the ship window is, before you run the risk of getting into "trouble."

Which is why you need a more unified structure. It's not ATI's fault and I'm not saying that. But the less fragmentation the better - by a long shot. Which is a plus for the nVidia semiconductor way as seen in XBox1.

Quite possibly, but it was Microsoft's decision to move away from that model, and Microsoft's decision to make a licensing deal. Perhaps THEY want to ensure as much of a unified structure for THEMSELVES with this? (though how, we will just have to see) Regardless, you can hardly blame ATI for Microsoft's change in venue.

Vince said:
Sure...and I really admire risk-takers. The key is recongizing and evaluating the risks, not just ignoring them. That's what gets you killed.

There are no points for second place. As the Iceman said, "The plaque for the alternates is down in the ladies room."

Indeed. And nVidia came in second with the NV30.

Who well come in OVERALL second? Who the hell knows? Few were expecting the "father of the 3D chip" to rise and fall so quickly, few were expecting Sony to enter a race and dominate it so quickly, few were expecting Sega to drop out while they still had a great machine on the market... (Many did, but they were paying a lot of attention and watching how things changed rather than opining around historical prescedent and acting on assumptions.)

Vince said:
Ahh, but the XBox's power (the NV2A) is built on much more advanced lithography - 150nm. If the PS2 was designed around 150nm rules inctead of 250nm while maintaning the same size - it would...

...never have been out in March of 2000? :p Not have let the Dreamcast continue to gain strength unchecked? Not have come out well before their competition to be a steamrolling juggernaut (while not being utterly trounced technically)?

Behold the power of just about everything else that is NOT lithography, which usually has more impact. Timing is a key comedy AND business! ;)

Vince said:
EDIT: Just imagine what you could do to the Graphic Synthesizer. At 250nm it's already an amazing achievement with it's eDRAM. With the shrink size alone you could have (just by basic physical scaling) 12MB of eDRAM and 7.2GPixel/sec fillrate (48 pipes) @ 150mhz - which would probobly be capable of a higher clock. And this is just simple scaling. DOT3 is a possibility, etc. Would be a monster to be reckoned with...

EDIT2: You could fit 3.3 EmotionEngines on the same size die @ 150nm. Although you'd probobly want to tear that sucker apart, enhance the R5900 core and then work on adding up the VUs or something like that. Hmm, I dunno.

Now do you see what I'm talking about?

Yes, and if we took this attitude to the extreme and always delayed everything INFINITELY, why the architecture would just keep getting better and better! :oops:

Fox5 said:
Was it, or did nintendo just not make any big deal when the chip speeds dropped from 200 mhz to....what was it, 145?

I'm not sure of the specifics, but other people seem to so I'll agree with them offhand. :) Of course this would have been the known result well beforehand, and if Nintendo wanted to exert more effort to ensure it would be there they would have had to pursue that. If they wanted to delay the Gamecube's launch enough to make sure Flipper was at exactly the speed they wanted, they could have done that too. As it stands, the Gamecube delivers great-looking games, and coming out well after the Xbox would likely have proven fatal. NV2A suffered from similar decisions, as I'm sure NEITHER wanted to slip on their launch date. (And MS didn't allocate the time or resources to help nVidia and their fabbing partner to push as much as they could.)

This is all--how shall we say--business as usual, ne?
 
Maybe all this arguing is moot. After reading the Chris Hook interview at Driveheaven maybe the future of consoles is to have a up-gradable one.
http://www.driverheaven.net/display.php?page=chris_interview
Chris Hook:
Soon and yes. In fact, ATI has paved the way for upgradability by providing the industry’s first and only family of pin-compatible discrete graphics processors. M7, M7GL, M9, M9+, M9GL, M10, M10GL and M11 (shh!J) leverage broad-based pin compatibility, so manufacturers only have to design one graphics module, and then populate it with their chip of choice. By not having to redesign the graphics module every time, the OEM or ODM saves themselves considerable engineering costs, which are then passed on to the consumer, making upgradability economically viable to the consumer. And we’re the only discrete mobile graphics manufacturer to also make mobile integrated graphics chips, so a manufacturer can design a base model that uses one of our integrated solutions, then plug in an upgrade card without having to re-install drivers, so we’re the only manufacturer that can offer a truly seamless upgrade.

Right now, upgrading a notebook means removing screws and panels and sometimes even the keyboard. But with new technologies available in the future, it may soon be possible for a user to simply add and remove graphics cards in a few seconds using a slot in the side of their notebook.
Five years is a long time to sit still. Since the VPU is the heart of the console the idea of a simple upgrade would allow ATI a way to take advantage of newer and better process tech. when available and feasible. Since the type of upgrade that Chris mentions involves just the VPU, not memory or PCB, it could be cost feasible.
 
I'm rather expecting that to be kept in mind by them the next gen--at least by Sony and MS. Sony's I rather see from the inherent nature of Cell itself, as "scalability" reads "easy upgradability" to me, especially as they refine their process. Why simply reduce costs when they can offer faster speeds and market on that as well? (Especially if they can play up the ability to link with one's OTHER PS3 to make things even faster! :oops: ) Through programming depth and scalability, they'd be able to kick ahead of the competition even if they don't start ahead of them by one means or another.

MS, I think, is likely to embrace a PC-ish stance, as they ALSO like being "on top no matter what" so may be more apt to want a GPU and CPU that will scale well so they have an upgrade path to pursue if it's called for. (CPU the kind of get by default, all things considered. Hehe...) But of course this may well keep the Xbox just as lossy or lossier...

Nintendo? <shrugs> A bit of an enigma right now. I can certainly see them getting more aggressive, since they've basically proclaimed a willingness to throw ALL their cash reserves into the ring (which is not insubstantial), but if they have any big plans for this upcoming generation they're hiding it well. From Sony we've seen big movements and MASSIVE cash movements since 2001, from MS and Nintendo we've seen...? <shrugs again> We've seen kinda the same approach as with the current gen, so I'm not very excited overall. MS's desires are usually easier to read, though. Nintendo's are unknown--and usually defy expectations anyway, so which WAY will they choose to defy expectations this time? Hehe...
 
I've been following this discussion since the beginning now, and I think there's is some fundimental things Joe & co are missing here:

The point that lithography is everything is of course only from and within a technical point of view and doesn't alone lead to success. However, I don't think anyone is arguing that lithography is the only key to success, but I think it most definately is the most important factor when you assume that the given company is willing to push the envelope on a technical level. Obviously the lithography advantage is potentially a very big advantage, although it is clear that it yields more risks aswell. This I feel will be seen with CELL as it undoubtly will and has to push the lithography in order to work.

better lithography could be the advantage, especially when you're biggest competitior is going to push the process with CELL. Of course risks are bigger, but then again, isn't that business? Especially when you want to outbid your biggest competitor, you can't hope to stand a chance if you're not willing to take risks.
 
zidane1strife said:
Or haven't you noticed that parts like the R300 and NV30 are not more advanced than NV2a..

Sure they've got more features... but are they more advanced by at least an order of magnitude in perf.(maybe with AA, an, etc.)? hmmm... raw perf wise...?

Who said it was or needed to be an order of magnitude difference? R300/NV30 is not "5 years after" the NV2x.

I don't expect R420 and NV40 to be an "order of magnitiude" difference from R300. I don't expect R500 to be an oder of magnitude difference from R420.

But, VERY generally speaking, each new core generation is about "twice the difference" from the last one.

That would make R500 generation roughly 8x that of NV2x/R200.

They invested far less in R&D, and had far less involved in R&D, even if the people behind ps2 would've wanted to they could not have justified, to their superiors, such expenditures... even after two succesive wins in a row, the expenses in current R&D are making some wary of sony's future.

Um, this is exactly my point.

Hey, you don't have to convince ME of that. Talk to Vince...he's the one saying that you've got to take extreme risks at all costs. That the ones who "succeed" are the ones that DON'T care about such things as "justification"...just go walk to the edge of the cliff.

Indeed, and it appears, unlike other architectures, cell was designed with relinquishing most memory and b/w bottlenecks in mind.

As is every architecture before it, ans as is every architecture afterward.

Thus with the smallst mem. and fastest trans. in the world, the gap between the 'Theoretical' and the 'Real' world performance will be further bridged.

As happens every generation...
 
Phil said:
I've been following this discussion since the beginning now, and I think there's is some fundimental things Joe & co are missing here:

...However, I don't think anyone is arguing that lithography is the only key to success,

Um...this was repeated several times by me, and echoed by "& Co."

Joe and Co. said:
Lithography is not everything.

And this was repeated by Vince several times:

Lithogrphy is everything.

Some fundamental things are missing alright, it's just not missing from "Joe & Co."

better lithography could be the advantage,

Absolutely, it can be. Time to market can also be "the" advantage. Developer Tools can also be "the advantage". Even marketing can be "the" advantage.

You don't have infinite resources to push every single aspect that could turn out to be "the advantage". Also "time to market" is counter to "better lithography", so those two directly compete against one another.

How much "lithography" itself will play out to determine the consoles ultimate success or failure, depends highly on the individual circumsatances of the company itself, its market position, and the competitive landscape, and the effectiveness of "other key" factors like those I mentioned above.
 
This argument has decended back into rhetoric and bullshit quickly, huh?

Joe DeFuria said:
And if 65nm "that youhave in mind for production" is not READY when you're done? (See Dave's post on X-box.)

Then you wait it out or cut your losses (as XBox ultimatly did). What did the NV2A loose by initially targeting 130nm and then being bumped back to 150nm as opposed to a 150nm design? precious little.

This is like a common sence thought based on that situation. Why are we arguing about something so simple?

Wrong. If you ship on lithography that is capable of an actual product ramp of your product, then you're ATI.

If it makes you feel like a better man - then sure Joe. What's that - everyone must toe the same line as you. What was that again? Oh yeah... "I can't understand why anyone wouldn't like ATI"

Why didn't Sony "get it done" and ship PS2 on 65nm?

Causuality. It was 1998. They were saying "Get it done by embedding 4MB of eDRAM on a 250nm die." Which was an achievement for that time.

JoeDeFuria said:
Or are you implying that there's no time to market risks for consoles? Tell that to me again as we near 2005.

Joe, you're wasting my Goddamn time. You already stated this:

Joe DeFuria [url said:
http://www.beyond3d.com/forum/viewtopic.php?t=7406&postdays=0&postorder=asc&start=240[/url]]This is exactly why I've argued in the past, (usually related to IMGTEC) that I feel the IP licensing model is FINE for the console market, but I do not like it for the PC market. Console designs flip over every 5 years or so, meaning you have more time to plan and sort things out. The PC race is break-neck, and having a more true fabless semi-con model is more beneficial.

As we 'near 2005' you're on the trail end of a 5 year break between releases. Go preach to some other choir.

Vince, you certainly can win by taking chances. This doesn't mean you will. It can also mean a spectacular failure

This isn't an investment forum. We're here for the preformance - if you want to talk about risks and assessements then go elsewhere. This is Beyond3D afterall.

Actually, in some cases yes. And when random activities occur, that doesn't change the fact that you did or didn't plan for them.

I have life insurance, Vince, on the chance of a "random occurance" of me dropping dead tomorrow, and putting a financial strain on the rest of my family.

Do you have life insurance, auto insurance, medical insurance? Don't tell my you "plan your life around" random eventualities.

Whether or not a process is ready is not a "random activity", Vince. It's a calculated risk.

You've got to be fucking me. You can't be serious.

You took a comperason that was pretty simple:

Vince said:
You don't plan your vacation for next summer around if your otherwise healthy Aunt may or may not decide to drop dead during that period. You put your ass on the line and go for it.

What the hell doesn't insurance have to do with anything in this debate?

When are you going to just STFU with these useless comments? Yes, people and companies have insurance for randoms acts outside of their control - but that has nothing to do with our debate. No comapny is going to pay IBM if they fall 3 years behind Intel. Reread the parallel and get your head out of there.

You are wasting my time.
 
This argument has gotten unruly and quite honestly, full of your stereotypical lawyer-esqu bullshit Joe. So, foreveryone with the exception of Joe to answer:

This argument condenses very easily for someone not keeping up:

  • If you're designing a closed box console (that has cutthroat competitiors) for release in 3 years, do you target the most advanced lithography process your R&D tells you will be ready or the process that came before?

That's it. No conditions, no-ifs-and-or-buts. The R&D projected this, that's their job and their ass if they F*up. What do you choose?
 
Status
Not open for further replies.
Back
Top