Official: ATI in XBox Next

Status
Not open for further replies.
Vince said:
All things being equal, N Architecture will be more economical to produce on what process?

All things being equal, N Architecture can support more logic on what process?

All things being equal, N Architecture has more oppertunity in way of preformance and it's associated attributes on what process?

All things are never equal though Vince. In a competetive environment you've also got factors such as cost and time to market to consider - no-one in is disputing that 130nm will ulitmately be better than 150nm, but that doesn't necessarily mean its the case that its going to be the second its available - arguably, even now, 130nm is more expensive in a similarly performing part to one designed on 150nm so is it actually currently better for the consumer?

You didn't answer my previous question though - was it not a good thing that ATI brought an excellent perfoming DX9 part to the market in 2002? Did that not advance gamers experiences and move along the delopment of the industry? If you say no to that then you're living in cloud cookoo land, if you say yes to then then there is a clearcut case where 150nm was better than 130nm.

It is a crapshoot, and all the special cases scenarios surrounding NV3x and R3x0 are utterly irrelevent from the standpoint of this discussion or any that is forward looking...

... But, what is a good indicator are general trends based on precedent and the fundimental and governing laws of design. You forgo all of these in your persuit of a half-truth based around a single case scenario - the R300.

Vince, you are ignoring that there are cases where similar thing could have been exploited in the past. The case of NV20 is pertinent - it was late from when NVIDIA had initially wanted it to be to the market, and a similar situation could have occured but none of the competition was in a position to do it. Half of a large chunk of your precident is set in a competetive vacuum.

The other issue is that this apparent precident is set on relatively low complexity part, processes and shorter cycles. We are now moving into a different era, of more complicated processes and longer cycles - can you say for certain that a new precident hasn't been set? No you can't but I'm not actually saying this is a precident. Regardless of whether you think this is a one off or not, the fact of the matter is that it has occured and you can't ignore it.

You talk of the R200 -> R300 jump as if it's supporting your case. How much of the supposed increase is because of logic? How much is do to RAM advances (also lithography based).

What, the move from integer to full float pipes? The 3 fold increase in vertex processing, the two fold increase in pixel shading (at higher float precisions), the larger on-chip HierZ, the move to MSAA, the higher Z compression ratio, colour compression, DX9 shader functionality... need I go on?

Nobody is debating this, it was their fault that anyone on an inferior process came close to them in a time when computational resources have such influence on overall preformance - but it hints at problems further up the development cycle; not the fault of lithography.

It was the fault of the manufacturing processes of that lithography not being ready at a time NVIDIA pushed to use it. The fab warned them of this - the simple fact was that lithography was not ready at the time for that application; similar things are still happening with things such as low-k (and at other fabs as well).

If you can't see this then it's your own fault

Vince, you're the only one saying this - we all know what smaller proces sizes bring, and frankly your constant pigheadedness and ability of putting things in other peoples mouths is pissing me off. What you don't seem to be appreciating is that other factors also go in there as well - timing, need and familiarity (hence what cen be achieved) are also factors that need to be tipped in and swilled about.

what's ATI going to do when nVidia launches a product based on a derivative 11S or 12S process out of IBM?

a.) When is that likely to be, b.) how do you know ATI won't be using a similar process at the same time, c.) will they have an architecural advantage to ofset process differences?

I don't know the answer to those questions, and I'm not suggesting any of those factors will occur, but they are possabilities, just as there is a possability that NVIDIA may hook themselves into one of these process choices again and the process is late to the market again.

Do you even read what you're writing bud?

Don't you dare...

What company in a preformance based market has ever sucessed long-term by not taking risks and pushing the envelope?

Again, its not always about "pushing the envolope" just in termd of new processes - ATI pushed pushed the envolope with 150, and they succeeded (up until now). As said, there is also the other factors that need to be considered as well and they are equally important. You also then have to sell that product to people, and in the PC space that isn't like the console market - it has top be produced at prices that are palletable for people to buy.

ATI will fall if they stick with their current lithography policy.

Their lithography policy does not preculed their latest high end designs utilising the current latest process if they have trialed that process elsewhere. Between now and R420 look to the products they release in that period to gain an understand of what R420 is likely to utlise.
 
nelg said:
Is the difference between a mature .15 process that much different than that of a new .13 process?

There is that as well. Its not like a process comes out and stays that way for the duration of its life - the fabs are making constant revisions to the process while its available. One of the major factors on R350's clockspeed increase over R300 is because R350 uses a newer 150nm process than R300 did.
 
Dave Baumann said:
In a competetive environment you've also got factors such as cost and time to market to consider - no-one in is disputing that 130nm will ulitmately be better than 150nm

Excellent. Now, while avoiding the extra layers of factors that have no relevence to the design of a closed-box IC, answer me just one more question:

  • All things being equal, what lithography process is more adventagous to use when designing a closed-box console with a lifespan of 5 years - 150nm or 130nm?


------------

Here are other responces, although I don't feel they're relevent to a discussion in the console forum/this discussion. But, you took time to write them and they're good posts so, here's my responces.

Just so you know, I will not respond to these lines of argument again. Atleast not while we're talking about Console based graphics. And as you can tell by my presence (or lack there of) in the PC Forum, I (personally) really have no interest in them.


You didn't answer my previous question though - was it not a good thing that ATI brought an excellent perfoming DX9 part to the market in 2002?

Of course it's a good thing, it's a great thing - it just has absolutly nothing (on any level) to do with this debate.

a.) When is that likely to be, b.) how do you know ATI won't be using a similar process at the same time, c.) will they have an architecural advantage to ofset process differences?

  • Late 2004, 2005 is feasible for the 11S/65nm process, they have lower xS processes already available.
    [*]TSMC/UMC doesn't have SOI/SS to the same extent and proven history as IBM. Right now it's basically IBM whose taken the lead, and as such have the advantage. AMD got it from IBM.
    [*]Well, you'd need to overcome a 15-70% preformance increase (taken from IBM).
I don't know the answer to those questions, and I'm not suggesting any of those factors will occur, but they are possabilities, just as there is a possability that NVIDIA may hook themselves into one of these process choices again and the process is late to the market again.

Dave, this means nothing. Whenever your argument turns from a technical ground to 'it maybe late to market' - there are problems in the argument.

Don't you dare...

Your forgot the 4 snaps in a 'Z' formation. heh. ;)
 
You keep saying "push the envelope" and yet I keep seeing "read the market" and "make the appropriate decisions."

Not only do extremely intelligent people on the forum make comments that defy your stated "truths," Vince, but your major reply being that anything said in approval of ATI's decisions is "fanb0y"ish in nature does not help you actually make your points. Quite honestly, it just gets people more irritated with the way you argue.

ATI used to have a history of bad drivers--this is no longer the case. ATI has not had a history of pushing the same boundries nVidia has (though of course NO company has had the ability to keep up with nVidia's release schedule until ATI proved itself. And hey, nVidia didn't have a proven track record of anything until THEY proved themselves either, did they? OHMIGORSH!)--this may well also no longer be the case.

You have not at all disputed that nVidia screwed up royally the NV30 release, yet you also seem to blame ATI for not having done the same. Instead they brought an impressive 0.15 process (not to mention DX9) to bear well before nVidia could bring their new process to light, and still keep this process performance-wise right on line (one could argue faster, one could argue not... <shrugs> ) with nVidia's current 0.13, only with less power consumption, higher image quality, superior shader performance, and it would seem just less compromises in general. Not only that, but they brought out their own 0.13 process pretty much at the same timeframe as nVidia only with--as I understand it--greater yields from the same chipyard. Which also squarely beat nVidia's 0.13 offering in its price range.

All this from just being "very lucky."

And from which you extrapolate that ther is NO POSSIBLE WAY that ATI will be keeping pace with nVidia through the processes because...? They hadn't done so quite as much until now? They opted out of royally fucking up a chip launch? They seem to be making appropriate business decisions?

Frankly, I'm left to wonder if an ATI executive officer ran over your dog, because you seem rather venomous towards anyone even remotely thinking "hey, they seem be pretty good right now. And, I mean, you have to be pretty frickin' good to take on the world's leader in graphics technology, don't you?" And basing a lot of personally held "obvious truths" about some things they've shown you wrong about and others you CAN'T POSSIBLY KNOW about them in the fastest-moving industry on earth in one of the biggest (and frankly, most interesting :D ) duels IN that industry.

No, I honestly can't see why everyone on earth doesn't share your opinions on this matter.
 
cthellis42 said:
You keep saying "push the envelope" and yet I keep seeing "read the market" and "make the appropriate decisions." [Bold is mine]

What good does that do in a closed box that stays closed for 5 years?

There is nothing relevent to this argument (eg. lithography is the foundation and ultimate factor to suceed in designing a console) that has been proven wrong - I don't care about the nVidia-ATI rivalry, nor the PC marketplace. And it's not like I'm the only one who thinks this way, they're just smart and don't argue knowing the PC people will all leave eventually.
 
Vince said:
TNT - 350nm
TNT2 - 250nm
NV10 - 220nm
NV15 - 180nm
NV20 - 150nm
NV25 - 150nm
NV30 - 130nm

Yup. Point?

If you can call the NV20 late after it's inclusion in the XBox, then the R400 is late as well.

Sure, anything's possible. Including that the 0.15u process wasn't

1) Good enough to get NV20 out on time
2) Good enough to get NV2a to the anywhere near the original MHz target.

It's also possible that one of R400's issues, is that 0.13 was not good enough to get it out at the desired time at the desired specs.

But, why think rationally about it when you can just spin it to your liking?

You mean, why not consider all possibilities rather than just pigeon hole one? It seems pretty rational to me that if nVidia couldn't get NV2a up to spec, AND they couldn't get NV20 out on time...they just might have been overly aggressive with their estimation on process readiness.

Unlike you, I do not consider my possibility the "only rational" one though.

I see aggressiveness above. nVidia surpassed all competitors with respect to lithography (perhaps the TNT was an exception) - I'd hardly call this a fluke.

But nVidia did NOT surpass all competitor's products, despite the lithography advantage. See R300 vs. NV30.

Lithography is not everything.

Again, what you see as nVidia's constant "over-reliance" - others see as "consistency".

I see it as both. Consistently over-reliant. ;)

I mean, how do you overrely on something as fuindimental as lithography to chipmaking?

Easy...by believing that when you release 500 Mhz, 110 million transistor part on 0.13...you assume it will be superior to any 0.15u design. Not, 6 months later and requiring a dust-buster just to have some semblance of being competitive.

Is this really not obvious to you?

ATI was taking a different risk...they were relying on their engineering ability to put 100+ million transistors on 0.15, clock it competitively, and not run into the same heat issues.

Does Intel overrely too?

In some cases, sure. In fact, I believe Prescott is not turning out to have the power consumption that Intel thought 0.09 would gain them.

As Per NV30, I've already stated that nVidia fumbled and ATI picked up the ball and ran with it. But where was ATI when nVidia doesn't stumble?

What kind of logic is this, really?

You admit that nVidia "fumbled." What was the cause of the fumble?

Where was ATI? Acquiring ARTX and reshaping their entire culture.

Playing second fiddle that's where....

I just don't get your point.

No one is saying that in any one given situation, a different set of risks won't or can't pay off. nVidia tends to rely on advanced processes, rather than trying to really push existing ones. ATI tends to rely on older, more mature processes to push their tech.

NO ONE is claiming that for all situations, one approach is superior to another. We're just trying to get you to understand that there are in fact, inherent risks in EACH approach.

The fact of the matter is, you "guess right" and you look like a star. You guess wrong, you look like a chump. Point is, you CAN guess wrong with either approach. There is nothing inherent in "being aggressive with lithography" that makes it a wiser approach.

If that were the case, nVidia could start pumping out 250 million transistor 0.09u NV50s as I type. That's REALLY aggressive, so it must be even SMARTER, right?

What's going to happen when nVidia gets on track (as they already are?), starts pumping out advaced SOI/low-K chips from IBM and ATI is doing these same old things?

Vince,

You could have asked the same thing a year ago.

"What happens when nVidia starts pumping out the 0.13u low-k NV30's, and ATI is still stuck with 0.15 R300s"

What DID happen? Low-K turned out to be a bust. ATI surprised everyone by actually putting an 9 pipe DX9 chip on 0.15, and increased clocks relative to their previous 4 pipeline DX8 chip on the same process.

The same COULD happen in the spring.

Alternatively IBM might have vastly better yields than expected, no problems with SOI, nVidia pushes the specs throguh the roof, and ATI might not be as successful with a more "mature" and "less risky" 0.13 TSMC process.

I have ZERO problems envisioning either scenario. What about you?

You're spinning a deficiency into a plus - and nobody is going to buy it.

Nobody's buying your one-sided approach to this. I recognize both the inherent risks, and potential benefits from either approach.

In the same vein, ATI has shown that it does not need the same advanced lithography to compete with nVidia.

Oh what utter bullshit. Go heed your calling and go into politics already.

Huh?

Did, or did non, ATI have a "competitive" part in the R300, vs. the NV30? Was the R300 out 6 months earlier, or was it not? Did the R300 require a dustbuster? Despite "only" being on 0.15, is the R300 considered competitive at WORST with the NV30...and superior by any rational judgement?

How can you not see this?

Again....NO ONE is saying that as a stead-fast rule, ATI's approach is always the superior one. I would be as much a fool for saying that as you would be (are?) for saying nVidia's approach is always superior.

So, because the R300 does well when nVidia stumbles -

Stop. WHY did nVidia stumble, and why DIDN'T ATI stumble?

you can declare them as able to compete with nVidia?

Um, who's "declaring?" Isn't it obvious? MS chose them. ATI's profits and margins are increasing, nVidia's is going in the opposite direction. ATI's market share is increasing, nVidia's is going the other way.

Yes, I'd say that ATI is "competitive" with nVidia.

What's happened before nVidia sumbled - ATI got ass kicked. Is ATI a One-Hit Wonder? Can't rule it out...

Agreed. You CAN'T rule it out. You also can't rule out that "nVidia continue to get it's ass kicked" for the rest of eternitiy either.

Why such the aggressie language BTW? You'd think this was a boxing match...

I tend to think that ATI and nVidia will be fierce competitors for quite some time...with one "regaining" the lead over the other every so often.

It also shows that lithography is everything - enter low-K dielectrics.

Enter NV30....of wait....didn't really pan out....

You are continually going with the assumption that because ATI chose not to be "aggressive" with Lithography with this past generation, it was for any other reason than in their estimation it wouldn't be ready.

Now, perhaps nVidia doesn't have the engineering skill to do something like the NV30 on 0.15, and get Mhz and power consumption to reasonable levels. That would explain Jen H's assertion that 0.13 was "required." It's quite possible that at nVidia, it was.

At ATI, it wasn't.


Moral of this story: Joe can spin any situation into his biased vision.

I am continually perplexed at how I can show you "both sides" of the story, and yet be spinning it at the same time... :rolleyes:
 
Vince said:
cthellis42 said:
You keep saying "push the envelope" and yet I keep seeing "read the market" and "make the appropriate decisions."

What good does that do in a closed box that stays closed for 5 years?

It means that a company that properly reads the market and knows what CAN be accomplished at X time is what should be designed for? Or are you suggesting Nintendo or Microsoft's next launch should involve the same bobbling NV30 took?

Vince said:
There is nothing relevent to this argument (eg. lithography is the foundation and ultimate factor to suceed in designing a console) that has been proven wrong - I don't care about the nVidia-ATI rivalry, nor the PC marketplace. And it's not like I'm the only one who thinks this way, they're just smart and don't argue knowing the PC people will all leave eventually.

Only you are riding on the assumption that ATI cannot do what nVidia can based on many irrelevant comparisons (such as when you brought the Gamecube into play earlier) and what is likely irrelevant history at this point. If you don't care about the PC marketplace, why are you using it (fairly poorly) to justify your assumptions?

Meanwhile a comment like "lithography is the foundation and ultimate factor to suceed in designing a console" is farce except from a purely technical standpoint. There are umpteen million other factors that actually make for the SUCCESS of a console, and quite probably what a Nintendo and MS need is a chip that fits what they WANT, and can be brought to fruition with fewer troubles (and perhaps greater profit margins) rather than eek as much litho as possible. After all, what's the litho of the GS to Flipper to NV2A? Why does the PS2 continue to drastically outsell the others with more dispersion and being 18+ months older? Why did the Xbox and GC pace each other so long, and X is barely outselling it now? If the Xbox picks up drastically in comparison to the Gamecube does it have ANYTHING to do with the chip as opposed to their games and overall strategy?

Meanwhile, just for kicks, lock R300 and NV30 in a box for 5 years. Which would you rather have? ;)
 
Joe, the argument has passed debating that. I have no need or want to continue debating PC based antics introduced by others - I just said this to Dave, I'm going to stay consistent.

If you want to debate topics that are nVidia <--> ATI, then do it in the PC forum. So, I'll meet you there with the Beer when the R420 and NV40 are releases.

I am continually perplexed at how I can show you "both sides" of the story, and yet be spinning it at the same time...

  • You don't show both sides.
  • I was the first to post that the other was "spinning," he who gets there first ;)

"Consistently over-reliant." - I like that :)

cthellis42 said:
Vince said:
What good does that do in a closed box that stays closed for 5 years?

It means that a company that properly reads the market and knows what CAN be accomplished at X time is what should be designed for? Or are you suggesting Nintendo or Microsoft's next launch should involve the same bobbling NV30 took?

I think you know what I'm suggesting. There is no "reading of the market" when you design a console - you decide what the future will be and build it. You're thinking in a PC-centric manner, this doesn't happen here.

Take STI (Sony-Toshiba-IBM) for example. They decided in 2001 what PS3/Cell will entail. They then set about outlining the architecture (finished that year) and spent over $1.5Billion to define the lithography development (upto $8B total) as seen in my post here:

http://www.beyond3d.com/forum/viewtopic.php?t=7188

Your argument is sound to some extent in the PC market (but even that is debatable - as we've seen here). In the Console market, it hold no weight at all.

PS. Ohh, and Joe - lithography is everything...
 
I think what is trying to be said in all of this is that Nvidia is trying to make advancements; now I'm not saying ATI doesn't either but Nvidia seems more interested in pushing the boundry's than ATI.

More or less here's what happened, nvidia had a very good core in the making the nv30. Some will even argue it was greater architectually than the R300. However, whilst pushing the 130 nm nvidia fucked up. Big time. It happens, and unfortuanatly in the GPU world where we have new cards coming out every 6 months one huge failure can set you back for a long time.

This allowed ATI to basically just walk their way over Nvidia and their messed up GPU.

Is ATI the performance leader now? Yes they are, do I like ATI more than Nvidia? You bet I do.

However I can't just help but think that maybe Nvidia is setting themselves up for the future with advanced lithography advancements, which ATI doesn't seem like they are doing.
 
Well, SCE has shown with the PS2 that shrinks can occur all the way from 250nm to 90m (GS&EE@90nm). So if you have the engineering resources of a vertical company like Sony, the process that you launch your console on is fairly irrelevant. Ofcourse, as the Xbox is still on 150nm and unlikely to change, this does not hold true for all the players involved ;)

edit: and for all you people jumping at Vince, realize that he is arguing within the confines of a CONSOLE spectrum, so 'ATI SHADARS AER FASTAR ON R9800P' comments are a waste of bandwidth ;)
 
Vince said:
Excellent. Now, while avoiding the extra layers of factors that have no relevence to the design of a closed-box IC, answer me just one more question:

  • All things being equal, what lithography process is more adventagous to use when designing a closed-box console with a lifespan of 5 years - 150nm or 130nm?

This was already addressed. All things are never equal. A "closed box console" with a life-span of 5 years makes little difference.

The appropriate approach ("push an older more mature process" vs. "push a new, bleeding edge process") can depend highly on what factors the console maker gives more weight to.

For example: Is the manufacturer more tolerant of a delay in launch, or a lower spec'd part? The fact that the "specs are closed" and don't change over the life of the product can work AGAINST a bleeding edge lithogrpahy, depending on the vendor's priorities.

Going with bleeding edge lithography has more potential for delays. Going with an older more mature process has more potential for not being as feature rich (due to transistor budget.)

Furthermore...what console these days doesn't "refresh" it's chips during it's lifecycle, for cost reasons? One way to hedge your bets, is to try and push a "mautre" process at launch (going with a large, expensive die, has risk of not being as feature rich or same performance level as a more advanced process, but has a higher probability of being delivered on time), and then refreshing it a year or two later on an advanced processes to reduce costs.

An illustration.

What if x-box2 was due for a Fall '02 launch? R-300 and NV30 were the candidate chips? (Or R-30a and NV3a if you must).

If Fall '02 ship date was of utmost importance, then R-300 would be the better choice. An NV30 at that time would have vastly reduced specs to be shippable and the console would be stuck with those reduced specs for 5 years.


Just so you know, I will not respond to these lines of argument again. Atleast not while we're talking about Console based graphics.

Again, this doesn't make sense. PCs and consoles both use these things called graphics chips. IHVs are all concerned about price, performance, features, deliverability, power consumption, etc.

PC vendors refresh chips for more and more power, features, and cost reduction.

Console vendors refresh chips, but only for cost.

And as you can tell by my presence (or lack there of) in the PC Forum, I (personally) really have no interest in them.

Which would explain your lack of seeing the relevancy of it.

course it's a good thing, it's a great thing - it just has absolutly nothing (on any level) to do with this debate.

If you really think that, then this is indeed hopeless. :(
 
Vince said:
I think you know what I'm suggesting. There is no "reading of the market" when you design a console - you decide what the future will be and build it. You're thinking in a PC-centric manner, this doesn't happen here.

There is the same exact reading as was required for the 0.15/0.13 process timing from TSMC however--knowing what will be best available for the timeframe given them and design the best possible chip to it.

Vince said:
Take STI (Sony-Toshiba-IBM) for example. They decided in 2001 what PS3/Cell will entail. They then set about outlining the architecture (finished that year) and spent over $1.5Billion to define the lithography development (upto $8B total) as seen in my post here:

Yes, and this is where Nintendo and Microsoft will probably get trounced yet again. Sony and Toshiba before and with IBM now invest INTENSE amounts into the Playstation brand and push boundries almost anywhere they can find them. They started developing 4 years prior to their proposed launch and are feeding billions into its design and production so they indeed DO get to define where they go. They are forging the ground as they go, and they are paying more than ATI's and nVidia's market value COMBINED to do it.

Thing is, neither ATI nor nVidia are the control factors for their respective consoles--Microsoft and Nintendo are. They choose their designs, they choose their targets, and THEY outlay the cash and forge whatever partnerships they desire. If a graphics vendor is only brought in ~2 years before proposed release (what was nVidias for the Xbox? 18 months?) they don't get to DEFINE their processes, as they aren't on the timescale to--nor can THEY afford to force the process such as the combined weight of Sony/IBM/Toshiba can. This does not reflect poorly on EITHER of them, as the driving force must come from Nintendo or Microsoft. Certainly when companies like nVidia and ATI are imminently involved in projects they can throw their weight and make demands and push their host farther, but ultimately the will to move comes from outside of themselves. NV2A ended up being not quite what they wanted (of course nVidia was also stretched designing many other areas of the Xbox than the graphics chip), but it's still a fine beast. Flipper was pretty much exactl what Nintendo wanted, and what they wanted was something simple and capable, not needing "the best." (And I believe the development process was a lot longer as well, yet was still their target aim.)

S/I/T are creating not just processes but PARADIGMS, and neither Nintendo nor Microsoft seems to be willing to push themselves the same way. As it is, they will use ATI for their needs, and ATI seems quite capable of delivering. From Microsoft's end, much will depend how much THEY want to force on the process and how much slack they're willing to give ATI, as well as how quickly they consolidate their other vendors since they are licensing ATI's tech. And if ATI says "push here-XX and here-YY and we can give you THIS!" and Microsoft likes it enough and THEY are willing to shove, then we may well see some nifty toys comin' out! :D

I guess I can't see why you're so determined to foo-foo ATI when so MANY factors are not in there demesne, and the impetus come from vendors that were not willing to push hard enough the first time, and may not be willing to do so this time, either.
 
Paul said:
However I can't just help but think that maybe Nvidia is setting themselves up for the future with advanced lithography advancements, which ATI doesn't seem like they are doing.

This is what I mean, though... We have NO IDEA what they are doing. They have been in the pole for basically one year now, and we have seen only the current cycle from either of them. We can only wait to see what happens when the next major chip designs roll out.

zurich said:
edit: and for all you people jumping at Vince, realize that he is arguing within the confines of a CONSOLE spectrum, so 'ATI SHADARS AER FASTAR ON R9800P' comments are a waste of bandwidth ;)

Well, actually, should Microsoft have wanted to push DX9 functionality and shader performance in their console, would they not look for the vendor that could best provide it to them?

Considering the Xbox CONSOLE uses what is in essence a PC graphics card in it, claiming no comparisons should be made to the PC realm seems rather asinine.

Theoretically, SHOULD there be? IMHO, no. I'd rather see consoles divorced from PC architecture entirely and pushing they boundries THEY see being most optimal, and dealing with none of the limitations. (Which is why the PS2 and PS3 intrigue me so much. :) ) But we all know what route Microsoft took with the Xbox, and seems likely to take again with its follow-up. It would seem PC analogies still carry some weight.
 
Paul said:
I think what is trying to be said in all of this is that Nvidia is trying to make advancements; now I'm not saying ATI doesn't either but Nvidia seems more interested in pushing the boundry's than ATI.

We're saying that ATI and nVidia are (currently at least) pusing different boundaries.

This allowed ATI to basically just walk their way over Nvidia and their messed up GPU.

No, that could not have happend if ATI did not push 0.15 as they did.
 
Joe DeFuria said:
This was already addressed. All things are never equal. A "closed box console" with a life-span of 5 years makes little difference.

The appropriate approach ("push an older more mature process" vs. "push a new, bleeding edge process") can depend highly on what factors the console maker gives more weight to.

For example: Is the manufacturer more tolerant of a delay in launch, or a lower spec'd part? The fact that the "specs are closed" and don't change over the life of the product can work AGAINST a bleeding edge lithogrpahy, depending on the vendor's priorities.

Going with bleeding edge lithography has more potential for delays. Going with an older more mature process has more potential for not being as feature rich (due to transistor budget.)

Furthermore...what console these days doesn't "refresh" it's chips during it's lifecycle, for cost reasons? One way to hedge your bets, is to try and push a "mautre" process at launch (going with a large, expensive die, has risk of not being as feature rich or same performance level as a more advanced process, but has a higher probability of being delivered on time), and then refreshing it a year or two later on an advanced processes to reduce costs.

An illustration.

What if x-box2 was due for a Fall '02 launch? R-300 and NV30 were the candidate chips? (Or R-30a and NV3a if you must).

If Fall '02 ship date was of utmost importance, then R-300 would be the better choice. An NV30 at that time would have vastly reduced specs to be shippable and the console would be stuck with those reduced specs for 5 years.

OMG! :oops: All that to answer what I'm hoping Dave will do with one word!

You're argument is beyond ridiculous to avoid admitting that lithography is a huge factor in designing a set-piece IC for a closed-box.

You're invoking arguments based on What if this in 2002 and What is that all hinging on small development cycles and quict TTM as seen in the PC world. Which you, yourself Joe DeFuria, made irrelevent with this comment:

Joe DeFuria [url said:
http://www.beyond3d.com/forum/viewtopic.php?t=7406&postdays=0&postorder=asc&start=240[/url]]This is exactly why I've argued in the past, (usually related to IMGTEC) that I feel the IP licensing model is FINE for the console market, but I do not like it for the PC market. Console designs flip over every 5 years or so, meaning you have more time to plan and sort things out. The PC race is break-neck, and having a more true fabless semi-con model is more beneficial.

Alrighty then. What you're doing is backpeddling my friend. You're argument is segmented and thuse futile. I shall reask my question to you inlight of these previous comments:

  • All things being equal, what lithography process is more adventagous to use when designing a closed-box console with a lifespan of 5 years during an extended development cycle as JoeDeFuria stated - 150nm or 130nm?

Simple question. I'm expecting a one word answer.

Again, this doesn't make sense. PCs and consoles both use these things called graphics chips. IHVs are all concerned about price, performance, features, deliverability, power consumption, etc.

PC vendors refresh chips for more and more power, features, and cost reduction.

Console vendors refresh chips, but only for cost.

Console makers also design monsterous and dedicated IC thats push lithography far. Here's an example:

<img src=http://pc.watch.impress.co.jp/docs/2003/0421/sony1_03.jpg width=150>

Cost and initial manufacturability are less of a concern for a console IC as it'll normalize over the lifetime of the unit. Thus, your very dependent upon pushing lithography and manufacturability. See here:

<img src=http://pc.watch.impress.co.jp/docs/2003/0421/sony1_16.jpg width=150>

Thus, as ststed above. With a known development cycle of 5 years - you know what you need and push to get it. You are still thinking this is the PC market with 6month refreshes on technology. You have one shot for 5 years - you kick ass to get there (eg. SCE's $8Billion and Nintendo's several)
 
Vince said:
You're invoking arguments based on What if this in 2002 and What is that. Which you, yourself Joe DeFuria, made irrelevent with this comment:

You forget he's invoking what CAN be done--probably what SHOULD be done (and what Sony/Toshiba/IBM are doing in massive amounts), but this does not carry over that all console makers are DOING it. Nintendo spent the time, but did not want to push. Microsoft seems willing to invest more in hardware loss and marketing than console design--they certainly didn't do it for the Xbox. This does not reflect poorly on nVidia, as they pretty much did a bang-up job with all they were allotted.
 
Vince said:
You're argument is beyond ridiculous to avoid admitting that lithography is a huge factor in designing a set-piece IC for a closed-box.

Um....that's nice.

Now all you have to do is tell me how you reached the conclusion that I don't think or wont' "admit" that lithography is a huge factor for ANY IC, including a closed box with a 5 year-life cycle....then maybe the rest of your post is worth reading.

Please, read my post again.

Hint: Lithography is CERTAINLY a huge factor, but not the ONLY factor.
 
Joe, I don't care. This argument is over. I think it's appearent that lithography is one of the preminent deciding factors in a closed-console - which is what I wanted to state in the beginning. It's very reasonable that you want a company that pushes their designs hard (as evident by the 130nm vs. 150nm question and PS2 precedence) and that the advantages of advanced lithography in a closed-box IC far, very far outweigh the disadvantages.

Thus, unless someone can tell me how an IC developed without costs concerns for a development cycle of 5 years can be better utilizing a lower process (eg. no SOI/SS/Low-K, etc) - this is over.

The rest of the argument is all IHMO and falls into line with historical precedence. That post by cthellis42 is great - and I agree very much with it. While the needs of MS might be met, I'm still dispointed (as are others) but it's only a console and it doesn't change my life.
 
Alrighty then. What you're doing is backpeddling my friend. You're argument is segmented and thuse futile. I shall reask my question to you inlight of these previous comments:

  • All things being equal, what lithography process is more adventagous to use when designing a closed-box console with a lifespan of 5 years during an extended development cycle as JoeDeFuria stated - 150nm or 130nm?

Simple question. I'm expecting a one word answer.

One word: Depends.

Correct answer: all else cannot be equal by definition, so your question is either irrelevant or ignorant.

Or is the 65nm process available as I type?

Console makers also design monsterous and dedicated IC thats push lithography far. Here's an example:

Now you show me how that applies to Microsoft's model. Last time I checked, Microsoft isn't deisgning "monsterous and dedicated ICs". They're more or less picking parts off the shelf. (Of course, X-Box2 might be different than X-Box 1, but given the lead time of the ATI deal...not likely to be that different.)

As cthellis42 has been trying to tell you, these are completely different models. Each of which, has it's own risks, advantages, and disadvantages. Sony's requires HUGE investments and lots of lead development time, but great flexibility and potential for "breakout". MS's has a lot less flexibility, a lot less risk, but less potential for some type of paradigm shift.

Thus, as ststed above. With a known development cycle of 5 years - you know what you need and push to get it.

So, ATI has 5 years to get xbox-2 chip out the door? So did nVidia?

You are still thinking this is the PC market with 6month refreshes on technology.

You are thinking of MS and X-Box model like it's that of Sony....

You have one shot for 5 years - you kick ass to get there (eg. SCE's $8Billion and Nintendo's several)

Which is why the X-Box GPU is a miserable failure? It only had 18 months or so from development to production? I though the X-Box console and GPu were holding their own quite well...
 
Vince said:
Joe, I don't care. This argument is over. I think it's appearent that lithography is one of the preminent deciding factors in a closed-console - which is what I wanted to state in the beginning.

I think it's one of the preminent deciding factors in any IC. That one factor alone carries all kinds of implications from cost, time to market, trasnistor budget, power consumption, etc.

It's very reasonable that you want a company that pushes their designs hard (as evident by the 130nm vs. 150nm question and PS2 precedence) and that the advantages of advanced lithography in a closed-box IC far, very far outweigh the disadvantages.

It depends on the situation. A simple exercise to (again) illustrate the point.

* According to you 65nm has more advantages, by far outweighing 0.13u or 0.09u in a closed box IC.

* No shipping closed box that I'm aware of currently has 65nm tech.

How can you reconcile those two "truisms"?
 
Status
Not open for further replies.
Back
Top