Official: ATI in XBox Next

Status
Not open for further replies.
zidane1strife said:
No, consoles that come 5yrs later... are at least two orders of magnitude above their predecessors... they'll be here... likely in about 18-20months approxx...

Zidane...

Have we EVER had anything other than "the latest console when launched, is at or slightly above the current top of the line PC" at the time?

PC and console graphics have evolved very similarly. The PC just takes more, smaller steps, vs. the consoles which take fewer, larger leaps.


But, VERY generally speaking, each new core generation is about "twice the difference" from the last one.

That would make R500 generation roughly 8x that of NV2x/R200.

Peak xgpu verts is about 125-150M(IIRC) approxx, it has directx 8 functionality... is the R500 peak near a billion verts?

Who gives a crap about "peak" anything, other than fanbo*s and "my poly rate is bigger than yours" geeks? It's throughput when doing a "typical game workoad" that counts when it comes to delivering game performance.
 
Joe said:
Vince said:
Joe DeFuria said:
If you can't see the blind facts that ATI shipped a 100 million transisror plus processor at 0.15, at the same time the "market leader and your personal saviour" nVidia couldn't get one out on 0.13....what do you think happened?

  • I don't give a shit about nVidia or what they said, get off it. You're becomming obsessive-compulsive.
  • And IBM shipped a 180nm SOI processor with 170M transistors that clocked at over 1.4Ghz internally at TJ. Watson. Pixel Fusion was 75M transistors on a 220nm process!

And how is anything you just said relevant to the fact that someone turned out a better product, in an earlier timeframe on 0.15 than someone else in direct competition on 0.13?

Ok, how is an arbitrary number based on just the PC IHV's relevent in anyway to a discussion on a Console Forum? In case you don't understand why I keep throwing in these "we don't care about ATI-nVidia" hints - we don't care.

Sony built a 300Million tranistor chip on 180nm as an internal R&D project. This is comperable to ATI's so called "achievement" that you keep tlaking about. I'm only concerned with lithogrpahy and have posted examples of how what ATI did is nothing but a small point, insignificant outside of a IHV pissing contest.

The GS would be a much better caliber chip if it were designed for 0.18. Lithography is everything remember!

My initial post cleared this up. Perhaps you should read better - I was talking open ended with an arbitrary timeframe declared. These comments have no bearing on this.

Repeat after me: A longer devleopment time does not NEGATE time to market pressures...

The longer Dev time with consoles lowers the risk for an IP licensing situation. It does not mean that time to market is not highly important or even critical, it means that you have more time to work the issues that IP licensing can bring...such that the RISK OF MISSING THE CRITICAL TIME TO MARKET WINDOWS is not as severe with consoles.


Obviously, and I never said this. But, as you did state the process is enlongated significantly and even allows for IP integration utilizing multiple parties involved in the design, synthesis and ultimate production of an IC in your given "5 year" timeframe.

Thus, you can't play it both ways. Threw you're IP comment where IP is usable in a console and not the PC, we've relized several things:

  • The Console Dev cycle shares no relation to that of the sterotypical PC cycle
  • The Console Dev Cycle is much longer than the PC cycle

Thus we can only assume that if we're allowed to integrate IP, the cycle is long enough to allow for an analysts of the process technology available at time T when the IC will be released and design to that as a goal. Thus, just as it's critical - we've goes the probobilities and with the time of 5 years can even do the process develoment ourself or finance it (eg. Intel, AMD, STI).

Um, Vince....when you decide to produce a chip on a process that won't be up and running for several years, you are EXACTLY desiging an IC based around what *might* happen. You'll do your damnedest of course to get it up and running, but that is the real risk.

Dude, I can get a 2nd grader to tell me that. But what they might not be familiar with is the idea of cost and reward based on probobility and how the cutthroat marketplace works.

Look, this is very simple, why do you think the IT industry has such a massive R&D channel? Because being on the cutting-edge is what brings sucess and profit - all the time, no; but enough of the time that it's the de facto standard.



Your argument is grounded in this pseudo-buisness wanta-be-analyst type ideology where you think you're right in balancing costs and risk agiants your percieved value. But, first of all your wrong with respect to R&D in the IT industry (which PS3 will show you first hand) and second, this isn't a financial forum. This is Beyond3D - we talk about technology.

Start a thread up on this in the Microsoft Forum at Fool/com and we can continue this discussion there.
 
Joe DeFuria said:
Have we EVER had anything other than "the latest console when launched, is at or slightly above the current top of the line PC" at the time?

Playstation in 1994 was such a revolutionary product. PS2 was for 2000, but lithography played a large part in the rapid advancement of the PC sector to overwhelm it. The comming of 60M tranistor DX8 chips and the huge logic influx with DX9 show this.

Who gives a crap about "peak" anything, other than fanbo*s and "my poly rate is bigger than yours" geeks?

The same kid of "geeks" that post this:

JoeDeFuria said:
If you can't see the blind facts that ATI shipped a 100 million transisror plus processor at 0.15, at the same time the "market leader and your personal saviour" nVidia couldn't get one out on 0.13....what do you think happened?
 
This is Beyond3D - we talk about technology.

And we've already proven - and you've agreed - that great technology doesn't necessarily come with massive process changes.

Vince, I'm getting a number of complaints (both here and privately) about the use of language you are displaying here. Tone it down or the mods will start removing your offending posts. (that includes the rather sad constant references to fanboyism)
 
DaveBaumann said:
nonamer said:
One issue constantly brought up is that lithography isn't everything, we have to look at the market, take timing in account, etc. Ok, let's do that. 65nm will be ready by 2H2005 from all the major fabs, including IBM, TSMC, Intel, etc., (sources: 1 2 3) right at the same time as the next generation of consoles. 65nm is indeed possible and it should be ready at the right time from multiple places. 65nm should not affect timing unless something goes wrong at every fab, but even a 6 months delay to 1H2006 should not be too much trouble, since it is not a 18 month delay like the Xbox 1 was to the PS2.

As stated earlier, ATI prefer not to make major architectual changes at the same time as making a massive process change - they like to trial that process first, or trail the architgecture. If (and thats a big if, since I still think there's a hell of a lot more to come from who does the back end) the part is spun off the R500 architecture then they will have already made that architectural change, hence they may then choose to target another process.

So... you're saying that ATI will be on 65nm for Xbox-2?

cthellis42: I advise you not to get inbetween Vince and Joe since they are some of the biggest noise makers on this board. Just sit back and let them duke it out. 8) And maybe let the mods handle them, not you.
 
Vince said:
This is Beyond3D - we talk about technology.

We certainly do. :)

Of course we both talk in theoreticals and in suppositions and in realities... This particular thread has been arguing over theory vs. reality for a while now, but I think the crux what most people are hitting on are indeed the "realities"...

What good is it to argue "all things being equal" in environments where things are never equal? Especially when we are discussing the future of things that WILL BE REAL!

Vince, I agree with you on many points (and heck, it seemed like you were agreeing with me, too! :) ), but what ARE you insisting at so virulently? Or is this just going to be a Vince/Joe turf war, and I missed out on the earlier stages?
 
cthellis42 said:
Quite honestly, Vince, you just crimped your central arguement with this. "...do you target the most advanced lithography process your R&D tells you will be ready..."

Not at all, because the R&D is a direct function of the development pipeline. Just as you're expected to suceed and "beat the industry" in your back-end work - so are they suppose to with the lithography.

Enter the whole argument of why less participents in the pipline are better. Enter semiconductor companies as opposed to IP.

Will be ready! You are bringing timing into what you JUST condensed the arguement into. 0.13 WASN'T READY in time for NV30 and the demands nVidia wanted to put on it. 0.13 WASN'T READY in time for NV2A and they had to backslide and change designs. Could nVidia have infused TSMC with more cash to ensure 0.13 was ready for their requirements? Perhaps

You just answered this yourself. For a given time T, you need to target a process N that satisfies your design needs AND parallels to that of the competition.

It was nVidia's fault for not doing many things concerning lith. Which is why the deal with IBM is what, IMHO, promised to be such a potent force. Which was a basic tenent back in the beginning of this argument. It's entirely self-consistent.
 
nonamer said:
cthellis42: I advise you not to get inbetween Vince and Joe since they are some of the biggest noise makers on this board. Just sit back and let them duke it out. 8) And maybe let the mods handle them, not you.

Oh, er... Right then. :LOL:

A shame though, as there are a lot of good things to talk about, but my head is hurting from all the scrabbling about.
 
Vince said:
cthellis42 said:
Quite honestly, Vince, you just crimped your central arguement with this. "...do you target the most advanced lithography process your R&D tells you will be ready..."

Not at all, because the R&D is a direct function of the development pipeline.

...that nVidia and ATI do not control...

Vince said:
Will be ready! You are bringing timing into what you JUST condensed the arguement into. 0.13 WASN'T READY in time for NV30 and the demands nVidia wanted to put on it. 0.13 WASN'T READY in time for NV2A and they had to backslide and change designs. Could nVidia have infused TSMC with more cash to ensure 0.13 was ready for their requirements? Perhaps

You just answered this yourself. For a given time T, you need to target a process N that satisfies your design needs AND parallels to that of the competition.

It was nVidia's fault for not doing many things concerning lith. Which is why the deal with IBM is what, IMHO, promised to be such a potent force. Which was a basic tenent back in the beginning of this argument. It's entirely self-consistent.

But you've never yet stated if you think nVidia by its lonesome HAS the resources to push litho to its needs. They seem rather small compared to the chip fabs and what is required, and neither ATI nor nVidia make the profits to invest enough and be a serious driving force. Both do not make the business connections to push it to the monolithic levels S/I/T have, or companies like Intel does all on its own. So either both are to be blamed, or both have to make the best use of the resources available to them. Either way, the motive force has to come from outside them.
 
cthellis42 said:
Vince, I agree with you on many points (and heck, it seemed like you were agreeing with me, too! :) ), but what ARE you insisting at so virulently? Or is this just going to be a Vince/Joe turf war, and I missed out on the earlier stages?

Nah, I have no problem with Joe. We agree on many things. His opinion just isn't valid IMHO to suceed in the console sector. He brings with him alot of PC "baggage" and it interrupts the conversation. Furthermore he's had several of his points totally routed: my favorite being when be compered the preformance of the XBox to PS2 and I showed how an equal lith plateu at 150nm would upset that drastically in the PS2's favor.

What this comes down to (and you rightly stated) is that a company like Sony is going the extra mile and ensuring that their lithography is ontime and better than the industry's. Microsoft isn't and it'll reflect on ATI (who I already contend is slower in adoption). As I showed with the sheer size of the GS and EE at 250nm, SCE will make the IC's freak'in huge if they have to. Just image a 300mm^2 IC at 65nmSOI.

Again, lithography is everything. A IC at 65nmSOI/SS will rape a same sized die at bulk 90nm - just due to the sheer logic and preformance increase. The problem is getting there and that involves living on the edge putting the resources behind it. Joe is too busy showing the difficulty in getting there (like a typical bean-counter) instead of focusing on the technology - which is why we're here. And shit, if STI will dump $8Billion to produce a superior product, then I don't care what they went threw to egt there. Because, as a consumer, I demand the superior product... at T time.

PS. You've had some excellent posts, very nice indeed. I'll get back to you one way or another with responces or dialogue.
 
cthellis42 said:
Oh, er... Right then. :LOL:

A shame though, as there are a lot of good things to talk about, but my head is hurting from all the scrabbling about.

Thats BS. Chat about whatever you want, this isn't "Our" forum or "Our" thread. Besides, you're conversation will almost certainly be better than ours. We can deal with it intermixing with our conversation, it's np. Don't ever yeild your thinking for someone.
 
Ok, how is an arbitrary number based on just the PC IHV's relevent in anyway to a discussion on a Console Forum?

What "arbitrary" number do you speak of? And these PC IHVs make console GPUs if you haven't noticed.

In case you don't understand why I keep throwing in these "we don't care about ATI-nVidia" hints - we don't care.

And yet you continually bring them up yourself. How quaint!

Sony built a 300Million tranistor chip on 180nm as an internal R&D project. This is comperable to ATI's so called "achievement" that you keep tlaking about.

You just have no concept of the relevancy behind that achievement. This is not to compare ATI's "abilities" vs. Sony's. That's irrelevant. It's certainly not to compare an actual shipping product with some R&D effort, which are bound by totally different constraints.

The concept is simple: grasp it: Lithography is not EVERYTHING.

I'm only concerned with lithogrpahy and have posted examples of how what ATI did is nothing but a small point, insignificant outside of a IHV pissing contest.

I'm concerned about the ultimate finished product. This inlcudes choices made with respect to lithogrpahy, memory, time to market, cost, etc.

The GS would be a much better caliber chip if it were designed for 0.18. Lithography is everything remember!

My initial post cleared this up. Perhaps you should read better - I was talking open ended with an arbitrary timeframe declared. These comments have no bearing on this.

Yes, just toss out arguments out of sheer embarrassment. Wise choice. ;) An "open ended, arbitrary timeframe?" Yes, because that's how all of these manufacturers, console or PC, operate.

I hear they also print their own money, and *shhhh*, they know where Jimmy Hoffa is too!

Obviously, and I never said this.

You directly accused me of it.

But, as you did state the process is enlongated significantly and even allows for IP integration utilizing multiple parties involved in the design,

Correct. Just as Sony, IBM, and Toshiba can partner up. The long time frame lessens the RISK that roadblocks would be really damaging. You have MORE TIME to RECOVER from any issues, and still meet a deadline. Do you not get this simple concept?

It's not that time to market isn't or can't be critical!! It's that with a longer dev cycle, you have more flexibility in terms of how you get there. Longer time frames are more forgiving of mistakes...NOT that longer time frames make missing the ultimate ship date less forgiving.

Please, think on this some more before you re-quote that IP statement of mine for the 10th time, every time not furthering your cause.

  • The Console Dev cycle shares no relation to that of the sterotypical PC cycle


  • Wrong.

    Thers is a relationship...in particular with the MS / ATI / nVidia model. Of course overall they are clearly NOT the same at all. But this is not to say there is no relationship.

    Namely, the GPU vendor has roughly the same amount of time to take design from the drawing board to production, as they would a fresh-slate PC chips.


    [*]The Console Dev Cycle is much longer than the PC cycle

Yes, but not necessarily the GPU dev cycle. Depends on the model.

Um, Vince....when you decide to produce a chip on a process that won't be up and running for several years, you are EXACTLY desiging an IC based around what *might* happen. You'll do your damnedest of course to get it up and running, but that is the real risk.

Dude, I can get a 2nd grader to tell me that.

Dude, it's getting you to understand and acknowledge it that's the trick.

Look, this is very simple, why do you think the IT industry has such a massive R&D channel? Because being on the cutting-edge is what brings sucess and profit - all the time, no; but enough of the time that it's the de facto standard.

Why do you continue to ignore the fact that there's more than one way to be on the cutting edge?

Your argument is grounded in this pseudo-buisness wanta-be-analyst type ideology where you think you're right in balancing costs and risk agiants your percieved value.

I don't think I'm right...every successful business does. Your argument isn't based on anything at all.

You think Sony is just going after 65nm cell because "that's the risky thing to do?" Are you just that naive? They are doing it because they calculate that the rewards will be worth the risk involved (and they have the financials to be able to undertake that risk)....yes...using some pseudo-business-analyst-type ideology.

But, first of all your wrong with respect to R&D in the IT industry

What did I say about R&D in the IT industry? And how is it "wrong?"

This is Beyond3D - we talk about technology.

This is Vince, I talk with one foot in my mouth....
 
Vince said:
Furthermore he's had several of his points totally routed:

A legend in your own mind?

my favorite being when be compered the preformance of the XBox to PS2 and I showed how an equal lith plateu at 150nm would upset that drastically in the PS2's favor.

That would be my favorite point of saying: if it would be an upset so drasatic in PS2's favor...why didn't they target THAT process, and delay a little bit?

Again, lithography is everything.

Except when it isn't, as has been proven time and time again.
 
Joe DeFuria said:
The concept is simple: grasp it: Lithography is not EVERYTHING.

Thought experiment:

If at T time, Fab A is yeilding xx% on both theie 65nm and 90nm processes:

  • Which process would give Semiconducter Developer B better preformance?
  • Which process would yeild a superior product in terms of transistors? Can you even name rough estimates of what you're fighting over?
  • What process will yeild, in the time of computational limited computing, the better preforming chip based on static designs aimed at a single standard?

Post-menopausal Joe said:
Vince said:
This is Beyond3D - we talk about technology.

This is Vince, I talk with one foot in my mouth....

Can I use this?
 
Joe DeFuria said:
That would be my favorite point of saying: if it would be an upset so drasatic in PS2's favor...why didn't they target THAT process, and delay a little bit?

Vince said:
http://www.beyond3d.com/forum/viewtopic.php?t=7406&postdays=0&postorder=asc&start=340[/url]]He is backpeddling and for this reason. My initial comment indicated that you need to "push the process to the edge and beyond into the realm of poor-yields with the understanding that future lithography will bring the yields and costs under control."

The comment is time-indifferent. You can push N process in Y year looking to stabilize yeilds at N+1 process in Y+1 year.

Or you can push N+13 process in Y+16 years looking forward. It's insignificant as the design team has a set launch window and can target N process for that period T - thus being open ended and just a fundimental rule that I'm speakin of.

Joe on the otherhand can't comprehend this, nor is he willing to because of some of the comments he's been caught up on concerning the question posed: What process is better all things equal?

Thus, this is just a charade in which he's adding context to support his case that's not a general basis for study or understanding - which is what I've stated on multiple occasions I'm after.

Perhaps you missed this. It's time-indifferent based around a given launchwindow, Y. This ideology of mine is based around a post around 11 pages ago - way before you even started this line of argument.

As per the praxis of your question, Sony had a given launch window. Their Y was 2000 with a 2H 1999 ramp-up. This yeilded a N of 250nm due to their eDRAM requirements. They then took this process, which was advanced for that period Y and pushed it untill it apoproached 300mm^2. Which is unbelieably massive for an IC. They are the quintisential company and case when it comes to this (well, perhaps the I-32 GS is).

Your repititious comment is, fallicious.
 
Vince said:
Thought experiment:

If at T time, Fab A is yeilding xx% on both theie 65nm and 90nm processes:

That's an "interesting" premise to begin with, because it assumes that both processes are at the same state of maturity, rather than one being on the "bleeding edge" that you purport is "the only right way to go." So already, you are showing evidence of the "all things being equal in theory, but they're not in reality" syndrome.

[*]Which process would give Semiconducter Developer B better preformance?

The imaginary bleeding edge 65mm process that's yielding the same as the mature 90nm one.

[*]Which process would yeild a superior product in terms of transistors?

Same as above.

[*]What process will yeild, in the time of computational limited computing, the better preforming chip based on static designs aimed at a single standard?

Same as above.

Why do you insists on restating the same old "all things being equal" argument, when it has been dubunked by everyone that's cared to comment on it in this thread?

This is Vince, I talk with one foot in my mouth....

Can I use this?

Sure...I thought it had a sort of nice ring to it myself. ;)
 
Vince said:
Nah, I have no problem with Joe. We agree on many things. His opinion just isn't valid IMHO to suceed in the console sector. He brings with him alot of PC "baggage" and it interrupts the conversation.

Excepting as I've noted and you haven't done much denying, PC comparisons can certainly leak in. The scale is of course different, but many fundamentals still apply. We can see many of them in the Gamecube and Xbox, and these are the other two players in the console world.

I rather share your views that their approaches will not ultimately be successful in displacing Sony, but that does not mean they didn't HAPPEN, nor that comparisons won't apply again. Nor does it in any way define the whole strength of a platform. Nor do I think the whole situation cannot possibly turn on its head in a moment's notice... :)

Vince said:
Furthermore he's had several of his points totally routed: my favorite being when be compered the preformance of the XBox to PS2 and I showed how an equal lith plateu at 150nm would upset that drastically in the PS2's favor.

Excepting that what I read of his stance--and agree with--is that the PS2 could not have accomplished their design on 0.15 in time for their desired launch, and that would have thrown off the entire balance of EVERYTHING--hardware, software, marketing, you name it.

Vince said:
What this comes down to (and you rightly stated) is that a company like Sony is going the extra mile and ensuring that their lithography is ontime and better than the industry's. Microsoft isn't and it'll reflect on ATI (who I already contend is slower in adoption). As I showed with the sheer size of the GS and EE at 250nm, SCE will make the IC's freak'in huge if they have to. Just image a 300mm^2 IC at 65nmSOI.

And quite probably all of us in this thread agree equally that this approach will most likely triumph. But this approach also doesn't reflect on ATI nor nVidia so much as Microtendo.

I also don't agree with your other contention (that ATI is slower in adoption), because you're reading a BROAD history that you have previously stated should not matter at all. We can go right back to the time nVidia couldn't have kept up with heavier weights, but it consolidated and planned, and pulled very much to the forefront. ATI has always been large, but in the past 1-2 years they seem to have done similar consolidation and serious planning that has enabled them to succeed with many lines and narrow the gap between themselves and nVidia. (Now dominating certain areas they concentrated on first.) When you read as being "timid" I see as being smart, as R300 did not suffer as NV30 did and came out well in advance. Their 0.13 process was out on the heels of nVidias, only it seems also with the same advantages R300 had, greater performance over nVidia's at its scale, and higher yields from the same fab. It still reads like you're blaming ATI for not doing the SAME EXACT fuck-up nVidia pulled with NV30 (or perhaps they would have only screwed up SLIGHTLY less, with better engineering), whereas I would say whatever company actually FUCKS UP is more the fool. The ONLY WAY we area really going to judge how nVidia and ATI both handle themselves in the future is if we have knowledge NOW of their design parameters for the next years to come (and last time I checked my pants, I don't) or by waiting on info of the next generation--when it slowly leaks out--and try to measure the trend for OODLES more speculation! :)

Vince said:
Again, lithography is everything. A IC at 65nmSOI/SS will rape a same sized die at 90nm - just due to the sheer logic increase. The problem is getting there and that involves living on the edge putting the resources behind it. Joe is too busy showing the difficulty in getting there (like a typical bean-counter) instead of focusing on the technology - which is why were here. And shit, if STI will dump $8Billion to produce a superior product than I don't care what they went threw, because as a consumer I demd the superior product... at T time.

PS. You've had some excellent posts, very nice indeed. I'll get back to you one way or another with responces or dialogue.

It seems more like Joe, as with myself and many others, doesn't particularly blame ATI for something where they have weight--and can certainly give direction--but don't have at ALL the resources or the business partnerships to do what you propose. Nor nVidia.

Vince said:
Thats BS. Chat about whatever you want, this isn't "Our" forum or "Our" thread. Besides, you're conversation will almost certainly be better than ours. We can deal with it intermixing with our conversation, it's np. Don't ever yeild your thinking for someone.

I know it's BS. You just weren't responding to anyone but Joe, so I had to get your attention SOMEhow... ;) Have no fear, I'm happier now. chuckles>
 
Vince said:
As per the praxis of your question, Sony had a given launch window....

Stop there, vince.

The point is Sony has a given launch window for a reason. I assume you know the obvious reason. The reason is...Sony, and again, someone doing some pseudo-business-analytical guestimations- predicts that no matter how much of a bang-up a console that they can make at a later date, the RISK of losing the market to a competitor is too great if that window is missed.

Sony would of course LOVE to milk the PS or PS2 for 10 years between generations and suck up royalties and simply decrease hardware cost in the interim. Take longer times between the massive R&D expenditures....but they know, as you and I do, the market would not reward that due to competition.

Vince, "all else being equal", more advanced lithogrpahy is better. No argument, dude. Never has been.

It's just that all else is not equal.

If you can just admit that, we can call it a day.

Whaddya think?
 
Joe DeFuria said:
That's an "interesting" premise to begin with, because it assumes that both processes are at the same state of maturity, rather than one being on the "bleeding edge." So already, you are showing evidence of the "all things being equal in theory, but they're not in reality" syndrome.

Yeilds generally normalize fairly quickly. This is a thought experiment, we are debating technology.

The imaginary bleeding edge 65mm process that's yielding the same as the 90nm one.

It's hardly "imaginary" - the lines are being assembled now. Much more real then, say, the R400.

Same as above.

Ahh, so unwilling to comment. Typical.

Same as above.

Yup, figured.

Why do you insists on restating the same old "all things being equal" argument, when it has been dubunked by everyone that's cared to comment on it in this thread?

Their not debunked, you just refuse to give into my origional argument - which if you'd care to go back and read - was concerning the theoretical approach to IC design and the importance of lithography. Which you refuse to comment on... which is obvious why.
 
Status
Not open for further replies.
Back
Top