Official: ATI in XBox Next

Status
Not open for further replies.
Vince said:
Quite frankly it's comments like this that's what caused this problem. I was never talking about a PC product. This was abundently clear from the beginning and only echoed again and again.

No, again Vince, you were drawing your arguments that you view as supporting you point of view from the PC space as well – your references to the precedents that were set by the history of NVIDIA’s PC parts, the “Where’s R400 commentsâ€, and a myriad of other points – you were the one who drw those into the discussion only to discard them 10 pages later. By your own rule (now) these element that you brought in have no place in this discussion, so why did you mention them in the first place?

The only true instince is the NV2A, but that just shows the downfall of adapting a PC part for the console platform.

And there go again – the fact that the part was brought in from the PC space has no bearing on whether the process was ready or not. Its clear that at that period very few companes had 130nm ready (inclusive of Sony).

Exactly which is why I said this to you:

These aren’t special cases – we’ve no illustrated several occurrences where process has not been the overriding factor, and yet you have continued subsequently continued with the same line of reasoning (??).

Again, where did he, or anyone else disagree in the first place that process wasn’t important? You are the one that suggested other people didn’t think that – as you did to me.

But no... people fight on stupid stuff that aren't even related.

Because quite a lot of the time you have drawn these element in.
 
nonamer said:
Dave: You've just said some interesting things. Please elaborate.

WRT to ATI’s current thinking on process use they do not like trying to combine a new process at the same time as introducing a massively new architecture so they currently state they prefer to test one or the other first. For instance – R300 was based on the 150nm process, which they have had a good knowledge of from all their previous parts, and rather than being “Vince’s bold†and going to a new process they were bold in taking that mature process that they already new and pushing it to the limits with their new architecture; so only one variance was changed. Subsequent to that they have gone ahead and utilised the 130nm process on RV350, which utilised the R300 architecture but obviously uses the newer process, and they impetrated that successfully with fewer yield issues than their competition. Now, it would be a wise idea to look at whatever processes they are currently using now and in the near future to give an indication of what process R420 is based on.

In the case of the XBox part – we’ll make the assumptions that: R500 is 90nm, the XBox part is a revision of R500 and ATI are doing the layout. If this were the case then ATI would have already trialled the R500 architecture on 90nm, so that takes away one of their factors, which could then pave the way for producing subsequent revisions of the architecture on 65nm (should the process be available in a suitable timeframe).

Whether ATI’s policy of “trial one factor first†will last is also another question – it is what they are saying now, but it doesn’t necessarily always have to be the case.

cthellis42 said:
I rather think MS, as they did before with nVidia, will have to exist primarily withing ATI's PC gameplan.

My honest reaction is that its too early to really say that this is the case… ;)
 
Vince said:
AzBat said:
With your reasoning it seems you believe it is required while Joe believes it's preferred, which is why he still seems to believe...

It will be required to compete against a competitor which does take the chance. For example, when the Graphic Synthesizer was first in silicon back in 1998, the best PC part was a TNT2 or Avenger.

XBox1 skewed the perspective since they [Microsoft] launched approaching 2 years after Sony first did. This time, for economic reasons (which I was hoping to avoid) they will launch in similar windows (as per MS comments) - thus process technology becomes vital. And when a competitor, (eg. STI) is working hard on an advanced process and have a history of fully utilizing it to make enormous IC's - you start to see why lithography is very important.

V3 said:
AzBat said:
Personally I would have to agree with Joe on this and I don't think I'm the only one either.

In consoles, designs are locked for 5 years, Improvements in lithography are the popular way to cut cost, thus I think it is everything in consoles space.

I am sure, after the Xbox experience, MS is also aiming to reduce cost fast for Xbox2.

Vince/V3,

I agree that lithography is important, but I still do not agree it's required or everything as you say. With that said, I doubt we will ever see I eye to eye on the lithography issue. So this is my last post on this issue.

Tommy McClain
 
Vince said:
With your reasoning it seems you believe it is required while Joe believes it's preferred, which is why he still seems to believe...

It will be required to compete against a competitor which does take the chance. For example, when the Graphic Synthesizer was first in silicon back in 1998, the best PC part was a TNT2 or Avenger.

Why "for example" here? Don't you keep trying to deny PC parallels? Back in '98 the PS2 had no competitor other than the Dreamcast--I'm not even sure if MS had announced their intentions to enter the console arena at that time.

So they were "competing" only against their own expectations--how hard they wanted to push themselves and the tech, over their own proposed launch timetables. The EE/GS itself--ALL of it--was developed over the years earlier to try to leverage themselves best against the industry that existed, the timeframe they wanted to be out, how they forsaw their future, and how to remain competitive with those to follow until their next generation were to come.

Vince said:
DaveBaumann said:
When your argument didn't hold much water in the PC space you then seemed to shift the tangent to "but we're talking about consoles here"

Quite frankly it's comments like this that's what caused this problem. I was never talking about a PC product. This was abundently clear from the beginning and only echoed again and again.

Except--again and again--you were drawing examples entirely from PC-space to state why ATI would be doing X while nVidia would do Y, and use nVidia's PC track record for supposition Z. If ALL we were to do is take ATI and nVidia's experiences with the GC and Xbox that would be fine, but you've repeatedly used PC experience to justify your beliefs and DENIED the parallels other people bring up that as irrelevant. I think many of us just wish you'd make up your mind so we could stop sparring on a mudslide.

Vince said:
And then we all see two instances in the most recent rounds of consoles where lithography that was in reach wasn't the be all and end all and other market pressures played a bigger hand.

The only true instince is the NV2A, but that just shows the downfall of adapting a PC part for the console platform. Something I addressed as did Mfa and his posts on the CPU/dedicated IC's.

I'm not sure how ArtX/ATI suddenly no longer developed Flipper... Or did I miss something and it either has A) no lithography, or B) was never really used in Gamecubes which secretly use the GS instead but are too embarrassed to say.

Flipper may well not have switched litho, but it certainly had other hardware compromises to work through.

Vince said:
The GS and EE are shining examples of this given their advanced process, eDRAM, and physical size is the best attribute to show.

I still don't think anyone is truly disagreeing with this, yet the Xbox and Gamecube DO seem to exist, they ARE consoles, and they ARE actively competing with the PS2 even as we speak. They both CAN outperform the PS2 in many ways, and they both ALSO HAVE their own distinct manufacturing models. Each company DOES make different amounts off them, and they DO have different strategies. As for the ultimate future, it IS nice to comtemplate, yet we still have to deal with all these pesky things that HAVE happened before and probably WILL happen again, because the products they make are all REAL and UNIQUELY DISTINCT.

Vince said:
But appearently nobody saw it as everyone started pulling specific one-time situations, that should have no place in a lithography discussion, out and using them as arguing-points. Instead of people looking back on lithography and saying, "Yes, having the most advanced process that will allow for upwards or beyond a billion tranistsors would be adventageous to a IC designer for a closed box" or even a, "With an increase like that seen in lith of logic gates, it's definatly the pre-eminent decider in computation preformance - which happens to be gaining in precedence."

I think it's rather because you proclaim with as with full knowledge that ATI's situation was NOTHING but luck, that not a single scrap of what they did was more intelligent or had any market value, and that you know exactly what their gameplan will be for the rest of eternity since it's so utterly obvious.

Certainly that rather came out of the tone and your objections to just about everything mentioned... I can't see how anyone might be offput by that at all, no... :rolleyes:

In that whole situation you pretty much have been seeing what you want, and discounting everything that goes against your pre-concieved notions. The rest of us, including "Joe the fanb0y," have been watching their decisions on a broader scale, and basically stating "we'll need to see more" before we claim ANYTHING like you toss out with unwavering conviction.

Vince said:
But no... people fight on stupid stuff that aren't even related. They avoid questions that could have ended this long before and bring in topics not even related... mades sence to me. Like I said, this was just an excersise in IHV damage control for a situation that doesn't need it.

Because it seems to many you only say "it doesn't matter" when you haven't built up a cogent arguement against something, and freely borrow from those same things "that aren't even related" whenever they support your opinions.

We shall see. :)

Exactly.
 
DaveBaumann said:
cthellis42 said:
I rather think MS, as they did before with nVidia, will have to exist primarily withing ATI's PC gameplan.

My honest reaction is that its too early to really say that this is the case? ;)

Of course it is. That's why I'm "guessing" as opposed to "declaring obvious truths." ;)
 
OK, I'm basically through with this thead then. By all the comments above and the past dozen or so pages, (from everyone but Vince) it's obvious that my position is not only understood, but generally agreed with by, well, everyone but Vince....and perhaps Zinde.

So continuing on certainly isn't going to clarify my stance any more than can be expected. It's already crystal clear to most.

I must admit I've never quite seen the "This is a Console Form" tactic employed by anyone, let alone with such ferocity.

So, Vince, you get the last word if you choose. Go on and rant about how we're all ATI "fanbo*s" against you, (the loney unbiased crusader for truth), how PC parallels destroy any semblence of credibility in a "Console Forum", and then top it off by ""boldly" daring me to agree with something I've never disagreed with from the onset.

In semi-related news....just a few hours ago, my son just did his very first "poop" in the potty. How is that related? As you might imagine, it filled me with a range of emotions ranging from utmost pride at the accomplishment...to utter disgust at the practical end result....kinda like this thread. :)
 
Joe DeFuria said:
In semi-related news....just a few hours ago, my son just did his very first "poop" in the potty. How is that related? As you might imagine, it filled me with a range of emotions ranging from utmost pride at the accomplishment...to utter disgust at the practical end result....kinda like this thread. :)

:LOL: <laughs> :LOL:

That was some of the funniest sh*t I've seen on the internet in QUITE some time... Literally! :p :)
 
In a nutshell: You guys are trying to convince Vince of what everyone else realizes is obvious and finding out how insulting he is when backed into a corner. It's just deja vu for me...
 
ah deja vu you say..? :LOL:

Lets just wait and see 2-3 years down the road, see how far PC vs Console technology goesth this time. No point podding the points when non are factually revealed.

All i know is, for techie3D satisfaction this gen, it has to be Xbox >> DC > PS2. PC vs Console...what was seen it the past...whatever for now. Cant wait for CELL though!!! :oops:
 
chaphack said:
ah deja vu you say..? :LOL:

Lets just wait and see 2-3 years down the road, see how far PC vs Console technology goesth this time. No point podding the points when non are factually revealed.

All i know is, for techie3D satisfaction this gen, it has to be Xbox >> DC > PS2. PC vs Console...what was seen it the past...whatever for now. Cant wait for CELL though!!! :oops:


oh god..... chap, u're amazing. really. no words...
 
I'm just waiting to see the functional debating start up again. 'twer pretty lively, and should at least involve less caterwalling now. ;)
 
Have we EVER had anything other than "the latest console when launched, is at or slightly above the current top of the line PC" at the time?

Yes many a pc gphx cards that came before the arrival of current gen. consoles, managed to push over an order of magnitude what the consoles of that time could, IOW they managed to push 10X more than the likes of psx/n64(before the arrival of ps2 and the like.)... but alas today that is not the case... the gap is shrinking, and the sloweth might cometh...

Who gives a crap about "peak" anything, other than fanbo*s and "my poly rate is bigger than yours" geeks? It's throughput when doing a "typical game workoad" that counts when it comes to delivering game performance.

Well peak is important, it often goes down drastically, but it's an indication of what sort of processing power you're dealing with... I rarely see 200%improvement upon previous pc gpus... usually it's 30-50%, and '2-3X as fast as our previous card' in PR 'BS'...

and it seems part of that so called 2X jump that was taking place is do to some ahem **cough**BM cheating **cough**...

you'd have told me that n64/psx would have had similar or superior b/w to a 2000 gpu, and I would have told you blasphemy... today that is not the case...

Recent consoles have managed to sustain a perf. increase of over two orders of magnitude, even with added effects, increased resolution, improved physics, etc... and it appears teh ones to cometh will do teh same...

By the time 65nm is getting similar yields and volume as 90nm...there's ANOTHER bleeding edge process just starting to come on-line.

Indeed the nigh 45nm design of cell, can't wait for true 45nm, for it'd arrive later than desired(especially if probs. developed.), thus the fabs. are designed to quickly transition from 65nm to 45nm, to diminish overall transition time, and thus diminish the losses they'll take at 65nm...

0.045 for a while, doesn't that mean more power to the developer who works out how to use it BEST, rather than who gets there FIRST?


Not if the one who got there first gave the yrs, 100s of engineers, and $$$, to develop some of the best, while the rest release every 6-12mnths with, relatively speaking, poorly funded rush jobs...

Now Company A started developing to hit the cutting edge of it and drive to market before their competitors, but had to make some quick design decisions that impacted their long range outlook and yields and profit margins to get there, claiming perhaps a 6-12 month lead. Company B develops for said process keeping in mind the long term knowledge that they will be stuck on it for 5 years and design to make the best use of the process, and are not forced to make any concessions simply to be the first one there. Company A and B overlap by some 4-4.5 years in this cycle, but B takes better advantage of the technology, made fewer concessions, and garners more profit--having only yielded some time to market. In the intervening 4+ years, which stance would logically have more advantages?

Exactly... rush, short-termed, relatively speaking, h@cks done with leaving some space in mind, for further improvements, will be hard pressed to surpass a true multi-year large R&D program which puts the effort to fill the space with the top nougat from the start... if the sloweth does cometh... What will happen?

Taiwan Supah cheese man. you better not fail us at these tougher xxnm times...
 
zidane1strife said:
Yes many a pc gphx cards that came before the arrival of current gen. consoles, managed to push over an order of magnitude what the consoles of that time could....

This is how it always is, isn't it?

1) At the time a console is released, it is typically viewed as a little better than the best PCs at the time.

2) In a short period of time, the PC catches up

3) By the time that generation of console reaches its end of life, the PC is far surpassing it.

Of course, "past performance is not indicative of future events", but I don't see a reason for it to change at this time.

I haven't seen any tIOW they managed to push 10X more than the likes of psx/n64(before the arrival of ps2 and the like.)... but alas today that is not the case... the gap is shrinking, and the sloweth might cometh...

I'm not clear on what you're saying here, but when NV64 came out, it was better than any PC....and then came Voodoo Graphics, and the PC was better again...and then came PS2 and XBox, and they were beter, and now we have R300 and NV30...and that's only about mid-way through console cycle.

You also seem to have a fixation on peak polygone performance, and neglect fill rate.

and it seems part of that so called 2X jump that was taking place is do to some ahem **cough**BM cheating **cough**...

This is just patently false.

R-300 generation is at least a 2X jump over R-200 generation almost any way you measure it. This is just in terms of raw performance, without even taking into consideration a more advanced feature set.

Recent consoles have managed to sustain a perf. increase of over two orders of magnitude, even with added effects, increased resolution, improved physics, etc... and it appears teh ones to cometh will do teh same...

Same with PC hardware on the same time frame. I really don't know what you're trying to get at here. How are you measuring an "order of magnitude" anyway? Peak poly performance? Bandwdith? Memory Footprint?

The current crop of PC hardware is certainly superior to the current consoles in just about every respect. This is directly observable when comparing X-Box to PCs because they use in essence the same parts. This is not surprising. When the PS3 and X-Box2 are released, they will be likely be at worst, on par with the high end gaming rigs and likely a bit better, and the cycle will continue.

Indeed the nigh 45nm design of cell, can't wait for true 45nm, for it'd arrive later than desired(especially if probs. developed.), thus the fabs. are designed to quickly transition from 65nm to 45nm, to diminish overall transition time, and thus diminish the losses they'll take at 65nm...

Right...the same as every generation. Just as with the PC they transistion to new fabs to save cost...but to also put on newer and better products on a more regular basis.

Again, I'm really not sure what overall point you are trying to make. Are you trying to make the case that consoles are all the sudden now on a faster "evolutionary" path than PC hardware? If that's your stance, then I'll just disagree based on past history and current situation. If that's not your stance, then could you clarify what it is you are trying to say?
 
zidane1strife said:
Yes many a pc gphx cards that came before the arrival of current gen. consoles, managed to push over an order of magnitude what the consoles of that time could, IOW they managed to push 10X more than the likes of psx/n64(before the arrival of ps2 and the like.)... but alas today that is not the case... the gap is shrinking, and the sloweth might cometh...

Well peak is important, it often goes down drastically, but it's an indication of what sort of processing power you're dealing with... I rarely see 200%improvement upon previous pc gpus... usually it's 30-50%, and '2-3X as fast as our previous card' in PR 'BS'...

and it seems part of that so called 2X jump that was taking place is do to some ahem **cough**BM cheating **cough**...

you'd have told me that n64/psx would have had similar or superior b/w to a 2000 gpu, and I would have told you blasphemy... today that is not the case...

Recent consoles have managed to sustain a perf. increase of over two orders of magnitude, even with added effects, increased resolution, improved physics, etc... and it appears teh ones to cometh will do teh same...

Indeed the nigh 45nm design of cell, can't wait for true 45nm, for it'd arrive later than desired(especially if probs. developed.), thus the fabs. are designed to quickly transition from 65nm to 45nm, to diminish overall transition time, and thus diminish the losses they'll take at 65nm...

Not if the one who got there first gave the yrs, 100s of engineers, and $$$, to develop some of the best, while the rest release every 6-12mnths with, relatively speaking, rush poorly funded jobs...


Exactly... rush, short-termed, relatively speaking, h@cks done with leaving some space in mind, for further improvements, will be hard pressed to surpass a true multi-year large R&D who puts the effort to fill the space with the top nougat from the start... if the sloweth does cometh... What will happen?

Taiwan Supah cheese man. you better not fail us at these tougher xxnm times...



I thought N64 was slightly better than PC graphics of the time but i didnt have a grip on these things at the time so i could be wrong.... i thought it took 3dFX voodoo1 to get N64 kind of visuals but i'm not sure when they came out and much less the difference in performance between the 2...

when PS2 came out, it took PCs quite a long time to get the same kind of polygon throughput. even now, i'm not exactly sure how many pc games push 15million polygons in game, although PC graphics do have the advantage of more memory which allows them better and bigger textures, and also display at higher resolutions with AA and AF depending on the performance... but certainly not at the time PS2 came out and not for a long time afterwards.

also, i'm not sure about this since i read about it YEARS ago but i remember reading that PS1 had much more bandwidth than PCs at the time, or the bus from the CPU to "something else" was much wider/clocked faster than the fastest Pentium at the time... or something like that...
 
I'm not clear on what you're saying here, but when NV64 came out, it was better than any PC....and then came Voodoo Graphics, and the PC was better again...and then came PS2 and XBox, and they were beter, and now we have R300 and NV30

What I meaneth, is that pc gphx cards that came yrs after n64/psx, but prior to ps2/xbox managed a 10X perf jump... today that is not the case... the cards are not even equal in b/w...

...and that's only about mid-way through console cycle.

mid-way? we're about 18 approxx months away from the likely launch of the ps3... u have 2004 or if you prefer this fiscal yr, before the likely arrival of next gen consoles... it's hardly MID-way...

You also seem to have a fixation on peak polygone performance, and neglect fill rate

From what I've heard increasing fillrate several fold(aka several 100Gpixels per second.), isn't that beneficial for a normal display... and in any case some of the consoles of today do achieve more than an order of magnitude in pixel fillrate increase IIRC...

measuring an "order of magnitude" anyway? Peak poly performance? Bandwdith? Memory Footprint

perform all the tasks of previous h/w but with 100x increases in T&L, b/w, processing specs of h/w involved, along with added features, resolution, memory... IOW an increase in overall performance of over two orders with substantial but more down to earth increases in other areas...

If that's not your stance, then could you clarify what it is you are trying to say?

If for example the arch of say the processors in the ps3, is a good/excellent one... It will feature over 2X the transistors that will be viable for a single 45nm part.. If the low level foundries that everyone relies in sloweth(do to the significant increases in costs for shifting to new processes, etc.) and reliance on them continues...

PS

London boy past consoles pushed about 100-200K verts, that is to be an order of magnitude u just need 1-2M verts, something that was widely heard...
 
Further examples...

The past...

The psx cpu was clocked at 30ish Mhz approxx... My mid'90s pc had 100+Mhz, and IIRC more transistors. It was above current pc gphx in part do to cheapsters like 3dfx being on top...

...surpassed in overall processing by more than an order of magnitude prior to the arrival of next gen successor...

The present...
Later on we saw the EE it was released with similar speeds to processors released 18months prior to it, and slightly slower than those released soon after, it´s trans. count was similar too IIRC...

... not surpassed by over an order of magnitude prior to the arrival of it´s successor... some specs were slightly(2-3x) surpassed and some remained above those of succesive hw(v'ram bw), while others were indeed significantly surpassed as would be expected...

... and the future?...

Now we have the Cell if the patent is correct it will have significantly more trans. and it´ll be faster than processors released 18months prior...

?!?
Companies saying the SIGNIFICANTLY increasing multi-billion dollar investments in upgrades are too much... some are saying they´re considering the sloweth... it is said that only a few ´the creme de la creme´ will manage to finance such ever increasing advances with a decent speed...


A single slip, or trouble in either the sti dev. or either intel, the like, or in the low-level fabs. could very well leave one or the other, at a significant advantage for quite a few yrs... similar to the recent Ati, nvidia scenario...

The high level fab.s ain´t cheapo, thus many rely on the low level stuff... but as we´ve seen the low level cheese ain´t that tasty ;)

Will the cheap cheese not smell funny again? Will supah cheese man and his friends manage to not slip again? I´d say the odds are against him frodo wouldn´t you? :D
 
Remember when the ps2 was launching the geforce was out and the geforce 2 would be launched shortly after. Now we are just seeing games that tax these cards (doom3) And the game running on the geforce 2 looks almost as good as the best ps2 games. Use a card that came out a year later like the geforce 3 and the game looks better than ps2 games.

I think the same will be true this gen. The problem has allways been that while the 9800pros are out the programers need to make sure the game runs on a radeon 64 meg. Its almost like making a ps3 game that must run on a ps1 . Wouldn't look all that great would it ? So hopefully with the minimum spec of a dx 9 chip for the new windows , the pc graphics will get a swift kick u know where.
 
Status
Not open for further replies.
Back
Top