Official: ATI in XBox Next

Status
Not open for further replies.
Well, Microsoft may like to choose them but I'm sure they'd have to do some very fast talking to get Intel to agree to such a thing as this is outside of their realm. MS would have to put a very persuasive case together to get Intel to divert resources away from the core business for such a project. It would also mean diverting the top end fab resources away from the bleeding edge CPU's as well.

While I think there are considerable hurdles to be taken for such a thing to happen, I do still believe the relationship between Intel and ATI hasn't been fully played out yet. I do think there is more to come there.
 
Why is it outside Intel's realm? Intel makes integrated chipsets with gpus in them so it would be easy for them to make chipsets with ATI graphics technology in it.
 
Because it would be a lot of custom work (high end 3D layout and fabrication) with no guarantee's of massive volume and possibly lower margins than their CPU business. If they could get close to the type of vilumes the PS2 has had then I'm sure they might well be interested, however the volumes that the XBox has shifted in might give them concerns.
 
bbot said:
Why is it outside Intel's realm? Intel makes integrated chipsets with gpus in them...

Chipsets (northbridges, IGPs) don't use cutting edge fabbing. For a product such as this, you'd like to use the best you've got, to the point of accepting bad initial yields in anticipation of improvements. In short, you'd cut straight into Intels fab capacity for their top end CPUs. We can only speculate as to what would be in Intels best economic interests to produce, but the guess is likely to be a good one. :)

Entropy
 
bbot said:
If MS chooses Intel to make the Xbox2 gpu, then it wouldn't be possible to make a gpu based on 65nm process? Intel will have fabs using 65nm lithography by 2005.
Intel could have one, maybe two (unlikely) 65nm FABs in production sometimes in H2/05 but why should they use the small amount of capacity (think: FAB(s) still ramping-up) for a chip that:

- is hard to produce due to sheer amount of transistors and a (compared to CPUs and from a manufacturing perspective) bad transistor:cache ratio

- is likely to sell for a fraction of the margin (!) of a high-end CPU that could be made on the same wafer?

I think there's only one way for MS to get Intel as a foundry partner:

Ballmer to Barrett: "Dude, I suppose you don't want us to support those pesky 'Pentia' anymore, do you?" ;)

cu

incurable
 
incurable said:
bbot said:
If MS chooses Intel to make the Xbox2 gpu, then it wouldn't be possible to make a gpu based on 65nm process? Intel will have fabs using 65nm lithography by 2005.
Intel could have one, maybe two (unlikely) 65nm FABs in production sometimes in H2/05 but why should they use the small amount of capacity (think: FAB(s) still ramping-up) for a chip that:

- is hard to produce due to sheer amount of transistors and a (compared to CPUs and from a manufacturing perspective) bad transistor:cache ratio

- is likely to sell for a fraction of the margin (!) of a high-end CPU that could be made on the same wafer?

I think there's only one way for MS to get Intel as a foundry partner:

Ballmer to Barrett: "Dude, I suppose you don't want us to support those pesky 'Pentia' anymore, do you?" ;)

cu

incurable

Actually Intel might be at .045 nm in 2005.

I had a brief conversation with an Intel employee a few weeks back that I ran into where I work, and he mentioned the latest for both the Itanium and Pentium line of CPU's would be manufactured on .045 nm in 2005. I thought he meant .065 nm and he corrected me and repeated .045 nm in 2005. I can't think of any reason for this guy to make up stuff, but the info he had was probably third hand from other Intel workers.
 
bbot said:
Thanks for the correction, Dave. However Microsoft can still choose Intel to make the Xbox2 gpu.

Should Intel for some crazy reason decide to give up limited space on their 65nm lines (which would be pumping out $180 profit margin Pentium 5s).
 
Brimstone said:
Actually Intel might be at .045 nm in 2005.

I had a brief conversation with an Intel employee a few weeks back that I ran into where I work, and he mentioned the latest for both the Itanium and Pentium line of CPU's would be manufactured on .045 nm in 2005. I thought he meant .065 nm and he corrected me and repeated .045 nm in 2005. I can't think of any reason for this guy to make up stuff, but the info he had was probably third hand from other Intel workers.

Intel's lithography roadmap (http://developer.intel.com/technolo...ue02/art06_lithographyroadmap/p03_roadmap.htm) shows 2007 as deployment-date for Intel's 45nm. Of course, that roadmap was published a while ago (May 2002.)
 
2007 as deployment-date for Intel's 45nm. Of course, that roadmap was published a while ago (May 2002.)

Wow!!! Intel is amongst the best if their 45nm were to come that late...

If products that push 65nm significantly over the limit could be available as early as first quarter 2005...
 
Joe DeFuria said:
nonamer said:
What's with all this fanATIcism on this board? Not trying to sound offensive, but it seems that I can't make a single statement that sounds bad for ATI without getting criticized (not just you Dave). The assumption I made was that ATI was simply going to modify an already existing chip whereas nVidia was planning on a more custom chip, more suited to the. Thus, ATI would be cheaper but overall less powerful.

Problem is, you have zero basis for making that guess. If you had some rationale behind it, we could take it seriously. That's why the criticism. (That, plus all your previous talk about ATI being a "bluff" for what MS really wants...again with no basis.)

In other words...why would you make some assumption that ATI was going to simply "modify" an already existing chip? Why would nVidia do something more "custom?" If anything, I'd say it'd be the other way around...considering that it doesn't appear ATi is going to actually produce the chip.

The problem is that I did give a rationale for it. Nvidia said the XGPU2 was too expensive but ATI said it was not a problem. Since they both have roughly equal development capacity in terms of GPU development, these contradiction statements make no sense, and I made a guess as to why. Instead of debating my guess, you simply criticize me.

DavaBaumann said:
Nonamer – because your statements don't make much sense given the environment presently, and the deal announced.

You can criticize me on my speculation that this announcement is a ruse, but for the rest you should take me more seriously.

Lets have a look what you’ve said and pick it apart:

nonamer said:
The 350 engineers nVidia had to use probably came mostly from the development of the audio chip, motherboard, and custom features in the NV2a not found in the NV2x series, hence they had to design the Xbox chip pretty much in parallel to their PC GPUs. It looks like ATI will simply design a R500 or R600 with the PC in mind and then simply upgrade/modify for the Xbox2 a la R350 and/or R420 afterwards. This could save a lot of money, but it could also be weak on power. It will take time for the modifications to be done, whereas a mostly custom designed chip could use that extra time to make something really impressive. That's what nVidia is did for the Xbox 1, and the costs where so great that they're now reluctant to do it again.

So I suppose for Microsoft choosing ATI is choosing the cheaper one but less powerful, and choosing nVidia is the more expensive but more powerful. I suspect that MS wants the more powerful, thus I call the idea that ATI has won the GPU contract for Xbox"2" into question.

a.) What makes NV2A so “special†in comparison to the rest of the NV2x series. I really don’t see that as being much beyond what they already had. The VS was already being beefed up in NV25, and the stencil rendering is a relatively easy tweak from MSAA.

Well since NV2A did drain Nvidia of 350 engineers something must have been "special" to require so much. I assumed that it is because of all the other stuff they did around the NV2A such as the audio chip and motherboard that required so much work.

b.) You say that ATI might take R500/600 and “simply upgrade or modify†for the XBox – how is this any different from the path NVIDIA took with NV2A?

The NV2A did come out before the NV25 but I suggest that ATI's X2-GPU will come out later than the R500/R600 that it derived from. Forcing someone to accelerate their product line 6 months faster like in Nvidia's case should be more draining than doing it the other way around.

c.) Why do you assume that it will just be simple modifications? The majority of the time spent on NV2A was not designing the IP (since all of it was already in-house at NVIDIA), but doing the chip layout, fabrication and debugging – being a licensing deal this is probably not going to be ATI’s concern since Microsoft are the ones choosing and dealing with how that is done. It takes far less time to alter the IP than it does to fabricate a chip.

AFAIK, the IP is the software equivalent of the chip layout itself. Debugging looks like 6 months worth of work seeing how GPUs tape-out 6 months before they are released. Fabrication is the simple manufacturing of the chip. Nothing I see here should be that expensive compared to the initial design for something as large and complicated as a GPU unless you can show otherwise. The isn't any real difference between ATI licensing a chip to Microsoft and letting the licensee manufacture it and ATI manufacturing the chip and then selling it to Microsoft when the GPU is outsourced to a foundry for production. It does remove the possibility of a future conflict like what happened with MS and NV, but the cost savings are minimal.

Regardless of what ATI does, small or large, Nvidia could the same and MS wouldn't save any money. However, this is not the case as Nvidia has stated that it would be too expensive whereas ATI said that it wasn't very expensive at all. I surmised that this is because Nvidia and ATI where planning on totally different design praxises for the development of the next Xbox GPU.

On a side note, this licensing idea leads to an interesting thought: Microsoft is in no obligation to use what ATI licensed to them if the announcement is just a licensing deal, contrary to what some have said in the other thread about this. In fact, MS is still totally free to choose whichever GPU they want for XB2. A whole host of possibilities if this is true.

d.) Even if this was “only†an R500/600 who’s to say that it wasn’t already sufficiently inline with MS’s needs in the first place?

This I can't really say for sure. I assumed that MS wants a GPU capable of going against Sony's PS3 and it's Cell computing. If the R500 is 90nm, then it should be inadequate. No clue about the R600, but if the Xbox-2 is to be out by 2005/2006 and can match the PS3, there would be a furious ramping up of GPU technology and power. I mean, you would need to go to the R500 in maybe late 2004, early 2005 (assuming it's a 90nm part) to the R600 (assuming at 65nm and twice as big) basically in the same year and then you have to consider ATI's Nintendo development requirements as well. If you are just going to get a derivative of a PC GPU, no matter how minor, it would still come up short or late, and I think MS wants neither. Anyhow, I can't "know" about this, but this is my guess.

e.) Why assume that ATI was “cheaper but less powerful†perhaps it was more powerful but at a higher cost and they were negotiating with NVIDIA for a cheaper solution.

From the speculation I made above and assuming that Nvidia was thinking of totally custom designing a 65nm GPU right from the start and ready by 2005/2006. This would definitely be powerful enough to match PS3 and it would also be very expensive and draining, and this is what I'm imagining is Nvidia's plan. They may be exaggerating the drain from Xbox One, but they're definitely not exaggerating how draining this would be, and it could explain why Nvidia is so reluctant. ATI plan seems to be a derivative of a PC GPU like the R500 or R600 like I mentioned before. That would be much cheaper, at least compared to Nvidia's plan, but I don't think it be as powerful as a fully fleshed out GPU on 65nm.

f.) You mention that MS played other companies off each other for the XBox, which they did, and its very probable that they did the same for the XBox2 – however with xb1 they didn’t announce until they had struck the deal. NVIDIA said recently that they had switched their minds right up until the last day – it sounds as though the same thing has happened here, given JHH’s fresh comments. MS were playing ATI and NVIDIA off against each other until the last days but then went with ATI.

I believe that MS is still playing off one another. Look, Nvidia's goal is to "power every pixel." They should be jumping on XBox2, not being so stoic about it. My opinion is that MS wants Nvidia to make something that can match PS3, but on hearing the news J-h H nearly had a heart attack seeing how much resources that would require. Nvidia probably said no unless MS gave them a huge sum of money, big even for MS, and now MS is doing every trick in the book to try to get Nvidia to lower it, even going (maybe) to the competition to scare Nvidia to do. Pretty sinister in my book.

EDIT: The part where I mention that MS is not required to use ATI GPU is something I find so interesting that it deserves an additional topic. It puts a whole new spin on my speculations. Basically, MS wants a fully maxed-out transistor-wise GPU on the 65nm process, ready by 2005. Considering that the current front-runners in GPU development are only beginning to reach 90nm by 2005, this plan by MS may simply be too overwhelming for either ATI and Nvidia unless they pretty much dedicate the main bulk of their resources to it. Nvidia was willing, but not unless MS gave them a huge amount of cash, too much even for MS. Instead of paying it or finding a way for Nvidia to reduce the cost, MS may be trying something totally different: Get IP from both ATI and Nvidia and creating a GPU using both of their technology. I don't know if this is possible, but if it is then we may see a Xbox2 with ATI+Nvidia codeveloped w/MS uber-GPU. :oops: Now that would be something.
 
nonamer said:
The problem is that I did give a rationale for it. Nvidia said the XGPU2 was too expensive but ATI said it was not a problem.

That's not a rationale. That's just more guess work. Unless you have a source that says nVidia claims the XGPU2 was "too expensive."

Since they both have roughly equal development capacity in terms of GPU development, these contradiction statements make no sense, and I made a guess as to why.

Even if it is true, why wouldn't it make sense? Every company has their own risk tolerances.

Instead of debating my guess, you simply criticize me.

Again, give some creditble basis, and it can be debated.

nonamer said:
The 350 engineers nVidia had to use probably came mostly from the development of the audio chip, motherboard, and custom features in the NV2a not found in the NV2x series, hence they had to design the Xbox chip pretty much in parallel to their PC GPUs. It looks like ATI will simply design a R500 or R600 with the PC in mind and then simply upgrade/modify for the Xbox2 a la R350 and/or R420 afterwards. This could save a lot of money, but it could also be weak on power.

This just makes no credible sense. You say that nVidia was "tied up" spoending money on all sorts of non-graphcis stuff...and then say that ATI might not be doind those things (hence cheaper)...but only doing graphics. And from that you conclude the GPU might be short on power?

Forget for the moment that we have no idea what exactly ATI is designing, or that the X_box1 GPU is essentailly the NV2x core (PC part) tweaked for the x-box.

And you do this regardless of what we already know to be a fact: that ATI's model (royalties) is vastly different than nVidia's current "part supplier", and can directly account for reduced costs.

It will take time for the modifications to be done, whereas a mostly custom designed chip could use that extra time to make something really impressive. That's what nVidia is did for the Xbox 1, and the costs where so great that they're now reluctant to do it again.

Wrong. The NV2a (x-box GPU) has so much in common with the NV2x cores, it's not even a question....the x-box GPU is no more a "custom" GPU that we're speculating might be for x-box 2.

So I suppose for Microsoft choosing ATI is choosing the cheaper one but less powerful, and choosing nVidia is the more expensive but more powerful. I suspect that MS wants the more powerful, thus I call the idea that ATI has won the GPU contract for Xbox"2" into question.

1) Your supposition that nVidia's x-box 2 "design" is more "powerful" than ATI's x-box2 design, is baseless. See above. (Not that the X-box 1 GPU has anything to do with the X-box2 GPU in the first place.)

2) Any assumption that MS wants more power over lower cost is also baseless. I'm sure what MS wants is the most power, for a given price point.

3) Your calling into question this relationship as a "ruse" is 100% baseless due to 1 and 2 above.

Well since NV2A did drain Nvidia of 350 engineers something must have been "special" to require so much.

The graphics core of NV2a did not require all those engineers.
You outlined a bunch of things above...motherboard design, northbridge, sound, etc.

I assumed that it is because of all the other stuff they did around the NV2A such as the audio chip and motherboard that required so much work.

Um....this is one of your problems. You claim nVidia spend all this engineering power on the NV2a....and then claim they spent their engineering power on the other stuff...

The NV2A did come out before the NV25 but I suggest that ATI's X2-GPU will come out later than the R500/R600 that it derived from.

Why?...again, no basis for this.

Forcing someone to accelerate their product line 6 months faster like in Nvidia's case should be more draining than doing it the other way around.

??

AFAIK, the IP is the software equivalent of the chip layout itself.

Doesn't have to be at all. It can be all the pre chip layout - route and trace stuff. That is, basically a software description of the chip's functions...not the chip design itself. Microsoft would choose the partner to take that software design, and turn it into a chip....doing all the place and route, and handling the majority of the chip debugging.

The isn't any real difference between ATI licensing a chip to Microsoft and letting the licensee manufacture it and ATI manufacturing the chip and then selling it to Microsoft when the GPU is outsourced to a foundry for production.

Wrong...there there can be a huge difference. The thing is, we don't know the exact nature of the licensing agreement. It's possible it might be along your lines (ATI essentially designs the chip), and it's just as possible that ATI goes the other way. (Which is along the lines of what ARTX did.)

However, this is not the case as Nvidia has stated that it would be too expensive whereas ATI said that it wasn't very expensive at all.

Again...sources?

Did nVidia say it was too expensive? Or too risky?

I surmised that this is because Nvidia and ATI where planning on totally different design praxises for the development of the next Xbox GPU.

Unless you can som up with the source where nVidia said it was too expensive, then I guess your whole line of reasoning is flawed from the very beginning. (Vs. just being flawed from that point onward.)

On a side note, this licensing idea leads to an interesting thought: Microsoft is in no obligation to use what ATI licensed to them if the announcement is just a licensing deal, contrary to what some have said in the other thread about this.

Sigh....

No, I'm sure MS just loves the idea of funding the development of ATI's technology just for the hell of it. :rolleyes:

In fact, MS is still totally free to choose whichever GPU they want for XB2. A whole host of possibilities if this is true.

Sure...as I said...as long as they're willing to kiss tens if not hundreds of millions of dollars goodbye...they can do whatever they want.

In fact....why...they could stop buying x-box GPUS from nVidia right now! I just thought of something...this deal is not only for future X-Box...but for X-Box 1!! nVidia not only lost the x-box 2 contract, but the x-box 1 as well. This is worse than I thought for nVidia....

This I can't really say for sure. I assumed that MS wants a GPU capable of going against Sony's PS3 and it's Cell computing.

I assume MS wants the best performance possible for a given price, deliverable at a certain time.

If the R500 is 90nm, then it should be inadequate.

Again..more completely baseless bunk. It's no different than the x-box1 vs. PS2. Both systems have their strengths and weaknesses, and I suspect it will be more of the same with the next generation.

If you are just going to get a derivative of a PC GPU, no matter how minor, it would still come up short or late,

Again, no basis...

I can't "know" about this, but this is my guess.

And we're telling you your guess is indeed baseless.

I can guess that cell will be so hard to program for effectively, that any DX10 level PC chip will be running circles around it from day 1.

From the speculation I made above and assuming that Nvidia was thinking of totally custom designing a 65nm GPU right from the start and ready by 2005/2006.

You keep pulling these "assumptions" out of your ass! That's the problem.

Here's one: I assume that nVidia was thinking of a totally non-custom GPU on 0.13, considering how costly their last part was.

...this is what I'm imagining is Nvidia's plan.... ATI plan seems to be....

Look, in all seriousness...you know nothing of EITHER ATI's or nVidia's plans. It's that simple. And making speculation BASED on assumptions that are purely guesswork....to support a "conspiracy theory"....is just ludicrous.

I believe that MS is still playing off one another.

And I believe you're trying to play us for fools.

Look, Nvidia's goal is to "power every pixel."

And so is ATIs...or hadn't you noticed their product portfolio?

They should be jumping on XBox2, not being so stoic about it.

Geezus....I agree that nVidia should be trying to get it...the thing is only one of them can actually win the contract. And *duh*, the loser is going to downplay the significance, while the winner is going to trumpet it. Did you expect anything different?

My opinion is that MS wants Nvidia to make something that can match PS3,

WHY NVIDIA. My opinion is that MS wants SOMEONE to make whatever it is that is MS's vision for the next x-box. This includes a certain level of power, a certain price point, and perhaps some other abilities beyond "pure games", for all we know.

but on hearing the news J-h H nearly had a heart attack seeing how much resources that would require.

Probably not as large as the heart-attack he got after losing the contract. Perhaps J-h H shouldn't be so inefficient with his engineering.

Nvidia probably said no unless MS gave them a huge sum of money, big even for MS, and now MS is doing every trick in the book to try to get Nvidia to lower it, even going (maybe) to the competition to scare Nvidia to do. Pretty sinister in my book.

Pretty far fetched and idiodic in my book. This just doesn't happen once the contract is signed. Before the partnership? Sure...playing one competitor against another is typical.

EDIT: The part where I mention that MS is not required to use ATI GPU is something I find so interesting that it deserves an additional topic.

Please...don't.

It puts a whole new spin on my speculations.

Decidedly, for the worse.

Basically, MS wants a
 
I really doubt we see 0,065 GPU in 2005 unless you imagine that both ATI and Nvidia skip the 0.09 process and it s very unlikely.
 
that's why I think Xbox 2 will ultimately be out in 2006. there are many reasons for 2006 instead of 2005.

one reason would be to get a GPU on .065
 
zidane1strife said:
2007 as deployment-date for Intel's 45nm. Of course, that roadmap was published a while ago (May 2002.)

Wow!!! Intel is amongst the best if their 45nm were to come that late...

If products that push 65nm significantly over the limit could be available as early as first quarter 2005...
I'm not entirely sure what you're saying, but expecting _any_ 65nm product to start shipping in H1/05 is rather optimistic, if not illusory.

cu

incurable

PS: This thread (and it's continued mention of PS3 as the target to beat) made me interested in more information about that Cell-thingy. Could anyone provide me with a link or two explaining it's basic concepts/implementation details? Thanks.
 
Could anyone provide me with a link or two explaining it's basic concepts/implementation details? Thanks.

Check back on previous pages of this forum, there is EXTENSIVE knowledge on PS3/Cell.
 
The problem is that I did give a rationale for it. Nvidia said the XGPU2 was too expensive but ATI said it was not a problem. Since they both have roughly equal development capacity in terms of GPU development, these contradiction statements make no sense, and I made a guess as to why.

Well, how sure are you that they have roughly th same capacity? ATI have been diversifying for many years now (consumer, mobile, dtv/settop). It may also be the case that NVIDIA has no capacity in some of the areas that MS may be looking at for the xbox (i.e. settop/dtv).

However, are you sure NVIDIA siad it was too expensive, or too expensive for NVIDIA in comparison to what MS were offering in return?

Well since NV2A did drain Nvidia of 350 engineers something must have been "special" to require so much.I assumed that it is because of all the other stuff they did around the NV2A such as the audio chip and motherboard that required so much work.

Did it, where'd you get that? I'd actually be surprised if they even had 350 hardware engineers to spare at that point in time. However, the rest of the element aren't even graphics related, so this doesn't relate to NV2A. If the deal that ATI and MS have struck also pertains to the chipset, ATI have already worked on this inhouse.

The NV2A did come out before the NV25 but I suggest that ATI's X2-GPU will come out later than the R500/R600 that it derived from. Forcing someone to accelerate their product line 6 months faster like in Nvidia's case should be more draining than doing it the other way around.

Errr, both NV25 and NV2A were derived from NV20. NV2A came alter than NV20, so I don't know how this it any different from the scenario you paint with ATI. ATI have already moved to a similar model as NVIDIA as witnessed by the R300 -> R350 cycle.

AFAIK, the IP is the software equivalent of the chip layout itself.

Nope. At least, not necessarily.

Chip layout is actually the arrangement of transistors on the chip - which means you've chosen a process and fab etc. IP licensing can be as little as selling the core logic.

Look at MBX as an example: it been here for 2 years and yet we've not seen any available yet; why? because those that have licensed it have to do all the chip level implemtation, inclusive of layout and fabrication. If PowerVR sold a chip layout that would tie those who were bying it into the process and fab that the chip had been laid out for, which clearly wouldn't work - the licensees needto choose the process size and fab (since some of the licensees - Intel, ST, TI for example - have their own fabs).

However, this is not the case as Nvidia has stated that it would be too expensive whereas ATI said that it wasn't very expensive at all. I surmised that this is because Nvidia and ATI where planning on totally different design praxises for the development of the next Xbox GPU.

This is where I think you've got it fundamentally wrong. Its clear that MS wanted a licensing deal, because that put the pricing of the chip in their hands and they are not tied to the semicon as they were with NVIDIA - this is why NVIDIA said the returns wouldn't be good (for them) since it is a royalty deal. NVIDIA fundamentally does not want to be an IP company, they want to be a semiconductor company, so that they make the chips and control the pricing of them - ATI have already shown that they are willing to be an IP company, as evidenced by their Nintendo contract, and were more than happy to operate on that basis with MS. Now, royalty based deals offers far less revenue, and possibly a little less profit per chip (depending on various factors), but does give a 100% margin on each part sold and is far less risk to the licenser.

Since the development of the parts will be paid for by MS, its not a case that ATI and NVIDIA were offering radically different things, but that NVIDIA didn't like the terms that MS wanted to operate on (royalty) whereas ATI did.

This I can't really say for sure. I assumed that MS wants a GPU capable of going against Sony's PS3 and it's Cell computing. If the R500 is 90nm, then it should be inadequate.

Well, guesswork that it would be inadequate. However, it will be up to MS to descide the process.

If you are just going to get a derivative of a PC GPU, no matter how minor, it would still come up short or late, and I think MS wants neither.

Well, thats what happened with XBox 1.

This would definitely be powerful enough to match PS3 and it would also be very expensive and draining, and this is what I'm imagining is Nvidia's plan.

NVIDIA have got some architectural issues to work through first. I don't think you'll see either company taking a qunatum leap beyond each other.

Look, Nvidia's goal is to "power every pixel." They should be jumping on XBox2, not being so stoic about it

So, they have a company mission - you don't think ATI don't have something similar? (I'd assume they have since they have a far more diverse model already in place to meet the requirement of far more pixels). And, obviously they were keen to get the XB2 - from JHH's comment a few days ago, to the quote a little above - it seems they were just beaten to the punch this time.

I believe that MS is still playing off one another.

The time for that has been and gone - quite evidently they were playing one against the other already, however they can't do it once the deal is in place.

It puts a whole new spin on my speculations. Basically, MS wants a fully maxed-out transistor-wise GPU on the 65nm process, ready by 2005

MS want the best they can get whilst still being in a position to maintain the costs far better than they could with XB1.
 
I think the glaringly important issues right now are, who will do the chip layout, who will fab it, and can MS meet PS3's streetdate, as they've promised ad nausem?

I'm rooting for Intel & Intel, but I'm also definitely not holding my breath :?
 
I have noticed that some of you are wanting or assuming the GPU for Xbox2 will need to be fabbed at 0.65u to be able to compete with CELL :? :?:

Can somone please explain that logic to me?

CELL is going to have a lot of eDRAM so yes it would benefit greatly from 0.065u. as a large chunk of die space will be needed for all that eDRAM. However assuming the next XGPU does not contain any eDRAM, it will have comparable logic area using only 0.09u because there's no eDRAM to used up die space.
 
I see that it's pointless to debate on this issue against DeFuria and Baumann so I should stop now. Just to let you know that this is a total waste of time and I'm stopping right now. You can claim whatever you from this post but I don't care. Enough is enough.

EDIT:
Joe DeFuria said:
And I believe you're trying to play us for fools.

Strangely enough I do, at least in this case. ;)
 
Status
Not open for further replies.
Back
Top