Official: ATI in XBox Next

Status
Not open for further replies.
Joe DeFuria said:
I would argue it [eg. Xbox - Vince]was a significant factor (among other significant factors) that put them into the quasi-leadership position they are in today.

And I'd disagree unless you're talking about their internal engineering problems related to XBox. nVidia was already ascending to it's current point before XBox, having toppled 3dfx threw superior technology and execution for two consecutive cycles, while humiliating Ati yet again. What you just did is revisionist history.

Joe said:
Vince said:
Lets see how ATI fares after this whole ordeal, my first thought is that this is a good day for Sony and nVidia (independently of course).

And you tell me I have fanperson mentality? Lol... :p

With all due respect Joe, I proceed to explain in pretty basic economic terms, with historical precedence, why I took this stance and proposed two pretty simplistic sceanrios that demonstrate this (keeping it simple and to the point the entire way). Even the legal issues were brought up in a realist fashion, as I'm curious how it'll be done.

And you respond with this comment? Seriously now.

And they key to success is successful management of those resources..

You can only do so much, labor is a finite resource. I already addressed this and didn't imagine that it would be resurrected.

Where's the R400 Joe?

Come off it Vince. Please. This deal is nothing but good news for ATI. That doesn't mean they will ultimately be successful, but it just gives them that much more opportunity.

And be sure to let me know when nVidia "resurges"....

Well, I'll talk to you in 2005 bud. We'll see what's up.... as I already said, time's on my side.

PS. Mfa is right, ditch your stock before PlayStation3.
 
Vince said:
If "failing" is kicking your nearest competitors ass in marketshare by almost a multiple of 2 with over 60% of the DX9 market, then yeah... they're failing. heh.

By flogging lots of cheap chips, killing their own margins in the process - this is not something to be proud of, as evidenced by the street's reaction to their last CC.

As for the revenue, it's just that. Lets look at long-term profitability in emerging sectors instead, ok? XBox is not one of these.

It may not be emerging, but its still a potnetially huge revenue stream - and no sane person would say that having 2 of the 3 console manufacturers using your tech is not an enviable position to be in.

Of course, ATI already have the leg up in these emerging markets so NVIDIA are already slow off the mark there.

As opposed to... "Dave Rolaston publically stated that it will divert resources to a large degree from their PC work." In related news, shares of ATI (NAS: ATYT) fell by 10% today on news from Dave..."

And its quite probably its correct - the same tech thats used in the PC space is likely to to be leveraged for this contract as well (also linking DX to ATI for the forseeable future as well).

By not diverting a large amount of resources to the project entails that it's fairly static in relation to the PC parts comming online. I don't have much faith.

You not having faith doesn't make it any more or less possible.

However, those resources are bound to increase - many engineers have already gone from one to the other company and as NVIDIA engineers see their options packages becoming worth increasingly less inevitably some will go a mile down the road to ATI.

:rolleyes:

:rolleyes: all you want, but if you don't think this hasn't happened in the past and won't happen again then you are delusional. Engineers go where the opportunities are just as much as anyone else and currently ATI is on the uptick and NVIDIA on the down.

Resurgent? Based on?

You'll see. Time is on my side...

Well, the technology is currently on ATI's side now, and quite evidently this deal is an endoursement that its on their side for the foreseable future.
 
Attack of the ATI people... good times.

whql said:
By flogging lots of cheap chips, killing their own margins in the process - this is not something to be proud of, as evidenced by the street's reaction to their last CC.

Can you say... sales. Somone's got them, someone doesn't. In the memorable words of the Iceman,"The plaque for the alternates is down in the ladies room.

It may not be emerging, but its still a potnetially huge revenue stream - and no sane person would say that having 2 of the 3 console manufacturers using your tech is not an enviable position to be in.

Sony must not be sane, neither is Nintendo whose actually turning a profit. What's the profit margin on the NV2A upto now? You seem to be confusing the good position of having an established revenue stream (which is good) and getting there by competing against foes which are outspending you by an order of magnitude (true story).


And its quite probably its correct - the same tech thats used in the PC space is likely to to be leveraged for this contract as well (also linking DX to ATI for the forseeable future as well).

XBox doesn't use an abstraction via DirectX to any reasonable level in current games. To further comment on something with no true basis to the relevent argument, but just to show how fallicious you thinking is; how did XBox1 "link" nVidia to DirectX evolution?

This isn't the PC world...

Well, the technology is currently on ATI's side now, and quite evidently this deal is an endoursement that its on their side for the foreseable future.

Alrighty then. ATI's had luck with R300 by picking up when nVidia stumbled with the origional NV30 design (which is a more advanced design regardless) - this is hardly an achievement. It's like AMD given a room to breath from when Intel chopped the proverbial balls off the initial Pentium4 core. Lets see ATI keep up...
 
whql said:
Well, the technology is currently on ATI's side now, and quite evidently this deal is an endoursement that its on their side for the foreseable future.

Just for argument sake, how do you know nVidia didn't turn down Microsoft's proposal? What if nVidia (as they previously stated was necessary) wouldn't recieve the terms and funding they wanted? Or they pushed as they once did against GigaPixel, but this time MS ran?

Because all I see is a one-hit wonder that is ATI and R300 and I'm left asking, Where's that R400 they were hyping last year?

We don't know and you're filling in the blanks the way you want to. I've raised open-ended rational points and shown two scenarios that no one has adequatly answered. I've raised quastions about ATI's fundementals and doubt I'll get a strait answer... I digress...
 
Vince said:
Can you say... sales. Somone's got them, someone doesn't. In the memorable words of the Iceman,"The plaque for the alternates is down in the ladies room.

YEs, one has them in the low revenue, low margin segement and the other has them in the high margin lower volume stream - which would you rather>

It may not be emerging, but its still a potnetially huge revenue stream - and no sane person would say that having 2 of the 3 console manufacturers using your tech is not an enviable position to be in.

Sony must not be sane, neither is Nintendo whose actually turning a profit.[/quote]

And the relation to this statement, or how it effects the designer of the graphics is what?

What's the profit margin on the NV2A upto now?

On NV2A? Very high - thats why MS and NVIDIA went into arbitration.

You seem to be confusing the good position of having an established revenue stream (which is good) and getting there by competing against foes which are outspending you by an order of magnitude (true story).

You seem to be confusting what the console vendor is doing to get to that position with the relation it has to the vendor supplying the parts. All that matter to ATI is the volume of the parts sold, not if MS is making or loosing money on it.

XBox doesn't use an abstraction via DirectX to any reasonable level in current games. To further comment on something with no true basis to the relevent argument, but just to show how fallicious you thinking is; how did XBox1 "link" nVidia to DirectX evolution?

Are you blind? DX8 was fashioned around NVIDIA's parts and what they were doing in relation to the XBox. If it wasn't MS wouldn't have had to got with DX8.1.

Alrighty then. ATI's had luck with R300 by picking up when nVidia stumbled with the origional NV30 design (which is a more advanced design regardless) - this is hardly an achievement. It's like AMD given a room to breath from when Intel chopped the proverbial balls off the initial Pentium4 core. Lets see ATI keep up...

Of course, its all "luck", there absolutley no engineering excellence in there at all :rolleyes: As for NV30 being superior, it had some nice marketting features of "long shaders" however these are absolutely not practical to use - R300's design was engineered far better for the market it was adressing.

Your analogy with P4 doesn't fit either - P4 was designed becuase they knew it had a long future and scale up very well into .13u. NV30's design is in a dead end as it doesn suit the directions the major API's that it relies on have taken - they need to scrap the NV30 design for something completely different to get back on track, something that Intel didn't need to do with P4.

Just for argument sake, how do you know nVidia didn't turn down Microsoft's proposal?

Just a day or so ago NVIDIA said they were still in with a chance.

Because all I see is a one-hit wonder that is ATI and R300 and I'm left asking, Where's that R400 they were hyping last year?

Shrug, the roadmaps got moved around, its not as though that doesn't happen - lets see what they do offer next. And as has been speculated already, how do you know these weren't prempted shifts?

We don't know and you're filling in the blanks the way you want to.

Well, thats what you've done - cast all kinds of aspertions and FUD with an absence of facts. At present we have a deal on the table, one that NVIDIA obviously did go for, and MS aren't going to go for a technically inferior solution.
 
whql said:
You not having faith doesn't make it any more or less possible.

Here, I'll even go out on a limb and explain what I'm thinking. Microsoft views Sony as a distruptive force in the forthcoming connected home, thus giving birth to XBox. We know from comments by people like Andy Grove and Gabe Newell that the above is true, that Microsoft fears the livingroom being connected by electronics companies utilizing competing OS's - such as the Panasonic/Sony Linux derivative or whatever. Microsoft think's XBox will enter them into this marketplace using the same trojan horse as Sony - the game console. So, on some level - XBox is pretty important and it's obvious that Microsoft wants to kill off PlayStation. To do this next generation you need to beat two things:

  • PlayStations/Sony's Hype.
  • Cell Architecture.
You won't outright beat Sony's Hype, this is a joke to even ponder. So, you need to beat Cell - which is basically a silicon embodiment of the above pervasive computing paradigm. I'm a big supporter of Cell, this is true, but then learn from me (my perception, understanding, thoughts) and apply it to this situation with your ideologies as a counter-balance. And as "Cell-believer" I was 'concerned' about only one obstacle this entire time - nVidia. Which I'll get into now:

Regardless of how Cell turns out, one thing can be inferred about the architecture and more specifically the IC used in PS3. STI is a venerable lithography and process powerhouse, perhaps even surpassing Intel as second to none. To beat Cell, you need to capitalize on lithography and push it to it's very limits - for I don't believe simple architectural differences/routine optimizations will cut it.

We're nearing a point as outlined in Suzuoki's Cell patent for SCE (and can be seen in the NV3x architecture) where graphic processing is becoming computationally limited and it's performance is limited by logic, thus pushing the advancement burden back to Moore's Law rather than Bandwidth or other such barriers. The future is an advancement following NV3x's direction, or more like Cell - where you have almost full computational flexibility threw the pipeline except where the task is excessively iterative and dedicated logic is the way to go.

To cause a 'borderline performance revolution' with type of architecture relies on bleeding-edge lithography and it requires the design team to push the process to the edge and beyond into the realm of poor-yields with the understanding that future lithography will bring the yields and costs under control. It requires massive investment like STI is doing ($8Billion in total) and it requires technologies like SOI/SS, Low-K, 65/45nm and lower lithography and other such advancements that are pushed hard.

When I saw IBM and nVidia team up and basically gain access to STI's advancements combined with some comments I heard a while back from a little bird - I thought it was over. nVidia has the balls to push and stick with it. When 3dfx was in the corner touching itself with .25um, nVidia was on Cu utilizing 180nm and utterly destroying 3dfx in everyway, performance, features, per IC cost, yields.. it goes on and on.

And I see the same now. While nVidia is testing with 130nm Low-K dielectrics, ATI is off pissing in the wind on a 150nm process. Sure, nVidia had problems this time, but it's the exception. What's ATI going to do when nVidia is utilizing a derivative of STI/AMD's 10S or 11S (11S is Cell's 65nm process slated for 2H 2004/1H 2005 production) process at IBM? Have you fanpeople tell us that SOI isn't necessary? That the thermal or 30% performance increase seen on Power4 isn't that big of a deal? That TSMC's sub-par roadmap and execution of <100nm is adequate? Don't even get me started on UMC, are they serious in going alone for 90nm and below then everyone is concentrating their R&D? HA! Give me a break.

Today is the first day I can say that Sony will be alright, that if I was Okamoto or Ken, I'd be happy as a pig in shit.

PS. Check out my post here: http://www.beyond3d.com/forum/viewtopic.php?t=7188
Please note the timeline and make note of IBM's involvement and that SCE/Toshiba are currently in production of the EE+GS@nm and have been producing the GS (eDRAM+logic) at 130nm since 2001.
 
Most of your post is BS, I don't need or want to use up valuable room with a semantic debate.

whql said:
You seem to be confusting what the console vendor is doing to get to that position with the relation it has to the vendor supplying the parts. All that matter to ATI is the volume of the parts sold, not if MS is making or loosing money on it.

See what the allways present and sexy as hell GeeForcer posted about vendor's "volume," "projections," "losses" and "XBox."

Are you blind? DX8 was fashioned around NVIDIA's parts and what they were doing in relation to the XBox. If it wasn't MS wouldn't have had to got with DX8.1.

If Microsoft was doing what you said, "linking DX to ATI for the forseeable future as well." When Microsoft wouldn't have released a DX8.1. As Mfa stated, Microsoft doesn't play favorites. This is know, this has historical precedence - lets move on.

Of course, its all "luck", there absolutley no engineering excellence in there at all :rolleyes: As for NV30 being superior, it had some nice marketting features of "long shaders" however these are absolutely not practical to use - R300's design was engineered far better for the market it was adressing.

I was refering to the actual low-level silicon design of the NV3x (NV35 is a good example) and compering that to the R300. "Long-Shaders" are just an [insignificant] byproduct.

Your analogy with P4 doesn't fit either - P4 was designed becuase they knew it had a long future and scale up very well into .13u. NV30's design is in a dead end as it doesn suit the directions the major API's that it relies on have taken - they need to scrap the NV30 design for something completely different to get back on track, something that Intel didn't need to do with P4.

Wow, the cow must have stayed in this spot a whole while longer....

  • First, the Pentium4's origional design which entailed, IIRC, stuff like added FPUs, lvl3 cache, etc was cut due to cost barriers. What you stated is insignificant and unrelated.
  • The NV3x architecture is hardly a dead-end. It's more like a prevision to the day when we have unified shaders running on an architecture that's based around a plurality of processing elements.
  • They [nVidia] need to advance the basic NV35 ideology with lithography.
This has been a reading from the Book of Vince... </soap box > j/k ;)

So, where is the R400?
 
Not read through the whole topic yet...but lets see, does this means we can give the IQ crown to MS again? :oops:

Anyone for a few specs guesstimation? :LOL:

AMD 64bit
256mb DDR3
2X R5X VPU with 32mb edram each
64mb DDR for XB1 emulation/audio/cache

I wonder does this push for a closer possibility of MS + Nintendo killer combo!!!! :oops:
 
chaphack said:
I wonder does this push for a closer possibility of MS + Nintendo killer combo!!!! :oops:

Probably not, both firms are well known for having shall we say control issues. Won't happen, end of story.
 
Vince said:
See what the allways present and sexy as hell GeeForcer posted about vendor's "volume," "projections," "losses" and "XBox."

Hey, perhaps you might like to look up the terms of the deal - this is a licensing deal, ergo none of the intentory issues that NVIDIA had, or fixed chip cost issues that MS had apply here.

If Microsoft was doing what you said, "linking DX to ATI for the forseeable future as well." When Microsoft wouldn't have released a DX8.1. As Mfa stated, Microsoft doesn't play favorites. This is know, this has historical precedence - lets move on.

DX9 favours ATI, and now DX10 is most likely to as well. And, yes, in term of API compatability MS does play favourites - all you have to do is look at the development of DX since DX5.

I was refering to the actual low-level silicon design of the NV3x (NV35 is a good example) and compering that to the R300. "Long-Shaders" are just an [insignificant] byproduct.

Then that makes your statement even more nonsensical - the low level silicon design was clearly a horrible mess otherwise it would have been a year in silicon development and canned as soon as it was released. What ATI achieved with .15u probably a far better feat - first to 100M transistors and with speeds that exceeds the limits of what NVIDIA have told people .15u can do!

First, the Pentium4's origional design which entailed, IIRC, stuff like added FPUs, lvl3 cache, etc was cut due to cost barriers. What you stated is insignificant and unrelated.

It was also designed to scale up in clockspeeds, which was clearly a long term decision - it sucked in comparison to AMD on 180nm, but as soon as they moved to Northwood the balance began to shift - that was the long term approach.

The NV3x architecture is hardly a dead-end. It's more like a prevision to the day when we have unified shaders running on an architecture that's based around a plurality of processing elements.

There is no precursor to a unified shader architecture in NV30 at all. It is also limited by Integeger shader units which do not match the major API that its operating on. Its 4x2 configuration is also probably legacy as well - if NV40 turns out to be an 8x1 design and fully float then, yes, NV30 was a dead end.

As for your previous post on your own fanboy rant on the apparent world domination of the Cell, well we'll see how that plays out - the link you posted links to the same limiation that all these things will have, being process issues. This is what any computing device will ultimately be limited by but MS and Sony are taking different approaches as to how that works with graphics and that will be the biggest unknown.

While nVidia is testing with 130nm Low-K dielectrics, ATI is off pissing in the wind on a 150nm process.

Where is it known that NVIDIA are working on low-k? Last it was said what the IBM are still having difficulaties with it and reports suggest that TSMC are further along the line. And, you you probably know ATI are using .13u, but how do you know they are not using low-k already?

Sure, nVidia had problems this time, but it's the exception

Its no gien that its an excpetion. You hope it will be an exception, but thats all you can really state on the matter.

What's ATI going to do when nVidia is utilizing a derivative of STI/AMD's 10S or 11S (11S is Cell's 65nm process slated for 2H 2004/1H 2005 production) process at IBM?

IBM's customer fabs are well behind their own, got any proof that this is ready for customers in these timescales?

That TSMC's sub-par roadmap and execution of <100nm is adequate?

Just because NVIDIA keeps making noise about their issues doesn't necessarily mean they are TSMC's issues. Again, report suggest that TSMC's roadmap is going better than IBM's customer roadmap at the moment.

Again, though, this is all bunk - ATI's deal with MS is a technology licensing deal MS still have all the options open as to how they fab this. And, a deal with Intel may not be out of the question.
 
whql said:
Vince said:
See what the allways present and sexy as hell GeeForcer posted about vendor's "volume," "projections," "losses" and "XBox."

Hey, perhaps you might like to look up the terms of the deal - this is a licensing deal, ergo none of the intentory issues that NVIDIA had, or fixed chip cost issues that MS had apply here.

Er, didn't you imply that the Xbox deal was great for Nvidia and their margins were shy-high, while in actuality the Xbox business is very long-margin (as states in their earning calls multiple times) and could have very well cost Nvidia money if it wasn't for arbitration (as quoted above).
 
Well, I can make no sense of this decision.
This would seem to benefit only Sony and ATI.
I think we should wait until the picture clears up considerably before making any guesses about the technology that will be appearing in the next GC and Xbox consoles.
A bit of a downer, though. I was really pulling for a resurrection of Trident...
 
Er, didn't you imply that the Xbox deal was great for Nvidia and their margins were shy-high, while in actuality the Xbox business is very long-margin (as states in their earning calls multiple times) and could have very well cost Nvidia money if it wasn't for arbitration (as quoted above).

The profit margin is high, after the arbitration. It was cited that NVIDIA were reciving $50 for the components they sell to MS for the XBox, and yet they sell chips such as NV30 to board vendors for only $18, so the margins are much higher. The arbitration went on because NVIDIA has issues with inventories and MS had issues with the price they were paying for the chips.

However, Vince's originally said "whats the profit margin on NV2A", but that doesn't apply to the Xbox2 graphics - this, being a licensing deal, is 100% profit (its just that overall revenues will be lower). Given that MS are paying for the development of the part of it the only downside could be the potential impact it may have on other parts of their business, assuming that XBox2 isn't successful enough to offset those potential damages.

And this is what ATI are quoted as saying about this already:

Q: Comment on resources to commit to XBox?

A: Key is to leverage technology; been working on it for a long time; have already hired a certain number of people for this contract; no requirement for large addition of resources. No concern about impact on PC roadmap
 
whql said:
The profit margin is high, after the arbitration. It was cited that NVIDIA were reciving $50 for the components they sell to MS for the XBox, and yet they sell chips such as NV30 to board vendors for only $18, so the margins are much higher. The arbitration went on because NVIDIA has issues with inventories and MS had issues with the price they were paying for the chips.

Would you have a source for these numbers? (NV2A - $50, NV30 - $18 )? I think the whole point of MS going for a licensing deal was to control costs better (read: among others, pay less) to graphics supplier. The fact that Nvidia had to go to arbitration in order to avoid loosing money should point worth noting to anyone dealing with them in the future.

BTW, what revenues/profits does ATi garner from the GC deal, considering that this would presumably be a modal for Xbox deal?
 
Not that I'd care to dig around for. The NV2A were old numbers from Huang early on when they got it and the $18 thing was from a relatively recent report saying that NVIDIA and ATI get about $18 for NV30 / R300 respectively whereas Intel makes about $180 per P4.
 
whql said:
DX9 favours ATI, and now DX10 is most likely to as well. And, yes, in term of API compatability MS does play favourites - all you have to do is look at the development of DX since DX5.

Maybe you can get Chappers to agree with this Microsoft conspiracy that borders on the illegal. But, it would see most people don't agree.

Then that makes your statement even more nonsensical - the low level silicon design was clearly a horrible mess otherwise it would have been a year in silicon development and canned as soon as it was released. What ATI achieved with .15u probably a far better feat - first to 100M transistors and with speeds that exceeds the limits of what NVIDIA have told people .15u can do!

Wait, what I stated is nonsensical but you can make such blatent and wrong statements? First IC to 100M Tranistsors? In what world dude?

Exceeds what 150nm can do? Off the top of my head, Power4 has 170M transistors on a 180nm SOI process and runs at >1Ghz commercially and over 1.4Ghz at TJ Watson AFAIK. So, how is that significant?

ATI fanatacism cum stupidity bores me.

First, the Pentium4's origional design which entailed, IIRC, stuff like added FPUs, lvl3 cache, etc was cut due to cost barriers. What you stated is insignificant and unrelated.

It was also designed to scale up in clockspeeds, which was clearly a long term decision - it sucked in comparison to AMD on 180nm, but as soon as they moved to Northwood the balance began to shift - that was the long term approach.

Ok, what part of "insignificant" and "unrelated" did you not comprehend? The initial Pentium4 core was stripped down significantly due to economic and manufacturing viability reasons, just as the NV3x was according to many people here. What you stated has absolutly no bearing on this line of reasoning.

http://www.eetimes.com/story/OEG20001213S0045

There is no precursor to a unified shader architecture in NV30 at all. It is also limited by Integeger shader units which do not match the major API that its operating on. Its 4x2 configuration is also probably legacy as well - if NV40 turns out to be an 8x1 design and fully float then, yes, NV30 was a dead end.

OK, this discussion is going nowhere fast. Please go read the discussions concerning the NV3x discussions in the 3D Architecture forum and then come back, take note of the TCL front-end and what it's composed of. Then look at where 3D processing is going [eg. Unified Shaders] and think about the R300. Now tell me which is closer...

As for your previous post on your own <bleep> rant on the apparent world domination of the Cell, well we'll see how that plays out - the link you posted links to the same limiation that all these things will have, being process issues. This is what any computing device will ultimately be limited by but MS and Sony are taking different approaches as to how that works with graphics and that will be the biggest unknown.

Alrighty then, wasted my time.

IBM's customer fabs are well behind their own, got any proof that this is ready for customers in these timescales?

Never read one of Panajev's posts huh?

Again, though, this is all bunk - ATI's deal with MS is a technology licensing deal MS still have all the options open as to how they fab this. And, a deal with Intel may not be out of the question.

So, my comments which have factual basis that I cite are "bunk", but your horseshit/blatently false comments and other such stab-in-the-dark guesses (Intel?) are acceptable? Excuse me while I :rolleyes:
 
Vince said:
Maybe you can get Chappers to agree with this Microsoft conspiracy that borders on the illegal. But, it would see most people don't agree.

Surely you aren't that blind to notice that each revision of DX has closer mapped to one vendor than another. I seem to remember some old comments from Brian Hook that stated exactly this - they'd play their "vendor of the year".

Wait, what I stated is nonsensical but you can make such blatent and wrong statements? First IC to 100M Tranistsors? In what world dude?

Did I say that? Were they not first to 100M in the consumer 3D space? Did they not reach that on .15u.

Exceeds what 150nm can do?

These were in NVIDIA's presentations.

ATI fanatacism cum stupidity bores me.

You've proven to be a fine one to talk.

Ok, what part of "insignificant" and "unrelated" did you not comprehend? The initial Pentium4 core was stripped down significantly due to economic and manufacturing viability reasons, just as the NV3x was according to many people here. What you stated has absolutly no bearing on this line of reasoning.

Regardless of its start, Pentium4 had a long term (relatively) future and one that arguably outshine its competitors after a rough start. The NV30 architecture is already being talked into obsolecence by NV40 it less than a year of the nv30's actual release - again, if NV40 is fully float and an 8x1 design then you'll see that NV30 was actually an architectural dead end, whereas the P4 wasn't.

OK, this discussion is going nowhere fast. Please go read the discussions concerning the NV3x architectural diagrams in the 3D Architecture forum and then come back. Just so you know, we're getting to the point where "4*2" has less meaning...

4x2 has less meaning because multitexturing has less meaning in the shader age we're moving into - this is a dead end as time goes by.

So, my comments which have factual basis that I cite are "bunk", but your horseshit/blatently false comments and other such stab-in-the-dark guesses (Intel?) are acceptable? Excuse me while I :rolleyes:

Well, at least I cared to look at the terms of the deal before spouting off on rants that you have no clue whether they apply. And, yes, Intel was just a suggestion however, there are still many things to be answered here, and you have to assume that MS are going to go for the best solutions. With this being a royalty based IP deal there are massive questions to be answered as to who will do the chip layout (I suspect ATI will still have a huge hand in this) and who it will be fabbed by - these questions will only get answered once we learn more about who and what is going to be in the XBox. Basically, all bets are off for this being a TSMC fabbed thing and the royalty deal blows the doors wide open as to how the technology is going to be implemented and made.
 
whql said:
Wait, what I stated is nonsensical but you can make such blatent and wrong statements? First IC to 100M Tranistsors? In what world dude?

Did I say that?

Um, lets see what you said:

whql said:
What ATI achieved with .15u probably a far better feat - first to 100M transistors and with speeds that exceeds the limits of what NVIDIA have told people .15u can do!

Alrighty then, on to the next thing.

These were in NVIDIA's presentations.

So, that makes it a feat how? Well, I mean, other than fanb0y arguments based on linguistic and semantic games.

ATI fanatacism cum stupidity bores me.

You've proven to be a fine one to talk.

Regardless of its start, Pentium4 had a long term (relatively) future and one that arguably outshine its competitors after a rough start.

The sky is blue, Elephants are big. Any other off-topic, useless things you want to throw in here?

The NV30 architecture is already being talked into obsolecence by NV40 it less than a year of the nv30's actual release - again, if NV40 is fully float and an 8x1 design then you'll see that NV30 was actually an architectural dead end, whereas the P4 wasn't

This is completely beside the point. First off, this doesn't address the TCL front-end design. It also is catering to the legacy ideals of "X pipes * X TCUs" which is useless - it serves us no good.

Besides, the NV3x will continue on, of course it'll be all FP (isn't the NV35?) and the actual architecture may very well eventually become a hybrid architecture, which was begun in the NV3x vertex shader architecture. This is fact, the R300 doesn't. We're arguing against a non-issue.

4x2 has less meaning because multitexturing has less meaning in the shader age we're moving into - this is a dead end as time goes by.

It has less meaning because there's no reason to have such a static architecture consisting of TCU's in this fixed manner. Eventually architectures will be like a pseudo-NV30 front-end that allows for resource sharing by tasks between fragment and vertex. This is remarkably similar to the design of Cell and what we anticipate it'll be.

Well, at least I cared to look at the terms of the deal before spouting off on rants that you have no clue whether they apply.With this being a royalty based IP deal there are massive questions to be answered as to who will do the chip layout (I suspect ATI will still have a huge hand in this) and who it will be fabbed by - these questions will only get answered once we learn more about who and what is going to be in the XBox. Basically, all bets are off for this being a TSMC fabbed thing and the royalty deal blows the doors wide open as to how the technology is going to be implemented and made.

Actually you interpreted it like this. There is no way that the problems of nVidia/Xbox infleunced this ones licensing condition. But, this doesn't mean that ATI can just design an open ended processor that can be used at XXX Fab or YYY Fab at a moments notice... The processor will be designed for a specific fab/line with specific libraries utilized.... what the are you thinking, I'm confused? ATI is intimatly bound to the eventual fab during the design, unless Microsoft is doing the synthesis and back-end stuff themsevles, but that's highly doubtful. Thus, this is relevent and very much so. Is it upto debate? Of course, but ATI is definatly not in the same position as nVidia.
 
Status
Not open for further replies.
Back
Top