What will the AMD- ATI aquisition mean for future chipsets??

Jakob said:
Is AMD's process technology so much better than TSMC's? There's still no word on 65nm from AMD, while TSMC has been in volume 65nm production since May.

http://biz.yahoo.com/bw/060517/20060517005370.html?.v=1

They're different.

TSMC tailored thiers to do certain things really good and that means its not so good at other things, and that is also true for AMD or for that matter Intel or IBM.

There are various pics floating about on the web of AMD's .65 sample chips BTW and they've already stated multiple times back in June/May that they'd started to get the equip. on line so I don't know what you're talking about when you say there has been no word on it at all... Looks to me like they're on schedule for a low volume release at the end of the year, which is what they've said they would do during thier Tech Analyst Day in June.
 
SiliconAbyss said:
I find it very interesting how many people decide to assume AMD will kill off at least some of ATI's products. I also find it very surprising that anyone would suggest that somehow ATI and AMD have to compete for resources, and the net effect will be less ability to compete. This is the exact opposite of reality.

And you know this how?

One thing that IMO cannot be ignored, is if AMD/ATI start producing discreet cards in their own foundries. This will be huge. It will give them a manufacturing edge that the bulk processes of the pay as you go foundries will not be able to match. And more importantly, it will reduce costs significantly. NVIDIA should be very worried about this. NVIDIA should also be worried that they don't have a path to begin producing a CPU/GPU hybrid, which I think is inevitable. Not to say the discreet card will go away. It won't. For the foreseeable future, the high end is going to be a discreet part. But IMO we are going to see more and more computers, devices etc. with hardware accelerated pixel capabilities. And the idea device for this is a CPU that can push out pixels.

CPU/GPU will be a low-end solution and only relevant for laptops, workstations etc. Means nothing for the brand as such and won't win any benchmarks. As we all know, it's the high-end lead selling the low-end stuff, so I'm interested how that will reflect in sales. That with the costs might mean something though.

I still don't see how AMD/ATI should make either Intel or nV scratch their heads. I still think they'll diverge from the current compatibility path and try to lock the users into proprietary interfaces etc. and that strategy will only make Intel happier. Hope I'm wrong, but I see no other way for that to work to any extent. Only if they're able to offer a faster platform to a competitive price will they survive.
 
Xbit has an AMD/ATI interview up.

They are clearly aimed at some kind of co-marketing starting early next year. How much technical difference there will be initially, is an open question. Maybe a combined driver stack --OEM's like that kind of thing.

Tom McCoy: Each company currently has roadmaps and those roadmaps and promises to customers and we are going to deliver on those promises. We think that in the beginning of next year we will be able to come to market with more optimized platforms delivering on our existing roadmaps to the delight of customers and, we think, to the delight of shareholders as well. The longer-term strategic value of the transaction comes in the roadmap that we will now innovate together reflecting products that will be coming to market in 2008 and beyond.

http://www.xbitlabs.com/articles/editorial/display/amd_atyt_interview_5.html
 
Few things not mentioned in this thread, liscensing fees can be paid only once now (well pending), one company, think they mentioned in one of the original articles on the merger that it was $70M. Other would be testing turnaround, I can imagine if AMD wishes to, they can do a one off run of chips for graphics for testing purposes and when it's good for production then use the outside foundries. AMD will be switching to new processes every year or so, there is the issue of what to do with the older tech in the meantime, they can use that for gpus if they choose to make it work. They have the ability to prioritize in other words.

I think the advantages are similar to two people dating but owning their own apartment vs moving in together, only one rent bill to pay, but you have to cooperate more.. :)
 
One of the more interesting features of this deal, to me, is how thoroughly pre-announcement we all had the fab angle bass ackwards from how AMD would actually see it. And from AnandTech comes another example to show how true this is.

A secondary part of that requirement is that you need to have something to manufacture at older fabs before you upgrade them to help extend the value of your investment. By acquiring ATI, chipsets and even some GPUs can be manufactured at older fabs before they need to be transitioned to newer technologies (e.g. making chipsets at Fab 30 on 90nm while CPUs are made at Fab 36 at 65nm).

Once the New York fab is operational, AMD could have two state of the art fabs running the smallest manufacturing processes, with one lagging behind to handle chipset and GPU production. The lagging fab would change between all three fabs, as they would each be on a staggered upgrade timeline - much like how Intel manages to keep its fabs full. For example, Intel's Fab 11X in New Mexico is a 90nm 300mm fab that used to make Intel's flagship Pentium 4/D processors, but now it's being transitioned to make chipsets alongside older 90nm CPUs while newer 65nm CPUs are being made at newly upgraded fabs.

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2807&p=2
 
Well I don't think this will happen on this process shift, because while ATI spends time reworking their chipsets and GPUs, AMD will be revamping their fabs. I suppose they might do it, if ATI does some rush hack job. However, I just don't see that happening, not with the comments AMD/ATI have been making about not using AMD fabs till 2008.
 
I thought a bit of necromancy might be fun here.

So, it's been two years. I'm curious as to what people think now about the future direction of AMD/ATI.
 
I'd have to say it's looking a bit better after the last week. :yep2:
While the 4800 series is indeed impressive, it's mostly impressive because of its price and because of its size, not because of its performance. It makes me wonder if AMD/ATI has, indeed, conceded the high-end.
 
Well, let's see if they get a 4870 X2 out the door significantly sooner than NV can shrink GTX 280, or vice versa.
 
While the 4800 series is indeed impressive, it's mostly impressive because of its price and because of its size, not because of its performance. It makes me wonder if AMD/ATI has, indeed, conceded the high-end.
I think it's just reflective of their new strategy, but we will see. :)

It's all gonna come down to Crossfire and how well it works out I think.
 
While the 4800 series is indeed impressive, it's mostly impressive because of its price and because of its size, not because of its performance. It makes me wonder if AMD/ATI has, indeed, conceded the high-end.

the 4800 series is impressive because it has price to performance leadership.
 
While the 4800 series is indeed impressive, it's mostly impressive because of its price and because of its size, not because of its performance. It makes me wonder if AMD/ATI has, indeed, conceded the high-end.


I think it was Jawed who said that ATI have not conceded the high end - just they've decided to address it with multi-GPU/Crossfire rather than a single monolithic chip.

We'll have to wait and see if X2 works and beats GTX280, where Nvidia can't build an X2 of their own to compete.

Really, it's no surprise given what we know of AMD's Fusion plans that they've decide to take this approach. I think all the GPU companies will do so eventually, much as the CPU companies have already gone multi-CPU.
 
I think it was Jawed who said that ATI have not conceded the high end - just they've decided to address it with multi-GPU/Crossfire rather than a single monolithic chip.

We'll have to wait and see if X2 works and beats GTX280, where Nvidia can't build an X2 of their own to compete.

Really, it's no surprise given what we know of AMD's Fusion plans that they've decide to take this approach. I think all the GPU companies will do so eventually, much as the CPU companies have already gone multi-CPU.

So you're mixing "multiple-CPU" with multiple-dies" with "multiple-cores" ?
Even "Nehalem" dropped the dual-die design for the more affordable quad-core version. That should have told us something right there, no ?

Also, i don't see how a modern GPU isn't multi-core already, and how is a dual-die approach any better than having both resources in one chip.

Surely the power/performance ratio would be worse (anyone thinking that two 55nm RV770's would have lower power consumption than a single, monolithic 1600 SP chip is pipe-dreaming, IMHO), inter-chip communication through PCIe bridges would be worse than intra-chip communication, etc, etc.

The "Larrabee" project is the sum of it all. Many basic x86 cores... built on a single die.
 
Yeah, as INKster mentions, there's no improvement to be had by going multi-core GPU. Multi-processor CPU's are mostly that way because of the programming interfaces for CPU's that are focused on single-threaded execution. Multi-threading is usually sort of "tacked on", while parallelism for a GPU is built in.

Basically, the only reason to go for multi-GPU setups is if the added cost/yield loss for going for one large GPU ends up being more than the efficiency loss from linking two smaller GPU's.
 
So you're mixing "multiple-CPU" with multiple-dies" with "multiple-cores" ?

Yeah I am. ;) Shorthand for "multi-anything" depending what they decide to do. Crossfire/SLI, X2, and Fusion/Larribee are all different endpoints for "more than one GPU core", so I kind of mixed up all the different nomenclature.

Look at the cost/performance of 4870x2. It's the difference between a financially and technically viable product, and one that won't see the light of day.

GTX2xx may be the last of monolithic GPUs, but do you think Nvidia would have still made it if they knew they were going to have to sell it at a loss (or at the very least well below their margin requirements)? Do you think that Nvidia is going to keep making bigger and bigger single monolithic chips? Imagine what the next generation of monolithic chip would have to look like in terms of size, power, heat, yeilds, price, etc. I just can't see it being feasible for the industry to try and live off where that road goes.
 
Basically, the only reason to go for multi-GPU setups is if the added cost/yield loss for going for one large GPU ends up being more than the efficiency loss from linking two smaller GPU's.

Obviously?

Even Nehalem's current size would be unoptimal (though not by far as Intel's 45nm process does give them quite some leeway on cost/yields).

Now see Barcelona. Not really the same case now, eh? Even with Deneb, they jusy made it on par with Nehalem (10% less?) and AMD's 45nm... oh.

Propus is really the only monolithic quad chip poised to be commoditized without overly dependant on external conditions. If the cuts are right, it's about half a Barcelona.

Now look back at GPUs. GT200... G80 (fluke) and R600 (seven months of bad luck). G92 made it back to the ideal size for a highend for half a year.

The problem mainly revolves around perf/mm2 stagnating quite terribly (at least in nVidia's case), mostly due to overconservative design choices.


If ATI had another vision with the same development steps, an equivalent chip (>500mm^2 not needed, they could do it within 350-450) of theirs (R7 gen) would have given the GT200 quite a dose of enlightenment.
 
Back
Top