Anandtech: AMD-ATI Merger in the Works?

heh Chalnoth is part of the "Inq ruin IT World!!1!" brigade so will see the negatives but not the many times when they are factually correct or close too. The reason why I read it is cause it's funny and they do report rumours that do in fact turn out correct. Thier coverage of HP-Compaq was excellant amongst many other topics. In this case it's not so much the Inq as some sort of single entity taking both positions but the individual reporters at the site are reporting on 2 different rumour sets.

Charlie is convinced that it will happen and you can read it here. Well, sorry to disappoint him but at this time there are no serious talks between the two.
try read who the piece is by.

Maybe there's a power struggle internal to the Inq ;)

Imagine a world without all this rumour and inuendo - how boring
 
Considering all the complaints people make about the Inq's reporting there are an aweful lot of links to their site posted here. :)
 
IgnorancePersonified said:
heh Chalnoth is part of the "Inq ruin IT World!!1!" brigade so will see the negatives but not the many times when they are factually correct or close too. The reason why I read it is cause it's funny and they do report rumours that do in fact turn out correct.
Well, of course, they do, because they take a shotgun approach and report everything. Since they're wrong at least as often as they're right, there's really not much point in reading them.
 
nutball said:
Right, which is why your bold statement that "oh well if this doesn't work then you trivially change to doing it another way" isn't founded on reality. The current parallelisation strategies employed for commonly-used algorithms in high-performance computing are the result of hundreds (probably thousands) of man-years of research and development over decades. If they stop working, it's really not just a case of sitting down with a pencil for twenty minutes and coming up with something better. It's very expensive and time-consuming to change your underlying parallelisation strategy. More often than not it entails a total re-write of your code which, if it's a couple of million lines of FORTRAN or whatever, is not a task you take on before breakfast.
Well, for games in particular, you have a number of things that help you over some of these applications you're talking about. First of all, modern game development isn't in Fortran, which is bonus number one. By programming in C (or better yet, C++), the code is more easily extensible. I also expect that as multicore becomes more and more mainstream, that we'll see better compiler tools and language extensions for dealing with multithreaded scenarios.

The second big help is game engine licensing: a big game engine developer can put in the large number of man-hours required to make good use of multicore. In order for the games market to remain healthy, game engine licensing is basically going to have to become much more common.

The final thing is that you don't need to make the entire game engine run in parallel: you only need a few performance-intensive sections to run in parallel. This is a rather dramatic reduction in the amount of work that is required to make good use of multicore.
 
Back on topic: I don't think a merger would be completely senseles.

For the past many quarters, the biggest (by unit) GPU manufacturer has been Intel. - And they have been growing their market share despite the fact that their integrated graphics core seem more and more feeble against even low level stand alone GPUs as well as IGPs from ATI and NV.

The primary reason for this is that Intel is not branding their processors anymore, but rather their platform, first the hugely succesful Centrino, and now they are trying with ViiV. The reason is clear. As much of the innards of a PC as possible will be made by Intel and thereby increase revenue.

AMD needs to respond to this. One way would be to buy (merge with) ATI.

To predict the future one must look back at history. In the past we had:
Northbridges with DRAM controllers
Southbridges with peripheral controllers (IDE/ATA, PCI, ISA etc)
Discrete sound cards
Discrete graphics cars
Discrete NICs
etc.

Today most PCs just have a northbridge (or a northbridge and a very slim southbridge)

The trend of integrating ever more stuff in ever fewer chips is clear. Merging of the MPU and the northbridge is not a question of if but rather when, IMO.

Looking at AMD's just held analyst meeting it looks like they are building an on-chip infrastructure to (relatively) easily plug in different modules/application specific accelerators, this could be used to integrate chipset functionality

There is of course pros and cons of merging with ATI, cons first:
Total integration
1. Integrating chipset functionality into the MPU -> more MPU SKUs for AMD to produce with associated cost (masks, stock etc).
2. More complexity -> longer time-to-market.
3. Shared bandwidth between MPU and primarily GPU -> potential lower performance (see below though).
Standalone GPUs:
4. Existing (and planned) designs will have to be reworked to fit with AMD's process
5. Merging two different business structures/cultures takes time and money -> product slips.

Pros:
Total integration:
1. Cost: One package with everything is cheaper to produce than multiple packages -> lower total SKU cost or higher revenue.
2. Lower pinout. Only some PCI-express lanes are needed, all HT links can be ditched -> lower pinout -> lower cost.
3. Fixed memory system. Motherboards will essentially just be a bunch of DRAM chips and some ports with a few PCI-e slots in it. Solder various fixed amounts of DRAM on the boards and sell these as different SKUs. The point of this is that by soldering DRAM on and making it single rank you get a DRAM topology similar to current GPUs, with a vast jump in the memory system interface speed. This leads to a big increase in bandwidth and may completely negate the effect of 4.) in the "cons" above.

Standalone GPUs:
4. ATI gets access to a process with world class performance. Trade performance for power savings and products should be very competitive compared to foundry produced GPUs.
5. AMD can amortize cost by using older process fabs to produce GPUs. This is what Intel does today, build chipsets (and Itaniums) on their n-1 fabs (n being bleeding edge fabs). In fact AMD needs this since they ditched their FLASH business which used to suck up the capacity of old fabs.

Whether the pros outweigh the cons is down to execution.

Cheers
 
Last edited by a moderator:
More wood for the fire.

Tweaktown report said:
[...] From a trusted and reliable anonymous source, today we learned that Intel are talking behind closed doors about the real possibility of an AMD and ATI merger, with some seriousness, and discussing what it could mean for their future business and bottom line. [...]
Props to Tech Report for the link.

Given AMD's seemingly stronger relationship with nVIDIA in recent years, is there any particular attribute that makes ATI a more favourable potential merger/acquisition, apart from ATI's somewhat undervalued stock?
 
  • Like
Reactions: Geo
ATi's rather sizable chipset market share is one thing. ATi is reasonably cheap for what they are.

While interesting to consider, I dont think it was ever a serious talk. If AMD really wanted a company to be able to make custom chipsets they'd go for much smaller fish than ATi.
 
BrynS said:
More wood for the fire.

Props to Tech Report for the link.

Given AMD's seemingly stronger relationship with nVIDIA in recent years, is there any particular attribute that makes ATI a more favourable potential merger/acquisition, apart from ATI's somewhat undervalued stock?

Okay, this is getting a bit freaky. Somebody is pushing this with an agenda beyond the usual rumor-monging.
 
geo said:
Okay, this is getting a bit freaky. Somebody is pushing this with an agenda beyond the usual rumor-monging.
Which only makes me more disinclined to believe it, but who would gain from a rumor of an AMD/ATi merger right now?

I'm asking, I really have no clue...I have trouble keeping track of the financial stuff. :oops:
 
Gubbi said:
Standalone GPUs:
4. ATI gets access to a process with world class performance. Trade performance for power savings and products should be very competitive compared to foundry produced GPUs.

Exploiting the "process-envelope" requires a lot of manual layout-work and precise process characterization -- this has been the norm in the CPU-business for quite some years now, but not for the GPU-business. Other than a few press reports stating ATI used 'CPU-design strategies' (in its 0.15u R300), I don't see any direct evidence they've continued with that approach.

I really don't see this happening because GPU product-cycles are much shorter, and design strategies for CPUs vs GPUs differ a lot. Doing a custom-layout for a GPU may take 6 or more months -- and if that SKU has a market-lifespan of 9 months ... well, the 'extra oomph' may not justify the increased development-cost. Conversely, GPUs contain a lot more ALU units (per die-area) than CPUs, so in the theoretical sense, a GPU could stand to benefit from custom-layout.

5. AMD can amortize cost by using older process fabs to produce GPUs. This is what Intel does today, build chipsets (and Itaniums) on their n-1 fabs (n being bleeding edge fabs). In fact AMD needs this since they ditched their FLASH business which used to suck up the capacity of old fabs.

Yes, but Intel has more fabs, and sells a lot more product than AMD currently does. Keeping a fab operational, even "n-1" node, costs a lot of money. Total production volume has to exceed a certain threshold, in order for an inhouse fab to outperform an external foundry. Intel has that threshold, AMD does not (yet)

In the meantime, many IDMs (integrated device manufacturers), such as Motorola, LSI, and Texas Instruments, outsource some wafer-production to merchant foundries (TSMC, SMIC, Chartered, UMC, etc.) Only the really exotic (or highly guarded) designs are definitely barred from outside contractors; pretty much everything else is fair game. They did so, because their internal fabs weren't cost-competitive with a third-party foundry.
 
Does anyone think that AMD might be interested in ATI's for its IP wrt utilizing it within a CPU. Think Altivec Cell etc.
 
Well, if it is serious, it seems to me much more likely that AMD is purchasing a portion of ATI's business, not the whole company.
 
Back
Top