AMD is an ideal takeover target for any well off company looking to make it big in the CPU industry. Possible, but still speculation.
IBM themselves are big in the processor market, or at least as big as you can get without being AMD or Intel. It would be crazy interesting however. As it would allow IBM to enter the CPU markets that they really have no leverage in right now. Consumer and the server sections that they do not have popular parts in.
EDIT: Also of note is that the Xbox 360, PS3, and Wii all have IBM designed CPUs.
The money isnt as good. Ill be quite shocked if IBM is actually behind this. They left the hard drive market, notebook/pc market, and at least one other hardware aspect (memory escapes me). They are not going to purchase AMD.Not in the x86 market they're not, which frankly, by now, is the only ISA that matters in the PC market. If they can enter the console CPU market with such force, what's the problem with entering the PC CPU market?
A sale now, if it happens, would be consistent with the strategy pursued by Samuel J. Palmisano, who became I.B.M.'s chief executive early in 2002. He has sold hardware businesses where profits were slender and growth prospects were limited, like its hard disk drive business, which was sold to Hitachi.
Instead, Mr. Palmisano has bet on expanding the company's services business, automating a full array of operations - from product design to sales-order processing - for corporate customers. I.B.M. now casts itself as a company that does not simply sell technology but serves as a consulting partner to help its customers use technology to increase the efficiency and competitiveness of their businesses. As part of that strategy, he bought PricewaterhouseCoopers Consulting for $3.5 billion, in a deal that closed in October 2002.
"Palmisano's getting out of businesses that aren't growth opportunities and concentrating on what I.B.M. does best," said Mark Stahlman, an analyst at Carris & Company. "PC's are not where the growth is."
To trim costs, I.B.M. has steadily retreated from the manufacture of its PC's. In January 2002, it sold its desktop PC manufacturing operations in the Untied States and Europe to Sanmina-SCI, based in San Jose, Calif. I.B.M. now confines its role in PC's to design and product development out of its offices in Raleigh, N.C., with all the I.B.M.-brand desktop or notebook computers made by contract manufacturers around the world.
Well I was correcting nonamer. The full list would be
X800XTPE > 6800Ultra > X800XT > 6800GT > X800Pro > X800 = 6800 > 6600 GT > X600 > 6200 = X300
But I do agree that if there is a part that screams bang for buck (9500P, 6800GT, 7900GT) that its a sure shot success regardless of how the flagship performs.
Don't forget about 6800 Ultra Extreme, lol. Nice paper launch BS there, to semi-stomp X800 XT PE. You can still find X800 XTPE around. Good luck finding 6800UE.
IMO the X800s were more than a match for 6800. SM3 was useless at the time, and SM2B is still good enough 90% of the time today, with rare exceptions like RS Vegas (only one OTOH). 6800 was more efficient per-clock with texturing/shading, but X800 could run its clock a lot higher. And it was a somewhat smaller chip with less cooling needs. X800XTPE had a single slot cooler.
ATI had better filtering and AA too. Significantly better, IMO. I've run a 6800 Go in my notebook and a X850 in my desktop for over a year now and have compared on many different games.
One major oddity was Oblivion. 6800 was murdered by that game. X850 XT can stand with the X1800s and alongside a few SLI setups. 6800 Ultra isn't even close. The 7800/7900s fixed that up a bit. 6800's performance was quite odd. I've been of the opinion that it came down to the low clock speed and resulting lesser geometry performance. But that doesn't explain the gap on that page.
6800 Ultra ever "cleaned up" the X800 XT PE. They were basically on par for almost everything, with NV falling far behind in a few titles. The fact that NV went out of the way to make an Ultra Extreme says it all. ATI only launched with XT PE and PRO. Just look thru the VGA Charts link.
Doom3 had issues with speed until Catalyst 4.9, thanks to a texture lookup performance issue. I had a 9700 way back then though. NV was very competitive with OpenGL though, for sure.
http://www.techreport.com/reviews/2004q4/radeon-x850xt/index.x?pg=4
http://www23.tomshardware.com/graphics.html?modelx=33&model1=536&model2=574&chart=214
There's definitely something wrong with 6800 in OB. I spent a lot of time with OB in 2006 and can tell you with total certainty that the X850 XT is in a separate class from the 6800 series in that game. I played the same saved game on both 6800 Go @ 385/770 1440x900 and X850XT @ 1920x1200. Even with the res difference, the X850 XT was a lot faster. And this is without HDR. Only used bloom. I was doing crazy stuff like forcing High Performance mode in the NV CP and turning off grass completely. It just ran terrible. My friend has a 6800 GT and it was as obviously not quite right for performance on that board, too.
Far Cry as ATI's purview well. Even with the beta SM3 additions which didn't do much for speed (and of which ATI supported some too). X8x0 was the card to have for some games, for sure.
Well for the games i enjoyed back then my 6800GT and later Ultra Extreme ran better than the x800, also the X800XTPE was not first the x800XT was along with the Pro and the XTPE followed later same with the 6800 Ultra Extreme, as for Oblivion i consider the game to have an ATI bias much the same way ppl say doom has an Nvidia Bias. As for X800 vs 6800 in the Doom3 engine, ATI caught up, but they never actully ran any faster, i remember testing an 5600XT /w 78.01 vs a 9600pro w/ 6.2 drivers and the 5600XT was alot faster, and the 9600pro is just a 4th of the x800XT, seeing how very little actully changed with the x800XT vs the 9700. Also as for FarCry, sure the x800 might bench higher, but i prefer it to look better with HDR on the 6800 vs bland and old looking on the R420 cores.
No, the X800 XT PE was launched alongside PRO. There was no plain XT. At least not initially.
True that the HDR in Far Cry can be seen as an advantage. But my impression was that it was a rather ugly implementation. And that it cost a ton of speed.
FTN Midwest analyst Joanne Feeney said it is likely people will think private equity firms would be interested in Advanced Micro Devices Inc. (AMD) because its balance sheet is under pressure.
(An item at 12:12 p.m. EST Monday incorrectly said Feeney believes it is likely AMD will see interest from private equity firms.) > Dow Jones Newswires
02-26-07 1617ET
Copyright (c) 2007 Dow Jones & Company, Inc
"lol"http://www.bloomberg.com/apps/news?pid=20601103&sid=aKJuc0rBcirA&refer=news
This is totally expected for us in the know. AMD hasn't made any compelling products for a while, and the delay of the R600 hurts them further in the GPU market (who seriously thinks they intentionally delayed the R600 now?).