AMD needs money?

Status
Not open for further replies.
There's some message boards claiming IBM is looking at buying AMD for between $22-25/share.

In terms of fab/process, IBM doesn't seem too far-fetched....

Who knows.....lol
 
If it's not in chinese, then "Farid's First Law" does not apply*.





*Farid's First Law: "Rumours about computer hardware companies in chinese on the internet must be true."
 
AMD is an ideal takeover target for any well off company looking to make it big in the CPU industry. Possible, but still speculation.
 
AMD is an ideal takeover target for any well off company looking to make it big in the CPU industry. Possible, but still speculation.

IBM themselves are big in the processor market, or at least as big as you can get without being AMD or Intel. It would be crazy interesting however. As it would allow IBM to enter the CPU markets that they really have no leverage in right now. Consumer and the server sections that they do not have popular parts in.

EDIT: Also of note is that the Xbox 360, PS3, and Wii all have IBM designed CPUs.
 
At this point, Im more likely to think its some hedge fund. IBM stated a while back they wanted to be a service company and were slowly departing from certain hardware aspects.

epic
 
IBM would be a disaster I think. I don't think they understand low-margin/commodity markets (or they understand them and don't want to be part of them). Though that does depend on which IBM you're talking about of course. Which is also part of the problem, IBM is akin to a bunch of multi-national corporations flying in close formation.
 
IBM themselves are big in the processor market, or at least as big as you can get without being AMD or Intel. It would be crazy interesting however. As it would allow IBM to enter the CPU markets that they really have no leverage in right now. Consumer and the server sections that they do not have popular parts in.

EDIT: Also of note is that the Xbox 360, PS3, and Wii all have IBM designed CPUs.

Not in the x86 market they're not, which frankly, by now, is the only ISA that matters in the PC market. If they can enter the console CPU market with such force, what's the problem with entering the PC CPU market?
 
Not in the x86 market they're not, which frankly, by now, is the only ISA that matters in the PC market. If they can enter the console CPU market with such force, what's the problem with entering the PC CPU market?
The money isnt as good. Ill be quite shocked if IBM is actually behind this. They left the hard drive market, notebook/pc market, and at least one other hardware aspect (memory escapes me). They are not going to purchase AMD.

epic
 
here you go:
http://www.nytimes.com/2004/12/03/t...=c60a66b7afa86173&ei=5090&partner=rssuserland
A sale now, if it happens, would be consistent with the strategy pursued by Samuel J. Palmisano, who became I.B.M.'s chief executive early in 2002. He has sold hardware businesses where profits were slender and growth prospects were limited, like its hard disk drive business, which was sold to Hitachi.

Instead, Mr. Palmisano has bet on expanding the company's services business, automating a full array of operations - from product design to sales-order processing - for corporate customers. I.B.M. now casts itself as a company that does not simply sell technology but serves as a consulting partner to help its customers use technology to increase the efficiency and competitiveness of their businesses. As part of that strategy, he bought PricewaterhouseCoopers Consulting for $3.5 billion, in a deal that closed in October 2002.

"Palmisano's getting out of businesses that aren't growth opportunities and concentrating on what I.B.M. does best," said Mark Stahlman, an analyst at Carris & Company. "PC's are not where the growth is."

To trim costs, I.B.M. has steadily retreated from the manufacture of its PC's. In January 2002, it sold its desktop PC manufacturing operations in the Untied States and Europe to Sanmina-SCI, based in San Jose, Calif. I.B.M. now confines its role in PC's to design and product development out of its offices in Raleigh, N.C., with all the I.B.M.-brand desktop or notebook computers made by contract manufacturers around the world.
 
Well I was correcting nonamer. ;) The full list would be

X800XTPE > 6800Ultra > X800XT > 6800GT > X800Pro > X800 = 6800 > 6600 GT > X600 > 6200 = X300

But I do agree that if there is a part that screams bang for buck (9500P, 6800GT, 7900GT) that its a sure shot success regardless of how the flagship performs.

Don't forget about 6800 Ultra Extreme, lol. Nice paper launch BS there, to semi-stomp X800 XT PE. You can still find X800 XTPE around. Good luck finding 6800UE.

IMO the X800s were more than a match for 6800. SM3 was useless at the time, and SM2B is still good enough 90% of the time today, with rare exceptions like RS Vegas (only one OTOH). 6800 was more efficient per-clock with texturing/shading, but X800 could run its clock a lot higher. And it was a somewhat smaller chip with less cooling needs. X800XTPE had a single slot cooler.

ATI had better filtering and AA too. Significantly better, IMO. I've run a 6800 Go in my notebook and a X850 in my desktop for over a year now and have compared on many different games.

One major oddity was Oblivion. 6800 was murdered by that game. X850 XT can stand with the X1800s and alongside a few SLI setups. 6800 Ultra isn't even close. The 7800/7900s fixed that up a bit. 6800's performance was quite odd. I've been of the opinion that it came down to the low clock speed and resulting lesser geometry performance. But that doesn't explain the gap on that page.
 
Don't forget about 6800 Ultra Extreme, lol. Nice paper launch BS there, to semi-stomp X800 XT PE. You can still find X800 XTPE around. Good luck finding 6800UE.

IMO the X800s were more than a match for 6800. SM3 was useless at the time, and SM2B is still good enough 90% of the time today, with rare exceptions like RS Vegas (only one OTOH). 6800 was more efficient per-clock with texturing/shading, but X800 could run its clock a lot higher. And it was a somewhat smaller chip with less cooling needs. X800XTPE had a single slot cooler.

ATI had better filtering and AA too. Significantly better, IMO. I've run a 6800 Go in my notebook and a X850 in my desktop for over a year now and have compared on many different games.

One major oddity was Oblivion. 6800 was murdered by that game. X850 XT can stand with the X1800s and alongside a few SLI setups. 6800 Ultra isn't even close. The 7800/7900s fixed that up a bit. 6800's performance was quite odd. I've been of the opinion that it came down to the low clock speed and resulting lesser geometry performance. But that doesn't explain the gap on that page.

your forgetting one element though, OpenGL was still rather popular in 2004, Doom 3, Quake 4, Call of Duty, ect where all OpenGl and all faster on the 6800. The slowdown problem on oblivion with the NV4x is the Pixel shaders are clocked lower than those of ATI. The cards had less shader power than there ATI buddies, but where shaders didnt matter to much or where SM3 was optimized for the code the NV4x cleaned up. The 6800Ultra and the x800XT where equals. As for finding a 6800 Ultra Extreme thats not to hard, I found one a few months ago that was missing a cooler and paid only 50 bucks for, threw a zalman on it and resold it for 300.
 
hmm no it probably has to do with more of the type of HDR used. If I remember correctly that benchmark did have HDR activated.
 
6800 Ultra ever "cleaned up" the X800 XT PE. They were basically on par for almost everything, with NV falling far behind in a few titles. The fact that NV went out of the way to make an Ultra Extreme says it all. ATI only launched with XT PE and PRO. Just look thru the VGA Charts link.

Doom3 had issues with speed until Catalyst 4.9, thanks to a texture lookup performance issue. I had a 9700 way back then though. NV was very competitive with OpenGL though, for sure.
http://www.techreport.com/reviews/2004q4/radeon-x850xt/index.x?pg=4
http://www23.tomshardware.com/graphics.html?modelx=33&model1=536&model2=574&chart=214

There's definitely something wrong with 6800 in OB. I spent a lot of time with OB in 2006 and can tell you with total certainty that the X850 XT is in a separate class from the 6800 series in that game. I played the same saved game on both 6800 Go @ 385/770 1440x900 and X850XT @ 1920x1200. Even with the res difference, the X850 XT was a lot faster. And this is without HDR. Only used bloom. I was doing crazy stuff like forcing High Performance mode in the NV CP and turning off grass completely. It just ran terrible. My friend has a 6800 GT and it was as obviously not quite right for performance on that board, too.

Far Cry as ATI's purview well. Even with the beta SM3 additions which didn't do much for speed (and of which ATI supported some too). X8x0 was the card to have for some games, for sure.
 
Last edited by a moderator:
6800 Ultra ever "cleaned up" the X800 XT PE. They were basically on par for almost everything, with NV falling far behind in a few titles. The fact that NV went out of the way to make an Ultra Extreme says it all. ATI only launched with XT PE and PRO. Just look thru the VGA Charts link.

Doom3 had issues with speed until Catalyst 4.9, thanks to a texture lookup performance issue. I had a 9700 way back then though. NV was very competitive with OpenGL though, for sure.
http://www.techreport.com/reviews/2004q4/radeon-x850xt/index.x?pg=4
http://www23.tomshardware.com/graphics.html?modelx=33&model1=536&model2=574&chart=214

There's definitely something wrong with 6800 in OB. I spent a lot of time with OB in 2006 and can tell you with total certainty that the X850 XT is in a separate class from the 6800 series in that game. I played the same saved game on both 6800 Go @ 385/770 1440x900 and X850XT @ 1920x1200. Even with the res difference, the X850 XT was a lot faster. And this is without HDR. Only used bloom. I was doing crazy stuff like forcing High Performance mode in the NV CP and turning off grass completely. It just ran terrible. My friend has a 6800 GT and it was as obviously not quite right for performance on that board, too.

Far Cry as ATI's purview well. Even with the beta SM3 additions which didn't do much for speed (and of which ATI supported some too). X8x0 was the card to have for some games, for sure.

Well for the games i enjoyed back then my 6800GT and later Ultra Extreme ran better than the x800, also the X800XTPE was not first the x800XT was along with the Pro and the XTPE followed later same with the 6800 Ultra Extreme, as for Oblivion i consider the game to have an ATI bias much the same way ppl say doom has an Nvidia Bias. As for X800 vs 6800 in the Doom3 engine, ATI caught up, but they never actully ran any faster, i remember testing an 5600XT /w 78.01 vs a 9600pro w/ 6.2 drivers and the 5600XT was alot faster, and the 9600pro is just a 4th of the x800XT, seeing how very little actully changed with the x800XT vs the 9700. Also as for FarCry, sure the x800 might bench higher, but i prefer it to look better with HDR on the 6800 vs bland and old looking on the R420 cores.
 
Well for the games i enjoyed back then my 6800GT and later Ultra Extreme ran better than the x800, also the X800XTPE was not first the x800XT was along with the Pro and the XTPE followed later same with the 6800 Ultra Extreme, as for Oblivion i consider the game to have an ATI bias much the same way ppl say doom has an Nvidia Bias. As for X800 vs 6800 in the Doom3 engine, ATI caught up, but they never actully ran any faster, i remember testing an 5600XT /w 78.01 vs a 9600pro w/ 6.2 drivers and the 5600XT was alot faster, and the 9600pro is just a 4th of the x800XT, seeing how very little actully changed with the x800XT vs the 9700. Also as for FarCry, sure the x800 might bench higher, but i prefer it to look better with HDR on the 6800 vs bland and old looking on the R420 cores.

No, the X800 XT PE was launched alongside PRO. There was no plain XT. At least not initially. Ultra Extreme was NV's "emergency" edition, kinda like P4 EE vs. A64.
http://www.techreport.com/reviews/2004q2/radeon-x800/index.x?pg=1
http://www.beyond3d.com/reviews/ati/r420_x800/

I really rather doubt that 9600PRO would run Doom3 significantly slower than FX5600. They weren't far apart when Doom3 arrived, let alone after ATI got the drivers tweaked post-4.9. I actually played Doom3 thru on a laptop with a overclocked 64MB 9600NP.
http://www.tomshardware.com/2005/07/05/vga_charts_vii/page9.html
(it's actually beating 5700 here. 5700 has numerous advantages over 5600.)

5700 series and higher NV cards have that wicked fast stencil shadowing due to the double Z mode. ATI couldn't ever quite match that advantage. But it only shows up in Doom3 engine games usually. Oh, and FEAR.

True that the HDR in Far Cry can be seen as an advantage. But my impression was that it was a rather ugly implementation. And that it cost a ton of speed.
 
Last edited by a moderator:
No, the X800 XT PE was launched alongside PRO. There was no plain XT. At least not initially.

There was an XT, I bought one....

Was early on too, I bought the XT because Novatech didn't have any XT PE's in stock with PCI-E.....
 
True that the HDR in Far Cry can be seen as an advantage. But my impression was that it was a rather ugly implementation. And that it cost a ton of speed.

actully FarCry HDR on the number 7 setting was the best HDR ive seen yet. Source HDR sucks, so does fear, and i havnt played oblivion, and my other 2 games Most Wanted and BiA:EiB use Bloom not HDR.
 
The buyout rumors were unsubstantiated.

FTN Midwest analyst Joanne Feeney said it is likely people will think private equity firms would be interested in Advanced Micro Devices Inc. (AMD) because its balance sheet is under pressure.
(An item at 12:12 p.m. EST Monday incorrectly said Feeney believes it is likely AMD will see interest from private equity firms.) > Dow Jones Newswires
02-26-07 1617ET
Copyright (c) 2007 Dow Jones & Company, Inc
 
Status
Not open for further replies.
Back
Top