Dave Orton to leave AMD at the end of July

What kind of savings would AMD with Orton leaving (after the severance package)?
Drop in the bucket sounds like an apt phrase when looking at AMD's expenses.

I think it's more important that Orton was the head of an independent ATI.
AMD has certain ends, to which ATI is a means.

What Orton would have liked in his quest to make ATI a better graphics company would not likely match AMD's desire for a well-utilized and subordinate adjunct to its core business.
 
Or an exec who would resist further butchering of ATI needed to be removed...
I think it's on record that part of the reason it was sold to ATI staff is that said butchering wouldn't happen anywhere other than the outlying cross-company cruft. If that's the case, I can see Dave vehemently objecting first and then deciding that he couldn't stand to let the troops down second, so he'd be the first out of the door if that had to happen.
 
Perhaps new direction is necessary to get ATI executing on schedule again.
I think this is it exactly. Orton was responsible for R300 and R350, and since then it's been a string of delays and the occasional complete misfire. I'd go so far as to say that he would have been gone faster if ATI hadn't been bought by AMD.
 
I think this is it exactly. Orton was responsible for R300 and R350, and since then it's been a string of delays and the occasional complete misfire. I'd go so far as to say that he would have been gone faster if ATI hadn't been bought by AMD.

I agree and said last Summer when AMD signed at the dotted line they should have walking papers for the executive management at ATI. ATI has been plagued with delays since their lone miracle in the R3.xx. This is a sign of poor management.
 
They aren't exactly replacing him with someone with a history of good execution.
They're getting Hector Ruiz. Again.

If it weren't for Orton, we may have seen Ruiz getting sacked by now.
 
Another way to look at it is that Orton was just ready to hit the golf links. He got ATI sold at a good price (no doubt he has pretty penny in bonus shares).

Many an executive does not wish to keep slogging away forever.

That said, how integral he was to the fleeting financial success that ATI enjoyed during the R300/NV30 era is tough to say. It's pretty clear that operationally (not even considering product performance) ATI was never even close to NVIDIA. And as far as strategies and resource allocation, I'm not sure that was so great either. NVIDIA's focus on the GPU (rather than things like Xilleon) has enabled them to invest more in areas like Quadro, whcih were previously considered niche markets. Just look at the annual revenues of 3D Labs when they were in their prime.

Or perhaps as the market leader (despite the NV30 debacle), NVIDIA is always one step ahead and Orton did as well as anyone under the circumstances. He certainly delivered an impressive return for shareholders at the end of the day.
 
It should also be pointed out that for a good while (years now) Orton's worked away from his missus and kids and that maybe he's just sick and tired of it more than anything else.
 
I think it is Rys 1 Tim Murray 0 in reading those thoughts, or maybe it should be 40-love considering Tim's name combining two "great" british tennis players.

As a lifelong nvidia fan I have a lot to be thankful to for Dave with the 9700, 9800 and the super duper 9500Pro of course ( probably the best mid range card of all time). It made nvidia change course dramatically and they owe him a lot even if they would not care to admit it :D.

I still think Ati made mistakes with the 3:1 and other things but that is only because it was beyond it's time and nvidia were also on the bounce and bringing down time scales again.

However I do not think manufacturing snafu's should be put on his shoulders, that is grossly unfair.

I'd like him to go to nvidia to combat Intel, along with he rest of the best Ati lot at AMD. Whether two Davids is too much is another matter, however I think if it could be made to to work nvidia would feel the benefit.


Two Davids v Goliath ?

You know how the prequel ended .....
 
Last edited by a moderator:
I wouldn't exactly say Rys 1 Tim 0.

One can imagine what Ati's results would be looking like right not as an independent company and it would not be pretty. And with the widening margin disparity between NVIDIA and ATI, ATI shareholders and presumably the board would definitely be questioning things.

Moreover, NVIDIA stumbled with NV30, but has been gradually been gaining momentum since NV40. Look at notebook market share. The proof is in the pudding. The G80 generation is outperforming R6xx on larger process nodes. The strong odds are that the performance per watt advantage is really going to increase when NVIDIA shrinks down. So the question is how long will ATI be at this disadvantage. Its perhaos a separate thread but an interesting topic. NVIDIA rolled out the NV40 in short order, but I would be surprised if ATI had answer for G80 and its successors anytime soon.
 
I wouldn't exactly say Rys 1 Tim 0.

One can imagine what Ati's results would be looking like right not as an independent company and it would not be pretty. And with the widening margin disparity between NVIDIA and ATI, ATI shareholders and presumably the board would definitely be questioning things.

Moreover, NVIDIA stumbled with NV30, but has been gradually been gaining momentum since NV40. Look at notebook market share. The proof is in the pudding. The G80 generation is outperforming R6xx on larger process nodes. The strong odds are that the performance per watt advantage is really going to increase when NVIDIA shrinks down. So the question is how long will ATI be at this disadvantage. Its perhaos a separate thread but an interesting topic. NVIDIA rolled out the NV40 in short order, but I would be surprised if ATI had answer for G80 and its successors anytime soon.

But the question arises that without David Orton they would not even have reached the zenith they had with 9700 to have the gradual decline since.

And the gradual decline was probabaly better with Dave at the helm than it would have been with AMD at the helm.

How are AMD at refreshing their stuff? They have hardly got an nvidia style track record at pulling themselves up by their bootlaces. How fast did nvidia get from the massive flop of nv30 to the nv40 which was competive and how are AMD doing with K8 to K10 ?

Actually K8 when it came out was not very good.

And K7 derrivatves where not very good either at the first attempt.

So saying, well AMD will steady the reigns is wishful thinking if you base it on their cpu releases. If it was AMD all those years ago you'd be now just looking at X1900 coming out the doors.

At 200Mhz.

They are PANTS when it comes to new releases and then have to brush it up massively afterwards.
 
Heh, what exactly are you smoking Dizi?

I would say that AMD has shown the same amount of success that Intel has when it comes to refreshes... in fact maybe a bit better? How long was the P4 out before it was replaced by the C2D? 180 nm Willamette was "ok" but certainly didn't outpace the original Athlon which was at 1 GHz and the P4 was at 1.5 GHz... and the Athlon won most of those benchmarks. Athlon XP was able to keep up with the early stages of Northwood. Then we had several years of Prescott to compete against the Athlon 64, which was undoubtedly a faster and more efficient processor and architecture.

So while the original K7 had the weakness of poor chipset and motherboard support, I would say that it and its derivatives were VERY successful. Athlon->integrated L2 Athlon->Athlon XP all did very well. Same could be said about the Athlon 64, though the jump to DDR2 was more about memory changes vs. any kind of radical performance increase (not to mention the addition of virtualization hardware).

Don't get me wrong, I am not a huge fan of Hector, but I think that AMD has performed very well through the years given their competition. Consider that NVIDIA and ATI have had revenues that were pretty comparable to each other. Now consider that AMD typically had 1/10 to 1/8 the revenue that Intel had to work with.

I do admit though, I am very curious why Orton is bailing. Then again, he was struggling with running ArtX in a competitive environment, and then doing the same again when running ATI. That takes a toll on anyone, especially if you are very hand's on and very cognizant of the operation. He's probably going to take his money and take a long breather... until the urge to get back into the industry surfaces again.

Edit: heh, after reading your other post about buying things when drunk, I guess that you were not smoking anything during the previous post. ;P
 
Last edited by a moderator:
I wouldn't exactly say Rys 1 Tim 0.

One can imagine what Ati's results would be looking like right not as an independent company and it would not be pretty. And with the widening margin disparity between NVIDIA and ATI, ATI shareholders and presumably the board would definitely be questioning things.

Moreover, NVIDIA stumbled with NV30, but has been gradually been gaining momentum since NV40. Look at notebook market share. The proof is in the pudding. The G80 generation is outperforming R6xx on larger process nodes. The strong odds are that the performance per watt advantage is really going to increase when NVIDIA shrinks down. So the question is how long will ATI be at this disadvantage. Its perhaos a separate thread but an interesting topic. NVIDIA rolled out the NV40 in short order, but I would be surprised if ATI had answer for G80 and its successors anytime soon.
When I look at ATI, I'm concerned about two things. One, their notebook market share, which has traditionally been their stronghold, is rapidly shrinking. Two, they haven't done anything in the workstation or GPGPU markets. I mean, come on--NVIDIA is selling external visualization boxes for $17,000 and a rackmount setup that is basically 4 G80s, a power supply, and a controller card for $12,000. Where do you think their huge margins come from? ATI had a huge lead in the GPGPU market with CTM. It should have been almost insurmountable. Instead, what do they do? Give people an assembly-level interface to the card and expect others to write their own high-level languages. They could have hired a compiler team, built something similar to Brook, and dominated--but that's exactly what NVIDIA did. Maybe I'm harping too much on GPGPU--that's certainly possible. It's early, it might never really take off. But if it does, ATI/AMD could have owned that market completely. By all reports, R580 performs as well or better than G80 in GPGPU applications (and hell, Mike Houston has commented that you can achieve ~R580 performance with a RV630). The problem is, R580 is a terrible development platform, and the difference between R580 and G80 is negligible compared to the difference between running your code on a CPU and running it on a GPU. They just missed the whole point. Considering how much it could be worth in two or three years, I really think that they screwed up immensely.

That FireGL has done nothing is also disheartening. NVIDIA now has a stranglehold on that market, but that wasn't true when Orton took over. But they didn't get anywhere--I don't know why this is the case, whether it's ATI's fault or if NVIDIA simply outmaneuvered them. Still, it's troublesome.

I think that ATI's nonexistence in the professional market is the big failing of Orton. I also think that manufacturing problems have to be blamed on someone, the policy-maker who repeatedly said, okay, we can be aggressive with our fabrication and trust TSMC to deliver this time even though they weren't able to last time. I can't imagine that it would be anyone other than Orton there as well. Maybe he just wants to get out--that's fine, the guy certainly deserves it. And sure, a lot of the problems weren't his fault. At the end of the day, though, he's the guy who's overseeing the whole operation, and he's the one that has to take responsibility for it.
 
Heh, what exactly are you smoking Dizi?

I would say that AMD has shown the same amount of success that Intel has when it comes to refreshes... in fact maybe a bit better? How long was the P4 out before it was replaced by the C2D? 180 nm Willamette was "ok" but certainly didn't outpace the original Athlon which was at 1 GHz and the P4 was at 1.5 GHz... and the Athlon won most of those benchmarks. Athlon XP was able to keep up with the early stages of Northwood. Then we had several years of Prescott to compete against the Athlon 64, which was undoubtedly a faster and more efficient processor and architecture.

So while the original K7 had the weakness of poor chipset and motherboard support, I would say that it and its derivatives were VERY successful. Athlon->integrated L2 Athlon->Athlon XP all did very well. Same could be said about the Athlon 64, though the jump to DDR2 was more about memory changes vs. any kind of radical performance increase (not to mention the addition of virtualization hardware).


Edit: heh, after reading your other post about buying things when drunk, I guess that you were not smoking anything during the previous post. ;P

Cheeky monkey :D

As a chronic overclocker the only thing I smoke is cpu's. And I have smoked a lot of the above so your comment

"Athlon XP was able to keep up with the early stages of Northwood"

made me fall over laughing.

No it didn't. Within a few weeks of it being available Northwood had completely wiped the Athlong XP off the face of 3dmark2001. Mine overlclocked from 3Ghz to 3.8Ghz, compared to my first Athlon Thoroughbred X1700+ 1.47GHz which overclocked to a magnificent 1.45GHz or minus 200MHz and which still rates as my worst ever overclock. I sent it back and got a replacement which overclocked to a mind boggling 1.485GHz or 150MHz over stock. Woopeyfookingdoo.

Then they brought out the Thoroghbred B with the extra metal layer and things were OK. You are mixing up how they finally ended up with what was intially offered, probably because you did not buy the first attempt and try and overclock it.

Intial K8 Opteron ..not speedy.

I bought an Fx55 when the process really got going and it did 3Ghz ok, not bad at all, that chip is still going. But I have bought a lot of chips and AMD never go that well straight off the bat.

For intel

300A Celeron at 450Mhz
650Mhz P3 at 966Mhz
3Ghz Northwood at 3.8Ghz
2.93MHz X6800 at 3666Mhz ( 4200+ when benching ) on air

It takes AMD years to get up to those levels of overclock and Intel can get them fairly soon out of the blocks. Because AMD always takes time to fine tune.

I've bought lots of things while drunk but never an AMD, I think that tells you something ;)
 
Well, ignoring subjective, random overclocking results, I don't think that AMD's successes post-K6 can be just laughed at. A couple of years ago no one would have ever tried ripping apart K8. How the wheel turns.

We will see if they can deliver with Barcelona. I do have to say that I am worried, but who really knows.

I am more astounded by ATI's total failure with their "2nd generation unified shader architecture". Jeez.

BTW, I kinda liked my Athlon XP-M 2500+ overclocked to 2.5 GHz. It was quite nice.
 
seems like only yesterday we were saying how good amd was and how bad intell was with its netburst architecture.....
 
Last edited by a moderator:
Back
Top