AMD has issued a press release announcing that Dave Orton, former CEO of ATI, is leaving the company as one of its executive VPs at the end of this month.
Read the full news item
Read the full news item
I think it's on record that part of the reason it was sold to ATI staff is that said butchering wouldn't happen anywhere other than the outlying cross-company cruft. If that's the case, I can see Dave vehemently objecting first and then deciding that he couldn't stand to let the troops down second, so he'd be the first out of the door if that had to happen.Or an exec who would resist further butchering of ATI needed to be removed...
I think this is it exactly. Orton was responsible for R300 and R350, and since then it's been a string of delays and the occasional complete misfire. I'd go so far as to say that he would have been gone faster if ATI hadn't been bought by AMD.Perhaps new direction is necessary to get ATI executing on schedule again.
I think this is it exactly. Orton was responsible for R300 and R350, and since then it's been a string of delays and the occasional complete misfire. I'd go so far as to say that he would have been gone faster if ATI hadn't been bought by AMD.
I wouldn't exactly say Rys 1 Tim 0.
One can imagine what Ati's results would be looking like right not as an independent company and it would not be pretty. And with the widening margin disparity between NVIDIA and ATI, ATI shareholders and presumably the board would definitely be questioning things.
Moreover, NVIDIA stumbled with NV30, but has been gradually been gaining momentum since NV40. Look at notebook market share. The proof is in the pudding. The G80 generation is outperforming R6xx on larger process nodes. The strong odds are that the performance per watt advantage is really going to increase when NVIDIA shrinks down. So the question is how long will ATI be at this disadvantage. Its perhaos a separate thread but an interesting topic. NVIDIA rolled out the NV40 in short order, but I would be surprised if ATI had answer for G80 and its successors anytime soon.
When I look at ATI, I'm concerned about two things. One, their notebook market share, which has traditionally been their stronghold, is rapidly shrinking. Two, they haven't done anything in the workstation or GPGPU markets. I mean, come on--NVIDIA is selling external visualization boxes for $17,000 and a rackmount setup that is basically 4 G80s, a power supply, and a controller card for $12,000. Where do you think their huge margins come from? ATI had a huge lead in the GPGPU market with CTM. It should have been almost insurmountable. Instead, what do they do? Give people an assembly-level interface to the card and expect others to write their own high-level languages. They could have hired a compiler team, built something similar to Brook, and dominated--but that's exactly what NVIDIA did. Maybe I'm harping too much on GPGPU--that's certainly possible. It's early, it might never really take off. But if it does, ATI/AMD could have owned that market completely. By all reports, R580 performs as well or better than G80 in GPGPU applications (and hell, Mike Houston has commented that you can achieve ~R580 performance with a RV630). The problem is, R580 is a terrible development platform, and the difference between R580 and G80 is negligible compared to the difference between running your code on a CPU and running it on a GPU. They just missed the whole point. Considering how much it could be worth in two or three years, I really think that they screwed up immensely.I wouldn't exactly say Rys 1 Tim 0.
One can imagine what Ati's results would be looking like right not as an independent company and it would not be pretty. And with the widening margin disparity between NVIDIA and ATI, ATI shareholders and presumably the board would definitely be questioning things.
Moreover, NVIDIA stumbled with NV30, but has been gradually been gaining momentum since NV40. Look at notebook market share. The proof is in the pudding. The G80 generation is outperforming R6xx on larger process nodes. The strong odds are that the performance per watt advantage is really going to increase when NVIDIA shrinks down. So the question is how long will ATI be at this disadvantage. Its perhaos a separate thread but an interesting topic. NVIDIA rolled out the NV40 in short order, but I would be surprised if ATI had answer for G80 and its successors anytime soon.
Heh, what exactly are you smoking Dizi?
I would say that AMD has shown the same amount of success that Intel has when it comes to refreshes... in fact maybe a bit better? How long was the P4 out before it was replaced by the C2D? 180 nm Willamette was "ok" but certainly didn't outpace the original Athlon which was at 1 GHz and the P4 was at 1.5 GHz... and the Athlon won most of those benchmarks. Athlon XP was able to keep up with the early stages of Northwood. Then we had several years of Prescott to compete against the Athlon 64, which was undoubtedly a faster and more efficient processor and architecture.
So while the original K7 had the weakness of poor chipset and motherboard support, I would say that it and its derivatives were VERY successful. Athlon->integrated L2 Athlon->Athlon XP all did very well. Same could be said about the Athlon 64, though the jump to DDR2 was more about memory changes vs. any kind of radical performance increase (not to mention the addition of virtualization hardware).
Edit: heh, after reading your other post about buying things when drunk, I guess that you were not smoking anything during the previous post. ;P