The AMD Execution Thread [2007 - 2017]

Status
Not open for further replies.
AMD should really be looking at tablets. The current bobcats would be amazing in a 7 or 10 inch tablet. Right now the dual core verison is at 18watts but droping it to 32nm should allow them to get down under 10 watts.

It would surely blow away anything else in the market and while battery life wouldn't be the greatest you only need 4 h ours or so on a tablet esp if you include a removable battery
 
They supposedly have a tablet-optimized C-50 derivative shaving an additional couple of watts off the original 9W spec by limiting memory flexibility and disabling I/O deemed unnecessary for a tablet. That might be in the realm of usability without requiring too much battery bulk.

There's supposed to be an upcoming Win7-based Acer model utilizing it at least.
 
They supposedly have a tablet-optimized C-50 derivative shaving an additional couple of watts off the original 9W spec by limiting memory flexibility and disabling I/O deemed unnecessary for a tablet. That might be in the realm of usability without requiring too much battery bulk.

There's supposed to be an upcoming Win7-based Acer model utilizing it at least.

Yep, it's a 5W version, discussed here: http://forum.beyond3d.com/showthread.php?p=1520288
 
Ain't 10Watts way too much for a tablet? Battery life would suck prettu bad, no?

Considering I'm currently awaiting a slate with a Core i5, battery life is only one consideration depending on the users requirements for a slate. Important certainly but not always the number 1 issue.

Similar to how some people can get by with a netbook while for others netbooks are absolutely useless even if they get twice as much battery life over a more powerful notebook.

For a media consumption only device like an iPad, yes battery life will be key. There are plenty of other use cases however where 4-6 hours of battery life will be perfectly fine if it can do what the user requires. And in some cases, (graphics artists for example) battery life is entirely irrelevant.

I really wish people would stop thinking the iPad is the be-all, end-all shining example of a slate. It is an extraordinarily well executed example of one type of slate addressing one segment of potential slate users, but doesn't represent a good product for all potential slate users.

Regards,
SB
 
Yup Silent , I really don't know in which situation i'd find myself in where i'd need more than 6 hours with a tablet and no place to charge. Mabye on a really really long flight but then again if a flight gets over 6 hours i will most likely be sleeping for some of it and now planes have outlets
 
on the second point. the only really bad execuition AMD has had is agena(what a fuck up :oops:) .
L
O
L

What about paying 6mlrd for R600? Inability to penetrate the market with better product (k8) for years? K9? Aiming too low with K10?
The total brand-mess - we keep Ati, we bury Ati
Enter FX chips, no more FX, FX comes again.
PR the 4x4x4 platform, which guess what - was never released (or no one bother to remember)
Or selling their mobile chipsets, so now Intel ships SOCs with AMD-designed chips inside, only they're no longer AMD.
Or Fusion - they managed to made it after Intel, and on top of that, NVidia is far ahead in GPGPU
Both Ruiz and Meyer were unable to see the future trends

All these things happened after JSIII left

Bah. For first time in my life I may buy Intel CPU, just tired of waiting and hoping
 
L
O
L

What about paying 6mlrd for R600? Inability to penetrate the market with better product (k8) for years? K9? Aiming too low with K10?
The total brand-mess - we keep Ati, we bury Ati
Enter FX chips, no more FX, FX comes again.
PR the 4x4x4 platform, which guess what - was never released (or no one bother to remember)
Or selling their mobile chipsets, so now Intel ships SOCs with AMD-designed chips inside, only they're no longer AMD.
Or Fusion - they managed to made it after Intel, and on top of that, NVidia is far ahead in GPGPU
Both Ruiz and Meyer were unable to see the future trends

All these things happened after JSIII left

Bah. For first time in my life I may buy Intel CPU, just tired of waiting and hoping
I don't know what 6mird means, but AMD didn't buy R600. They bought the ATI team to develop Fusion.

Intel doesn't ship SoCs with AMD/ATI (Qualcomm graphics) designs inside.

Some obviously disagree with the decision to kill the ATI brand, but it's hardly a huge failure and it's common that old company names eventually die off after some time so it shouldn't have been a huge surprise. The average consumer never even recognized the ATI brand. Though the same can be said about the AMD brand.
 
I don't know what 6mird means, but AMD didn't buy R600. They bought the ATI team to develop Fusion.

Intel doesn't ship SoCs with AMD/ATI (Qualcomm graphics) designs inside.

Some obviously disagree with the decision to kill the ATI brand, but it's hardly a huge failure and it's common that old company names eventually die off after some time so it shouldn't have been a huge surprise. The average consumer never even recognized the ATI brand. Though the same can be said about the AMD brand.
Weeks ago somewhere on the Net I saw picture of small board with Intel chip inside and claim that the graphic chip on board was developed by AMD, yet now sells as Qualcomm. Mind you I never checked nor saved the link

My point is that paying 6mlrd dollars was way too much.
These money f*** the balance sheet for years and still do it. After R600 was out ATi stock fell 2-3x. That was the "fair" price. I even wonder if someone from AMD put money in his pocket from this deal.

PS: Notice, milliard is written with l as in light, not with i as in irony.
 
Though didn't AMD receive a metric ton of tax-breaks from the continuous quarters of write-offs from the ATI purchase as well as the increased revenue to offset the purchase by now?
 
Yeah, that isn't used much if at all anymore in English. People are more likely to understand if you use the word billion.
I'm a native english speaker from the US and I've never heard the word milliard before. This is a nitpick, but if my memory is correct the real purchase price for ATI was < $5 billion since ATI had cash. Still a significant amount of money.

People complain about how much AMD paid for ATI, but at the time no one knew R600 would flop and ATI's value would drop so much. Plus, I'm of the opinion that if AMD hadn't acted when they did they wouldn't have been able to garner the support for such a large purchase post Barcelona. No one really knows though which is the beauty of speculation. :smile:
 
L
O
L

What about paying 6mlrd for R600? Inability to penetrate the market with better product (k8) for years? K9? Aiming too low with K10?
sigh, two things you need to do:
1. read all of a post before replying
2. comprehend all of a post before replying

The total brand-mess - we keep Ati, we bury Ati
Enter FX chips, no more FX, FX comes again.
theres about 3 people in the entire world who see removing of the Ati name as an issue, radeon is just as power brand name. Also has nothing to do with execution. Also if FX is your extreme high performance line, its kind of a joke to sell it when you can't complete in raw performance.

PR the 4x4x4 platform, which guess what - was never released (or no one bother to remember)
again has nothing to do with execution.

Or selling their mobile chipsets, so now Intel ships SOCs with AMD-designed chips inside, only they're no longer AMD.
and????? When you need to raise cash you need to make choices, what would have you done instead?

Or Fusion - they managed to made it after Intel, and on top of that, NVidia is far ahead in GPGPU
Both Ruiz and Meyer were unable to see the future trends
GPGPU is still far more bark then bite. I would agree Fusion has been slow but that said its going to slap intels solution sideways.

All these things happened after JSIII left

Bah. For first time in my life I may buy Intel CPU, just tired of waiting and hoping

you need to learn how to separate different issues and analysis them separately. Execution has been good. Design and direction failed during K8 years, Bulldozer V1 failed but that is separate from execution.

the bobcat based APU's look good, if you listen to the hotchips presentations they talk about having planing for core improvements as well as on 28nm we are seeing quad cores.
To me AMD has learn't the painful lesion they should have learnt from intel, You cant focus on just one metric, (intel clock, amd cores) you need to make improvements in many directions with each iteration.

Llano is a rather known quantity your going to be trading CPU perf for GPU perf vs SB. but to me Llano is a far more balanced APU.

the only unknown is bulldozer but we will know about that one soon enough. some people here need to take off there princess shoes.
 
I'm a native english speaker from the US and I've never heard the word milliard before. This is a nitpick, but if my memory is correct the real purchase price for ATI was < $5 billion since ATI had cash. Still a significant amount of money.

Being a 'native' english speaker from USA you wouldn't use milliard, as the american word for milliard is billion. The English word for it is milliard. And I might add, the english version is more in line with international standards (who would have thought!).

This I did to nitpick.;)
 
kyniskos said:
The english version is more in line with international standards.
Actually, the UK switched away from milliard to billion in 1974. So it's now incorrect to call it the British system, since Britain no longer uses it. And there's no international standard about what word to use for 10^9. Here's some history and background:
http://en.wikipedia.org/wiki/Long_and_short_scales
 
1 milliard dollars!

/exit stage left

milliard is euro speak for billion.

get over it, no English speaker knows anything about metrics because 1 Milliard equates to 6 pounds, 5 pence and 7 stones to you, imperial, lot.

j/k
 
Status
Not open for further replies.
Back
Top