AMD: Sea Islands R1100 (8*** series) Speculation/ Rumour Thread

What did you expect? 20nm will be viable for larger dies in mid-2014 at the earliest. 28nm was/is expensive, 20nm will be even more so. This means new generations will have to hold out longer and longer.
 
I wasn't talking APUs. Neither was caveman-jim, unless there exist such things as APU cards. Let's face it: nobody really cares passionately about APUs. ;)

Well Jim mentioned the question: 'what happens if I but a new card now and in 6-9 [months] there's no AMD driver team?'

Which can only happen if AMD goes under, since most of their business is APUs + discrete graphics. And I think people worry about AMD dying more than usual with the rumors about Kaveri being delayed/canceled.
 
Hmm. :???: Worse than one could have even thought. Does it mean HD9000 in very late 2014? :oops:



:oops: Where do they send AMD driver team? I honestly don't understand what's going on here. Are they close to bankruptcy or...?

Cause you really believe what is saying Fudzilla ( well i should say Fuad, when its not write by him it is ok .. )? When i have read it, i was think its a copy past of their end november article of the last 4 years.

They should maybe tell us what happend with the Geforce 700, and i m really more worry of dont see him, point anything about nvidia in this article.

We all know we cant expect HD8000 in december, we are allready in december and launch a product in 2 weeks will be anyway a paper launch cause you cant send products to shops after that. whatever the cards will or will not appear in January ( or in february, march ) is even another story. Like each years, i believe we will see Fudzilla bring us an article each week about delay or anything else.
 
Last edited by a moderator:
Well Jim mentioned the question: 'what happens if I but a new card now and in 6-9 [months] there's no AMD driver team?'

Which can only happen if AMD goes under, since most of their business is APUs + discrete graphics. And I think people worry about AMD dying more than usual with the rumors about Kaveri being delayed/canceled.
I looked at it in the context of AMD earlier laying off lots of people, most of them supposedly in the GPU group. They could decide to focus on one product, APUs, and drop discrete GPUs and stop supporting discrete GPU driver. Not that this would make sense, mind you. But it's more likely than them going out of business completely.
 
I looked at it in the context of AMD earlier laying off lots of people, most of them supposedly in the GPU group. They could decide to focus on one product, APUs, and drop discrete GPUs and stop supporting discrete GPU driver. Not that this would make sense, mind you. But it's more likely than them going out of business completely.

I doubt it, they have lower cost of executive on all sectors, thats' right, but why will you keep 2 offices on Dresden Germany (and the linux dpt one ), 1 in Canada ( old ATI offices ), etc etc when you can pull all of that in one office somewhere in USA ? .. Reduce cost, and improve efficiency ..

One of the main problem with AMD, is they had so much offices, labs department explosed around the world, it was really throw the money by the window..They had sold the fabs of Dresden, and they had still 2 offices/labs there ? For what ? do linux drivers? at 600km of the initial driver department ? I dont even understand why they have not look at all thoses non sense before.
 
Last edited by a moderator:
I looked at it in the context of AMD earlier laying off lots of people, most of them supposedly in the GPU group. They could decide to focus on one product, APUs, and drop discrete GPUs and stop supporting discrete GPU driver. Not that this would make sense, mind you. But it's more likely than them going out of business completely.

Would the drivers for APUs and discrete GPUs really be all that different when they share the same architecture? Albeit with a ~1-year delay on APUs.
 
Would the drivers for APUs and discrete GPUs really be all that different when they share the same architecture? Albeit with a ~1-year delay on APUs.
When you sell an APU, you sell a CPU that happens to be also somewhat decent at graphics (as opposed to Intel horrible.) You enter the realm of good enough pretty quickly. With a discrete GPU, you don't have this luxury.

Whether or not this makes a big difference in terms of staffing and resources, I don't know. But you don't need engineers looking for the last 10% of performance, you don't need an AMD Get In The Game program (or what is it called these days?), you don't need labs with regression setups, you need much less marketing chasing the hot tech blog editors of the moment etc.

And then you need to count this double, because the money spent on this could maybe better be spent somewhere else? AMD is in such a situation that they have to make very specific choices about where they see themselves in future. They can't just spend freely and hope something sticks, the way profitable companies can. If they conclude that discrete GPU is shrinking, then what the point sustaining a product that's even now barely profitable?
 
You enter the realm of good enough pretty quickly. With a discrete GPU, you don't have this luxury.

This is terrible. I don't believe that enthusiasts would write it. It is never enough, that's what actually drives progress.
I would partially agree if they begin to manufacture the awful APUs at TSMC, or Intel, or wherever it is possible to use the best processes. But even then, that is a huge loss of precious performance.

And then you need to count this double, because the money spent on this could maybe better be spent somewhere else?

Like where? You know that AMD's graphics department is still the profitable division, or the better one in this regard.

AMD is in such a situation that they have to make very specific choices about where they see themselves in future. They can't just spend freely and hope something sticks, the way profitable companies can. If they conclude that discrete GPU is shrinking, then what the point sustaining a product that's even now barely profitable?

APU and CPU market should be shrinking and without profit too.
 
Last edited by a moderator:
I looked at it in the context of AMD earlier laying off lots of people, most of them supposedly in the GPU group.
Ummm, eh?

Whether or not this makes a big difference in terms of staffing and resources, I don't know. But you don't need engineers looking for the last 10% of performance, you don't need an AMD Get In The Game program (or what is it called these days?), you don't need labs with regression setups, you need much less marketing chasing the hot tech blog editors of the moment etc.
Intel has pretty deep pockets where ISV co-marketing goes as well and when graphics is the primary differentiator then you still need an ISV program to highlight them.

If they conclude that discrete GPU is shrinking, then what the point sustaining a product that's even now barely profitable?
The real question is whether the IP set is fundamental to the future of the company and how you plan to fund the development of the IP.
 
Ummm, eh?
Supposedly. Another case where the village idiot has been proven wrong? Glad to see this corrected.

Intel has pretty deep pockets where ISV co-marketing goes as well and when graphics is the primary differentiator then you still need an ISV program to highlight them.

The real question is whether the IP set is fundamental to the future of the company and how you plan to fund the development of the IP.
Sure. But I was pointing out that there are costs that are specific to discrete that don't necessarily benefit APU a whole lot.

To be perfectly clear: I don't think AMD will and should make cuts in its discrete GPU division. But that doesn't mean there can't be arguments in favor.
 
This is terrible. I don't believe that enthusiasts would write it. It is never enough, that's what actually drives progress.
Not only do you get excited about APU graphics, but you think they drive graphics progress?

I see APUs as utterly boring pieces of mediocrity. Subpar CPU perf, subpar GPU perf. I like Intel's model much better: stellar CPU, forgettable GPU.
 
To be perfectly clear: I don't think AMD will and should make cuts in its discrete GPU division. But that doesn't mean there can't be arguments in favor.

I think it would be pretty difficult for anyone to argue that they should cut the GPU division that makes good products and brings in money, in favour of concentrating on the CPU division that operates well below par.
 
I think it would be pretty difficult for anyone to argue that they should cut the GPU division that makes good products and brings in money, in favour of concentrating on the CPU division that operates well below par.
Say you're a CEO who believes discrete is going away slowly in the next 5 years. And you have a competitor who somehow is able to sell its products better.

Would it be such a bad decision to stop developing a new GPU architecture and simply scale it down to smaller technologies? Do one more step to make geometry processing parallel like Nvidia (my guess: this work has been completed anyway), but keep GCN shader cores and don't touch them anymore.

I think this would work out pretty well in the short term. Think Mark Hurd at HP: cut R&D expenses to the bone, reap the short term financial rewards. Except that you free up cash for other endeavors where you think there's a long term future.
 
I think this would work out pretty well in the short term. Think Mark Hurd at HP: cut R&D expenses to the bone, reap the short term financial rewards. Except that you free up cash for other endeavors where you think there's a long term future.

The question should be if your competitors are making money from that market, is the best solution to stop competing and run away? Should you take the business that makes you money, and trash it or sell it off in order to put the money into a business that doesn't work or may not even exist yet?
 
Hmm. :???: Worse than one could have even thought. Does it mean HD9000 in very late 2014? :oops:

March / April time frame isn't a new rumor, it's kinda expected. I notice that there has been absolutely no word on GPU tape outs for either side.... did both companies get that much better at keeping that quiet? We should have heard about it before now. Maybe nobody's looking?

Next gen after SI the second is going to be on the new process node, and for that I keep thinking back to Rory Read statement "we've got to stop leading with our chin". I could interpret that to mean, stop rushing to get first on a process node, and then having to play catch up on design problems, drivers, etc. Tahiti launched early to meet public statements, it wasn't ready until Feb but launched Dec.

There are benefits to jumping on process nodes early, like tieing up wafer allocation for other people wanting the same capacity....

:oops: Where do they send AMD driver team? I honestly don't understand what's going on here. Are they close to bankruptcy or...?

After the first round of operations optimizations that very publicly shook up the PR teams, and took out a lot of big names, AMD has this year another three more rounds of optimizations ongoing, announced before they announced the Austin campus sell off lease back option. The second round is happening right now, third round goes in effect jan/feb. It's targeted at consolidating sites and reducing operating costs, look at the cash they're trying to get for Austin - ~$300M (max), which is what, half of the R&D budget these days? under $2Bn in the bank, operating costs are ~$1.3Bn yearly, revenue is under that.

We've heard about how Dresden linux team is done, Markham's got a target on it, what about Boston, Orlando? AMD recruitment are bringing in grads as much as possible, while losing senior fellows, engineers, directors... lots of change going on.

The S|A article says AMD confirms Kaveri, and the big cores are on track. On the face of it that refutes the article entirely, Charlie is 100% wrong...

But it's doesn't spell out that the published roadmaps are still accurate, that the design is unchanged from what's been shown so far. Bulldozer codenamed project died and came back again before we saw a product. On track doesnt mean they didnt change the track, and the scope, and the implementation. Maybe I've been listening to Alex to much and got overly paranoid.




In response to somebody in this thread - yes, I'm excited about APU's, they're awesome concepts and I can't wait for HSA to really take off. And if you really look at it, HSA/APU's have been driving the design of GPU's for about 3 years now, since the Evergreen ones.


- Carsten: no :p, no more paper or limited or preview launches, please :D
 
Tahiti launched early to meet public statements, it wasn't ready until Feb but launched Dec.
Actually HD 7970 was available pretty much right at the start of Jan (hence the early launch). 7950 launched in Feb.

We've heard about how Dresden linux team is done
You understand that this has zero to do with graphics in the first place? Nothing.

But it's doesn't spell out that the published roadmaps are still accurate, that the design is unchanged from what's been shown so far.
The business units are in control of their roadmaps and roadmaps will shift for any number of factors.
 
The part that makes people think Kaveri is still going to launch?

If you look at what the x264 guy got, it's quite clear that "Kaveri" is going to launch. Just that it's actually Richland that is actually PD which isn't PD but might be PD. Having spent the past few months working with a Trinity powered laptop (that thankfully also has some GCN discrete in), I can't say it's an update that I'd mind, if it included GCN cores (if it's just a tuned Trinity meh).
 
Not only do you get excited about APU graphics, but you think they drive graphics progress?

I see APUs as utterly boring pieces of mediocrity. Subpar CPU perf, subpar GPU perf. I like Intel's model much better: stellar CPU, forgettable GPU.

Compared to the IGPs of old, they've considerably improved the lowest common denominator. So yes, in a sense, they do drive graphics progress.
Besides, the PC market is something like 60% laptops now, and ever smaller ones at that, where discrete graphics often just isn't an option.

As for the subpar CPU performance (for AMD) that's because Bulldozer sucks, but it doesn't really have anything to do with APUs as a concept.
 
Actually HD 7970 was available pretty much right at the start of Jan (hence the early launch). 7950 launched in Feb.

Availability so that anyone who wanted one could get one didn't happen till after the 7950 launched from what I could tell at the time, obviously you've got better info on numbers etc.

You understand that this has zero to do with graphics in the first place? Nothing.

Yes, and I don't think I was trying to conflate the two but apologies if it appeared that way - it was more of a point to whole picture I'm thinking of than causality of a thus b.

The business units are in control of their roadmaps and roadmaps will shift for any number of factors.

Sure, like the people in charge of the global business units changing resource allocations, priorities and mandating cut backs. Those changes might be driven from internal forces, like resource optimization stategies - layoffs and site consolidation - but also in light of new guidance on target market size, sales and growth rates.

Plus key staff walking out the door and saying "hello Samsung/Qualcom/Intel/NVIDIA", that's going to have an effect as well.
 
Back
Top