ATI Moving Strictly to Fusion?

Where will AMD's focus be next year?

  • FUSION - no enthusiast-class GPU's but "some" disrete cards

    Votes: 3 7.1%
  • Same pyramid scheme we have now(enthusiast through budget), plus FUSION

    Votes: 34 81.0%
  • Only FUSION

    Votes: 2 4.8%
  • Next Year? I just want R600 already! ;)

    Votes: 6 14.3%

  • Total voters
    42

pelly

Newcomer
Ok...before anyone goes off the deep end with conspiracy theories, let me just run with the disclaimer that this thread is created purely on my own with no particular knowledge or input from AMD, ATI, or anyone else... :)

I'm just curious to see what people think about the idea of AMD pulling out of the discrete enthusiast graphics card business.

What if AMD decided to instead focus purely on developing mainstream GPU technology? This could easily be incorporated into their next-generation FUSION processors with some discrete desktop and/or mobile business possible if margins were favorable. Designs could be modified to adapt to handhelds, consoles, cell phones, etc...

Thoughts???

Note: Again, this thread is strictly created to see what you people think about the idea, nothing more...
 
I think Fusion is planned for 2009 and imo it will be dual use : low end graphics and vector coprocessor.
the midrange and high end GPUs will still be on cards, they just won't find the transistor budget, electric power, cooling and bandwith on a CPU socket.
But I can see an AMD fusion in Wii 2.
 
IMO Fusion will range form low-end gfx to medium mid range, i.e X1650 pro\XT class GPUs. However, i might take my 'mid-range' claims back once i finish researching GPU TDP. Anyway, even a low end GPU clocked 6X faster than any other GPU on the market won't be that bad.
 
Is the question that AMD pulls out of discrete solutions entirely?

I'm not sure there's enough money available in the low-end market Fusion targets to justify that, especially not with the price tag they paid for ATI.

If the low level of Fusion were all there were, I wonder if AMD would have tried acquiring some lesser graphics brand. Considering the poor showing Fusion will make against any competent discrete solution, AMD must have an eye on the higher-margin parts of the graphics market.

On the other hand, I can see some reasons why it would be tempting.
Any Fusion derivative graphics core would need to be treated separately from any discrete foundry design. There would be a significant difference at the circuit level between say an RV8xx and a contemporary Fusion, even if the higher-level design were the same.

There's enough of a process difference to restrict how much work on any graphics silicon on the CPU side would be transferable to ATI's traditional products, and vice versa.

AMD would need to handle the burden of a bifurcation in its circuit engineering resources.

If AMD pulled out of competition with Nvidia, it would be able to focus on holding off Intel. Right now, it doesn't look comfortable suffering through a two-front war with two market leaders.

Either way, I do fear that with current market pressures, the ATI portion of AMD will be the first to be marginalized.
We'll have a better idea if we start seeing layoffs or transfers from ATI's engineering department in groups that would have reduced utility for the Fusion and CPU departments.

If we start seeing guys being shuffled to AMD's IC group, like the design experts for ATI's memory controllers and software development, then that would indicate AMD is cannibalizing its acquisition.

If we start seeing guys who would be designing the core logic for the discrete group, as well as engineers who handled implementation of circuit design for discrete chips leaving, then that's at least a sign that AMD is backing down, if not out entirely.
 
Is the question that AMD pulls out of discrete solutions entirely?

I wouldn't think we'd see AMD pull out of discrete solutions entirely....though they'd focus strictly on mainstream and performance level products and not produce any "enthusiast" class products. Here, you'd see discrete cards offered from $199 and under...Yields are trivial here, transistor counts low, margins solid, and volume phenomenal. However, the biggest plus to this approach is the lack of risk...

If AMD focused on "performance" level (ie: ~$100-199) discrete solutions for their "high-end" and used IGP/FUSION for the lower end....they could scale solutions across a ton of different markets, have low risk, keep costs down (mfg - processs, transistor count, bourd layers, etc), and really make a solid push against Intel...

Who knows.....should be interesting to see what happens...
 
I wouldn't think we'd see AMD pull out of discrete solutions entirely....though they'd focus strictly on mainstream and performance level products and not produce any "enthusiast" class products. Here, you'd see discrete cards offered from $199 and under...Yields are trivial here, transistor counts low, margins solid, and volume phenomenal. However, the biggest plus to this approach is the lack of risk...
Which is why Xabre, DeltaChrome, etc. have done so well, or why AMD just sat around making K6 chips.

I suppose AMD wants to give up on the high-margin workstation market, where performance can still be signficant. Granted, Nvidia is doing very well here as well.

The enthusiast parts are hardly a deal-breaker. The cost in not having a leading architecture that can trickle down is painful.

Unlesss, that is, the large overhead of maintaining a design department for discrete GPUs is scaled down significantly. Then when after a generation or two Fusion's performance and the low-end parts are noticeably worse than the dirt-cheap products of Nvidia and Intel, and the safe route leaves AMD with the results that short-term thinking will eventually produce.

If AMD focused on "performance" level (ie: ~$100-199) discrete solutions for their "high-end" and used IGP/FUSION for the lower end....they could scale solutions across a ton of different markets, have low risk, keep costs down (mfg - processs, transistor count, bourd layers, etc), and really make a solid push against Intel...
An equivalent amount of sharing can come down from the high-end just as easily as a jump from performance to low-end.

Very little sharing will happen at all, however, if Fusion and the discrete solutions are on a different process, and possibly not much even if they were.

There's that much of a difference between foundry processes and AMD's in-house technology, and the CPU+GPU+socket situation is very different.
 
I think they'll keep their hand in the enthusiasts market for as long as they provide viddy solutions, there's just too much trickle-down business from having the bragging rights to the fastest card.

(At least IMHO, else why else would they be competing for it so hard? :p )
 
I'm not sure I see how a Fusion aimed at solely at low-end graphics can work for AMD in the GPGPU market.

What's it going to be competing with? What would be its competitive advantages? We're already aware of two Intel research projects which smack of being potential monsters in that market. NVIDIA look like they're taking this market very seriously, and pushing their high-end GPUs in that direction.

To be competitive in that market surely Fusion is going to have to have the raw floppage, massive bandwidth and immense latency-hiding capabilities of a high-end GPU? So why only use it for low-end graphics? My understanding is that Fusion is supposed to be a scalable architecture, and to exploit the synergies (I hate that word!) between graphics and high-end compute. So I can see why it would work for low-end graphics, but if it can't scale to high-end graphics I'm not sure I see why it can scale to high-end compute (cf. what their competitors will have in 2009+).

I know that GPGPU is a niche market currently, but the technical & scientific server market is huge and high-margin, and it's a market have AMD spent the past three years trying to break in to with Opteron?

Something doesn't stack up. Maybe I'm missing something :/
 
Not low-end, mid-end...or mebbe even redefining the low-end a bit. I'm not quite sure.

I don't think they're just aiming for basic graphics with it though, they're aiming for something you could actually game on. I'm personally hoping it raises the minimum system specs for games a bit over time.
 
I'm not sure I see how a Fusion aimed at solely at low-end graphics can work for AMD in the GPGPU market.
That's the target market disclosed so far for AMD's CPU+GPU chips.

The strongest hint that AMD isn't ditching the discrete market for GPGPU is that hideously overpriced Radeon card they put out as a streaming processing board.

The CPU socket is not much of a win outside of "modestly better performance per watt", to paraphrase AMD's latest description of Fusion's advantages.

To be competitive in that market surely Fusion is going to have to have the raw floppage, massive bandwidth and immense latency-hiding capabilities of a high-end GPU?
As a socketed CPU, physical realities cut that massive bandwidth by a factor of 10. The latencies will stretch even the best GPU's abilities to hide them, and the CPU and its caches will probably cut the raw floppage by several times what a discrete solution would have.

So why only use it for low-end graphics? My understanding is that Fusion is supposed to be a scalable architecture, and to exploit the synergies (I hate that word!) between graphics and high-end compute.
There are apparently fewer easily captured synergies than AMD hoped. Their lack of vision when it comes to outlining a plan to get worthwhile benefits from the two markets is telling.

So I can see why it would work for low-end graphics, but if it can't scale to high-end graphics I'm not sure I see why it can scale to high-end compute (cf. what their competitors will have in 2009+).
You're in good company, AMD doesn't know either.

edit:
This is not to sound anti-AMD. It's just that the litany of stupidly optimistic early promises, then brutal reality checks, and the complete inability to set down a definitive plan infuriates me.
How can they hope to bumble their way to anything?
It's like Gilligan's in charge of their dev strategy.
 
What, we don't get an "Other"? :cry:

Here's some tea leaves to stir: http://www.hpcwire.com/hpc/1282253.html from Hester and Drebin.

I don't like any of the choices offered, primarily because I don't think the current pyramid will survive at the bottom. I think Fusion swallows IGP for sure, lowest-end discrete most probably, and maybe even low-middle discrete. Middle-middle and up I think is likely safe for discrete.

And "next year" is a little quick, unless you mean engineering-wise.
 
How about an Other Other choice? ;)

I'd bet that IGPs would most likely get subsumed within the AMD market (all 23% of the total that it is).
The low-end discrete I can see being encroached from below.

The question I have is where exactly the mixed-signal hardware and non-processing elements go.
x86 chips and their processes basically omit that portion of the hardware world, but I can see an interesting new market opening up at the low end.

The GPU might leave the video card, but the low-end card market would likely remain.
Something needs to interface with the outside world and do all those nifty functions that can't fit on a Fusion die.

There would be a need for a peripheral card at the higher reaches of the low end, where the inherent inflexibility and constraints on features limit the differentiation of a GPU married to a CPU line.
So perhaps it's Other Other.
The low-end GPU market starts to fade, but the low-end card market remains.
If AMD wants to keep a broad market appeal, it better keep making cards for this market as well, or find some trusted board partners.

I still can't wait for the Nvidia card that interfaces with an ATI GPU (assumes PCI-E will be used for this, and probably would). Former IGP motherboards would keep the peripheral hardware built in.

note:
A big assumption here is that AMD is able to make the push for Fusion's acceptance. There will be strong forces arrayed against it, and it's not known if AMD can offer the necessary volumes to supply demand if Fusion takes off.
 
Ok...before anyone goes off the deep end with conspiracy theories, let me just run with the disclaimer that this thread is created purely on my own with no particular knowledge or input from AMD, ATI, or anyone else... :)

I'm just curious to see what people think about the idea of AMD pulling out of the discrete enthusiast graphics card business.

What if AMD decided to instead focus purely on developing mainstream GPU technology? This could easily be incorporated into their next-generation FUSION processors with some discrete desktop and/or mobile business possible if margins were favorable. Designs could be modified to adapt to handhelds, consoles, cell phones, etc...

Thoughts???

Note: Again, this thread is strictly created to see what you people think about the idea, nothing more...


The obsession continues..
 
I gotta ask, was the original question your own Pelly or did someone/some corporation suggest it to you?

LOL!!!

I assure everyone that this was purely a question I posted based on my own curiosity. Since the acquisition, there have been rumors and speculation that ATI's standard way of doing things could possibly change dramatically. Based on information we've all heard since then and some ramblings at CES, I was curious to see what members here had to say on the topic. I thought my excessive disclaimers would prevent questions like this.....though I'm more than happy to field a question or two from you dig.... :)
 
I think Fusion swallows IGP for sure, lowest-end discrete most probably, and maybe even low-middle discrete.
The IGP business, maybe, if all the future AMD processors will be part of the fusion project and have integrated hardware acceleration. But I don't see them branching out the low end and middle end discrete part production.
Unless, they do absolutely not want to sell part for OEMs that choose Intel CPUs, because they started making money all of a sudden or something along these lines.
 
The IGP business, maybe, if all the future AMD processors will be part of the fusion project and have integrated hardware acceleration. But I don't see them branching out the low end and middle end discrete part production.

Unless, they do absolutely not want to sell part for OEMs that choose Intel CPUs, because they started making money all of a sudden or something along these lines.

Well, the unspoken part of my analysis was Larrabee swallowing the same on the Intel side. :smile: Tho midrange, even low midrange, might be a bridge too far. We'll see. It's likely neither AMD nor Intel knows for sure yet how far up the foodchain they can drive these.

Remember Jen-Hsun's little sneer, "Maybe they'll save a buck or two". That would be, I'd guess, a reference to swallowing the IGP stuff. But I'd also guess that's the minimum they're after. That's their risk limiter --they make that happen and it wasn't a total waste of time and money. The actual goals are probably more ambitious than that. . .
 
I think it would be interesting to see AMD segment the whole GPU business into two segments; one mianstream and the other enthusiast. Having two different segments would allow them to easily manage different lifecycles and competition.

On the mainstream front, they could focus on Intel and making FUSION/IGP a success.....On the enthusiast front, they could focus on NVIDIA and segways into high-end consoles. The biggest advantage here would be being able to cope with 6mo product wars and refreshes whereas the battle with Intel is more of an annual ordreal.

Make no mistake, I want AMD/ATI to stay in the discrete GPU industry...Not only have they pushed the indsutry for higher image quality, but they've helped keep GPU prices (somewhat) sane thanks to competition. Not much good could come from having NVIDIA largely running the show on their own...

Times like this, you wish you could push a fast-forward button to see how it will all work out... :)
 
Last edited by a moderator:
I think that's what you'll see too. From what I've gathered the IGP will be the new midstream and the enthusiast will be the high and super high-end stuff. ;)
 
Back
Top