The AMD/ATI "Fusion" CPU/GPU Discussion

The AMD-ATI new venture is something that will gradually evolve.
The most important part now is that key brains (HW & SW architects and designers) on both sides are probably working togehter and thinking how they can make something that:
- Make a market sense
- Leverage from the combined capabilities, know-how, IPs, etc...

They will learn from each other and hope good and marketable things will come.

This will take time but eventually in more 4 CPUs process generations (16nm by ~2014) the full integration of CPU and GPU in a single chip is almost inevtable (ISA, memory hierarchy, data management, etc..).

For the 45nm they dont have time to do something completelly new, just target some key market using the old cookbook. Maybe target the nascent low cost, pervasive ultraportable market (probably the key market worldwide for the future), something between a notebook and a pda with high graphics capabilities.
 
It seems to me that the biggest advantage for "ATI"/GPU may be that AMD might spend more time doing custom-XX logic instead of using standard libraries for everything. On the other hand, customization seems to lengthen product development cycles enormously and is not necessarily a panacea, anymore than writing an entire application in assembly language is better than writing it in C++. Sometimes it makes sense, in the same way that breaking out the most expensive hotspot code into assembly makes sense, but I'm not sure if AMD holds any inherent advantage over Nvidia in doing this, since the units most likely to be "optimized" (the ALUs) seem to be the ones Nvidia has already did custom R&D designs for.

I think Intel will have problems producing a competitive GPU core regardless of how many engineers they hire, just like NVidia isn't likely to whip out a Core 2/K8L competitor just buy hiring a few hundred engineers to work on it. There are lessons learned, skills built, from working on the same solution/product over and over several cycles, not just individual experience gained that you can't get by reading patents, textbooks, and journal papers alone, but organization experience as well. The industry is littered with companies that "staffed up", tried to crack into CPU and GPU markets, and fell of the charts. Most of them are gone now, and except in mobile and server spaces, and even there, commodity hardware from major providers is killing off the little guys and big guns alike. Will Niagara, PowerPC, et al, continue to survive? MIPS, PA-RISC, non-Niagara SPARC, et al, are all dead. Except in niche cases (like BlueGene/L or cheap datacenter hosting providers), x86 commodity hardware still seems to be erroding their market. The mobile market seems to be converging as well, onto Xscale and OMAP. Everyone except Intel and TI has been losing marketshare in recent years.

While I think it would be interesting to see an NVidia x86 main CPU, I think it would be a HUGE distraction from their core business. In order to even have a hope of being competitive, they'd have to devote enormous resorts and effort on it, with limited possible results and a very long ROI timescale. If they fail (the most probable), they'll end up like Cyrix, Transmeta, and umteem other failed x86 clones. If they succeed, maybe they'd be another AMD, but AMD's business position isn't exactly spectacular compared to NV's position.

If I was in NV's position, it would look like investing in the graphics market brings higher ROI/ROA/ROE, and better margins, with less entrenched players (no Intel to fight). Investing in the CPU market looks like a sink hole to throw money down, since even in the best case, the margins and returns are worse than GPU, mobile, logic, etc investments, and then you have a huge uphill battle to fight against Intel and AMD.

No, if I was NV and I was looking at making any kind of CPU, I'd look at going after the mobile market where things are alot more fluid, and TI, STM, Motorola, Renesas, Qualcomm, NEC, et al, are far less entrenched and guaranteed of a position due to less technological path-dependency in those markets. NV could choose to either partner with one of those, or, they could design their own mobile DSP/CPU with integrated core for handhelds, mobile phones, iPods, etc.

Or, alternatively, they could buy Transmeta, produce a low-power x86 capable chip with integrated MCU and MGPU capabilities, and sell it for the ultra lightweight (i.e. not bulky desktop replacement) notebook market, and origami devices. It could also be embedded in living room media center devices (not computers, but stuff like iTV), as well as in automative/telematics applications, especially since most GPS navigation systems I've demoed have TERRIBLE framerate.

But trying to go up against Intel/AMD in the mainstream x86 market seems like a terrible business proposition, and if I learned that they are actually doing that, I'd sell off my NVDA shares and switch to shorting them.
 
But trying to go up against Intel/AMD in the mainstream x86 market seems like a terrible business proposition, and if I learned that they are actually doing that, I'd sell off my NVDA shares and switch to shorting them.

I think I agree with this, I can't see how it can happen if you look at the birthing pains of AMD in the mainstream x86 market. There are many other avenues open to NV of course, its a question of guessing which they'll go down. Mobile? HPC? Media/STB stuff? Maybe NVIDIA are looking ahead to XBox 720 and PS4?
 
But trying to go up against Intel/AMD in the mainstream x86 market seems like a terrible business proposition, and if I learned that they are actually doing that, I'd sell off my NVDA shares and switch to shorting them.

Yes. Given how we've previously talked about the drain on resources and subsequent delays on other products when companies like ATI and Nvidia design a graphics chip for a console, how on earth will they deal with a mainstream CPU?

Nvidia and (the company formerly known as) ATI are not huge companies, and designing a CPU and going up against two massive entrenched incumbents like Intel and ATI is an Everest to climb.

However, it seems an obvious, if difficult direction for Nvidia to go in now that the writing is on the wall for them to potentially be frozen out of a large portion of the graphics and chipset market in future years by both Intel and AMD/ATI. The CPU manufacturers have seen the markets that ATI and Nvidia have in chipsets and graphics, and they'd rather have it for themselves. Nvidia have got to find new areas to move into before Intel and AMD grind their market away.
 
Sweeney talks to FiringSquad about D3D10:

http://www.firingsquad.com/hardware/directx_10_graphics_preview/page8.asp

Talk of "adding physics features to GPUs" and so on misses the larger trend, that the past 12 years of dedicated GPU hardware will end abruptly at some point, and thereafter all interesting features -- graphics, physics, sound, AI -- will be software problems exclusively.

The big thing that CPU and GPU makers should be worrying about is this convergence, and how to go about developing, shipping, marketing, and evolving a single architecture for computing and graphics. This upcoming step is going to change the nature of both computing and graphics in a fundamental way, creating great opportunities for the PC market, console markets, and almost all areas of computing.

Jawed
 
He's been selling that message for several years now. I don't know that it really would be in the cards if the CPU makers hadn't hit a clock speed wall, forcing them to go wider rather than faster. . . and that wasn't readily apparent (at least I don't remember it being so) at the time he first predicted it.

If it's going to happen, at least thank goodness there will be some graphics "adults" in the room to provide reality checks to the CPU zealots, at least at AMD.
 
If it's going to happen, at least thank goodness there will be some graphics "adults" in the room to provide reality checks to the CPU zealots, at least at AMD.
My guess some of these graphics "adults" are probably jumping from one side to the other because of pure happiness thinking about the early access to advanced fabrication process, much more resources and longer development cycles.
 
However, it seems an obvious, if difficult direction for Nvidia to go in now that the writing is on the wall for them to potentially be frozen out of a large portion of the graphics and chipset market in future years by both Intel and AMD/ATI. The CPU manufacturers have seen the markets that ATI and Nvidia have in chipsets and graphics, and they'd rather have it for themselves. Nvidia have got to find new areas to move into before Intel and AMD grind their market away.

Discrete solutions ain't going away, and right now, integrated solutions are only 17% of NVidia's revenues. Should they spend massive amounts of R&D to bet the whole company on making a CPU to guard against losing 17% of their business? (not considering how this figure will be diluted from PS3 and PSP2 revenues)

What we're seeing is a repeat of consolidation arguments like during the 3dfx days, or the "you can't be a competitive chip maker without investing in and building fabs" argument during the early days of Intel competition. Anyone who didn't merge with, or build their own fab, was going to lose out, and discrete vendors which did not "own" retail card production capacity were going to lose out as well. I mean, if you own your own fabs, you aren't subject to your competitors competiting for capacity on a shared fab and rising prices, and of course, if you make your own cards, you can get a cut of retail revenue. How efficient, cutting out the middle men.

At best, Nvidia might risk being locked out of the core logic market, but the company was doing outstanding before they even entered that market, and realistically, it is a distraction to their core competency which is graphics (even if they seem to be quite good at making core logic) In my opinion, the core logic chipset market is headed for commodification in the future, in contrast to discrete which has alot more legroom before it is "commodified", and that means Intel and AMD will be competiting on price alone eventually, and that's a recipe for shitty future margins.
 
OK, what about new markets? Such as handheld. If AMD produces a combined single-core x86 CPU/GPU handheld processor and Intel does the same, how is NVidia supposed to compete?

Jawed
 
OK, what about new markets? Such as handheld. If AMD produces a combined single-core x86 CPU/GPU handheld processor and Intel does the same, how is NVidia supposed to compete?

What's the compelling reason for using x86 in a handheld over, eg. ARM?
 
At best, Nvidia might risk being locked out of the core logic market, but the company was doing outstanding before they even entered that market, and realistically, it is a distraction to their core competency which is graphics (even if they seem to be quite good at making core logic) In my opinion, the core logic chipset market is headed for commodification in the future, in contrast to discrete which has alot more legroom before it is "commodified", and that means Intel and AMD will be competiting on price alone eventually, and that's a recipe for shitty future margins.

Difference is that commodification is also going to take place in the graphics segment, which is Nvidia's core business. In a couple of years, we could be seeing CPU/GPU hybrids that give you "enough" graphics power, ie, G80/R600 levels of power (even though they will have been superceded by then in the discrete graphics segment). For a lot of people, except the high end gamers, that's more that adequate.

It's quite possible that the mid to mid-high segment of discrete cards goes away, to be replaced by Fusion type products from both AMD and Intel. The people that are today buying low, mid, and mid-high cards will simply buy them as part of their CPUs. All but the highest end products simply get moved to the multi-core package and takes the place of a couple of (mostly idle) cores that sit in the CPU socket.

Can Nvidia survive on just the discrete high end, especially when ATI/AMD will more easily be able to continue to offer discrete as part of their overall graphics business, while taking away the cash cow of Nvidia's core business?

The danger is obviously that without a Fusion-type product Nvidia could lose out if AMD and Intel are shipping products that don't need a discrete graphics card, and if the stories are true, Nvidia belives this is the case to the extent that they are doing something about it by designing their own X86 core.

I like discrete cards, and I'd never buy an intergrated solution because of the poor performance for gaming, but what happens that day when all the transistors currently in a discrete chip are part of a multicore chip? Am I going to need a discrete part when my "processing package" has quad CPU cores and quad R600 cores all in the one package? Just because it all sits in one socket instead of half being on a card in a PCIe slot is really no difference as far as the end-user experience goes.

The AMD/ATI merger has taken one step closer to that, and if Intel follows suit (god knows both Intel and AMD are looking for extra things that can make all these multicores useful), then Nvidia might not have any choice but to follow on down that road or be left behind.

Despite all that chest-thumping when ATI annouced they were selling out to AMD, I think that Nvidia are worried. The industry landscape of their core business is being changed around them, and if they are not part of it, they are going to be left out in the cold when it all washes out in five years time. Intel and AMD used to need Nvidia and ATI. They needed their chipsets, they needed their innovation, they needed the graphics cards to showcase all the CPU power and things to do with multiple cores. Now both AMD and Intel are working on making ATI and Nvidia unnecessary. ATI gets subsumed for AMD to make a technology leap on Intel, Intel does it's own thing for chipsets and makes deals with Imagination for graphics, and Nvidia are left out in the cold.
 
Last edited by a moderator:
My guess some of these graphics "adults" are probably jumping from one side to the other because of pure happiness thinking about the early access to advanced fabrication process, much more resources and longer development cycles.

I'm not sure if the development cycles really are longer anymore, at least for major architectural changes. The evidence suggests that G80 was under dev for nearly 4 years, and if you go back to the R400 roots of R600, R600 possibly as long.

But the major caveat/wildcard there might be MS continually punting Vista backwards.
 
The danger is obviously that without a Fusion-type product Nvidia could lose out if AMD and Intel are shipping products that don't need a discrete graphics card, and if the stories are true, Nvidia belives this is the case to the extent that they are doing something about it by designing their own X86 core.
I have serious doubts about Nvidia's chances with their own x86 design, at least not in the time frame they have.

Did they get the license for it? I know VIA got it from when it bought Cyrix, and AMD has a court-mandated cross-licensing deal. Transmeta went through software emulation to do theirs.

Where did Nvidia get the license?

I'm not aware of the licensing around Intel's CPU bus, but it's not likely Intel will let Nvidia use that as a platform.
Nvidia could possibly use coherent hypertransport, since AMD has allowed other vendors to use it, but that's still up to AMD/ATI.
Even if Nvidia could go hypertransport for its CPU, it would be going against a very established AMD by that point.

Nvidia would have to design not only a worthwhile x86 CPU, but also the system platform to go with it. Then it would have to get it accepted by the rest of the market, and that's very hard to get going.

Remember how painful things were when AMD was forced to go to its own socket A? That was just when Intel was pressuring motherboard manufacturers and chipset makers.

What's Nvidia going to do when both Intel and AMD/ATI are trying to slow it down?
No motherboard manufacturer is going to have the spine to face the two manufacturers that make up pretty much all the market in x86.

I guess Nvidia could hope for less than VIA C3 volumes for its low end integrated chips on its pricey new platform that nobody will dare support publically.

Am I going to need a discrete part when my "processing package" has quad CPU cores and quad R600 cores all in the one package? Just because it all sits in one socket instead of half being on a card in a PCIe slot is really no difference as far as the end-user experience goes.

Considering that a single R600 core might need as much bandwidth as all four CPU cores, how are they going to fit the pins for four R600s and four CPUs onto a die that will not be much larger than what we have now?

The AMD/ATI merger has taken one step closer to that, and if Intel follows suit (god knows both Intel and AMD are looking for extra things that can make all these multicores useful), then Nvidia might not have any choice but to follow on down that road or be left behind.

Nvidia's best hope is to hammer ATI's offerings now and hope Intel continues to hammer AMD's lineup.
AMD isn't on the firmest of footing, and ATI isn't now, either. It hasn't been determined if this is a match that will stand the test of time.

With enough strain, AMD might be forced to spin off its assets again.

Failing that, Nvidia could enter into some limited partnership deal with Intel, with Intel being the much more senior partner.
 
I have serious doubts about Nvidia's chances with their own x86 design, at least not in the time frame they have.

<snip>

What's Nvidia going to do when both Intel and AMD/ATI are trying to slow it down?
No motherboard manufacturer is going to have the spine to face the two manufacturers that make up pretty much all the market in x86.

Yes indeed, I actually agree with most of your points, and the licence issue is most pertinent, but what else can Nvidia do if they foresee a future where all that's left for them in the PC space is the very high end discrete market that they have to fight for against Intel and AMD/ATI?

Considering that a single R600 core might need as much bandwidth as all four CPU cores, how are they going to fit the pins for four R600s and four CPUs onto a die that will not be much larger than what we have now?

You won't need pins as such, the transistors that make a graphics chip will be incorporated into a die, just like a second CPU core is now. The didn't need to add more pins when they added a second core. Hypertransport seems to be getting regularly upgraded in speed and bandwidth, so maybe that's how they'll handle the issue along with faster and wider memory. I'm sure that AMD didn't just announce Fusion without some idea of how they will handle the memory issues that come with it.


Nvidia's best hope is to hammer ATI's offerings now and hope Intel continues to hammer AMD's lineup.
AMD isn't on the firmest of footing, and ATI isn't now, either. It hasn't been determined if this is a match that will stand the test of time.

With enough strain, AMD might be forced to spin off its assets again.

I doubt Nvidia has the time or money to get into a price war with AMD, with whom they still have a lot of lucrative deals. Not only is "hoping the other guy breaks first" a poor strategy, it's just as likely to damage Nvidia.

Failing that, Nvidia could enter into some limited partnership deal with Intel, with Intel being the much more senior partner.

That's been mooted, but it seems that the personalities at the top won't let that happen.
 
Motherboard-integrated graphics were once seen as sure death knell for discrete ones. While the conversation has now shifted to CPU integrations, the equation has not fundamentally changed. Stand-alone GPU will still offer better performance and greater flexibility then integrated ones.
 
Motherboard-integrated graphics were once seen as sure death knell for discrete ones. While the conversation has now shifted to CPU integrations, the equation has not fundamentally changed. Stand-alone GPU will still offer better performance and greater flexibility then integrated ones.


Dunno, the leading graphics producer by volume is Intel. The question I ask myself, is how far up the discrete graphics foodchaing does integrated have to bite off before *even if it can't compete with the very top end discrete graphics on performance alone* it makes providing those top end discrete solutions uneconomical? We should not forget that a lot of what makes the R&D investment at the top end possible is the ability to amortize that over the much higher volume/lower margin business in the midrange and low-end. If what is today "X1300/X1600" territory gets swallowed by integrated, then there is much lesser volume to amortize that high-end R&D investment over for discrete standalone.

The other point is that I don't know that today's "integrated" is a fair comparison to what is being discussed here. On-die integration is going to provide wildly greater interconnect speeds than northbridge-CPU, and that will have consequences. I think, upon reflection, that's what Matt Pharr's paper in another thread was saying, rather than just talking about Cell per se.

So, sure, speed to graphics memory is still an issue, but there *are* some countervailing advantages that come to the table too to help balance that off.

Like I said, I'm still from Missouri on whether it could really get to take on high-end toe-to-toe on performance all else being equal. . . but if it kills off high-end by making it uneconomical, then it still "wins" (whether I like it or not).
 
Dunno, the leading graphics producer by volume is Intel.

Yeah… I thought about that. However, in the span that Intel's market share went from 2% to 35-40 it is now, Nvidia and ATI had quadrupled their profits and revenue. I am sure that Intel's integrated graphics had cost both of them sales, but did not impede a very substantial growth lvel.

The question I ask myself, is how far up the discrete graphics foodchaing does integrated have to bite off before *even if it can't compete with the very top end discrete graphics on performance alone* it makes providing those top end discrete solutions uneconomical? We should not forget that a lot of what makes the R&D investment at the top end possible is the ability to amortize that over the much higher volume/lower margin business in the midrange and low-end. If what is today "X1300/X1600" territory gets swallowed by integrated, then there is much lesser volume to amortize that high-end R&D investment over for discrete standalone.

It’s a fair question. I am sure there is an equation that can be derived along the lines that as long as low and mid-low range chips outperform integrated solutions by X% (let's say, 60-100%), they will remain viable solutions.

The other point is that I don't know that today's "integrated" is a fair comparison to what is being discussed here. On-die integration is going to provide wildly greater interconnect speeds than northbridge-CPU, and that will have consequences. I think, upon reflection, that's what Matt Pharr's paper in another thread was saying, rather than just talking about Cell per se.

True, the performance will be greater, but the fundamental issue of transistor budget limitations compared to non-integrated solutions and lack of flexibility are still there. Plus, we should also note the emergence of surprising trend in graphics - multi-GPU solutions. How do "consumers want everyone on a single die" and "consumers want multiple GPUs is that means better performance" market assessments reconcile?

EDIT: Yeah, I know that integrated and multi-GPU are two completely different market segments. However, their success does say something about the current market mentality.
 
Last edited by a moderator:
Back
Top