Fusion die-shot - 2009 Analyst Day

I was just wondering... which market is llano aimed to? i feel that nowadays there are only netbooks/nettops and performance desktops... so if you aim for netbooks or notebooks, wouldn't it make more sense to scrap llano and go straight for ontario?(or a quad core bobcat instead of an old power "inefficient" K10.5?) also it would probably save some ingeneering ressources to accelerate the coming of buldozer which would be crutial at that point of time. I really don't see the point of creating a new platform which would certainly end in a deadend.
 
I was just wondering... which market is llano aimed to? i feel that nowadays there are only netbooks/nettops and performance desktops... so if you aim for netbooks or notebooks, wouldn't it make more sense to scrap llano and go straight for ontario?(or a quad core bobcat instead of an old power "inefficient" K10.5?) also it would probably save some ingeneering ressources to accelerate the coming of buldozer which would be crutial at that point of time. I really don't see the point of creating a new platform which would certainly end in a deadend.

depends on sizes and power usage no ?

Netbooks require a really power effecient chip that is why atoms are really big in that field. Not just power efficent but also very cool running.

Also lower end laptops with turbo mode would go good against intels cluv chips.

bobcat would be good but its most likely not coming till after lano can launch and mabye it will be more power and heat demanding than lano and end up in low end laptops and higher. Mabye even cheap towers.
 
I was just wondering... which market is llano aimed to? i feel that nowadays there are only netbooks/nettops and performance desktops... so if you aim for netbooks or notebooks, wouldn't it make more sense to scrap llano and go straight for ontario?(or a quad core bobcat instead of an old power "inefficient" K10.5?) also it would probably save some ingeneering ressources to accelerate the coming of buldozer which would be crutial at that point of time. I really don't see the point of creating a new platform which would certainly end in a deadend.

Any system with a basic desktop processor and IGP would be the target for Llano. That means inexpensive desktops, some laptops, or office machines.

And according to AMD, it wouldn't make sense to scrap Llano at this point. It would mitigate risk if a well-understood core is used to transition to Fusion, which is already a GPU integration+new process product introduction. Given that AMD is trying to rebuild some battered credibility with product releases coming in on time, it doesn't want a black eye from a Bulldozer core delaying Fusion (again (again (again))).
 
12" netbooks still have relatively quite a bit of juice in them .... too much for Ontario to make sense.
 
I was just wondering... which market is llano aimed to? i feel that nowadays there are only netbooks/nettops and performance desktops... so if you aim for netbooks or notebooks, wouldn't it make more sense to scrap llano and go straight for ontario?(or a quad core bobcat instead of an old power "inefficient" K10.5?) also it would probably save some ingeneering ressources to accelerate the coming of buldozer which would be crutial at that point of time. I really don't see the point of creating a new platform which would certainly end in a deadend.

15" laptops don't make themselves heard, as they are still the same as usual and thus a bit boring, but I'm sure the market still is there and huge. As in, a huge percentage of population has it as their main and only machine. (some people have netbooks as their only machine too)

It also makes for a high performance desktop. As in, it has bout 10x the CPU power than a pentium 4 or similar PC it would be replacing, and a GPU similar to a 9500GT or other, much faster than the fx5200 or such it would be replacing.

your concern atbout it being dead-end, maybe from an architecture point of view but Bulldozer-based Fusions will plug in the same mobos probably. So I can't exactly see it as a dead-end platform.
 
i see your point but still, most cpus have enough punch to accomplish what the main market wants which is office work, internet and watching (not encoding) videos. it explains why a subpar cpu like atom managed to completely rock the boat. from what I understand, ontario would be more powerful than a dual atom (maybe on par with current culv?) with an adequate gfx part for this market. Even I, who has been an enthousiast since I overclocked/mod my 286(!!), I bought a CULV instead of a better performing laptop. I feel that because of the level of performance that we get, power efficiency has become more important than performance. Let s make things clear: more than 95 percent of buyers don't care about 3dmark and superpi. Creating a dying platform/socket would just confuse the consumer. To me I still think that having only 2 lines of products: ontario derivatives for netbook/laptops/netttop/low end desktops (you can still increase the frequency if you want to ramp up the performance) . and buldozer for desktops/high end. llano simply doesn't fit the requirements of netbooks because it would eat at least 20 W alone (5 W min per core). it would be a dead before arrival because of the new core i3 and i5 ( which wouldprobably have their culv counterparts by 2011..)
 
What dying platform?

Llano serves notebooks and desktops, which when combined are the vast majority of the market for both personal computing and corporate PCs.

The sockets in use will probably be used by successors, for which the implementation details of Llano are unimportant for the platform's longevity.
 
ok i did not get that last point. I did not know that you could keep the same platform/sockets when you change the core structure (replacing the k10.5 core by a bobcat or successor) . In my vision you could not replace the cores without changing sockets. It would be most regrettable if llano creates a platform like the 939 which would only last a year and a half. for oems, having a stable platform would be more practical (just fitting higher frequency cpus as they come ).
 
Don't forget business laptops and desktops. I haven't heard of netbooks being popular in this market. Plus, as others have said traditional notebooks are still very popular.
 
ok i did not get that last point. I did not know that you could keep the same platform/sockets when you change the core structure (replacing the k10.5 core by a bobcat or successor) . In my vision you could not replace the cores without changing sockets. It would be most regrettable if llano creates a platform like the 939 which would only last a year and a half. for oems, having a stable platform would be more practical (just fitting higher frequency cpus as they come ).


if you look back on all AMD sockets since 754, you can see that sockets are tied to memory support (single channel ddr, dual channel ddr, registered ddr, ddr2, registered ddr2 and so on). doing away with the FSB and using Hypertransport makes your socket platform very stable.

Intel has only done the same recently, and is about to launch a new high end socket with quite a twist : it will be used for Nehalem EX and Itanium processors, which aren't even using the same instruction set.

Regarding AMD Fusion, the Bobcat Fusion might be intended for a smaller, lower foot-print socket while Llano and Bulldozer Fusion will use a similar to AM3 socket.
 
Last edited by a moderator:
thanks for the enlightment, I was also thinking in my line of bobcat better that k10.5. I thougt that last few years have been going for more threads and cores. also I was thinking that bobcat is supposed to be "50 % of the size and 80% of todays performance" . I suppose this applies per core...therefore a quad core bobcat would have provided 4 threads with a size similar to a dual core k10.5.... with the current tendency, wouldn't it provide a better laptop/desktop experience? If you look at all the boring laptop market, wouldn't corporate user want more battery life to do his work in a transatlantic flight? wouldn't anyone choose better battery life rather than "performance" considering that the main usage is office work?
 
thanks for the enlightment, I was also thinking in my line of bobcat better that k10.5. I thougt that last few years have been going for more threads and cores. also I was thinking that bobcat is supposed to be "50 % of the size and 80% of todays performance" . I suppose this applies per core...therefore a quad core bobcat would have provided 4 threads with a size similar to a dual core k10.5.... with the current tendency, wouldn't it provide a better laptop/desktop experience? If you look at all the boring laptop market, wouldn't corporate user want more battery life to do his work in a transatlantic flight? wouldn't anyone choose better battery life rather than "performance" considering that the main usage is office work?

It's better for the first fusion product to be based around an existing core rather than one that is still in developement.

Once you have the concept implemented and design hurdles overcome or at least understood, you can then look to implementing more advanced CPU cores. But even then don't expect fusion variants to launch day and date with pure CPU cores.

Especially when in most cases, your initial new gen CPU core will be targetted towards the server/workstation market where an integrated GPU doesn't currently make sense (and would result in wasted transistors and greater power consumption).

In the future, if OpenCL/Direct Compute takes off it may become a key feature in that segment. But until then fusion will always lag behind a "pure" CPU core.

Regards,
SB
 
AMD went for 1st gen Z-RAM, then 2nd gen Z-RAM.
Now it's T-RAM.
Maybe the third pie-in-the-sky memory type is the charm?

T-RAM's operating principles are based on some rather novel ideas, though it sounds very new and not field tested.
Z-RAM initially relied on what was otherwise a sometimes problematic side-effect of PD-SOI, one that is going away eventually. The 2nd gen had some other idea.
It sounds like Z-RAM "worked", in that there were functioning devices. It did not work in the sense that it could work at the desired performance and reliability required.

On-die eDRAM at least has one high-performance CPU using it. Sure, its a chip that goes into systems that can cost more than a house/neighborhood, but it's at least something real and something that has had to face far more stringent requirements than AMD's flavor of the year memory tech.

At least as far as the first instantiations of Bulldozer, the cache capacities are not out of line with what we'd expect of SRAM.

Thyristor based memory is actually a lot more reasonable than the floating body effect that the ZRAM guys were/are working on. The problem with the floating body effect, is that there doesn't seem to be a whole lot of charge to work with.

T-RAM has being doing test chips since 1999, initially at Stanford, then at T-RAM itself.

David
 
I don't know enough about the timeline to say, though I feel it may be a bit of overreach to have that level of certainty, as it does presume everything goes swimmingly with Llano and Intel's equivalents during that time period (it seems unlikely that there will be massive problems, though).

The arguments for the ending of the low-end discrete segment are strong.
Llano's on-chip GPU will end the need for motherboard graphics chips for the AMD platform.
Cheap graphics boards do not have stratospheric memory bandwidth numbers, due to cost and limited need.

Without the major lead that higher-end boards have in memory bandwidth, Llano's GPU won't compete in the higher segments. The value end has bus widths comparable to a CPU, and the cheapo RAM they do use is sometimes the same grade as what is used for a PC.

Outside possibly some kind of niche multimedia features or the need to update old systems, Llano's GPU will be sufficient, and potentially superior to the low-end discrete cards.

As far as costs go, system builders for the low to possibly lower-mid range could skip the GPU component completely. Motherboard complexity will likely go up somewhat with the larger socket and extra output needed, but that can achieve much better economies of scale than multiple tiers or slightly different cards.
This might be made up for by possibly reducing the high lane-count graphics slots or removing the extra space mobile products set aside for a separate package and DRAM.

Even if Via and Nvidia teamed up, Via's minimal CPU presence is not going to bring the volumes of AMD or Intel's similar efforts, hence Nvidia's hopes for opening up any kind of market outside of the traditional fields.

edit:

Later on he added some posts about the effect the lower-end Evergreen variants are having on Nvidia, which is a factor that would be having immediate effects.
 
Last edited by a moderator:
Without the volumes of <$100 parts, will the R&D for the big parts still be feasible?

My guess is that >80% of the unit volumes are in the <$100 market.
 
I don't know enough about the timeline to say, though I feel it may be a bit of overreach to have that level of certainty, as it does presume everything goes swimmingly with Llano and Intel's equivalents during that time period (it seems unlikely that there will be massive problems, though).

The arguments for the ending of the low-end discrete segment are strong.
Llano's on-chip GPU will end the need for motherboard graphics chips for the AMD platform.
Cheap graphics boards do not have stratospheric memory bandwidth numbers, due to cost and limited need.

Without the major lead that higher-end boards have in memory bandwidth, Llano's GPU won't compete in the higher segments. The value end has bus widths comparable to a CPU, and the cheapo RAM they do use is sometimes the same grade as what is used for a PC.

Outside possibly some kind of niche multimedia features or the need to update old systems, Llano's GPU will be sufficient, and potentially superior to the low-end discrete cards.

As far as costs go, system builders for the low to possibly lower-mid range could skip the GPU component completely. Motherboard complexity will likely go up somewhat with the larger socket and extra output needed, but that can achieve much better economies of scale than multiple tiers or slightly different cards.
This might be made up for by possibly reducing the high lane-count graphics slots or removing the extra space mobile products set aside for a separate package and DRAM.

Even if Via and Nvidia teamed up, Via's minimal CPU presence is not going to bring the volumes of AMD or Intel's similar efforts, hence Nvidia's hopes for opening up any kind of market outside of the traditional fields.

edit:

Later on he added some posts about the effect the lower-end Evergreen variants are having on Nvidia, which is a factor that would be having immediate effects.

I'm not convinced the liano will kill low end cards .It might get the $20-$50 parts . But i don't see how a gpu sharing the cpus memory bandwidth is going to compete against anything higher.

The 5670 at $100 has 64GB/s bandwidth. The core i7 needs three memory channels using ddr 3 -2000mhz to get 48GB/s

I don't know any company that will ship a low end computer with three channels of ram running at 2000mhz each . Thats aside from the point that there is still a cpu sharing that same bandwidth.

Then you have the fact that anyone who has a laino might at one point need a better gpu. You might be able to put a faster cpu in with a faster gpu. But most likely it will still be th same generation and tied to your same bandwidth. Mean while 2 years down the road gpus would have gained acess to even faster ram and newer generations of design for the gpu itself.

So I really think we will see low end gpus still exist and have a healthy life because they will still be viable as cheap upgrades for games and other things. You'll still get better open cl , java , flash and what have you performance also. Now mabye in laptops the cheap add in gpus are gone. the lowe end of the specturm. But people will still want dual gpus in them and still want faster than what the cpu/gpu can provide.
 
Last edited by a moderator:
I am concerned over the bandwidth as well. But the prize for $100 GPU market is large enough to warrant putting in some dedicated GDDR. Not to meantion that this will encourage modern day GPU's to become more bandwidth efficient.
 
I am concerned over the bandwidth as well. But the prize for $100 GPU market is large enough to warrant putting in some dedicated GDDR. Not to meantion that this will encourage modern day GPU's to become more bandwidth efficient.

but where will the gddr go ? They wont put it on die , it would increase the chip size. I doubt they will have it on the motherboard either.


I think the $100 gpu market will stay a discrete market. I think the igp market will go away though and maye the sub $100 market. But even then i have my doubts.

The 5670 has 64 gigs now. The 6570 will have 64 gigs and be $80 and under. The 7570 will have 64 gigs and cost $50.

How much will ram bandwidth go up on the desktop market ? Generaly new ram is tied to new sockets.
 
but where will the gddr go ? They wont put it on die , it would increase the chip size. I doubt they will have it on the motherboard either.

They could make an MCM for the 5670 equivalents. Then it isn't tied to sockets any more and can provide more bandwidth/mW.

But yeah, $100 gpu's will resist cannibalization much better.
 
Back
Top