Trinity vs Ivy Bridge

why only one module ? there is a 35watt 8 core bulldozer @1.6 with turbo to 2.8

Yes, but Bulldozer is the CPU only. Trinity also must combine the GPU elements, and consider that GPU is a considerable power consumer for the package. If you could 'only' cut the CPU in half, you'd still be over the 17W power budget. The number of cores will necessarily decrease by almost 1/4 of Bulldozer's 8-core design in order to have enough power envelope to squeeze in the power budget for the GPU.
 
The 17W APU shown in CES is 2-module/4-core. It's been confirmed in several places, and the pictures taken by the press show the 17W BGA chip with exactly the same size as the 35W laptop socket version.

AFAIK, what AMD said is that they will only be targeting the sub-$500 notebook market, as they'd have a very hard time competing with intel in the high-end/high-margin market. To me, this means the 2-module part is going for sub-$500 machines as AMD simply isn't going to compete in the market for higher priced models.


What is still unknown is how much power the rest of the system will consume. IIRC Trinity has a 2-channel DDR3, which should consume a bit more than Brazos' memory subsystem, and we're yet to know how much the Trinity's "southbridge" consumes. This means a 17W Trinity laptop may have substantially lower battery life than a 18W Brazos one.
 
Last edited by a moderator:
Yes, but Bulldozer is the CPU only. Trinity also must combine the GPU elements, and consider that GPU is a considerable power consumer for the package. If you could 'only' cut the CPU in half, you'd still be over the 17W power budget. The number of cores will necessarily decrease by almost 1/4 of Bulldozer's 8-core design in order to have enough power envelope to squeeze in the power budget for the GPU.

that assumes
no advancement in process
no advancement in power consumption from bulldozer to piledriver
no advancement/optimization in the floor plan
ignoring the power consumption of the L3.
advancements in power sharing between CPU and GPU

I wouldn't be writing off 4 cores just yet....................
 
I wonder if they ever considered doing a version of Trinity on TSMC's 28nm process. The excellent yields for 28nm (and high clock headroom for the 7970) makes me think it might be worth a shot to branch out for the subsequent high-end design for the capacity alone despite the different design rules at the two fabs.

In any case, the excellent 28nm process at TSMC should atleast make the next gen Jaguar core based fusion quite interesting. I'm thinking they could get a massive IPC boost by making cache full speed but more judiciously powering it up when needed. I also wish that more of the lower end fusion netbooks could get nicer IPS monitor as options atleast.
 
that assumes
no advancement in process
no advancement in power consumption from bulldozer to piledriver
no advancement/optimization in the floor plan
ignoring the power consumption of the L3.
advancements in power sharing between CPU and GPU

I wouldn't be writing off 4 cores just yet....................

Based on all currently available data regarding AMD's lithography processes, voltage needs and power draw, I see no reason to amend my statement yet.

If you can show me anything that AMD is currently producing that can change my mind regarding the power they're chewing through and the performance they're not getting from it, then please feel free to show me.

The 17W APU shown in CES is 2-module/4-core. It's been confirmed in several places, and the pictures taken by the press show the 17W BGA chip with exactly the same size as the 35W laptop socket version.
I saw one place claim it was 17W, and a bunch of others parrot that claim. I've seen no confirmation of that anywhere (ie, measurement of power draw from the unit in some definable way.)

I can take a picture of a 17W Sandy Bridge package and a 45W Sandy Bridge package and you will never be able to tell the difference. You know why they will look the same? Because they are the same. Showing a picture of a CPU die is meaningless to power consumption, so when you come back with your double-confirmed proof, please be aware that a picture of a BGA package isn't part of that proof.

Nowhere am I saying that AMD cannot accomplish a 17W Trinity, but I am saying that nothing we have to go on today indicates that it will be a 'fully featured' part.
 
Looks like Trinity has been delayed by at least a quarter. Afaik back in Q3'11 AMD stated that it would be out in "early" 2011. The revised mid year launch date indicates that they maybe needed another spin.
 
I saw one place claim it was 17W, and a bunch of others parrot that claim. I've seen no confirmation of that anywhere (ie, measurement of power draw from the unit in some definable way.)
(...)
I am saying that nothing we have to go on today indicates that it will be a 'fully featured' part.

Then watch the laptop-inside-a-desktop case video, where the AMD rep. clearly states it is the 17W model running and it is a quad-core.


I can take a picture of a 17W Sandy Bridge package and a 45W Sandy Bridge package and you will never be able to tell the difference. You know why they will look the same? Because they are the same.

WTF?
Trinity - Top to bottom: desktop socket, laptop socket, subnotebook BGA
dsc5176575px.jpg


Ivy Bridge
imagesnewsintelivybridg.jpg





Showing a picture of a CPU die is meaningless to power consumption, so when you come back with your double-confirmed proof, please be aware that a picture of a BGA package isn't part of that proof.

Oooooh I get it now.. this is a conspiracy drama after all.
So AMD openly tells the world+dog they have a 2-module, 17W Trinity up and running but you think they may be blatantly lying to everyone, hence the "there's no proof" argument.

And when was the last time AMD lied about a CPU's power consumption?


Nowhere am I saying that AMD cannot accomplish a 17W Trinity
And this is the safeguard, just in case you're wrong.
Okay.
 
Here is the video:

http://www.youtube.com/watch?v=lsmTDb-Mlws

Nowhere does he say 17W. Interestingly, he specifically mentions "almost half the power" -- if the processor was truly 17W, he wouldn't say "almost half", he'd say "less than half!" as any good PR parrot should. His choice of wording insinuates that it's actually above half the power, and the lowest powered offering AMD currently has is the 35W Llano. Does that mean 18W, or does it mean 20W, or does it mean even more and the origin point for "half" was a higher model?

You're welcome to call my bluff, watch the video, and point out the timestamp where the AMD rep calls out the 17W rating.

He mentions quad core, which means two modules. He mentions that it can play video games, watch a video, and encode video simultaneously. We have no performance data on the game (what rez? what settings? what features are enabled?), we have no performance data on the encoding videos (what video rez? what bitrate? audio format? what source media? how fast is it actually encoding? I can encode video on one core of my Q9550 and continue to play games without issues just as easily as he could, it will just go slowly...) and no performance data on the video that is actually playing (video rez? bitrate? audio stream?)

You have shown me nothing that I didn't already know, and further have shown no proof of 17W.

Rather than attacking ME, why don't you focus your effort on supporting your argument with details and facts that would either answer some of my (very sorely obvious) misgivings and questions about your statements, or else stop making blatent statements without any factual support.

If you theorize that Trinity 17W is fully capable of doing all of this, it is your prerogative. Keep in mind that it is not my prerogative to see things in the same way you do.
 
Last edited by a moderator:
What's your point? That a PR manager didn't mention exact value during a 120s shot? There are official AMD slides stating 17W TDP for FP2 BGA Trinity. AMD can achieve it quite easilly - they can adjust GPU's TDP by PowerTune to whatever value they need. I think they'll do some kind of CPU/GPU TDP balancing, too (like Intel does).

I believe you missed the point of the demonstration. Movie playback is performed by UVD processor, media encoding is performed by VCE processor and game runs on 3D core (racing games aren't CPU-demanding, there's very simple AI and very simple physics). Low CPU utilization is quite a proof of that. You don't need a fast CPU for such demonstration, because all these tasks are performed by dedicated hardware.
 
What's your point? That a PR manager didn't mention exact value during a 120s shot? There are official AMD slides stating 17W TDP for FP2 BGA Trinity. AMD can achieve it quite easilly - they can adjust GPU's TDP by PowerTune to whatever value they need. I think they'll do some kind of CPU/GPU TDP balancing, too (like Intel does).
My point is twofold: One, while there is certainly a 17W Trinity, I do not believe that specific part was the demo model shown in that video. Two, claims of "I TOLD YOU IT WAS SEVENTEEN WATTS DIDNT YOU SEE THE VIDEO AND ALL THE PEOPLE CONFIRMING IT VIA THE VIDEO?!?!?!?" are obviously bunk as no so claim was made in the video. On a tangent to that last point, that guy in the video being PR or not, someone who is showing off that part during CES will absolutely know exactly the sales pitch. And the sales pitch absolutely included power consumption, as it was a large portion of the sell of ultrathin laptop doing ALL of this awesomesauce... He was quite purposefully mentioning "almost" half power, so that he's not lying when it isn't half power.

I believe you missed the point of the demonstration. Movie playback is performed by UVD processor, media encoding is performed by VCE processor and game runs on 3D core (racing games aren't CPU-demanding,
No, I got all of that. Nowhere did I claim tomfoolery or shenanigans on the part of that demonstration; I am reasonably convinced all three of those items were indeed happening in parallel and on the laptop device.

What I'm not convinced of is whether this is a great way to demonstrate the "power of Trinity." Given what you described (dedicated hardware for basically all of it), you might reasonably expect Ivy Bridge to pull off the same capabilities. Hell, I would reasonably expect Llano to be able to pull off that same stunt, given a bit of 'tweaking' to the various data streams going in and coming out of that box. Of course, Llano would be doing it at 35W or more...

Again, we have NO data on:
  • Power consumption of that box
  • Performance or quality data on the video encoding
  • Performance or quality data on the video playback
  • Performance or quality data on the game being played

If I'm encoding 640x480 video from my smartphone, while playing a DX11 (on paper? what makes it a full DX11 implementation?) racing game at 800x600 with no AA and limited AF, while playing back an NTSC DVD that was ripped to the local drive, I would expect a Llano (or gasp, even a Sandy Bridge) to get away with that pretty easily. I might be able to get my i5-520m to almost get away with it, depending on how 'truly' DX11 that video game is.

None of that ultimately matters. My initial point that still stands: given what we know about AMD's current CPU and GPU architectures and what they've told us is going into Trinity, I have no reason to expect the 17W version of Trinity to be doing ALL of that work at quality levels that are meaningful. It's surely a 'cool' demo, but not really a meaningful one.
 
It was a Mainstream Trinity used in the demo system, means 35W or 45W APU according to AMD.

http://www.youtube.com/watch?v=agJxehoSBmY

See, now that is entirely believable given AMD's current architectural and lithography capabilities. Modest gains in CPU processing power, even better gains in GPU processing power, but on a smaller lithography node so they're able to keep the power profile flat. No real imagination-stretching required to get to that conceptual point...

Edit: and at least we have slightly more detail on the video transcode and playback -- they're "high def" :D (whatever that means, hehe)
 
See, now that is entirely believable given AMD's current architectural and lithography capabilities. Modest gains in CPU processing power, even better gains in GPU processing power, but on a smaller lithography node so they're able to keep the power profile flat. No real imagination-stretching required to get to that conceptual point...

Edit: and at least we have slightly more detail on the video transcode and playback -- they're "high def" :D (whatever that means, hehe)

Trinity is 32nm. So you're getting better performance on the same process. AMD doesn't have the resources to migrate Bulldozer to 28nm bulk, and there's probably not a lot of motivation if GF's pricing is good.

I suspect that Trinity's encode/decode will be good enough. They should be able to match SNB if they choose.

David
 
Trinity is 32nm. So you're getting better performance on the same process. AMD doesn't have the resources to migrate Bulldozer to 28nm bulk, and there's probably not a lot of motivation if GF's pricing is good.

I suspect that Trinity's encode/decode will be good enough. They should be able to match SNB if they choose.

David

23.976 fps playback???:LOL::LOL:
 
Interesting, for some reason I had it in my head that Trinity was a shrink. Given this, I'm a bit skeptical again about the total performance being brought to the table alongside the fat power reduction, but I see no reason to doubt a 35W or 45W package performing that CES demo.

It was the 17W argument that I couldn't agree with.
 
Nowhere does he say 17W. Interestingly, he specifically mentions "almost half the power" -- if the processor was truly 17W, he wouldn't say "almost half", he'd say "less than half!" as any good PR parrot should. His choice of wording insinuates that it's actually above half the power, and the lowest powered offering AMD currently has is the 35W Llano. Does that mean 18W, or does it mean 20W, or does it mean even more and the origin point for "half" was a higher model?

There is an apparent flaw in your argument. TDP is not actual power consumption, it's a classification. It indicates a part's power consumption is somewhere between the top of the given TDP class (35W in this case) and the top of the class one level below (18W?). So, the "almost half" of the power consumption of a part belonging in the 35W TDP class can indeed be 17W.
 
There is an apparent flaw in your argument. TDP is not actual power consumption, it's a classification. It indicates a part's power consumption is somewhere between the top of the given TDP class (35W in this case) and the top of the class one level below (18W?). So, the "almost half" of the power consumption of a part belonging in the 35W TDP class can indeed be 17W.

Sigh.

It doesn't matter. Each company has a different way of measuring it; an Intel 95W TDP chip will consume less than 95W; an AMD chip typically consumes more. No matter what, AMD's own people rate the chip in that demo unit as one of thier "mainstream", which is not in the 17W class.

Thus, it doesn't matter how you personally want to spin it, the only people perpetuating the 17W myth are misinformed at best. Now that you know, you can help stop the myth :)
 
Arctic Cooling sues AMD for using name Fusion, which AC has used on their PSUs before. It's completely irrelevant that there's countless products, many older than any AC PSU is, that are called "Fusion"

http://translate.google.com/transla...-streiten-ueber-die-Marke-Fusion-1418534.html

Shortly after, AMD announces that they're ditching Fusion naming, FSA (Fusion System Architecture) will be called HSA (Heterogenous Systems Architecture) instead.
http://www.bit-tech.net/news/hardware/2012/01/19/amd-ditches-fusion-branding/
 
Back
Top