Trinity vs Ivy Bridge

There is an apparent flaw in your argument. TDP is not actual power consumption, it's a classification. It indicates a part's power consumption is somewhere between the top of the given TDP class (35W in this case) and the top of the class one level below (18W?). So, the "almost half" of the power consumption of a part belonging in the 35W TDP class can indeed be 17W.

Um...not even close.

The TDP is the highest allowable average power consumption over a thermally significant period of time. It's how you design your heat sink.

So a 35W chip is guaranteed to have an average power of 35W or less over a thermally significant time frame. The average power might be 10W, it might be 5W or it might be 34.5W.

However, it's quite common for the power consumption (and dissipation) to be substantially higher over short periods of time, due to variations in current and voltage.

The real world power draw depends on a lot of factors such as the CPU design, any DVFS, temperature, cooling, etc.

David
 
an Intel 95W TDP chip will consume less than 95W; an AMD chip typically consumes more.

NO, it's absolutly not true, it's indeed the other way around! Intel base its classification on weighted average consumpion, so you will find Intel CPU's with f.ex. 130W TDP and consuming really 150W at high loads. AMD, in the other hand, base its classification on absolute maximal [averaged on short time-frames] consumption [at official clock rates], so you won't find any AMD CPU that consumes more anytime [longer than a split second, except with overclocking, of course] than it's given TDP! Just look around, it's common knowledge since a decade. "Sigh."

Thus, it doesn't matter how you personally want to spin it
Maybe it's you who spin things according to your bind to the maker of the CPU you own...

Um...not even close.
What is not even close to what? Are you avare of the antecedents? It's that an AMD representative said the new super-low-consumption 4-core Trinity consumes "almost half" of a certain part that has a TDP of 35W, and some say that then it cannot be 17W (because 35/2 > 17), which is also said to be its (maximal, etc.) consumption. And I've said there are no contradiction if one considers that TDP is a classification.

The TDP is the highest allowable average power consumption over a thermally significant period of time. It's how you design your heat sink.

So a 35W chip is guaranteed to have an average power of 35W or less over a thermally significant time frame. The average power might be 10W, it might be 5W or it might be 34.5W.

However, it's quite common for the power consumption (and dissipation) to be substantially higher over short periods of time, due to variations in current and voltage.

The real world power draw depends on a lot of factors such as the CPU design, any DVFS, temperature, cooling, etc.

I know it all, but how does it invalidate what I've wrote? I was obviously speaking about averaged maximal consumption at normal circumstances, if you're just splitting hairs. Or, do you perhaps deny there are TDP classes (at least at AMD)? Then how would you explain there are ranges of CPU's with different clockrates and indeed different averaged-maximal-consumption-at-normal-circumstances (by measure), still with the same given TDP? There are TDP's of 45W, 65W, 89W, and so on, but not f.ex. 55.6W, because that will be classified as 65W... (To ease logistics or whatever.)

So, I stand by what I've wrote: the "almost half" of the consumption of an AMD part classified as "35W TDP" (so with actual averaged-maximal-consumption-at-normal-circumstances somewhere between 18W and 35W!) can indeed be 17W. Simple as that.
 
Last edited by a moderator:
NO, it's absolutly not true, it's indeed the other way around! Intel base its classification on weighted average consumpion, so you will find Intel CPU's with f.ex. 130W TDP and consuming really 150W at high loads. AMD, in the other hand, base its classification on absolute maximal [averaged on short time-frames] consumption [at official clock rates], so you won't find any AMD CPU that consumes more anytime [longer than a split second, except with overclocking, of course] than it's given TDP! Just look around, it's common knowledge since a decade. "Sigh."


Maybe it's you who spin things according to your bind to the maker of the CPU you own...


What is not even close to what? Are you avare of the antecedents? It's that an AMD representative said the new super-low-consumption 4-core Trinity consumes "almost half" of a certain part that has a TDP of 35W, and some say that then it cannot be 17W (because 35/2 > 17), which is also said to be its (maximal, etc.) consumption. And I've said there are no contradiction if one considers that TDP is a classification.



I know it all, but how does it invalidate what I've wrote? I was obviously speaking about averaged maximal consumption at normal circumstances, if you're just splitting hairs. Or, do you perhaps deny there are TDP classes (at least at AMD)? Then how would you explain there are ranges of CPU's with different clockrates and indeed different averaged-maximal-consumption-at-normal-circumstances (by measure), still with the same given TDP? There are TDP's of 45W, 65W, 89W, and so on, but not f.ex. 55.6W, because that will be classified as 65W... (To ease logistics or whatever.)

So, I stand by what I've wrote: the "almost half" of the consumption of an AMD part classified as "35W TDP" (so with actual averaged-maximal-consumption-at-normal-circumstances somewhere between 18W and 35W!) can indeed be 17W. Simple as that.

You need to read AMD's definition of TDP. What are PVT? What is max current?

What I'm telling you is that you cannot assume that just because TDP dropped by 2X that the average power dropped by 2X. That's just silly.

David
 
What I'm telling you is that you cannot assume that just because TDP dropped by 2X that the average power dropped by 2X. That's just silly.
Yes, that would be silly, but I didn't even say that! Please, read more carefully! The 35W was a TDP class number, but the 17W wasn't (I assume). But, even if it was so, as well, it's still possible that its actual averaged-maximal-consumption-at-normal-circumstances is more than half of that of a 35W TDP part. So, what's the problem?

ps. why don't you snip big parts of quoted texts that are not related to your answer (and not even addressed to you)?
 
Last edited by a moderator:
@french toast: I don't think that was me. I don't even know what is "hm1".

@Albuquerque: Regarding AMD TDP vs. Intel TDP (/Max power/Sustained power):
http://www.semiaccurate.com/forums/showpost.php?p=7949&postcount=86

-----

Well, there is an official slide from AMD:
trinity17w.jpg
 
Last edited by a moderator:
Well, there is an official slide from AMD:

And here comes Albuquerque saying it's still not really explicit that the 17W part is a quad-core and the world+dog is just making up facts based on a vague slide.
Or something like that.



Um...not even close.

The TDP is the highest allowable average power consumption over a thermally significant period of time
. It's how you design your heat sink.

So a 35W chip is guaranteed to have an average power of 35W or less over a thermally significant time frame. The average power might be 10W, it might be 5W or it might be 34.5W.

However, it's quite common for the power consumption (and dissipation) to be substantially higher over short periods of time, due to variations in current and voltage.

The real world power draw depends on a lot of factors such as the CPU design, any DVFS, temperature, cooling, etc.

David

As dess pointed out, that's Intel's TDP measurement, not AMD's:

http://www.anandtech.com/show/2807/2
.

AMD measures TDP by multiplying voltage and current at its electrical maximum (I don't really know how they achieve this, but it's probably using unrealistically high usages through a power virus of some sort).
An AMD 17W TDP CPU/APU will never consume beyond 17W (either average or peak), unless overclocked.
TBH, it's a bit silly to call it Thermal Design Power, since the term originates from calculating cooling systems, but it seems AMD just lazily uses this widely known term to measure an actually more demanding characteristic.
By the way, this could also be an explanation as to why AMD's Turbo function is so modest.


An Intel 18W TDP CPU may consume a lot more during an instant.
Intel's TDP is more realistic, though.
Nonetheless, Albuquerque's statement about AMD consuming more than the announced TDP is flat out wrong.



Of course, the actual difference between AMD's and Intel's TDP is probably negligible for practical usage regarding battery life and cooling systems.




@ dess,
Don't worry, newbie "friendly bullying" isn't that uncommon in these parts, but the forum is still worth it!
Just don't let the frightening number of posts make you doubt your knowledge ;)
 
@ToTTenTranz: Thanks! :) I've also edited in some bits about it in my last post. (In some cases there can be a huge difference! Thus, AMD intoduced the ACP measure, as being comparble to Intel's TDP. They don't use it on the desktop/mobile, though.)
 
It's very difficult to guarantee there won't be some transient spike over TPD. It's why TDP is averaged over a longer period of time.

Looking at the explanations I've seen, AMD's TDP is not an average but an absolute maximum value.


I thought Bulldozer's turbo core allowed it to ramp higher than TDP for a short period, but cannot source that.

Llano, however, apparently does.
http://www.anandtech.com/show/4444/amd-llano-notebook-review-a-series-fusion-apu-a8-3500m/4

I've seen the following statement in that article:
Like Sandy Bridge, Llano is able to temporarily exceed the APU's maximum TDP if it determines that the recent history of power consumption has been low enough that it'll take a while for the APU to ramp up to any thermal limits.

But that doesn't seem to meet the description of AMD's TDP, and I haven't seen a single official slide/document supporting this claim..
After reading this and other articles, it shows how modest the turbo overclock values are, which further implies that there's a very strict AMD-TDP limitation, and not a AMD-ACP/Intel-TDP limitation, as suggested by that sentence.
 
At the Abu Dhabi Techday, there actually was a slide in the "morning session", that could be interpreted as Llano being able to exceed it's maximum TDP. There was a case given, when both, GPU and CPU power budgets have high priority as in a load-balancing OpenCL app. Both TDP budget together would be larger than the chip level TDP.

On that slide, there is a remark that CPU power reduction in such a case takes place on a temperature base, while at the same time, there could be an embedded system controller that could limit the CPU cores to less than P0.

Maybe one of the AMD guys reading here can clarify.
 
And here comes Albuquerque saying it's still not really explicit that the 17W part is a quad-core and the world+dog is just making up facts based on a vague slide.
I'm sorry, did you provide proof that this is the chip in the CES demo?

If not, then please carry on with your unfounded personal attacks while utterly proving my point. Your'e far too interested in attacking me and not interested enough in proving your statements that the CES demo model was explicitly claimed to be a 17W part by AMD.

AMD never claimed that their CES demo ran on a 17W watt chip, which is why you cannot prove it. The fact remains, contrary to your opinion that you continue to espouse as fact, AMD's own rep explicitly stated the laptop-in-a-desktop-case was using a 'mainstream' (their word, not mine) part, which (per AMD's own definition) means it wasn't a 17W processor, which means it has no bearing or relation to Dess's ULV slide.

Do you wish to continue avoiding this topic, or are you more interested in continuing to take little pot-shots at me?

Because so far, I'm 2-0 on proving you wrong, and you're 0-2 for proving yourself right. Want to make it best of five?
 
NO, it's absolutly not true, it's indeed the other way around! Intel base its classification on weighted average consumpion, so you will find Intel CPU's with f.ex. 130W TDP and consuming really 150W at high loads. AMD, in the other hand, base its classification on absolute maximal [averaged on short time-frames] consumption [at official clock rates], so you won't find any AMD CPU that consumes more anytime [longer than a split second, except with overclocking, of course] than it's given TDP! Just look around, it's common knowledge since a decade. "Sigh."

Really? And your only backup to this claim is a Semi-Accurate blog post? This sounds like a job for science. Or maybe with a proper way to measure direct power consumption of the CPU's, say for example, Anandtech?

Total power consumption of example systems at idle: http://www.anandtech.com/bench/CPU/63
Total power consumption of example systems, at full load (H264 encode): http://www.anandtech.com/bench/CPU/64

By diff'ing these two tables, we can loosely extrapolate CPU power consumption under load. Certainly RAM will be a consumer under load, and part of it will also be related to power losses in the VRM circuitry on the boards. Thus, our comparo will not be exact, but should let us identify if there are any common patterns...

Intel Core i7 2600k: 74 (idle) vs 127 (load) = 53W draw rated at 95W TDP (actual vs TDP is 55.8%)
AMD Phenom II X6 1090T: 88.5 (idle) vs 201 (load) = 112.5W draw rated at 125W TDP (actual vs TDP is 90%)

Intel Core i7 980X: 79.5 (idle) vs 185.5 (load) = 106W draw rated at 130W TDP.(actual vs TDP is 81.5%)
AMD Athlon II X4 635 79.5 (idle) vs 179.3 (load) = 99.8W draw rated at 95W TDP (actual vs TDP is 105%)

I picked the hottest Intel chips across two generations for the above, just to make it as bad as possible for my comparo. Unfortunately Anand doesn't have a decade of idle versus load power draw figures, but given the several data points available on that page, scientific measurement of draw versus TDP seems to be supporting my claims rather than yours.

Care to rebut?
 
Really? And your only backup to this claim is a Semi-Accurate blog post? This sounds like a job for science. Or maybe with a proper way to measure direct power consumption of the CPU's, say for example, Anandtech?

Total power consumption of example systems at idle: http://www.anandtech.com/bench/CPU/63
Total power consumption of example systems, at full load (H264 encode): http://www.anandtech.com/bench/CPU/64

By diff'ing these two tables, we can loosely extrapolate CPU power consumption under load. Certainly RAM will be a consumer under load, and part of it will also be related to power losses in the VRM circuitry on the boards. Thus, our comparo will not be exact, but should let us identify if there are any common patterns...

Intel Core i7 2600k: 74 (idle) vs 127 (load) = 53W draw rated at 95W TDP (actual vs TDP is 55.8%)
AMD Phenom II X6 1090T: 88.5 (idle) vs 201 (load) = 112.5W draw rated at 125W TDP (actual vs TDP is 90%)

Intel Core i7 980X: 79.5 (idle) vs 185.5 (load) = 106W draw rated at 130W TDP.(actual vs TDP is 81.5%)
AMD Athlon II X4 635 79.5 (idle) vs 179.3 (load) = 99.8W draw rated at 95W TDP (actual vs TDP is 105%)

I picked the hottest Intel chips across two generations for the above, just to make it as bad as possible for my comparo. Unfortunately Anand doesn't have a decade of idle versus load power draw figures, but given the several data points available on that page, scientific measurement of draw versus TDP seems to be supporting my claims rather than yours.

Care to rebut?
Your examples are desktop parts.

http://www.tomshardware.com/reviews/a8-3500m-llano-apu,2959-22.html

Power%20-%203DMark.png



The power use in the above graph is a result of a controlled test on an external monitor, so we repeated this metric again, this time using the laptop's own display. The A8-3500M laptop lasted two hours and 12 minutes. Assuming the Intel laptop used the exact same battery, it would run for one hour and 22 minutes.
This is very impressive. Not only does the A8-3500M get about twice as much time out of its battery, it does so while delivering far better graphics performance. The implications of this are profound: a Llano laptop user might be able to play a mainstream 3D game for an entire two-hour flight with decent frame rates, while the Intel Core i5-based platform would only last for half of the flight with choppy performance. There does, in fact, seem to be validity in AMD's excitement over its improved power story, and of course this is a real advantage when it comes to mobile devices.
 
I'm sorry, did you provide proof that this is the chip in the CES demo?

So first you say every tech news site is going through some sort of mass histeria because there is no "hard info" about a 4-core 17W Trinity, even with dailynews and others claiming such chip was mentioned during an interview with AMD.

Then you see how horribly wrong you were so you just decide to focus on the CES demo, where we have one guy stating it's a 35W chip and another stating it's a 17W chip.

Oh.. sorry, it's not 17W as that's an utter lie. He said "half the TDP" so it's 17,5W.
And gawd have mercy on people calling 17,5W on a 17W TDP chip!

But suuure, you get all the points you want.
2 - 0? Make that 17,5 - 0, if you will.



(...)
Care to rebut?

You think subtracting idle power usage in the total system to load power usage in the total system equals total CPU power usage.
I'm actually trying to count in how many ways can this be rebutted..


What's the PSU's efficiency?
Let's say we're talking about a very good PSU with a high-end rectifying circuit (you know, those thingies that turn AC into DC, but not the rock band) that does some 87%. It even changes according to the power load but let's not even go that way.
Then you can multiply all those values for 0.86 and you may get a number that's a bit closer to reality.
And only then you can subtract the RAM usage, the motherboard's voltage regulators, HDD/SSD consumption and some other stuff that kicks in when going from idle to load.

I'm pretty sure you can reach the happy conclusion that the Athlon II 635 is pretty much far away from its announced TDP of 100W.


Besides, I'm not sure why you put some Intel values in there.
Just in case you didn't read properly, no one said Intel's CPUs often surpass their TDP using averaged results.
What was said (and you can consult Intel's documentation to confirm it) is that Intel CPUs often surpass their TDP during instants, when the system considers this is unimportant for battery life and heat output.


BTW, it's a bit funny how you pointed out dess' reference of semiaccurate while ignoring my anandtech reference in the following post, claiming the exact same thing. And then you made a post using anandtech results.
 
@Albuquerque: Your calculations are wrong. Consumption at full load not equals full system load - idle system (it's only the excess upon idle [+ loaded system overhead]), but full system load - idle system + idle CPU - a few watts (chipset, ram, etc. at load). The idle CPU consumption is not known here. (Neither the last one.) So we can't tell the full load consumption of the CPU.

If you want direct CPU power consumption measurements, take a look at lostcircuits.com. I don't seem to remember a single case there when an AMD CPU consumed more than its TDP. Unlike with Intel. Altough, yes, it's usually not the case, here, either (switched off turbo or inconsistence in following their own rules?).

Also, those Maximal and Sustained power numbers in the cpu-world.com table were official. Sad they later has got out of the habit of publishing those, along with TDP.

He said "half the TDP" so it's 17,5W.
And gawd have mercy on people calling 17,5W on a 17W TDP chip!
Well, it's 17W TDP, so it's 17W or even less. (See what I've wrote about TDP classes.)
 
Last edited by a moderator:
So first you say every tech news site is going through some sort of mass histeria because there is no "hard info" about a 4-core 17W Trinity, even with dailynews and others claiming such chip was mentioned during an interview with AMD.
The link you provided had the AMD rep say something about "it can use as little as half power", but never said what he was comparing it to. A later poster, here in this very thread, linked you to the same AMD booth at CES where the AMD rep clearly states it's a mainstream part, meaning a >=35W design.

Let me help you remember:
It was a Mainstream Trinity used in the demo system, means 35W or 45W APU according to AMD.

http://www.youtube.com/watch?v=agJxehoSBmY

You've provided no links, in fact you've provided no evidence ANYWHERE except for your continued blather on this thread. Your very next reply should contain links to support your argument, your else I will simply assume that you have relented and agree that your viewpoint is unrealistic and unsupportable.

You think subtracting idle power usage in the total system to load power usage in the total system equals total CPU power usage.
Just like the rest of your reading comprehension, apparently, you skip any of the important bits and just go wildly assuming. Let me help you, yet again:
Certainly RAM will be a consumer under load, and part of it will also be related to power losses in the VRM circuitry on the boards. Thus, our comparo will not be exact, but should let us identify if there are any common patterns..
Hmm, that looks like me suggesting that it will not be a perfect result. But here's the kicker: Anand was building this table by using the same 'common' hardware -- same ram, same video card, same PSU, same drives, et al. The only specific points of variation were ones of necessity -- can't test a Phenom II in a Socket 1155 board, can we?

So, while I've already speciifcally mentioned we cannot use these values as any sort of direct consumption measurement, we can use these values to analyze for specific trends when evaluating a load that is CPU-specific (for example, H264 encode using a software encoder -- which is exactly the benchmark I selected.)

Dess wanted to tell me that AMD consumes less than TDP across the board, and does so much better than Intel. Given the data points provided, regardless of the CPU consumption being a "pure" number or not, the pattern is that AMD is not doing as well as Intel on that front.

If you would like to present any additional factual data that can refute what I've found, please go ahead!
 
Dess wanted to tell me that AMD consumes less than TDP across the board
Yes, it does. We gave you several references already.

and does so much better than Intel.
No, I didn't say such a thing. It's all relative to their own TDP.

Yes, lately Intel over rates many of their chips' TDP, but it wasn't this way all the time (certainly in the P4 era, f.ex.) and there are still cases when the official TDP is significantly lower than the actual maximal consumption (look f.ex. SB-E). It seems Intel is quite inconsistent in this matter, playing with the TDP the way they likes.

Given the data points provided, regardless of the CPU consumption being a "pure" number or not, the pattern is that AMD is not doing as well as Intel on that front.
1. Your numbers aren't only not "pure", but plain wrong. See my last post (that you've missed somehow, it seems) why.
2. A few cherry picked example doesn't disprove the aboves, anyway.
 
Aha! Look what I have found, power draw figures of a HUGE pile of processors measured at the ATX12V connector :)

http://www.behardware.com/articles/842-12/amd-fx-8150-and-fx-6100-bulldozer-arrives-on-am3.html On that first bargraph, if you hover your mouse over the the link that reads "ATX12V", the chart will change to direct power draw readings for each unit -- at idle, single-threaded and fully-threaded benchmarks. I tried to group them into competing pairs -- the i7 competes with the 8150, the Phenom II competed with the C2Q boxes.

Code:
			Idle	Load	TDP	Load / TDP (in %)
Intel Core i7 2600K	3.6W	63.6W	95W	66.9%
AMD FX-8150		4.8W	111.6W	125W	89.2%


AMD Phenom II X4 980	7.2W	82.8W	125W	66.2%
Intel Core2Quad Q9650	7.2W	58.8W	95W	61.8%

So, what does this give us? Looks relatively flat to me, give or take. THere doesn't appear to be any specific winner or loser in the "ZOMG THEY TOTALLY CHEAT AT TDP".

Why are we discussing this, anyway? The topic is Trinity, the point I have been making this whole time was that I didn't believe the CES demo booth was a 17W processor. TorrentTranz wanted to tell me it was certified by AMD at the show; turns out that AMD called it a "mainstream" version.

Dess shows up telling me how TDP is way over-stated on AMD, and way under-stated on Intel The above raw data shows this is not the case. Dess then pops out with an AMD marketing slide for Trinity ULV slide, indicating the 17W Trinity model is available (which I also mentioned that I never doubted, only that the CES demo wasn't one of them.) That slide goes a long way to proving my point also -- the slide was made to differentiate 'ULV' Trinity options from their 'mainstream' Trinity options. So again, AMD telling us it's a 'mainstream' part at CES, and that slide showing us that the 17W models are being billed as 'ULV' and not 'mainstream'.

So, CES != 17W part. AMD's TDP is no more or less 'pure' than Intels, and AMD's power consumption as a ratio of TDP is similarly no more or less 'pure' than Intels.

I'm done with this off-topic nonsense in this thread. If you want to continue discussion TDP versus power consumption, let's start a new thread and go at it. IF you want to continue discussion how TDP is a 'classification' rather than a meaningful metric, then we can start a thread for that too.

THis thread would be for discussion the Trinity processor itself; the one shown at CES was >=35W. And no matter what you might think, that's a pretty damned good demonstration for a 35W part. There's no logical reason to muddy the waters about how it HAS to be a 17W part to be somehow more interesting. Leave it stand on it's own.
 
Last edited by a moderator:
Back
Top