Welcome, Unregistered.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Reply
Old 25-Jan-2012, 18:51   #251
CarstenS
Senior Member
 
Join Date: May 2002
Location: Germany
Posts: 2,952
Send a message via ICQ to CarstenS
Default

At the Abu Dhabi Techday, there actually was a slide in the "morning session", that could be interpreted as Llano being able to exceed it's maximum TDP. There was a case given, when both, GPU and CPU power budgets have high priority as in a load-balancing OpenCL app. Both TDP budget together would be larger than the chip level TDP.

On that slide, there is a remark that CPU power reduction in such a case takes place on a temperature base, while at the same time, there could be an embedded system controller that could limit the CPU cores to less than P0.

Maybe one of the AMD guys reading here can clarify.
__________________
English is not my native tongue. Before flaming please consider the possiblity that I did not mean to say what you might have read from my posts.
Work| Recreation
Warning! This posting may contain unhealthy doses of gross humor, sarcastic remarks and exaggeration!
CarstenS is offline   Reply With Quote
Old 25-Jan-2012, 20:41   #252
Albuquerque
Red-headed step child
 
Join Date: Jun 2004
Location: Guess ;)
Posts: 3,266
Default

Quote:
Originally Posted by ToTTenTranz View Post
And here comes Albuquerque saying it's still not really explicit that the 17W part is a quad-core and the world+dog is just making up facts based on a vague slide.
I'm sorry, did you provide proof that this is the chip in the CES demo?

If not, then please carry on with your unfounded personal attacks while utterly proving my point. Your'e far too interested in attacking me and not interested enough in proving your statements that the CES demo model was explicitly claimed to be a 17W part by AMD.

AMD never claimed that their CES demo ran on a 17W watt chip, which is why you cannot prove it. The fact remains, contrary to your opinion that you continue to espouse as fact, AMD's own rep explicitly stated the laptop-in-a-desktop-case was using a 'mainstream' (their word, not mine) part, which (per AMD's own definition) means it wasn't a 17W processor, which means it has no bearing or relation to Dess's ULV slide.

Do you wish to continue avoiding this topic, or are you more interested in continuing to take little pot-shots at me?

Because so far, I'm 2-0 on proving you wrong, and you're 0-2 for proving yourself right. Want to make it best of five?
__________________
"...twisting my words"
Quote:
Originally Posted by _xxx_ 1/25 View Post
Get some supplies <...> Within the next couple of months, you'll need it.
Quote:
Originally Posted by _xxx_ 6/9 View Post
And riots are about to begin too.
Quote:
Originally Posted by _xxx_8/5 View Post
food shortages and huge price jumps I predicted recently are becoming very real now.
Quote:
Originally Posted by _xxx_ View Post
If it turns out I was wrong, I'll admit being stupid
Albuquerque is online now   Reply With Quote
Old 25-Jan-2012, 21:42   #253
Albuquerque
Red-headed step child
 
Join Date: Jun 2004
Location: Guess ;)
Posts: 3,266
Default

Quote:
Originally Posted by dess View Post
NO, it's absolutly not true, it's indeed the other way around! Intel base its classification on weighted average consumpion, so you will find Intel CPU's with f.ex. 130W TDP and consuming really 150W at high loads. AMD, in the other hand, base its classification on absolute maximal [averaged on short time-frames] consumption [at official clock rates], so you won't find any AMD CPU that consumes more anytime [longer than a split second, except with overclocking, of course] than it's given TDP! Just look around, it's common knowledge since a decade. "Sigh."
Really? And your only backup to this claim is a Semi-Accurate blog post? This sounds like a job for science. Or maybe with a proper way to measure direct power consumption of the CPU's, say for example, Anandtech?

Total power consumption of example systems at idle: http://www.anandtech.com/bench/CPU/63
Total power consumption of example systems, at full load (H264 encode): http://www.anandtech.com/bench/CPU/64

By diff'ing these two tables, we can loosely extrapolate CPU power consumption under load. Certainly RAM will be a consumer under load, and part of it will also be related to power losses in the VRM circuitry on the boards. Thus, our comparo will not be exact, but should let us identify if there are any common patterns...

Intel Core i7 2600k: 74 (idle) vs 127 (load) = 53W draw rated at 95W TDP (actual vs TDP is 55.8%)
AMD Phenom II X6 1090T: 88.5 (idle) vs 201 (load) = 112.5W draw rated at 125W TDP (actual vs TDP is 90%)

Intel Core i7 980X: 79.5 (idle) vs 185.5 (load) = 106W draw rated at 130W TDP.(actual vs TDP is 81.5%)
AMD Athlon II X4 635 79.5 (idle) vs 179.3 (load) = 99.8W draw rated at 95W TDP (actual vs TDP is 105%)

I picked the hottest Intel chips across two generations for the above, just to make it as bad as possible for my comparo. Unfortunately Anand doesn't have a decade of idle versus load power draw figures, but given the several data points available on that page, scientific measurement of draw versus TDP seems to be supporting my claims rather than yours.

Care to rebut?
__________________
"...twisting my words"
Quote:
Originally Posted by _xxx_ 1/25 View Post
Get some supplies <...> Within the next couple of months, you'll need it.
Quote:
Originally Posted by _xxx_ 6/9 View Post
And riots are about to begin too.
Quote:
Originally Posted by _xxx_8/5 View Post
food shortages and huge price jumps I predicted recently are becoming very real now.
Quote:
Originally Posted by _xxx_ View Post
If it turns out I was wrong, I'll admit being stupid
Albuquerque is online now   Reply With Quote
Old 25-Jan-2012, 22:11   #254
ronvalencia
Registered
 
Join Date: Jan 2012
Posts: 8
Default

Quote:
Originally Posted by Albuquerque View Post
Really? And your only backup to this claim is a Semi-Accurate blog post? This sounds like a job for science. Or maybe with a proper way to measure direct power consumption of the CPU's, say for example, Anandtech?

Total power consumption of example systems at idle: http://www.anandtech.com/bench/CPU/63
Total power consumption of example systems, at full load (H264 encode): http://www.anandtech.com/bench/CPU/64

By diff'ing these two tables, we can loosely extrapolate CPU power consumption under load. Certainly RAM will be a consumer under load, and part of it will also be related to power losses in the VRM circuitry on the boards. Thus, our comparo will not be exact, but should let us identify if there are any common patterns...

Intel Core i7 2600k: 74 (idle) vs 127 (load) = 53W draw rated at 95W TDP (actual vs TDP is 55.8%)
AMD Phenom II X6 1090T: 88.5 (idle) vs 201 (load) = 112.5W draw rated at 125W TDP (actual vs TDP is 90%)

Intel Core i7 980X: 79.5 (idle) vs 185.5 (load) = 106W draw rated at 130W TDP.(actual vs TDP is 81.5%)
AMD Athlon II X4 635 79.5 (idle) vs 179.3 (load) = 99.8W draw rated at 95W TDP (actual vs TDP is 105%)

I picked the hottest Intel chips across two generations for the above, just to make it as bad as possible for my comparo. Unfortunately Anand doesn't have a decade of idle versus load power draw figures, but given the several data points available on that page, scientific measurement of draw versus TDP seems to be supporting my claims rather than yours.

Care to rebut?
Your examples are desktop parts.

http://www.tomshardware.com/reviews/a8-3500m-llano-apu,2959-22.html



The power use in the above graph is a result of a controlled test on an external monitor, so we repeated this metric again, this time using the laptop's own display. The A8-3500M laptop lasted two hours and 12 minutes. Assuming the Intel laptop used the exact same battery, it would run for one hour and 22 minutes.
This is very impressive. Not only does the A8-3500M get about twice as much time out of its battery, it does so while delivering far better graphics performance. The implications of this are profound: a Llano laptop user might be able to play a mainstream 3D game for an entire two-hour flight with decent frame rates, while the Intel Core i5-based platform would only last for half of the flight with choppy performance. There does, in fact, seem to be validity in AMD's excitement over its improved power story, and of course this is a real advantage when it comes to mobile devices.
ronvalencia is offline   Reply With Quote
Old 26-Jan-2012, 00:05   #255
ToTTenTranz
Senior Member
 
Join Date: Jul 2008
Posts: 3,075
Default

Quote:
Originally Posted by Albuquerque View Post
I'm sorry, did you provide proof that this is the chip in the CES demo?
So first you say every tech news site is going through some sort of mass histeria because there is no "hard info" about a 4-core 17W Trinity, even with dailynews and others claiming such chip was mentioned during an interview with AMD.

Then you see how horribly wrong you were so you just decide to focus on the CES demo, where we have one guy stating it's a 35W chip and another stating it's a 17W chip.

Oh.. sorry, it's not 17W as that's an utter lie. He said "half the TDP" so it's 17,5W.
And gawd have mercy on people calling 17,5W on a 17W TDP chip!

But suuure, you get all the points you want.
2 - 0? Make that 17,5 - 0, if you will.



Quote:
Originally Posted by Albuquerque View Post
(...)
Care to rebut?
You think subtracting idle power usage in the total system to load power usage in the total system equals total CPU power usage.
I'm actually trying to count in how many ways can this be rebutted..


What's the PSU's efficiency?
Let's say we're talking about a very good PSU with a high-end rectifying circuit (you know, those thingies that turn AC into DC, but not the rock band) that does some 87%. It even changes according to the power load but let's not even go that way.
Then you can multiply all those values for 0.86 and you may get a number that's a bit closer to reality.
And only then you can subtract the RAM usage, the motherboard's voltage regulators, HDD/SSD consumption and some other stuff that kicks in when going from idle to load.

I'm pretty sure you can reach the happy conclusion that the Athlon II 635 is pretty much far away from its announced TDP of 100W.


Besides, I'm not sure why you put some Intel values in there.
Just in case you didn't read properly, no one said Intel's CPUs often surpass their TDP using averaged results.
What was said (and you can consult Intel's documentation to confirm it) is that Intel CPUs often surpass their TDP during instants, when the system considers this is unimportant for battery life and heat output.


BTW, it's a bit funny how you pointed out dess' reference of semiaccurate while ignoring my anandtech reference in the following post, claiming the exact same thing. And then you made a post using anandtech results.
ToTTenTranz is offline   Reply With Quote
Old 26-Jan-2012, 00:12   #256
dess
Junior Member
 
Join Date: May 2005
Posts: 28
Default

@Albuquerque: Your calculations are wrong. Consumption at full load not equals full system load - idle system (it's only the excess upon idle [+ loaded system overhead]), but full system load - idle system + idle CPU - a few watts (chipset, ram, etc. at load). The idle CPU consumption is not known here. (Neither the last one.) So we can't tell the full load consumption of the CPU.

If you want direct CPU power consumption measurements, take a look at lostcircuits.com. I don't seem to remember a single case there when an AMD CPU consumed more than its TDP. Unlike with Intel. Altough, yes, it's usually not the case, here, either (switched off turbo or inconsistence in following their own rules?).

Also, those Maximal and Sustained power numbers in the cpu-world.com table were official. Sad they later has got out of the habit of publishing those, along with TDP.

Quote:
Originally Posted by ToTTenTranz View Post
He said "half the TDP" so it's 17,5W.
And gawd have mercy on people calling 17,5W on a 17W TDP chip!
Well, it's 17W TDP, so it's 17W or even less. (See what I've wrote about TDP classes.)

Last edited by dess; 26-Jan-2012 at 00:31.
dess is offline   Reply With Quote
Old 26-Jan-2012, 02:48   #257
Albuquerque
Red-headed step child
 
Join Date: Jun 2004
Location: Guess ;)
Posts: 3,266
Default

Quote:
Originally Posted by ToTTenTranz View Post
So first you say every tech news site is going through some sort of mass histeria because there is no "hard info" about a 4-core 17W Trinity, even with dailynews and others claiming such chip was mentioned during an interview with AMD.
The link you provided had the AMD rep say something about "it can use as little as half power", but never said what he was comparing it to. A later poster, here in this very thread, linked you to the same AMD booth at CES where the AMD rep clearly states it's a mainstream part, meaning a >=35W design.

Let me help you remember:
Quote:
Originally Posted by Paran View Post
It was a Mainstream Trinity used in the demo system, means 35W or 45W APU according to AMD.

http://www.youtube.com/watch?v=agJxehoSBmY
You've provided no links, in fact you've provided no evidence ANYWHERE except for your continued blather on this thread. Your very next reply should contain links to support your argument, your else I will simply assume that you have relented and agree that your viewpoint is unrealistic and unsupportable.

Quote:
Originally Posted by ToTTenTranz View Post
You think subtracting idle power usage in the total system to load power usage in the total system equals total CPU power usage.
Just like the rest of your reading comprehension, apparently, you skip any of the important bits and just go wildly assuming. Let me help you, yet again:
Quote:
Originally Posted by Albuquerque View Post
Certainly RAM will be a consumer under load, and part of it will also be related to power losses in the VRM circuitry on the boards. Thus, our comparo will not be exact, but should let us identify if there are any common patterns..
Hmm, that looks like me suggesting that it will not be a perfect result. But here's the kicker: Anand was building this table by using the same 'common' hardware -- same ram, same video card, same PSU, same drives, et al. The only specific points of variation were ones of necessity -- can't test a Phenom II in a Socket 1155 board, can we?

So, while I've already speciifcally mentioned we cannot use these values as any sort of direct consumption measurement, we can use these values to analyze for specific trends when evaluating a load that is CPU-specific (for example, H264 encode using a software encoder -- which is exactly the benchmark I selected.)

Dess wanted to tell me that AMD consumes less than TDP across the board, and does so much better than Intel. Given the data points provided, regardless of the CPU consumption being a "pure" number or not, the pattern is that AMD is not doing as well as Intel on that front.

If you would like to present any additional factual data that can refute what I've found, please go ahead!
__________________
"...twisting my words"
Quote:
Originally Posted by _xxx_ 1/25 View Post
Get some supplies <...> Within the next couple of months, you'll need it.
Quote:
Originally Posted by _xxx_ 6/9 View Post
And riots are about to begin too.
Quote:
Originally Posted by _xxx_8/5 View Post
food shortages and huge price jumps I predicted recently are becoming very real now.
Quote:
Originally Posted by _xxx_ View Post
If it turns out I was wrong, I'll admit being stupid
Albuquerque is online now   Reply With Quote
Old 26-Jan-2012, 03:32   #258
dess
Junior Member
 
Join Date: May 2005
Posts: 28
Default

Quote:
Originally Posted by Albuquerque View Post
Dess wanted to tell me that AMD consumes less than TDP across the board
Yes, it does. We gave you several references already.

Quote:
and does so much better than Intel.
No, I didn't say such a thing. It's all relative to their own TDP.

Yes, lately Intel over rates many of their chips' TDP, but it wasn't this way all the time (certainly in the P4 era, f.ex.) and there are still cases when the official TDP is significantly lower than the actual maximal consumption (look f.ex. SB-E). It seems Intel is quite inconsistent in this matter, playing with the TDP the way they likes.

Quote:
Given the data points provided, regardless of the CPU consumption being a "pure" number or not, the pattern is that AMD is not doing as well as Intel on that front.
1. Your numbers aren't only not "pure", but plain wrong. See my last post (that you've missed somehow, it seems) why.
2. A few cherry picked example doesn't disprove the aboves, anyway.
dess is offline   Reply With Quote
Old 26-Jan-2012, 03:37   #259
rpg.314
Senior Member
 
Join Date: Jul 2008
Location: /
Posts: 4,218
Send a message via Skype™ to rpg.314
Default

Can we stop with this shouting match and try to get back on topic?
rpg.314 is offline   Reply With Quote
Old 26-Jan-2012, 04:22   #260
Albuquerque
Red-headed step child
 
Join Date: Jun 2004
Location: Guess ;)
Posts: 3,266
Default

Aha! Look what I have found, power draw figures of a HUGE pile of processors measured at the ATX12V connector

http://www.behardware.com/articles/8...es-on-am3.html On that first bargraph, if you hover your mouse over the the link that reads "ATX12V", the chart will change to direct power draw readings for each unit -- at idle, single-threaded and fully-threaded benchmarks. I tried to group them into competing pairs -- the i7 competes with the 8150, the Phenom II competed with the C2Q boxes.

Code:
			Idle	Load	TDP	Load / TDP (in %)
Intel Core i7 2600K	3.6W	63.6W	95W	66.9%
AMD FX-8150		4.8W	111.6W	125W	89.2%


AMD Phenom II X4 980	7.2W	82.8W	125W	66.2%
Intel Core2Quad Q9650	7.2W	58.8W	95W	61.8%
So, what does this give us? Looks relatively flat to me, give or take. THere doesn't appear to be any specific winner or loser in the "ZOMG THEY TOTALLY CHEAT AT TDP".

Why are we discussing this, anyway? The topic is Trinity, the point I have been making this whole time was that I didn't believe the CES demo booth was a 17W processor. TorrentTranz wanted to tell me it was certified by AMD at the show; turns out that AMD called it a "mainstream" version.

Dess shows up telling me how TDP is way over-stated on AMD, and way under-stated on Intel The above raw data shows this is not the case. Dess then pops out with an AMD marketing slide for Trinity ULV slide, indicating the 17W Trinity model is available (which I also mentioned that I never doubted, only that the CES demo wasn't one of them.) That slide goes a long way to proving my point also -- the slide was made to differentiate 'ULV' Trinity options from their 'mainstream' Trinity options. So again, AMD telling us it's a 'mainstream' part at CES, and that slide showing us that the 17W models are being billed as 'ULV' and not 'mainstream'.

So, CES != 17W part. AMD's TDP is no more or less 'pure' than Intels, and AMD's power consumption as a ratio of TDP is similarly no more or less 'pure' than Intels.

I'm done with this off-topic nonsense in this thread. If you want to continue discussion TDP versus power consumption, let's start a new thread and go at it. IF you want to continue discussion how TDP is a 'classification' rather than a meaningful metric, then we can start a thread for that too.

THis thread would be for discussion the Trinity processor itself; the one shown at CES was >=35W. And no matter what you might think, that's a pretty damned good demonstration for a 35W part. There's no logical reason to muddy the waters about how it HAS to be a 17W part to be somehow more interesting. Leave it stand on it's own.
__________________
"...twisting my words"
Quote:
Originally Posted by _xxx_ 1/25 View Post
Get some supplies <...> Within the next couple of months, you'll need it.
Quote:
Originally Posted by _xxx_ 6/9 View Post
And riots are about to begin too.
Quote:
Originally Posted by _xxx_8/5 View Post
food shortages and huge price jumps I predicted recently are becoming very real now.
Quote:
Originally Posted by _xxx_ View Post
If it turns out I was wrong, I'll admit being stupid

Last edited by Albuquerque; 26-Jan-2012 at 18:56. Reason: fixed the rating on 2600K - Thanks DavidC
Albuquerque is online now   Reply With Quote
Old 26-Jan-2012, 08:56   #261
CarstenS
Senior Member
 
Join Date: May 2002
Location: Germany
Posts: 2,952
Send a message via ICQ to CarstenS
Default

Here's measurements of just the CPU and it's VRC:
http://ht4u.net/reviews/2011/intel_s...re/index17.php
__________________
English is not my native tongue. Before flaming please consider the possiblity that I did not mean to say what you might have read from my posts.
Work| Recreation
Warning! This posting may contain unhealthy doses of gross humor, sarcastic remarks and exaggeration!
CarstenS is offline   Reply With Quote
Old 26-Jan-2012, 12:32   #262
DavidC
Member
 
Join Date: Sep 2006
Posts: 304
Default

Quote:
Originally Posted by Albuquerque View Post
Idle Load TDP Load / TDP (in %)
Intel Core i7 2600K 3.6W 63.6W 95W 90.0%
AMD FX-8150 4.8W 111.6W 125W 89.2%


AMD Phenom II X4 980 7.2W 82.8W 125W 66.2%
Intel Core2Quad Q9650 7.2W 58.8W 95W 61.8%
[/CODE]

So, what does this give us? Looks relatively flat to me, give or take. THere doesn't appear to be any specific winner or loser in the "ZOMG THEY TOTALLY CHEAT AT TDP".
You got the percentage calculation wrong on the 2600K. 63.6/95 = 67%
DavidC is offline   Reply With Quote
Old 26-Jan-2012, 16:21   #263
dess
Junior Member
 
Join Date: May 2005
Posts: 28
Default

Quote:
Originally Posted by Albuquerque View Post
On that first bargraph, if you hover your mouse over the the link that reads "ATX12V", the chart will change to direct power draw readings for each unit -- at idle, single-threaded and fully-threaded benchmarks.
You have to add some 20W to the consumption of Nehalem and most probably Sandy Bridge, because the memory controller is not powered from the separate "ATX12V" lead. link (Also note that the memory controller and much of the IO is off-chip in case of the Core 2 CPU's.)

Quote:
Why are we discussing this, anyway?
For two simple reasons. One: you have said here that the "almost half" of the consumption of a 35W TDP part cannot be 17W, which is wrong. Is it clear now? Then, you've said: [i]"an Intel 95W TDP chip will consume less than 95W; an AMD chip typically consumes more." The latter one is plain wrong, again, right? Yes, as I've admitted already that Intel usually consumes less, as well, but there was and are exceptions (perhaps mainly on the Xeon line). All in all (and this is the relation between these off-topics and matter of the 17W Trinity), if they've said the part consumed "almost half" of that of a 35W TDP one, it indeed could be 17W. Could you acknowledge it already, so that we can move on? That if they were speaking about the actual part in the notebook, is another topic.

Quote:
Dess then pops out with an AMD marketing slide for Trinity ULV slide, indicating the 17W Trinity model is available (which I also mentioned that I never doubted, only that the CES demo wasn't one of them.)
I think it's obvious that all the claims on the slide is referring to the same product. So, it's 4-core and 17W TDP. And I see no reason why couldn't it be in that notebook on the show, as well. Did you see what even a Brazos is capable of, with only two cores and a lesser IGP?

Quote:
IF you want to continue discussion how TDP is a 'classification' rather than a meaningful metric, then we can start a thread for that too.
I doesn't need any discussion, regarding AMD parts. It is a known thing.

Quote:
THis thread would be for discussion the Trinity processor itself
Really? All these discussions could be saved if you were more knowledgeable about AMD's TDP.

Last edited by dess; 26-Jan-2012 at 16:31.
dess is offline   Reply With Quote
Old 26-Jan-2012, 18:52   #264
Albuquerque
Red-headed step child
 
Join Date: Jun 2004
Location: Guess ;)
Posts: 3,266
Default

Quote:
Originally Posted by dess View Post
You have to add some 20W to the consumption of Nehalem and most probably Sandy Bridge, because the memory controller is not powered from the separate "ATX12V" lead. link (Also note that the memory controller and much of the IO is off-chip in case of the Core 2 CPU's.)
If you can provide better numbers, so be it. My miscalculation for i7-2600k (mentioned above) means that, even if the measurement was off by 20W, it would still be in-line with the FX-8150. Nothing you've provided shows me that Intel nor AMD's power consumption versus TDP are far-separated.

Quote:
Originally Posted by dess View Post
One: you have said here that the "almost half" of the consumption of a 35W TDP part cannot be 17W, which is wrong. Is it clear now?
Did you watch the video? The AMD PR person mentioned the CES Trinity unit was half the power of an undisclosed former mobile part. He did NOT mention if his claim was thermal design power, or if it was power consumption. For his claim to have any validity, he had to be using the same units of measure (comparing TDP of a former part to Actual power of a current part is an outright lie.) Thus, it is still my opinion that the new Trinity part is NOT anything near half the power of the former part.

Further, have you not watched the (thrice-linked in this very thread) video of AMD spokesperson telling us definitively that the Trinity display was using a mainstream mobile part? Their mainstream parts are TDP of >=35W, and the most likely former candidate would be the Llano which is also a >= 35W TDP part. My opinion still has not changed, I do not believe that Trinity is half the power of Llano. Sorry, you haven't convinced me. Mostly because you've either put up marketing slides, or tried to make some passing commentary on how AMD's TDP is somehow more reflective of -- something? -- than Intel's.

Quote:
Originally Posted by dess View Post
Then, you've said: [i]"an Intel 95W TDP chip will consume less than 95W; an AMD chip typically consumes more."
Interesting, because you told me I was exactly backwards and Intel always consumes more and AMD always consumes less. As it turns out, I corrected my own claim when finding real data to look at:
Quote:
Originally Posted by Albuquerque
So, what does this give us? Looks relatively flat to me, give or take. THere doesn't appear to be any specific winner or loser in the "ZOMG THEY TOTALLY CHEAT AT TDP".
So, I came out and said that it looks flat after all, and yet here you are telling me I'm wrong and never admitted it... Really? I'm pretty sure you're projecting, because:

Quote:
Originally Posted by dess,1615439
I doesn't need any discussion, regarding AMD parts. It is a known thing.<snip> All these discussions could be saved if you were more knowledgeable about AMD's TDP.
The burden of proof is on you. It is you, not I, who are making the accusations that somehow AMD's TDP is more 'relevant' than Intel. For you to make this claim, you will now go find the requisite material to prove it. If this is such a well known fact as you say, then there will be data overflowing from countless websites out there, although somehow I'm not finding it as easily as you suggest.

Fact: We now have three sites that give some attempt at metering CPU power consumption. IN all three, when comparing against their rated TDP among chips who are roughly performance equivalent, AMD and Intel are proving to be roughly equal in terms of relation between actual consumption and TDP rating.

Fact: AMD stated the CES Trinity demo was a mainstream mobile part, not ULV.

Fact: AMD and Intel only rate processor wattage in terms of TDP, so a "17W processor" in PR terms would mean a "17W TDP Processor."

Fact: The CES demo was not run on a 17W TDP processor.

Fact: There will be 17W TDP Trinity chips

Possibility: There may not be quad-core Trinity chips at 17W TDP. Your ULV slide does indeed specify that quad core will be available in the ULV space, but they do not say that it will be in the 17W TDP profile. They say ULV "starts at 17W".

Opinion, based on what we know about GloFo and AMD's ability to build processors: Trinity at CES is likely NOT half the power consumption of the prior Llano 35W TDP part. It's the same lithography process, with significantly more transistors thanks to more GPU and CPU cores. Sure, L3 cache goes missing, but is that going to save half of the power draw? Nope. Power doesn't go down in a scenario where your lithography stays the same, your computation power goes up, clocks stay flat, and you add transistors... Although, strictly speaking, it also doesn't mean TDP had to go up either.
__________________
"...twisting my words"
Quote:
Originally Posted by _xxx_ 1/25 View Post
Get some supplies <...> Within the next couple of months, you'll need it.
Quote:
Originally Posted by _xxx_ 6/9 View Post
And riots are about to begin too.
Quote:
Originally Posted by _xxx_8/5 View Post
food shortages and huge price jumps I predicted recently are becoming very real now.
Quote:
Originally Posted by _xxx_ View Post
If it turns out I was wrong, I'll admit being stupid

Last edited by Albuquerque; 27-Jan-2012 at 00:11.
Albuquerque is online now   Reply With Quote
Old 26-Jan-2012, 19:02   #265
Albuquerque
Red-headed step child
 
Join Date: Jun 2004
Location: Guess ;)
Posts: 3,266
Default

Quote:
Originally Posted by CarstenS View Post
Here's measurements of just the CPU and it's VRC:
http://ht4u.net/reviews/2011/intel_s...re/index17.php
Interesting, lends credence to Dess's claim of ~20W possibly gone missing from the SB platform due to uncore neneds thatr aren't fed by the ATX12V. Also interesting, that article includes FX-8150 and old Phenom II X4 and X6 data, which also appear to be partially affected when referencing against the ATX12V data I found.

To my eye, it looks like both companies draw some amount of power that isn't fed directly by the ATX12V, so again, it's looking like a (generally) flat comparo to me. I mean, even if we say only Intel does it (which isn't correct, but whatever) we'd still end up with approximately the same percentages when calculating actual draw vs TDP. Not that this artificial benchmark has any true bearing on reality, but whatever

Quote:
Originally Posted by DavidC View Post
You got the percentage calculation wrong on the 2600K. 63.6/95 = 67%
That table was painful to create, and then I go and muck it with a bad calcuation. Bleargh! Thanks for the correction; I've updated my post...
__________________
"...twisting my words"
Quote:
Originally Posted by _xxx_ 1/25 View Post
Get some supplies <...> Within the next couple of months, you'll need it.
Quote:
Originally Posted by _xxx_ 6/9 View Post
And riots are about to begin too.
Quote:
Originally Posted by _xxx_8/5 View Post
food shortages and huge price jumps I predicted recently are becoming very real now.
Quote:
Originally Posted by _xxx_ View Post
If it turns out I was wrong, I'll admit being stupid
Albuquerque is online now   Reply With Quote
Old 27-Jan-2012, 05:49   #266
ronvalencia
Registered
 
Join Date: Jan 2012
Posts: 8
Default AMD A8 3500 vs Intel Core i5-2520M

Quote:
Originally Posted by Albuquerque View Post
Sigh.

It doesn't matter. Each company has a different way of measuring it; an Intel 95W TDP chip will consume less than 95W; an AMD chip typically consumes more. No matter what, AMD's own people rate the chip in that demo unit as one of thier "mainstream", which is not in the 17W class.

Thus, it doesn't matter how you personally want to spin it, the only people perpetuating the 17W myth are misinformed at best. Now that you know, you can help stop the myth
From
http://www.tomshardware.com/reviews/a8-3500m-llano-apu,2959-22.html


The power use in the above graph is a result of a controlled test on an external monitor, so we repeated this metric again, this time using the laptop's own display. The A8-3500M laptop lasted two hours and 12 minutes. Assuming the Intel laptop used the exact same battery, it would run for one hour and 22 minutes.
This is very impressive. Not only does the A8-3500M get about twice as much time out of its battery, it does so while delivering far better graphics performance. The implications of this are profound: a Llano laptop user might be able to play a mainstream 3D game for an entire two-hour flight with decent frame rates, while the Intel Core i5-based platform would only last for half of the flight with choppy performance. There does, in fact, seem to be validity in AMD's excitement over its improved power story, and of course this is a real advantage when it comes to mobile devices.
ronvalencia is offline   Reply With Quote
Old 27-Jan-2012, 10:13   #267
CarstenS
Senior Member
 
Join Date: May 2002
Location: Germany
Posts: 2,952
Send a message via ICQ to CarstenS
Default

I cannot be sure, but it seems that they compare an actual notebook with a Mini-PC with resulting implications on components' pricing and efficiency.
__________________
English is not my native tongue. Before flaming please consider the possiblity that I did not mean to say what you might have read from my posts.
Work| Recreation
Warning! This posting may contain unhealthy doses of gross humor, sarcastic remarks and exaggeration!
CarstenS is offline   Reply With Quote
Old 27-Jan-2012, 11:16   #268
DavidC
Member
 
Join Date: Sep 2006
Posts: 304
Default

Quote:
Originally Posted by CarstenS View Post
I cannot be sure, but it seems that they compare an actual notebook with a Mini-PC with resulting implications on components' pricing and efficiency.
Damnit, I thought that TH review was laptop vs. laptop. Those mini desktop systems tend to use somewhat more power than laptops because they turn off some power saving components and sometimes use desktop components.

http://www.legitreviews.com/article/1636/5/
http://techreport.com/articles.x/21099/7

The first review is interesting that they show 3DMark06 results when running on battery and AC. Other sites show the performance numbers on AC, but battery life is of course measured on DC.

Similar performance with 1.6x better battery life or 40% better performance on AC.
DavidC is offline   Reply With Quote
Old 27-Jan-2012, 11:23   #269
DavidC
Member
 
Join Date: Sep 2006
Posts: 304
Default

Quote:
Originally Posted by Albuquerque View Post
Possibility: There may not be quad-core Trinity chips at 17W TDP. Your ULV slide does indeed specify that quad core will be available in the ULV space, but they do not say that it will be in the 17W TDP profile. They say ULV "starts at 17W".
Albuquerque, slide down in the page says "Quad core and 17W." Of course that doesn't really tell what the CES test system used.
DavidC is offline   Reply With Quote
Old 27-Jan-2012, 11:37   #270
dess
Junior Member
 
Join Date: May 2005
Posts: 28
Default

Quote:
Originally Posted by Albuquerque View Post
Nothing you've provided shows me that Intel nor AMD's power consumption versus TDP are far-separated.
My main concern here was your claim that AMD typically under-rates their CPU's TDP. It's clear already it's not the case, the consumption at high loads is certainly between the value of the given TDP and the TDP on class below. Now, on Intel's side, look at the i3-2100, its real high-loaded consumption is some 26W (your source; it's unknown if it's including memory controller or not), yet Intel lists it as a 65W part. On the other hand, there is i7-880: 102W from ATX12V (your source, again) + 20W for the memory controller (it's Nehalem) = 122W, yet Intel lists it with 95W.

Quote:
Did you watch the video?
I don't have time to analyse them, I didn't comment on those, just corrected some claims it seemed was related to the topic at hand.

Quote:
So, I came out and said that it looks flat after all, and yet here you are telling me I'm wrong and never admitted it... Really?
Not really, as I did not say you didn't admitted this one, I just answered your question why we're discussing it.

Quote:
The burden of proof is on you.[/b] It is you, not I, who are making the accusations that somehow AMD's TDP is more 'relevant' than Intel.
No. Note that it was you who "made accusations" first that AMD's TDP is "typically" under-rated... (And that of Intel never.)

Quote:
Fact: We now have three sites that give some attempt at metering CPU power consumption. IN all three, when comparing against their rated TDP among chips who are roughly performance equivalent, AMD and Intel are proving to be roughly equal in terms of relation between actual consumption and TDP rating.
Not really.

Quote:
Fact: AMD and Intel only rate processor wattage in terms of TDP, so a "17W processor" in PR terms would mean a "17W TDP Processor."
So it's 17W at most and can be even lower. (Not more as you first claimed.)

Quote:
Fact: The CES demo was not run on a 17W TDP processor.
It's not a fact, more like your assumption only.

Quote:
Possibility: There may not be quad-core Trinity chips at 17W TDP. Your ULV slide does indeed specify that quad core will be available in the ULV space, but they do not say that it will be in the 17W TDP profile. They say ULV "starts at 17W".
And I think the claims of "All the features of a premium 35W 'Trinity' notebook" and "The only available premium quad core, low voltage APU" is implicitly true for the whole ULV line, including the 17W TDP part.

Quote:
Opinion, based on what we know about GloFo and AMD's ability to build processors: Trinity at CES is likely NOT half the power consumption of the prior Llano 35W TDP part. It's the same lithography process, with significantly more transistors thanks to more GPU and CPU cores. Sure, L3 cache goes missing, but is that going to save half of the power draw? Nope. Power doesn't go down in a scenario where your lithography stays the same, your computation power goes up, clocks stay flat, and you add transistors... Although, strictly speaking, it also doesn't mean TDP had to go up either.
The most important factor in max. power consumption for a given part is voltage and then clock rate. The 17W Trinity is an ULV (Ultra Low Voltage) part, most probably at a relatively low clock rate...

Also note that the max. consumption can go down even upon a respin in some cases. GloFo could falso fine-tune their processes in the meantime.

Last, AFAIK the ULV and the SHP (super high performance) processes are two distinct ones.

Last edited by dess; 27-Jan-2012 at 11:57.
dess is offline   Reply With Quote
Old 27-Jan-2012, 12:03   #271
ronvalencia
Registered
 
Join Date: Jan 2012
Posts: 8
Default

Quote:
Originally Posted by CarstenS View Post
I cannot be sure, but it seems that they compare an actual notebook with a Mini-PC with resulting implications on components' pricing and efficiency.
My post for "an AMD chip typically consumes more. No matter what" comment.

AMD A8-35x0M rated at 35 or 45 watts.

The posted TH graphs shows that the 17 watts Trinity is reachable.
ronvalencia is offline   Reply With Quote
Old 27-Jan-2012, 17:45   #272
ToTTenTranz
Senior Member
 
Join Date: Jul 2008
Posts: 3,075
Default

People, I think it's time to stop trying to correct Albuquerque
Everyone pretty much already knows his statement about AMD going above the TDP is wrong, and the 17W Trinity has a 2-module Piledriver.
It's just that he's obsessed with this "score" of his and he'll just keep doing loops of awfully random information (like calculating average power consumption on Intel CPUs, lol) until he's somehow proven right.


Now, just to make a quick review of all the info we know so far regarding the performance of the 17W part:

Hothardware took this video during CES:
http://www.youtube.com/watch?feature...&v=lsmTDb-Mlws
At approx. 1m40s, the AMD rep. says it's a quad-core.
Regarding the TDP, the people to took the video said:
Quote:
Originally Posted by hothardware
Particulars like clock speeds and the GPU configuration weren’t disclosed, but we can tell you that the Trinity APU in the notebook used during the demo was a 17w variant, and as AMD has already disclosed, the Trinity APU sports quad Piledriver cores, an update to Bulldozer that should offer better performance though not only architectural enhancements but frequency increases as well.
Next, there's this Dailytech interview with AMD's director of global product marketing, John Taylor where they say:
Quote:
Originally Posted by DailyTech
The chip is still built on the 32 nm process and is expected to come in at 17 watts for lower clocked ultrathin models, and a 35 watt model for traditional laptops. AMD predicts 25 percent faster CPU performance and 50 percent better GPU performance, versus Llano.
(...)
AMD was showing off one such notebook by Taiwanese manufacturer ASUSTek Computer Inc. (TPE:2357). It was playing the DirectX 11 game Dirt and unlike Intel's demo there was no fakery -- we were actually able to physically verify the that the notebook was actually running the game. AMD humorously had placed the ultrabook inside a desktop PC case, removing the side panel to reveal the glorious truth.


Then there's the newest leaked slides from computerbase.de:






Right here it's already pretty obvious: the 17W is a 2-module chip.
There's also the pictures: the 17W BGA APU (on the left, AMD confirmed at the CES floor that the BGA is 17W) is the same size as the socket 35W APU. There's no production of a single-module Trinity so far, at least that we know of.
The 17W Trinity could be laser-cut with only 1 module working, but besides all the other confirmations, AMD has always been making the lowest-voltage models with all the cores: the ~6W Brazos Z-01 is a 1GHz dual-core, not a single-core; the C-60 (dual-core, 1GHz) has the same TDP as C-30 (single-core, 1,2GHz), etc.
As pretty much every CPU architecture introduced during past 5 years, more cores at lower speeds seems to have better power consumption than less cores at higher speeds, so there's really no reason to assume AMD would need to cut down a module in Trinity to achieve the 17W TDP.

There's also another factor here: the demo from AMD at CES, with the Dirt3 @ Low + video conversion + video playback can be done with a 35W Llano 3500M, as it was reproduced with users in other forums using laptops with that APU. The only differentiation here is the power consumption ("almost half the power consumption", as stated by the PR).



The performance of the 35/45W Trinity parts has been gradually uncovered (+25-50% performance over Llano A series), but the 17W part hasn't been discussed that much.

A couple of pages back I did this very-wild assumption that the 17W Trinity could be bringing a 400% performance bump over the E-450 Brazos in the same power consumption class.
While it does sound way too wild, the truth is that AMD is claiming a similar performance to the 35W Llano A-series. That means it could go from an A4 3300M (dual-core 1.9-2.5GHz, 240sp GPU @ 444MHz) to an A8 3520M (quad-core 1.6-2.5GHz, 400sp GPU @ 444MHz).
The latest would best the E450's GPU power by 3 to 4x easily, while CPU power would go from 2 to 3x, depending on workload.

Maybe a fair assumption is that it should be somewhere in the A6 3400M zone. That's a quad-core 1.4-2.3GHz CPU with a 320sp 16TMU 8ROP GPU @ 400MHz.
So this 17W Trinity could be around a ~1.2/1.3GHz (Turbo up to 2.0GHz?) quad-core/2-module Piledriver along with a GPU with 256-320sp VLIW4 16TMU 8ROP @ ~400MHz.

While not a "steady" 4x performance bump over E-450, this would still be damn impressive to carry in a 11,6-12" thin form factor.
ToTTenTranz is offline   Reply With Quote
Old 29-Jan-2012, 00:53   #273
french toast
Senior Member
 
Join Date: Jan 2012
Location: Leicestershire - England
Posts: 1,633
Default

So basically 17w trinity should have the gpu power of an xbox 360! pretty cool.
This to me seems to be the area AMD was designing for all along, cpu/gpu modules together on one die, brilliant idea, now picture if AMD didn't have the finantial constraints it has and actually got some of these revolutionary ideas to the consumer on time..

As it was they stumbled over the finish line and Intel just copied their plans an threw zillions at it to catch up.

Off topic (perhaps someone in the know could PM me?) why didnt AMD patent their ideas over the years instead of letting Intel blatenly copy them?
french toast is offline   Reply With Quote
Old 29-Jan-2012, 01:14   #274
Paran
Member
 
Join Date: Sep 2011
Posts: 203
Default

Quote:
Originally Posted by ToTTenTranz View Post
Hothardware took this video during CES:
http://www.youtube.com/watch?feature...&v=lsmTDb-Mlws
At approx. 1m40s, the AMD rep. says it's a quad-core.
Regarding the TDP, the people to took the video said:

He said 50% more compute capability at almost half the power. Think twice about it. AMD announced on CES 50% more compute power for the 35W Trinity. 17W and +50% power doesn't match AMDs claim. You can be sure it is a marketing trick.

Something more serious:
http://www.youtube.com/watch?v=agJxehoSBmY

He clearly stated the demo system used a Mainstream APU. Ultra Thin APU was separately mentioned.
Paran is offline   Reply With Quote
Old 29-Jan-2012, 04:41   #275
3dcgi
Senior Member
 
Join Date: Feb 2002
Posts: 2,165
Default

Quote:
Originally Posted by french toast View Post
Off topic (perhaps someone in the know could PM me?) why didnt AMD patent their ideas over the years instead of letting Intel blatenly copy them?
The concept of integrating a CPU and GPU is not unique. Only the details of the integration. See the following link to see why Intel is not copying anything though AMD might have prompted Intel to rekindle the concept.
http://en.wikipedia.org/wiki/Intel_Timna
3dcgi is offline   Reply With Quote

Reply

Tags
amd, fusion, intel, ivy bridge, trinity

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 20:23.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.