What's your point? That a PR manager didn't mention exact value during a 120s shot? There are official AMD slides stating 17W TDP for FP2 BGA Trinity. AMD can achieve it quite easilly - they can adjust GPU's TDP by PowerTune to whatever value they need. I think they'll do some kind of CPU/GPU TDP balancing, too (like Intel does).
My point is twofold: One, while there is certainly a 17W Trinity, I do not believe that specific part was the demo model shown in that video. Two, claims of "I TOLD YOU IT WAS SEVENTEEN WATTS DIDNT YOU SEE THE VIDEO AND ALL THE PEOPLE CONFIRMING IT VIA THE VIDEO?!?!?!?" are obviously bunk as no so claim was made in the video. On a tangent to that last point, that guy in the video being PR or not, someone who is showing off that part during CES will absolutely know
exactly the sales pitch. And the sales pitch absolutely included power consumption, as it was a large portion of the sell of ultrathin laptop doing ALL of this awesomesauce... He was quite purposefully mentioning "almost" half power, so that he's not lying when it
isn't half power.
I believe you missed the point of the demonstration. Movie playback is performed by UVD processor, media encoding is performed by VCE processor and game runs on 3D core (racing games aren't CPU-demanding,
No, I got all of that. Nowhere did I claim tomfoolery or shenanigans on the part of that demonstration; I am reasonably convinced all three of those items were indeed happening in parallel and on the laptop device.
What I'm
not convinced of is whether this is a great way to demonstrate the "power of Trinity." Given what you described (dedicated hardware for basically all of it), you might reasonably expect Ivy Bridge to pull off the same capabilities. Hell, I would reasonably expect Llano to be able to pull off that same stunt, given a bit of 'tweaking' to the various data streams going in and coming out of that box. Of course, Llano would be doing it at 35W or more...
Again, we have NO data on:
- Power consumption of that box
- Performance or quality data on the video encoding
- Performance or quality data on the video playback
- Performance or quality data on the game being played
If I'm encoding 640x480 video from my smartphone, while playing a DX11 (on paper? what makes it a full DX11 implementation?) racing game at 800x600 with no AA and limited AF, while playing back an NTSC DVD that was ripped to the local drive, I would expect a Llano (or gasp, even a Sandy Bridge) to get away with that pretty easily. I might be able to get my i5-520m to almost get away with it, depending on how 'truly' DX11 that video game is.
None of that ultimately matters. My initial point that still stands: given what we know about AMD's current CPU and GPU architectures and what they've told us is going into Trinity, I have no reason to expect the 17W version of Trinity to be doing ALL of that work at quality levels that are meaningful. It's surely a 'cool' demo, but not really a meaningful one.