If... if... if... So what are they doing wrong now with the reports of the early units overheating? The other thing I'd like to point out is that we haven't even begun to consider the TDP for the CPU.
If by "early units" you mean the Wii U, no one outside Nintendo, IBM and AMD would know for sure (it is still very early hardware after all).
The TDP of a GPU is higher most of the time than the TDP of a CPU for a system that is built with gaming in mind.
We really don't know what kind of IBM CPU the Wii U will have so I will base this off an Intel SB i3.
A 3.3Ghz i3 with its onboard GPU has a TDP of 65W, some of that TDP would come from the GPU and the i3 has very high pref per a clock, so I think that a TDP of 35W would be a good upper limit for the Wii U's CPU (and that is not even getting into the fact that an i3's heatsink is about the same size as the Xbox 360S's but can still work for the 95W TDP i7s).
What were the ambient conditions? How long were you playing at 100% load? Can you say the same for every household and weather condition? Stuffing it in a home theater setup behind the window? I wish it were so easy as your single example. I don't have problems with my laptop in the winter. I sure as hell can if the house jumps to 30C and I want to play for several hours. I know when to quit when I can feel the bottom burning my finger. How loud was your laptop as well? Is that acceptable for a console or much less, the WiiU? How much was the laptop? How big of a heatsink are they going to need (i.e. extra costs, extra weight, extra shipping weight...)? Can it be done for under $300?
30C+, playing for a few hours at a time on a couch with games like Red Faction Guerrilla (which can push both the GPU and CPU), Mirror's Edge with GPU physx, WoW, Dirt 2, BFBC2 ect.
It was not really loud at all, I could clearly hear it when gaming but it was around a foot away from my head (and much, much quieter than the original Xbox 360).
The laptop came out at about $1200 2 and a half years ago (but it has a screen, battery, HDD ect.)
The heatsink in my laptop was not very big, it has one heat pipe going to the CPU (then on to the North Bridge) and one to the GPU (that has a higher TDP GPU than the HD6950M BTW) and then the heatpipes went to a radiator around 3 CM x 2 CM x 9 CM which had a fan behind it blowing air over it and out of the case. I have even found a pic
http://www.notebookreview.ru/files/image/Reviews database/Asus/G51/G51VX/Big/4.jpg
If we take something like a 35W TDP CPU and a 50W HD6950M and put them on the one chip you should end up with 85W TDP, easily enough for a small heatsink like the Sandy Bridge stock heatsink (rated for 95W) or even the 360 S's heatsink.
The die size of the HD6950M should be the same as the HD6850/70 (Barts) and is 255mm2 on 40nm but keep in mind that it is a binned core using 960 out of 1120 SPs, 48 out of 56 TMUs and also has a 256Bit memory bus (vs the 128Bit bus a console will likely use).
I am not saying that the Wii U would use anything like that but it is possible.
I don't mean to be snarky with these questions, but these are important considerations that go beyond just looking at "oh another device can do it IF there's enough money thrown at it, IF it has much sturdier construction, IF it's not playing games 100% of the time, IF we just reduce clock speeds, IF we ignore that there are hugely different ambient temperatures that people can game in, IF we ignore that Nintendo is quite possibly wasting its money producing a larger chip and just clocking it down.......".
And that's it. I'm done with the thread.
I would expect better from a Mod TBH.