Not for embedded low-power-ish, though. The Power7 cores were designed for very high power consumption and very high clocks. Each Power7 quad-core module (the smallest ones available) was a 567mm^2 monster with 1.2B transistors, 4 cores with 4 threads and 4 FP64 floating point units each, designed to run at ~4GHz for a total throughput of 400 GFLOPs.
Oh God, I already knew any relevant server POWER core was out of the question. Those are server chips! A quad variant of the e600 from Freescale would've been great, basically 4x PPC 7448s. Hell even just two of those cores in the 2.0 GHz region would've been a huge improvement over the 3x PPC750 since the e600 like PPC7448, has true 4-wide Altivec. A quad would've easily been competitive with Xenon in real workloads because of the short pipeline. The PowerPC 970 with a 45nm die shrink I bet could've been pulled off too. Sticking to PPC would've leveraged the PPC work known by devs on all the systems at that point (Wii, 360, PS3).
They'd be much better off keeping the damned PowerPC ~750MHz 750 for direct Wii compatibility, use it in Wii U games for handling the O.S. and then let the developers use a quad-core Cortex A9 1.8-2.0GHz. 2GB GDDR5 128bit, plus the single channel DDR3 for low-priority RAM they had been implementing since the Wii and call it a day.
A retained PPC750 could've been kept around as a secondary security processor for Wii U titles, while performing BC for Wii and Gamecube games, however that would still leave the Wii/GC GPU to worry about retaining. But better yet, Nintendo could've had some level of intelligence and leveraged the work of the Dolphin emulator team to build a comprehensive software emulator to run previous system games. They should've offered to bring them in-house, pay them well, and in turn get some good boy points from the community instead of the usual demonization schemes they tend to enact. There would've been no need to retain any old hardware, and older games could be ran at higher resolutions, AA, and AF to look better than before. Hell it makes a case to have used an AMD Llano APU instead of sticking with PPC. Nintendo would've been ahead of the curve as Sony and MS transitioned to x86.
That and a Juniper or even a RV740 GPU would put it close enough to the 2013 consoles to get multiplatform titles.
RV740 was exactly what we were all expecting when it was rumored the Wii would have an R700 based GPU in 2012 of all years. I guess the feature set relative to the transistor count and GFLOPS in R700 made sense to Nintendo as DirectX capability was irrelevant but only 352 GFLOPS of GPU compute was pathetic. You're right that more capability would've kept the system relevant, if not somewhat competitive graphically, but would've also forced Sony and MS to produce more powerful systems too. I've tried to run some newer games on a Radeon 4670 (same basic config for Wii U GPU) for poops 'n giggles, but by 2012, game and driver support had moved beyond the R700s. Let's say AC4 Black Flag barely ran, even on lowest settings.
Funniest thing is this could probably all fit inside a single ~180mm^2 chip made at IBM, since most of Latte's die area was for the 32MB eDRAM. They didn't even have to go with three ridiculously sized different chips in a substrate. The only additional cost here would be the 4* GDDR5 chips for 128bit width.
I blame Nintendo's unwillingness to break hardware backwards compatibility, and perhaps vendors played a role in whether they wanted to combine their IP on one die. AFAIK, Microsoft owns the chip designs to the 360, hence why they could create an "APU" for the later model 360s. It might have not been the same for the Wii U.
Idealized Wii U Specs:
Freescale e600 Quad-core @ 2.0 GHz
AMD RV740 GPU, ported to 28nm, 640:32:16 @ 750+ MHz (960+ GFLOPS)
4 GB GDDR5 on 128 bit bus
No retained old hardware
, and at least for a year, Nintendo would've had the best versions of multiplatform games in 1080p as compared to 720p on the 360 and PS3.