And it's slow CPU power compared to the onboard CPU for all the reasons mentioned a million times. Cloud computing is very good at handling extremely many users at the same time, but they do it slow compared to local resources. However the good thing is that they only have to allocate what amounts to a single i5 for every XBOX1. So maybe they can keep it fast enough, though the memory access it likely to be slow as "hell" compared to the consoles. 20GB of ram in the cloud is not likely to be local.
Worst thing is that no matter how much CPU power you have in the cloud, the quality of your graphics will still rely on your GPU. The WII U will still have to display the graphics with it's GPU, now granted you could do the cell trick and use some of CPU power to life a weak GPU like in the U and XBOX1, but it's not easy.
As i read this thread i am pretty confident that the number of games that will use this is limited to 1st party games from either Nintendo,Sony or Microsoft. The technology developed with this could be used in the future (if deemed feasible). But as i have said before, if this was easy to do, and something that could harvest great gains it's strange that others didn't do it before. Other software developers have been involved in online games far longer than Sony,Nintendo or Microsoft. Google has the biggest cloud out there and we haven't seen anything from them.. maybe they are getting old
I think we still lack a well written example of cloud computing lifting work from a console game that makes it worth the effort. Maybe some of the technical post mortems on Killzone, Halo etc could be dissected and injected with cloud computing..