Sigh, I really post too much off-topic stuff these days but...
I somehow doubt Maxwell will be anywhere near suitable for the tablet market: we're talking about an architecture that's bound to be very HPC-oriented, here.
First is it very important to dissociate the Maxwell GPU core and the Denver CPU core, which are both used in the first Maxwell-based chip. Not every Maxwell chip will necessarily use Denver. I'm not sure that's the way NVIDIA think about their naming scheme (which in the G9x/GT2xx generation one of their top engineers told someone I know he couldn't keep up with anyway), but I can't find any other way to dissociate the two clearly.
Maxwell's GPU and the system architecture of that first chip are very HPC-oriented, but the Project Denver CPU itself is nearly certainly not. Remember the idea is to run the FP-heavy stuff on the GPU, not the CPU. I'd be very surprised if we had more than a single 128-bit FMA here - which Cortex-A15 already has!
As for the GPU, AFAIK the next-generation Tegra GPU is only coming in Logan which is likely slated for late 2012/early 2013 tape-out on 28HPM with 2H13 end-product availability. That will also be the first Tegra with Cortex-A15, as the 2012 Wayne is much more incremental. So the timeframe for next-gen Tegra GPU and the Maxwell GPU is surprisingly not that different, but the former comes up earlier than the latter and is one process node behind.
So I think architectural convergence is very very unlikely, unless it is the Maxwell GPU itself that is a next-gen Tegra GPU derivative, which would be completely crazy but rather in line with Jen-Hsun's insistence that Tegra is the future of the company and that performance will be much more limited by perf/watt than perf/mm² (and already is).
As for ARM CPU adoption on PCs... I think there's a strong possibility that many notebooks will evolve towards also having a touchscreen over time. That makes Metro UI and the like more attractive, and significantly reduces the relative appeal of legacy application compatibility. But yeah, desktops? No way. Maybe hell has already frozen over now that Duke Nukem Forever is released, but there's no way desktops are ever switching to ARM. Maybe some niche 'desktop' functions like Windows HTPCs, but that's more likely to migrate towards ARM by moing away away from Windows anyway.
Back on topic:
But actually, it's fortunate for NVIDIA that Intel's integrated graphics blows: it means they can still sell mid-range graphics cards to Intel users.
Yeah, 64-bit discrete GPUs are clearly a thing of the past though.
Llano is very impressive, but I wonder how bandwidth limited it really is, I really wish someone benchmarked it with different DDR3 module speeds. If it's very limited, then there may not be much room to grow before DDR4 becomes mainstream, or some other clever trick is used (silicon interposers as rumoured for Intel Haswell?)