Worrying about Moore's law may become academic before it becomes physically and economically impossible.
Truly, this should be the point of this thread. Is a divergence from Moore's law any sort of concern? A law that Moore himself really didn't specifically intend to dictate or create, and further an observation that he himself did not think would continue in perpetuity?
There are ways that we can continue doubling transistor count -- we can make bigger chips, we can stack them, we can make them smaller. But all of these growth dimensions have physical limitations, some of which are more pressing (at this time) than others.
Maybe the better question is: do we really need it to continue forever? We can tackle problems in different ways now. We do not live in a world where only the primary CPU can do useful work; multi-core, multi-socket, multi-node-clustered, distributed, or otherwise "cloudified" compute can do a lot more of that work and potentially at a lot less cost than a monolithic humongous CPU.
I think it's time that we stop worrying about how to address the continuation of Moore's unsustainable law, and work towards solving the problems we actually care about.