There are internal projects at MS that use EC3 instead of Azure because of the cost.
And if a publisher rolls their own they can use it on both platforms.
The cost point is probably valid in some context but for this discussion I think we just need to take Microsoft's word at face value and assume "it will be paid for." The bigger question to me is if they really believe that transistors 'in the cloud' can have a meaningful impact on console gaming.
WRT developers "rolling their own," I assume there needs to be some level of hardware or OS level support for the type of enhancements that MS is touting though? Otherwise the 360 and PS3 could equally have benefitted from server-side compute resources?
I think the bigger issue is going to be convincing anyone doing a cross platform title to use it at all.
Any use of a remote resource like this is going to take significant planning, plus implementations in multiple environments.
Given to do the same on the competitors platform I have to host the servers, why would I go to the effort?
There is also cost, Live as it is right now certainly won't pay for the cloud if it is broadly used, so the cost probably falls on the publishers, another reason not to use it.
Are server based AI routines an easy target for Multiplatforms though? I think AI could benefit greatly from something like this as the NPCs could "learn" over time based on thousands of people playing the game, rather than just the code thats baked on the disc. i.e. game 1 uses local code only, game 2 updates with server code if available...