It depends on how the deals are structured.
What is the expected service level for specialized grid computing applications ? e.g., How long before they expect the result to be sent back, how many Gb are the dataset ? Are there applications that will take time to raise funds/schedule to run on a supercomputer anyway without a cheap enough alternative ?
The client will still need to go through the expense of data gathering and packaging of work units.
Its apps would have to be customized, and the workflow must now include means of verification and error checking.
Is the savings in front-end costs for building and maintaining an on-site system worth the consistently higher overhead of distribution to 3rd parties through middlemen?
Maybe, if the client just has this one big non-proprietary job and never intends to use the system again.
Otherwise, the only way to save money is to not pay somebody somewhere full cost (the user). One gets the reliability one can expect from such a source.
Any proprietary data would have to be sent over an open network to thousands of insecure locations.
Aside from network interception, any such data would have to stay out of the PS3's hard disk and reside entirely in its tiny RAM.
This would put a limit on the size of the work unit, and place a premium on network bandwidth and latency.
For serious jobs, the end product may not be something we run in the background while hanging out in Playstation Home. It may mean running an image full-time (e.g., when the user sleeps), or even handling data on Blu-ray discs and external drives (I hope not)
I hope not as well. No way my financial data is going on a Blu-ray disk mailed to someone's house.
I'm not getting into legal and political issues of data crossing jurisdictions.