Billy Idol
Legend
I'll go one step further. Enormous GPU compute is only possible with parallel workloads, meaning large datasets, yet the bandwidth limits of the internet massively cap the rate data can be sent to/from the servers. 1.2 TF burns through 1.2 trillion 4 byte single floats a second, 4.8 terabytes/s. If you could upload 1 MB/s to the servers (which is an enormous 8 Mbps upload BW), 1.2 TFs could process each byte with 4.8 million operations. There's no way you're going to want to process the byte that many times to arrive at a final value! If we compare that to MS's claimed 200 GB/s in Xbox One, that's enough for the GPU to process each byte with 24 operations. There's no way to utilise massive processing power for sustained periods unless all the data is local to the server, which means running them more like game servers than distributed computing nodes.
Maybe that's how MS will provide their supposed performance? Each console has access to 4 TF/s, but only needs a microsecond timeslice of that power. That way a 4 TF/s total power server could deliver 4 TF/s to thousands of consoles, but only in tiny bursts. In essence comparing a sprinter to a marathon runner using a choice but unrealistic number to sound faster.
That is an interesting post Shifty. I remember that we once did a project on distributed High Performance Computing, where we connected two machines across Europe via an internet protocol...we really had to work hard to hide the latency introduced by the slow transfer rate.
It was super hard to find a task with low enough amount data that needed enough operations to make it worth...at the end we found some configurations that were reasonable, but what helped us was that in large scale HPC you often have timings of several seconds to get a critical result due to the massive amount of operations needed with total run times in the range of hours.
So in conclusion maybe high fidelity simulation tasks can be done in the cloud. Just an example: fluid simulation. Character jumps into some water and generates fluid reactions. Resistance 2 did this by solving a wave type equation with FFTs accelerated on the SPUs with a source term to simulate the player. But you could see how low res the simulation is, as single triangles in the waves were visible as only a 32x32 simulation grid was used (iirc).
So basically to get a high fidelity version, you need just the position of the player and maybe velocity to calculate the source...with this low amount of data a wave simulation, or with enough power a fluid simulation kicks in on the cloud to compute the result, transfer back 2D displacement data which is then rendered again local on the box?!?