We have conducted experiments in our lab where Cell servers were used to feed wireless handheld visual devices (PDAs). We found that we could software render and compress hundreds of frames per second using only a single Cell processor. The limiting factor became how many compressed frames we could push across the 802.11b wireless link. The prototype handheld system encoded the user’s inputs (GPS, Digital compass, Joystick, etc.) and shipped them to the Cell server where the software renderer rendered the correct 3D view of the world, compressed the resulting 2D image and delivered it back to the handheld client. With simple JPEG like compression and 802.11b wireless we were able to deliver 15 frames/sec to the handheld device. Given this result I believe that with Cell SMP servers and more aggressive compression, like H.264, persistent world 3D online games could be played with very low power handheld clients. These handheld clients would not need power hungry 3D GPUs or large amounts of memory, instead they would only need to decompress and display streams of 2D images. Why send megabytes of 3D geometry to handheld gaming devices for storage and processing and then constantly update it every frame when the server can compute and send the finished 5KB frame?