hasn't more than a decade of GPU history really taught you nothing?
i mean, i'm all for software rasterizers, but you're just outright ridiculous here.
just one simple question for you, what hit do you think EMBM would impose on the SPE's LS scheme? mind you, flipper/hollywood have oomphs of transistors dedicated to the task. combined with low latency embedded memory. at all levels of the memory hierarchy.
but let's assume for the sake of argument that after some unhuman heroism a clever coder manages to squeeze some high-enough framerates out of cell at some hollywood-biased tasks. what CPU power do you expect to have left after that for actual game code?
i think some people on these forums should really cut on the hype dose..
Cell's not a typical cpu, it likely could handle software rasterizers an order of magnitude or two more powerful than what we've seen so far. Plus, the Wii's hardware is extremely outdated, at this point, if you cut out some filtering algorithms that cpus would struggle with, I believe a top of the line cpu now could handle a bit more than a geforce 3's level of graphics. (so iq would be worse, but I think it could be similar polygon counts, resolution, and framerate)
As far as EMBM, well until we start seeing Wii games making heavy use of it, I think it's more relevent that Cell would destroy Wii as a system in polygon thoroughput.
I agree that using Cell for a Wii level software rasterizer wouldn't leave much of anything to run game code though.