Complete agreement here.
There are lots of algorithms. Some run well on CPU's. Some run well on GPU's. Some run well on both, and some don't run well on either.
For games, I don't see the CPU quite becoming 'a glorified joystick processor' (as Jeff Minter described the 68000 CPU on the Atari Jaguar many years ago) in PC applications - there's plenty of AI and physics for it to do (and the GPU would be totally useless at the former). And, as Chalnoth said, to tell the GPU what to render.
In professional graphics GPU's are finally starting to look interesting to the 'serious' people (universities, defence, Hollywood etc.). And rightly so; GPU's are approaching (may already be at?) teraflop levels of performance, and a few hundred high-power GPU's might be able to seriously compete with a few thousand Linux boxes pretty soon... or alternatively, you can run a few thousand high-power GPU's and throw on some even more complex algorithms!
There are lots of algorithms. Some run well on CPU's. Some run well on GPU's. Some run well on both, and some don't run well on either.
For games, I don't see the CPU quite becoming 'a glorified joystick processor' (as Jeff Minter described the 68000 CPU on the Atari Jaguar many years ago) in PC applications - there's plenty of AI and physics for it to do (and the GPU would be totally useless at the former). And, as Chalnoth said, to tell the GPU what to render.
In professional graphics GPU's are finally starting to look interesting to the 'serious' people (universities, defence, Hollywood etc.). And rightly so; GPU's are approaching (may already be at?) teraflop levels of performance, and a few hundred high-power GPU's might be able to seriously compete with a few thousand Linux boxes pretty soon... or alternatively, you can run a few thousand high-power GPU's and throw on some even more complex algorithms!