Admittedly I haven't seen anything about this "Fusion Unfolding" you mention, but I really don't understand this repetitive thing about CPU's becoming GPU's :smile:
CPU's are general-purpose processors. They are good because, whilst good at some specific tasks, they are good all-round processing units. They handle the whole turing-complete and computable problems thing quite nicely. Ever heard the phrase
"Jack of all trades, master of none"?
A CPU is therefore perfectly capable of doing graphics processing - look at all of the software rasterizers and ray-tracers that still keep graphics programmers entertained
GPU's are highly specialised pieces of hardware. Yes, they share much in common with CPU's but the reason they whup the CPU's ass when it comes to performance is because they are specifically designed to do one thing extremely well (
yes, you can argue that they're becoming more mainstream, but I've discussed this in another thread recently...).
I remember having a lecturer at university openly laugh at people using Intel and AMD desktop computers to do decryption/encryption for secure communication. She went on to say that the security company she worked for had dedicated hardware specifically designed to handle encryption and related problems. It was rubbish at everything else but was an order of magnitude faster for encryption related tasks. In the same way as my previous point, sure a CPU can do encryption stuff but a dedicated chip can do it much better.
Okay, to get back to your point ....
Why would you want it to be part of the CPU's instruction set unless you actually wanted the CPU to execute it? I suppose if you had some sort of super-chip that was a CPU and GPU combined then maybe... but even then I'd be sceptical of whether you'd actually want them to share an instruction set.
Jack