I guess my talk about convergence was that the majority of graphics parts sold in the future will probably be processors like Fusion (from both AMD and Intel), just like the majority of graphics chips being utilized right now are integrated units vs. standalone cards.
The reason why IGPs make sense is that most people don't care about games; thus, in that part of the industry, the product degenerates into a commodity. This can be minimized by the importance of a company's brandname, however.
I don't think there is any fundamental reason why the same principle that applies to GPUs and IGPs cannot apply to CPUs. The reason IGPs make sense is that most people don't need more than that. So, what's your justification for selling anything more than a 20mm2 dual-core CPU in the 2010 timeframe, anyway?
Most consumers won't need more than that either. Most other workloads are massively parallel, and even just a minuscule integrated GPU next to it would be good enough for most consumer-oriented GPGPU workloads, such as voice recognition, I believe.
In a couple of years, it will be possible to create a single-chip architecture (excluding analogue; I wouldn't be surprised if different kinds of analogue functionalities converged into a single chip for footprint reasons though) that includes everything 90%+ of consumers will ever need, and with a die size of less than 60mm2 on a foundry process. The big question is if that's a positive or a negative. It could, in theory, expand the userbase. But it would also substantially reduce the gross profit per customer.
If your idea of convergence is that 'different architectures will merge into a SoC', then the dynamics are imo much, much, much more complicated than most people seem to think. It's more of an economic, marketing and political problem than anything else too, imo.
So while we are sitting at shader units doing FP32 calculations, it is not a leap to figure that in the next two years we will be seeing higher precision calculations on these chips that will not only slightly improve graphics, but make these floating point monsters even more significant in HPC and scientific/financial simulations.
We'll be seeing GPUs capable of 1/4th speed FP64 within the next 9 months or so. I don't think there's a good reason to go beyond that, either.
As for whether that's useful in the consumer space... probably not very much so. What NVIDIA and ATI are really selling is 'performance for a given level of image quality'. Increasing the 'sweet spot' precision without any practical reason is not a good way to improve that.
In another 10 years we should have really, really nice stuff that will render stuff close to reality at the high end (or at the very least at the level of real time, high quality CGI type playback).
I'm not convinced about that. You know, this might seem ironic, but I'm going to point out that diminishing returns will prevent that from happening as quickly as you seem to be thinking.
I'm not saying diminishing returns will be such a problem that you couldn't notice the improvement between a generation of hardware and the next. It'll still be very perceptible, I'm sure. But it won't be as significant each time, simply because it (arguably) takes less effort to go from 'ugly' to 'bad' than from 'bad' to 'okay'. It's much easier to notice the difference between 500 polygons and 2000 than from 2000 to 8000. You can easily tell the difference, but it's still smaller.
It's hard to say how long until we hit diminishing returns that are so obvious nobody will care about "what's next" anymore. I would personally not be surprised at all if perceived visual quality could get *better* than real-life (photosurrealism, anyone?) and that we would still notice the difference.
After all, what looks 'good' or 'right' to the eye is extremely subjective and sometimes pseudo-random. If nature was the very definition of beauty, then why are girls bothering with cosmetic surgery anyway? You may argue this is not a very good comparison, but consider it from a more abstract way (rather than "zomg boobs!!!") and I think you'll hopefully get my point...