I would more or less agree with regards to today's software, but with more cores available future software may attempt to do things we don't bother with today in the consumer space, computer vision related tasks in particular.
If anything like that takes hold it would be counter product to forgo extra cores in favor of GPU like elements. Especially when the majority of the populace isn't interested in high performance graphics, a just good enough solution provided by the many core CPU would likely suffice.
I can see your point. The only question is, are the future applications that would require such resources un-amenable to using the more GPU like elements? If we are comparing different combinations of multiple cores (heterogeneous or not) , the issue of concurrency and dependencies remain in both scenarios.