It seems to me that the biggest advantage for "ATI"/GPU may be that AMD might spend more time doing custom-XX logic instead of using standard libraries for everything. On the other hand, customization seems to lengthen product development cycles enormously and is not necessarily a panacea, anymore than writing an entire application in assembly language is better than writing it in C++. Sometimes it makes sense, in the same way that breaking out the most expensive hotspot code into assembly makes sense, but I'm not sure if AMD holds any inherent advantage over Nvidia in doing this, since the units most likely to be "optimized" (the ALUs) seem to be the ones Nvidia has already did custom R&D designs for.
I think Intel will have problems producing a competitive GPU core regardless of how many engineers they hire, just like NVidia isn't likely to whip out a Core 2/K8L competitor just buy hiring a few hundred engineers to work on it. There are lessons learned, skills built, from working on the same solution/product over and over several cycles, not just individual experience gained that you can't get by reading patents, textbooks, and journal papers alone, but organization experience as well. The industry is littered with companies that "staffed up", tried to crack into CPU and GPU markets, and fell of the charts. Most of them are gone now, and except in mobile and server spaces, and even there, commodity hardware from major providers is killing off the little guys and big guns alike. Will Niagara, PowerPC, et al, continue to survive? MIPS, PA-RISC, non-Niagara SPARC, et al, are all dead. Except in niche cases (like BlueGene/L or cheap datacenter hosting providers), x86 commodity hardware still seems to be erroding their market. The mobile market seems to be converging as well, onto Xscale and OMAP. Everyone except Intel and TI has been losing marketshare in recent years.
While I think it would be interesting to see an NVidia x86 main CPU, I think it would be a HUGE distraction from their core business. In order to even have a hope of being competitive, they'd have to devote enormous resorts and effort on it, with limited possible results and a very long ROI timescale. If they fail (the most probable), they'll end up like Cyrix, Transmeta, and umteem other failed x86 clones. If they succeed, maybe they'd be another AMD, but AMD's business position isn't exactly spectacular compared to NV's position.
If I was in NV's position, it would look like investing in the graphics market brings higher ROI/ROA/ROE, and better margins, with less entrenched players (no Intel to fight). Investing in the CPU market looks like a sink hole to throw money down, since even in the best case, the margins and returns are worse than GPU, mobile, logic, etc investments, and then you have a huge uphill battle to fight against Intel and AMD.
No, if I was NV and I was looking at making any kind of CPU, I'd look at going after the mobile market where things are alot more fluid, and TI, STM, Motorola, Renesas, Qualcomm, NEC, et al, are far less entrenched and guaranteed of a position due to less technological path-dependency in those markets. NV could choose to either partner with one of those, or, they could design their own mobile DSP/CPU with integrated core for handhelds, mobile phones, iPods, etc.
Or, alternatively, they could buy Transmeta, produce a low-power x86 capable chip with integrated MCU and MGPU capabilities, and sell it for the ultra lightweight (i.e. not bulky desktop replacement) notebook market, and origami devices. It could also be embedded in living room media center devices (not computers, but stuff like iTV), as well as in automative/telematics applications, especially since most GPS navigation systems I've demoed have TERRIBLE framerate.
But trying to go up against Intel/AMD in the mainstream x86 market seems like a terrible business proposition, and if I learned that they are actually doing that, I'd sell off my NVDA shares and switch to shorting them.