I don't know what they can do, exactly. A few ideas and concerns I can say off the top of my head as an arm-chair quarterback follow.Yes indeed, I actually agree with most of your points, and the licence issue is most pertinent, but what else can Nvidia do if they foresee a future where all that's left for them in the PC space is the very high end discrete market that they have to fight for against Intel and AMD/ATI?
Anyone is free to debate any of these points. I don't have the in-depth knowledge of each field to be certain of any of my analysis.
They need to contain the arena in which AMD/ATI may have an advantage:
In the very low-power/embedded space:
An AMD/ATI offering would permit lower platform costs from a single-chip configuration.
Nvidia should look into making a CPU for here, but go for an ARM or MIPs license. x86 hasn't set the embedded world on fire, and thus far AMD is focused on Geode x86 processors (at least I think they're mostly x86).
Nvidia could just snap up one of the many small companies that already make hardware in this realm, or partner with one of the larger manufacturers.
Laptops:
The value end is basically an AMD/ATI playing field.
Even if Nvidia could create a decent x86 chip to pair with a GPU, they couldn't match a company with a high-end fab and the volume to cut costs. They will need a parter or find some way to make life very hard for AMD.
Nvidia's going to have to find some way to out-innovate and differentiate its products with features and tie-ins that can make it worthwhile to not buy an integrated solution.
Until we see how the new culture and methods of operation form for AMD, we don't know how hard that will be. CPU manufacturers go by different rules, and that may lead to feature stagnation for ATI's future parts.
For everything between low and high-end laptops, Nvidia is going to do everything it can to portray the integrated solution in the most negative light that it can. This means something like TWIMTBP on steroids. Lots of sponsored developers, lots of software that has Nvidia-only bells and whistles, and lots and lots of (probably crooked) benchmarks.
There are some unknowns about the integration of GPU and CPU. The power density is going to be higher, meaning a more robust means of cooling is needed in a very tight space. On the other hand, the removal of a separate GPU might clean up the layout and airflow of the case.
This may have some bearing in the ultra-thin laptop market. Even if it doesn't, Nvidia can still say it does and parade the necessary expert studies to prove it.
The higher-end will still favor discrete components, unless AMD goes really high-risk and includes some on-package DRAM, which would require some very signficant design changes and a much higher unit cost.
PC space:
The low-price, low-performance segment again is very much an AMD/ATI strong point.
Without partnering with Intel, it is very unlikely Nvidia can hold this field at all without some heavy price cutting, which AMD could probably match.
The minute performance becomes even remotely an issue, however, Nvidia can take the advantage. The mid-end might not be all too profitable for AMD, because it would need commensurately larger silicon investment to match a discrete card, and cost scales quickly with larger die sizes.
Also, if combined CPU/GPUs make it into the higher levels of the PC market, then Nvidia has a possible chance of making money with every AMD/ATI chip sold.
The graphics board may be reduced to little more than a riser card in some low-cost computers, with just the RAMDAC and other associated port chips, but it also means that graphics output goes through a standard PCI-Express slot.
Any number of enhancements could be made to the board, and AMD is unlikely to become more than a reference board manufacturer.
Since AMD and ATI are consolidating a position on the production end of the graphics pipeline, Nvidia needs to build a position on the other end.
Nvidia could get into the business of designing and providing the components to the new interface boards. It could even make its own boards, something unlikely but much more viable than trying to start up a CPU business.
It could become the main supplier of chips that add functionality to the interface boards for all AMD/ATI CPU/GPUs.
Imagine what designers of an AIW board could do with all that real-estate freed up that used to be the GPU,RAM, and its power regulation circuitry. (Another biggie, what happens to all those bargain motherboards that need better power regulators?)
AMD would probably tell its ATI engineers to stick to their core competency.
Nvidia can branch out and find that niche it can use to leverage against its competitor, it has to.
Nvidia could even get into the physics processing unit business, since it will now have all that real-estate AMD has freed up, and the integrated CPU/GPU would largely remove a leg in the triangle of contention between the PPU, CPU, and GPU.
Even better, once Intel gets in the game (assuming it doesn't partner with Nvidia) it would depress prices on AMD's GPU/CPUs.
If Intel steers clear of the board section, the only place with unrestricted margins would be the one that Nvidia carved out.
The high-end could still be wide-open. Unless AMD suddenly has the spare capacity with all the multi-cores, laptop chips, and all that, the high-end ATI stuff is likely to be manufactured somewhere else, negating the process advantage (unless they go with Chartered, which has agreed to make some AMD cpus and has been given process help).
AMD's going to be using a lot of fab space just keeping up with Intel, especially in the server market. It's basically given up on desktop performance leadership for at least a processor generation.
ATI could get a lot more help with circuit design, but that's assuming that AMD doesn't need all the designers it can get for the CPU side.
Could ATI do the same thing? It might, if AMD sees that it is worthwhile, a PPU could hurt the viability of the those quad-core CPUs, so cannibalization of sales is a concern.
Servers:
For servers that need little or no graphics, a quad-core AMD chip with one stripped-down GPU somewhere on die would be a good sell.
Look for Intel to quickly put its own product in.
I don't know if Nvidia was serious about this space before, and it's likely it won't be after.
Unless the server really needs serious graphics horsepower, I'm having a hard time thinking of a reason Nvidia can sell anything here.
Intel is the big wildcard Nvidia needs. It needs Intel to keep pressuring the CPU side, so that the drain on resources hurts the graphics side of AMD/ATI.
As long as Intel keeps hurting AMD, ATI will be hurt as well.
Or they could count on the AMD/ATI thing just not working out.
A lot of things could go wrong very easily. Both AMD and ATI have had deadlines slip on them, but now they'd have double the deadlines that could slip.
If AMD/ATI get along amazingly well, look to see who buys Nvidia, or who investors will revolt against if it doesn't happen.
The GPU is going to need bandwidth. Four of them will need four times as much. They're so very parallel that they will use every bit of bandwidth available.You won't need pins as such, the transistors that make a graphics chip will be incorporated into a die, just like a second CPU core is now. The didn't need to add more pins when they added a second core. Hypertransport seems to be getting regularly upgraded in speed and bandwidth, so maybe that's how they'll handle the issue along with faster and wider memory. I'm sure that AMD didn't just announce Fusion without some idea of how they will handle the memory issues that come with it.
Even with newer memory buses, the pin counts will be staggering, not counting the need for more power and ground pins.
Unfortunately, pad density doesn't follow Moore's law, and socket pin density advances even more slowly.
Look at the number of pads on the bottom of a GPU package and look at the number of pins in a motherboard's socket.
Nvidia's discrete GPUs will have a much less constrained pin budget, because they don't need to share pad space with a CPU.
It is also possible that the substrates the CPU/GPUs will be mounted on will be significantly more expensive and complex.
Mounting defects also lead to junked chips, so to keep up manufacturability, AMD may rein in the GPU devs significantly when it comes to pad allocation.
If it does not compete with a company that is also now a direct competitor, what's it going to do?I doubt Nvidia has the time or money to get into a price war with AMD, with whom they still have a lot of lucrative deals. Not only is "hoping the other guy breaks first" a poor strategy, it's just as likely to damage Nvidia.
If it takes price cuts in to keep up sales areas where a CPU/GPU makes performance sense, why not do it? It is risky to cede market segments to AMD and allow it to focus on areas where it isn't as competitive.
Nvidia certainly can't start building fabs and try to compete as an x86 manufacturer.
Via would kill it on the low-power super-cheap end, and the rest is AMD and Intel.
If AMD's gamble works, things will change.That's been mooted, but it seems that the personalities at the top won't let that happen.
If the execs will not change, there are boards that will change the execs, or competitors that will change the companies.