My long thoughts on this
Why developers liked NVidia
The truth of the matter is that until the 9700 and the late FX NVidia was on the way to totally owning the market for gamer-oriented 3d accelerators. Support for ATI and other chips was becoming more and more of an afterthought for developers--who in fact liked it that way. Things were easier if they only had to develop for the NVidia architecture, and since the XBox also had that architecture it gave them an easy path to the console market. If the FX had come out on time (before the R300) NVidia's market dominance might have been unstoppable, and game developers would have happily targeted their games entirely at the NVidia architecture. As alternative video cards became less well supported, they would have become less popular and developers would have even less reason to support them--which they would have regarded as a good thing.
NVidia also had the advantage of solid, often updated drivers with excellent OpenGL support, a close relationship with Microsoft, and excellent developer relations--including monetary incentives for NVidia branding.
Some recently released titles were developed under the assumption of NVidia dominance, and many still in development have developers that were working within that model.
Cg and Developers
Part of the excellent developer support for NVidia was a web-site with well documented SDK's and tools. Cg built on that by providing a state-of-the art tool for writing complex shaders, of the sort that NVidia promised would be optimum for their upcoming hardware.
If you were a developer that shared the assumption of NVidia dominance, Cg seemed like a fairly easy choice to make. It would provide the best possible support for the dominant platform, and NVidia promised at least passable support for the "standards" (which in any case appeared to be becoming less important to the market of the future). It supported OpenGL, and so the latest NVidia extensions, including those in the drivers which emulated the chip that promised to cement NVidia's future dominance, the FX. If Cg captured enough developer support, the other IHV's would be forced to supply back-ends optimized for their hardware--to the extent possible.
Expectations meet reality
Last summer ATI released the R300, a GPU which had much greater speed and much more advanced capabilities than NVidia's current flagship product. At first it was assumed that NVidia's FX chip would trump the R300 soon after, but as it turned out NVidia would have no competitive chip for 6 months or more--and numerous questions remain about the FX's performance, availability and driver support.
This gave ATI the opportunity to sell a lot of DX9 cards, and in fact to introduce a line of DX9 cards at various price points. ATI did not make great inroads in market share on a unit basis, but among higher-end card buyers, which are particularly important to developers because they spend the most money on games, ATI made significant inroads. In fact, the only significant market for a title released with DX9-specific features in the next few months would be ATI owners. The assumption of NVidia dominance has proven incorrect, at least for the next several product cycles.
Developers now have to take ATI owners into account, or face an angry backlash from a large proportion of their most vocal audience (cf. Neverwinter Nights). ATI can now afford to disdain Cg and urge developers towards the "IHV-neutral" "standard" platforms to insure that ATI boards are supported properly.
If NVidia had counted on special optimizations available from Cg back-ends to get optimum performance on their future cards, they now are in a tenuous position. They have to sell Cg to developers even while the best available hardware to run next-generation shaders is made by a company that is adamantly refusing to support Cg. And if Cg does not gain support among developers, NVidia may find its devices comparing less favorably on other development platforms.
[Edited for formatting]