IMHO good developers adapt to new technologies/languages. I don't think people like John Carmack are using the same tools for Rage that they used to write Wolf3D.
Statistics dictate that most developers are not on the level of John Carmack. That's why most devs license engines and modify them.
Why would that be happen? If the competitor has a better product and consumers buy that prodcuct, developers should target that, no Nvidia's inferior solution.
That's an idealized view of the world, that consumers will instantly choose the optimal product according to a set of criteria, and that developers will target that. In the real world, consumers choose to buy products based on many criteria as well as persuasive marketing. In the real world, a company releasing a new superior productive does not (most of the time) instantly disrupt the installed base.
And, in the real world, time is money. So unless a product is truly disruptive and so revolutionary that it is a world apart, simply being a little bit better, would not necessarily convince a developer to give up productive tools purely for performance.
Look at it this way. Games are expensive and risky to develop. A developer wants to minimize development expense, and maximize his market. To do that, he'll use off the shelf solutions when possible if they solve his needs (as long as the costs are not too high), and he'll target the lowest common denominator.
What this means is, one or two quarters of strict ATI dominance cannot meaningfully transform the deployed base of cards, nor is it going to convince devs to just drop NVidia. If anything, the lower performing card that is easier to develop on will be a closer target to the average consumer.
Thus, I stand by my comments that in the short to medium term, ATI's cost advantage is not going to sway developers or alter how games are developed very much. If Nvidia surrenders market share for 1-2 years, things could look different. In the mean time, good developer relations, marketing, and sales pressure can act to minimize losses.
There is a big difference in what geek/developer/enthusiasts like, and what is good for development business.
If the OpenCL abstraction layer is good enough you shouldn't care what the hardware looks like under the hood, right?
You could say the same for OpenGL and DirectX, yet ensuring optimal performance across GPUs often requires special code paths. Hell, NVidia and ATI have special *driver* code paths (app detection) for some games. The reality is, the abstraction layer can only do so much. All RDBMS support SQL, but queries have differing performance characteristics on different databases, such that almost all applications that are multi-database are specialized.