More Facts
Dave, Geoff B is wrong as has been corrected.
Guest, you forget the 'profile' aspect of Cg and HLSL. This is the key to the compiler vs. language point.
Cg, the language, has the reserved keywords and syntactic rules to support primitive programs, full per-pixel object oriented branching, programmable frame buffer blending, application callbacks, etc.
The currently shipping "profiles" for the Cg compiler, however, support dx8 & ogl vertex & pixel shaders. The compiler source was shipped yesterday (
www.nvidia.com/developer ), so any IHV or ISV that wants something that the current compiler "profiles" don't support can add it, such as displacement maps, although this should be added for the dx9 vertex profile.
The current shipping Cg compiler is sub-optimal for vertex shaders, but beats 95% of coders at dx8/ogl pixel ( through nvparse ). The version shipping in October is supposed to be much better at this.
Also, at Siggraph, 500 developers attended free Cg labs, where each had a computer and actually coded shaders using it. Probably 150 other developers from the game, tools and off-line communities worked hands-on with an earlier version of Cg. To think that NIVIDA came up with this behind closed doors with no feedback is crazy. Microsoft and NVIDIA had numerous developers give feedback from syntax to general philosophy ( "make it like C!" ) for months. Just because it wasn't uploaded publicly does not imply that feedback wasn't accepted from many respected graphics companies for months.
Another point, programs can be compiled at runtime ( using d3dx in dx9, or NVIDIA's cg compiler ), or off-line. I think 90% of developers will do off-line, but on-line is a nice way to assemble shaders on the fly for certain effects.
As far as tool integration goes, NVIDIA and tool employees busted their asses to get this ready and give this away for free to the community - and it works. Certainly this effort could be duplicated for other languages, and thanks to the initial work it will be much easier for others to do so.
It's not important that Cg become the only way to write high-level shaders for current & future hardware. The point is that Cg is a standard now, like it or not, and it's here and it works, and it's improving.