it was rather monolithic in setting up the compiler and all to work with cg runtime. if you only needed the compiler in advance, it was rather fine.
but it ment huge downloads, and still does. if you want to see an example, you need to download the whole package, and rely on a (long time) buggy browser, and all.
nvidia demos of 2000 and such where small, simple, and independent. they showed how to accomplish a certain task, and just that. now you have huge together-connected packages, wich are quite a bit more difficult to analize how they actually solve a task. it's nvidia proprietary. it wasn't, some time ago.
and, uhm, no, cg is wellknown to not work at all on pre radeon9500 (opengl, that is), as there is no ps1.4 supported (possibly is now, i don't know).
it is as well wellknown to insert extra swizzles that makes tons of theoretically valid pixelshaders unneccesary long, or add unneeded texture-dependencies to make them unrunable on 9500+ hw. these things where long called bugs. it's still interesting to see how those bugs where free on the nvidia hw (while swizzling and movs not always are, those where detected by all drivers and removed), and made the shaders unrunable on ati hw.
eigther they are reeeeeeaaally bad at compiler writing, or, well.. make your own thoughts.
fact is, cg was longtime unusable on any other hw than nvidias.
but yes, during this time, it was the only option, besides 'lowlevel shading'.
the biggest fault of cg is not actually the language, but the binding.
technically, cg would never have needed to be more than:
[source]
CGShader compileShader(char[] shader,CGRequestedTarget target);
with
struct CGShader {
std::map<std::string,GLuint> variables;
std::string compiledShader;
}
or similar
[/source]
it would just have been a small .lib, or .dll, and an exe wich uses the same, to offline compile.
everything more is useless overhead. and espencially in the beginning, there was quite some more. haven't looked at it recently, but i will do so now.