antlers4 said:
Cg was basically a marketing tool designed to encourage developers to write complex shaders--not in itself a very sinister goal. nVidia thought that their nV3x products would be unique in their ability to run complex shaders at acceptable performance levels, so having widespread adoption of Cg by developers would encourage gamers to buy nVidia cards to see the latest bells and whistles. I don't think nVidia had any agenda besides getting a HLSL out to developers and gamers' mindshare sooner than either DX9 or OpenGL 2 would.
If Cg is a marketing tool designed to make people write longer shaders, then what the hell is OpenGL 2.0 and DirectX9 HLSL? Are these also evil marketing ploys by 3DLabs and others to do the same thing?
I mean, are you seriously suggesting things happened like this:
Marketing Guy: We need a way to force people to write long shaders.
Marketing Guy #2: Let's get the tech guys to invent a language to make writing long shaders easily.
Tech Guy: (talking to marketing guy) Ok, so you want us to invent a programming language and compiler so that we can lock developers into writing longer shaders and hopefully our hardware. Ok, got it.
Tech Guy #2: Ok, I did what the marketing guy told me. I went and hacked up a C parser real quick, removed some stuff, and made a compiler for DX8, ARB, and our own NV30 extensions.
Marketing Guy: Thanks, now can we get developers to write NV30-only stuff, Muahahah!
In this scenario, the Cg concept was invented by marketing strategy, and didn't come from engineers trying to come up with innovative ways to program ever more complex hardware.
You see, I would suggest that it happened more like this. NVidia's NV30 and ATI's R300 are the natural evolution of a trend towards more programmability. As CPUs/GPUs get more complex, the apps that you can write become more complex, and the details of the pipeline timings become too much for most assembler programmers to juggle.
The logical result of this trend is to design high level languages. It would suggest that when NVidia started designing the NV30, no HLSLs for real-time cards (OGL/DX) existed in the public market place. A group of software engineers got together and designed a language to program more complex shaders. This was a prototype research project for awhile, and finally got elevated in status, given a marketing trademark (Cg), and Nvidia started evangelizing it.
It just so happens that other companies were also working on their own HLSLs, 3DLabs and Microsoft. There are probably more out there that we don't know about at this moment. None of them were in particular, creating the HLSL languages for a specific anti-competitive purpose. These languages are the
natural result of general purpose programmable hardware coming into existence
You are going to see MORE languages being invented in the next few years, not less. Programmers LOVE inventing new languages that's why there are so many of them.
I myself think Pure Functional Programming fits better with GPUs vs C-like languages, because GPUs can't modify external state during shader execution, and can only return a final "result" at the end of the program, which nicely parallels pure functional programming. Thus, I think a pure-functional Lisp or Scheme type shader language would actually be better and there are more optimizations you can do, because function programming permits the ability to use equational reasoning, much like symbolic Mathematica-like math packages.
Some other folks might think that stream-based languages might fit better. And still others might think concurrent languages are a better fit.
High level languages goes beyond the mere small world of ATI vs NVidia, and the idea that engineers at Nvidia are designing these things because they want to play anti-competitive games, and not because they want to build something technically cool, is well, I think absurd.
I've been in the industry for almost 2 decades now, and I have worked at several large "evil" corporations. The technical guys who work in the engineering departments were just as geeky and clueless about marketing as the average d00d on say, Slashdot. Most of them had huge egos and cared more about beating other engineers or doing something technically or insanely great to get some R-E-S-P-E-C-T and had little concern for grand conspiracy theories.
I mean, do you think OpenGL Guy is sitting around thinking about ways he can use OpenGL drivers to disadvantage NVidia, or the way they can do political maneuvering at ARB to hurt NVidiai, or do you think he is more concerned with optimizing OGL Driver programs, inventing the next great cool OpenGL extension, or fixing bugs because the IHV/ISV's are breathing down his ass with nasty emails? And if he does invent a cool new OpenGl extension that Nvidia hardware can't implement, and he lobbies developers to use it, is he deliberating trying to be anti-competitive, OR, is he just trying to get people to utilize the cool work he has done?
I think all too often, people impute malice and ill-will where there was none.