I'm still waiting to see how all of this (GLSL approach) has in fact practically benefitted the consumer. Get your own friggin clue.
Well, since OpenGL1.5 was just ratified and it generally takes about 18 months to design a new game engine, perhaps you should be a little more patient. Many people have been waiting for PS2.0 to practically benefit them, or the D3DX effects framework.
BEcause it was a piece of crap 1+ years ago? Or did was (and still is?) HLSL compiling BETTER (performance and bugs) than Gc not exactly what happened? I could've sworn that the Cg compiler was a big sore spot of the whole effort...
The first versions of MS's FXC also produced poor code. Not as bad as Cg BETA, but poor none the less. Of course, you're still missing the essential fact that the biggest part of the compiler is in the drivers which convert PS2.0 token streams into native code.
So in any situation...you still need to download "a patch." And of course...with "compilers built into drivers" that have "across the board increases / fixes", you seem to have negated the "it can BREAK apps that happened to 'rely' on the older functionality" effect.
Yay...id tells me to download the latest driver, which does wonders for Doom3...but breaks whatever other game someone decides to actually code in GL.
None of your arguments have anything to do with GLSL specifically. The problems you cite apply generally to the design of operating systems today.
Yes, you need to download a single patch, instead of N patches. Sure, dynamically linked code is a central point of failure which can effect multiple applications, but this is no different than any piece of code in modern software development. All operating systems have moved to shared libraries, shared components, and shared drivers. Why? Because the benefits outweigh the problems.
Here's an example of a problem with static linking. In the past, a majority of applications statically linked in functions to lookup domain names, many from the libresolv DNS client library. In the mid-90s, I discovered a buffer overflow attack in this library which allowed intruders to seize control of just about ANY internet enabled application (ftp, browsers, etc) Got my picture in NYT, WSJ, WAPO. After the bug was fixed, *THOUSANDS* of applications that used libresolv had to be patched and fixed on millions of systems. Unlike when Microsoft issues a "hotfix" today where they patch a single DLL or EXE, this effected almost every internet enabled component and required recompiling everything.
Yes, sharing code between applications means the shared code becomes single point of failure. But it is also a single point of benefit. That's why today's operating systems have abstract device driver systems, shared components, processes, services, and libraries.
There are performance benefits to static linking (unless you have a runtime compiler), and some apps will resort to static link if they want a specific older version of a library (but today, most OSes support versioning, so even that is starting to decline)
I'm also the one who has actually seen and touched a few games that have been released / compiled with HLSL shaders. Still waiting for those GLSL titles...for all it's superiority....
Well, wait all you want. I'll wait for a clean API. OpenGL is falling in decline because of Microsoft's marketing muscle, so even after OGL2.0 ships, and driver support is available, don't expect a plethora of games. This has less to do with HLSL vs GLSL and more to do with general OGL vs DX developer issues, Microsoft support (Microsoft *refuses* to update the OpenGL32.dll for the new >1.2 bindings), and MS development tools and support push DX.
But I decline to get into a discussion on the merits of the Microsoft monopoly and their attempts to destroy OpenGL over the years.
I'll just note the results of the informal poll above.