WaltC said:
In my book I prefer DX9 because it doesn't support custom extensions, which I think are what has tremendously slowed the progress of the mythical "ARB committee"...Heh...
Isn't PS1.4 a custom extension to ATI? What about DX9.0a/b/c/d/e/f/g..... ? Why so many of them? It almost seams that they are trying to map each hardware that comes along from NVIDIA or ATI.
Also don't forget that GL is not games only. There must be an ARB to decide the best for everyone, and that includes CAD software and games.
WaltC said:
But DX I think delivers the more consistent approach of the two which I think is a better platform for proceeding with "the future of 3d" than is OpenGL with its wild, "old-west" extensions approach. Of course, being somewhat "cross platform" (although that phrase has nowhere near the meaning it had a few years ago, I think), OpenGL as a 3d API has just about got to be different from DX in these respects.
I also find the subject of extensions funny. If Doom3
demanded a version of GL, say 1.5, it would not require any extension to run. Exacly like any other game demanding DX9.0.
What is the diference between checking caps bits or checking for extensions? There is only one problem with extensions. Sometimes, each ISV would create it's own extension to do the same thing. But that does not happend that often (nice to see NVIDIA using ATI's extensions....)
WaltC said:
ARB is just slow as old man Christmas, but considering the different applications for OpenGL as opposed to DX, the glacial progress of a formal OpenGL API structure has never seemed that big a deal to me because it could always be offset somewhat or completely through extensions. I'm very ambivalent about extensions. When Carmack first championed them I supported the idea, but in succeeding years have grown increasingly less fond of the concept in a 3d API.
Isn't DX10 supposed to have a formal extension mechanism?! If it had one, probably DX9 developers could be using NV's depth bounds be now...
If the supper buffer are rectified at the end of the year, GL2.0 + super buffers will surpass DX9 I think (without the annoying SM versions thingy).
And GL2.0 + super buffers + topology processor even equal DX10, right? That doesn't seam "slow as old man Christmas" to me...