Sigma said:
Isn't PS1.4 a custom extension to ATI?
No. It's a part of the DX API, so any developer or IHV who wants can support things like ps2.0b, etc. A custom extension is something not formally supported by the API, but which may electively be supported by a developer--take nVidia's "Ultra-Shadow" feature. Although unsupported in either API, nVidia can write a custom extension for OpenGL in its drivers to support it, and any game developer who chooses to can support that OpenGL extension in his game. (In DX games, it could be supported directly by the game engine *outside* of the DX API.)
However, the catch is that a developer has to decide if the work involved in supporting the custom extension is worthwhile, since it pertains to only a specific IHV's products, and often only a limited subset of that IHV's total 3d-product offerings. Sometimes, too, an IHV will make loud PR noises about a custom feature that it turns out the IHV doesn't actually support all that well in its drivers (if at all--the old "it's there but currently unexposed in the drivers" ploy...
).
By restricting things to the API and avoiding an extensions mechanism, developers have only to write to the API in any way they choose within the latitude provided by the version of the API they target, and any IHV who sells hardware advertised to be compliant with the same API version *should* be able to run their software with a minimum of effort being expended by the developer.
The purpose of APIs is to directly benefit 3d game developers by enormously simplifying the 3d programming involved or required by otherwise having to write separate 3d engines to support hardware and drivers produced by multiple IHVs. Had 3dfx not instituted GLIDE, as an example, the sale of their chips would have been greatly constrained due to lack of developer interest which would have spelled a distinct lack of 3dfx-compatible 3d games. Lack of GLIDE as an API would have set back the clock on 3d by at least 3-5 years, if, that is, 3d as we know it today would ever have left the starting blocks at all. The 3d API is a foundational bedrock for the 3d gaming industry.
Extensions needlessly complicate the process and may add considerable work at the developer level--like we saw with NWN and Bioware's support in its engine for nVidia's OpenGL extensions--but not ATi's OpenGL extensions. Thus at first things like "shiny water" wouldn't render on a 9700P, but rendered as expected on nV25. The situation wasn't remedied until Bioware made the changes in its engine to support the ATi OpenGL extensions for supporting the R300 hardware necessary to render "shiny water,"
which were different from nVidia's OpenGL extensions required to support "shiny water." The NWN thing is a classic example of what kinds of needless problems and extra work the OpenGL extensions mechanism can cause a developer.
What about DX9.0a/b/c/d/e/f/g..... ? Why so many of them? It almost seams that they are trying to map each hardware that comes along from NVIDIA or ATI.
The purpose of API development is to work *with* hardware IHVs (certainly not to control or dictate to them) to
include formal API support for their newer hardware standards. Letter changes in DX API version schemes denote minor inclusions of newer capabilities within the formal structure of the API. They are not directly IHV-specific inclusions, and certainly not extensions, but are direct inclusions within and expansions of the API. So any developer can write his game to support a specific version of DX9 because any IHV's 3d products advertised to be compliant with that version of the API should run the game with little if any problem.
The difference is that had support of the hardware functionality required for NWN's shiny water been a part of the formal structure of the OpenGL API, then all Bioware would have to have done is to write in shiny water API support, and there'd have been no necessity to support multiple extensions from multiple IHVs (it would then be up to the IHVs to support API functionality in their drivers) so that the game engine's shiny water support would render on everyone's hardware capable of rendering it. The difference seems fairly clear to me.
Also don't forget that GL is not games only. There must be an ARB to decide the best for everyone, and that includes CAD software and games.
I think I mentioned that (although not CAD specifically), including cross-platform necessities. OpenGL is much less specialized than DX, serving different purposes, and so therefore has to be different in some respects. But it's 3d-gaming comparisons between the APIs that I think we're talking about here.
I also find the subject of extensions funny. If Doom3 demanded a version of GL, say 1.5, it would not require any extension to run. Exacly like any other game demanding DX9.0.
But of course--the problem is that it doesn't, apparently...
The other side of the problem is then that nVidia and ATi need to write drivers which maintain a certain set of extensions in order to run with older the OpenGL games whose engines expect the extensions, as well as writing newer OpenGL drivers supporting the newer API core functionalities for newer games. This is I think markedly different from DX driver development, wherein each succeeding API version contains compatability with all of the older versions as an inherent subset.
What is the diference between checking caps bits or checking for extensions? There is only one problem with extensions. Sometimes, each ISV would create it's own extension to do the same thing. But that does not happend that often (nice to see NVIDIA using ATI's extensions....)
The cap bits are supported by the version of the API, extensions are not. The game developer doesn't really *have* to check the driver cap bits, since, if an IHV's DX drivers are advertised as compliant with a specific version of the API, the API-version cap bit support is assumed by the developer to be there. It's the job of the IHV to ensure that his drivers support the cap bits needed to support the API version. Extensions, otoh, lack any formal cross-IHV, trans-API-version structure.
Problems can materialize for a developer when, say, he writes his engine to require a specific version of the DX API, and then discovers that an IHV's hardware *won't run* his engine because it supports only parts of the API version, even though the IHV advertises his products as 100% compatible with the version of the API the developer is targeting. That's when they need to get into looking at cap bits and so on to decipher what in the API version the IHV actually supports and what he doesn't, so that they can make changes to their engine to accomodate the holes in the IHV's support.
This resembles superficially a problem with OpenGL extension support developers face, but is actually not the same thing at all. There are no extension requirements for an IHV--the whole notion of extensions is geared to custom IHV hardware support *outside* of the API. DX-version cap bit support, however, is geared toward both the developer and the IHV knowing in advance what is required for functionality support.
Isn't DX10 supposed to have a formal extension mechanism?! If it had one, probably DX9 developers could be using NV's depth bounds be now...
I can't imagine why they'd want to introduce the same problems inside of DX that OpenGL now has relative to extensions (go back to the NWN example.) Why degrade your API in that fashion, when instead you can continue to work with IHVs to advance the API in a formal, universally supportable fashion. Certainly, that's got to be a lot better for developers, I would think. Also, I consider it unlikely because M$ has always had the option of permitting extension support within the API, but has declined to do so to avoid NWN-like scenarios. If, however, M$ intends DX10 to be the last DX-version for a long, long time, then perhaps it might make more sense.
To turn it around on you a bit--what prevented nVidia from preparing some nice, OpenGL-extension demos of its ps3.0 functionality in nV40? No need to "wait" on DX9.0c at all, right? But it seems like that's just what nVidia has done--wait on DX9.0c. Perhaps nVidia doesn't want to expose it through an extension, but rather through a new set of OpenGL drivers to support upcoming versions of the OpenGL API? I'd consider that likely--which tells me they didn't think supporting it through current OpenGL extensions would have been worth the effort. (I could get cynical and say that they didn't bother with an sm3.0-type OpenGL extension because there really isn't much in nV40 in the way of actual SM3.0 support, and that "waiting" on DX9.0c was a stalling tactic to give them time to do other optimizations they could claim to be SM3.0-related...but I won't...
)
If the supper buffer are rectified at the end of the year, GL2.0 + super buffers will surpass DX9 I think (without the annoying SM versions thingy).
And GL2.0 + super buffers + topology processor even equal DX10, right? That doesn't seam "slow as old man Christmas" to me...
Well, not a lot of merit in comparing what is (DX9) to what *may be* (DX10 & ogl 2.0), is there?...
Again, yes, OpenGL core API structuring has been as slow as Christmas, no doubt about it, in comparison to D3d. But that isn't necessarily a criticism, considering the different purposes the OpenGL 3d API is intended to support as contrasted to D3d, as I mentioned in my initial post.