Xmas said:
Outside of the DX API? How do you imagine this to work?
I related to Z that I thought I had read somewhere that Splinter Cell was a D3d game which supported Ultra-Shadow through an nVx engine-specific custom code path--but as I told him I could surely be mistaken in that recollection--but that's the only reason I added that bit in parenthesis...
But yours and Z's comments do underscore the basic difference between the APIs from a structural standpoint, that whereas OpenGL currently permits IHV extensions unsupported by the API, D3d does not.
How is that different from features in the DX spec that are only supported by one IHV? Loud PR noises? Remember TruForm, RT patches, displacement mapping?
Thanks for bringing up TruForm....
Was it an ATi OpenGL-only feature? (I ask because I've never really used it.) Your statement implies that although truform was unsupported by D3d, it could be nonetheless supported by ATi's D3d drivers through D3d game-engine specific support--similar to original FSAA support relative to the APIs as implemented by 3dfx. This would make me think, then, that a D3d game engine could support something like nVx Ultra Shadow without its direct support in the D3d API.
Anyway, it's a bit hard to follow the general "how is OpenGL different from D3d?" questions I've been getting, since I would assume that the differences are obvious to the point of being self-explanatory. And judging by the disparity in the numbers of game developers who elect to use/develop D3d engines as opposed to OpenGL engines, I should think it would be obvious that developers plainly see a difference between the APIs, from their point of view.
By "loud PR noises" I am referring to cases similar to the launch of nV30 wherein nVidia's PR talked incessantly about fp32 being superior to fp24 under DX9, but nV30's initial drivers (when the product "shipped" to reviewers months later) included support only for fp16 sans fp32--and fp16 support was only added later (post nV30's paper launch in '02), I believe, to the DX9 spec as pp.
Practically speaking, as we all know, fp32 was unworkable for D3d game development targeting nV3x, but it was only towards the end of '03 that nVidia finally came "clean" about that, at least to some extent, and started pushing fp16 like there was no tomorrow...
Developers always knew better, however, and didn't try fp32 for their nV3x-targeted game engines. (They didn't try for fp24 precision levels, either, but that doesn't change the fact that for R3x0 fp24 was workable *had developers chosen to support it* but that nV3x fp32 was not, as it was entirely too slow.)
OpenGL developers can do just the same. They can choose to do more.
Exactly the point. Why should they "choose to do more" work for an OpenGL engine, in terms of supporting multiple vendor-specific extensions in their engines to support the *same engine functionality* (going back to NWN's shiny water, again), when the extra work isn't necessary under D3d to obtain the same results?--Only because of the differences between the ways the APIs are structured, imo.
Why is that a problem? John Carmack apparently saw no problem in that at all.
Isn't this obvious?...
Since he voluntarily constrains himself to the confines of OpenGL as his exclusive 3d API (unlike companies like Epic, which use both APIs--although Epic is "officially" only D3d), he is prepared to "choose to do more work" in supporting multiple OpenGL code paths and supporting multiple OpenGL vendor-specific extensions, because it is his personal election to do so. If you read between the lines in his various commentaries, however, you can see that he pines for the day when his ARB2 path will be the only coding he has to do...
Can't really blame him for that, right?
Since most new core functionality is basically unchanged former extension functionality, driver developers have to write almost no extra code at all. All they have to do is to provide the same API entry points with two different names (and many extensions only define a few new constants, which means no additional work at all).
Heh...
Why are you confusing functionality with the code supporting that functionality?
Hopefully, what the ogl 1.5-2.0 specs do is to make functionality addressable through the API in a single, unified way. This then shifts the burden from the game developer to the IHV to make sure that the appropriate functionality in his hardware can be called through the unified API interface that developers can use to support that functionality in their engines.
OTOH, If you take all of the vendor-specific extensions which now exist and throw them, as is, into a large messy stew, and simply call that approach "core functionality"--it doesn't seem to me you've accomplished anything...
I would think that moving hardware feature support into the API core is *doing away* with the current requirement that such functionality may only be supported through vendor-specific extensions. But then you've got the demon of "backwards compatibility" to consider. Old game engines are going to expect very specific things from vendor-specific extensions (not the least of which is that they be present in the IHV drivers), and chances are these expectations won't jive with the new specs for OpenGL core API function implementation. So, for a time at least, IHV OpenGL drivers will have to support both--is the way it looks to me.
Let's go back to the NWN shiny water thing again to illustrate. Had the functionality support for "shiny water" been a part of the core OpenGL API spec at the time Bioware wrote its engine, then Bioware could have written its engine support for shiny water one time, in one way, and the hardware support would have been there for all sets of hardware truthfully compliant with that OpenGL API spec, right? But as it was, Bioware had to implement multiple vendor-specific extension support in its engine, which resulted in the fact that shiny water only worked on nV3x--obviously because by the nature of what "vendor-specific extensions" are you would not expect ATi's and nVidia's extensions to be the same.
And indeed they were not, and so shiny water worked on nVidia but not on ATi. The deficiency of support, then, resided exclusively in the Bioware engine, and not, of course, in the ATi hardware. It took Bioware a few months to add *NWN engine support* for the ATi extensions required to support shiny water, and the result was that it took a lot more work and a lot more time for Bioware to support shiny water through the mechanism of OpenGl extensions than would be true if support for the required functionality would have been inside the core API from the beginning.
The DX specs define some minimum capabilities for "compliancy" with a certain version, however these minimum caps are quite low.
You can't write a modern game engine based on these minimum caps.
I think you're being a tad rationalist here...
By "minimum compliancy" I can only think that you mean "backwards compatability" with earlier DX versions. For instance, I've got several Dx5/6/7/8 games that run just fine under my DX9.0b installation, under my DX9-compliant 3d hardware and drivers. If by "modern game engine" you mean a game engine requiring functionality support exclusive to DX9, then of course it's true that a DX9 game won't run if it only supports cap bits relative to DX6...
But then, it would not be a "modern game engine" in the first place, would it, but it would be a DX6 game engine, right...? There's nothing, however, within DX9 which would prevent a developer who wanted to from writing a DX6-restricted cap bit game, is there?
Again, the difference between succeeding DX versions and OpenGL extensions is that everything stays uniform from version to version as a standard function of the core API under DX, for all IHVs, which simplifies things greatly for game developers. Under the OpenGL extensions mechanism, support for IHV hardware is exactly the opposite of uniform and predictable. And as I see it, that is the problem with an extensions mechanism in the first place, from a developer's point of view.
Extensions, otoh, lack any formal cross-IHV, trans-API-version structure.
Extensions differ by nature from one IHV to the next, even when they are written to support the same basic hardware functionality. This is not true of of DX, where it's up to the IHV to support the API function (and not up to the developer to directly support the IHV's custom extension.) That's what I mean by "cross-IHV." Extensions have little or nothing to do with the API version; whereas each succeeding version of the DX API contains as a subset formal support for all previous versions of the API--which is what I mean by "trans-API version."
There's just no way to get around the fact that vendor-specific extensions for hardware function support are very different from formal API function support in the core API. I mean. it seems like night & day to me...
"100% compatible" does not mean it supports all features. It hardly means anything. There is no card that ever supported all features of any given DX version.
I disagree--I think it means everything...
Of course, I'm assuming that the cap bits checked actually relate to function support in the drivers--otherwise, as you say, checking out cap bits that are non-functional would be no different than checking out extensions that don't do anything, either...
I do agree that thinking about it by way of "cap bits present" is a very superficial and incomplete analysis. But the only thing we can be sure of is that when the cap bits are absent--so is the functionality. DX has an expected list of cap bits, though, pertaining to each version of the API--there's no such condition relevant for OpenGl extensions, is my point, since the core API presently would be content with none.
Relying on anything else than just the minimum caps for a certain DX version without caps checking is a bug. It's no different than OpenGL extension checking.
The difference, of course is, that for DX it's the *API* which expects the cap bits first, then the game engine, based on how it is written. For OpenGL extensions, it the only game engine expecting the extensions, the API doesn't enter the picture ...
We keep coming back to this core difference, don't we? Last time I checked there's a big difference between writing a game engine to an API, and writing it to support custom vendor-specific extensions.
Isn't DX10 supposed to have a formal extension mechanism?! If it had one, probably DX9 developers could be using NV's depth bounds be now...WGF is intended to stay for a long, long time. Extensions are a necessity here.
Extensions are only a "necessity" when the core API function support hardly changes over time--as has been true of OpenGL to date. However, this is somewhat circular, because it can be convincingly argued that the *reason* the OpenGL API core function support has been so slow to evolve is because of its tolerance for vendor-specific extensions. This might also be why game developers overwhelmingly choose to use the non-extensions approach of DX over the extensions approach of OpenGL.
What I meant is that *extensions* might be rational if DX10 was going to be the last DX version M$ would establish until, say, 2015, or something like that...
Yes?.....
Which of these was written by nVidia and accompanied nV40 when nVidia sent it out on the review circuit earlier this year at nV40 launch, when nVidia was telling us all about the splendors of SM3.0....? More importantly, which of these demonstrate functionality (or distinct and clear advantages) impossible with SM2.0 but only possible with SM3.0? Heh...
All I heard at that time was how it was necessary to wait on DX9.0c, and until that time, we'd have to take nVidia's word on the subject of nV4x's SM3.0 implementation.
In closing, I'd like to state that nothing I've written here represents an "attack" on the OpenGL 3d API. Not even close...
I'm simply trying to frame an overview on just why it might be that the great majority of developers eschew OpenGL for their 3d-game engine development and choose DX instead, despite the assumed "cross-platform" advantages that OpenGL ostensibly offers them. I'm not manufacturing information when I say that developers clearly prefer DX, and that DX9 to many of them represents a kind of "watershed" in the evolution of the D3d API. I'm simply trying to understand why they choose DX over OpenGL so consistently...