Is Doom 3's release going to change the OpenGL community?

Chalnoth said:
I never thought I'd hear anybody say they thought GL2 was being rushed....

with regard to building the right API, I think that the current API proposed is being rushed ahead of 3dlab's proposed changes to the API wrt to the "pure" api. Its really just opengl 1.6. nothing special, except the 2.0 name. If, for a sudden reason they produce the original propsed GL 2.0, I will be surprised...

I think, initially, the original discussions (of 2.0) were meant to spark interest in the API, and continue it's progression...currently it's been dumbed down.. until significant API is introduced, we SHOULD NOT move forward to 2.0.
 
Chalnoth said:
I never thought I'd hear anybody say they thought GL2 was being rushed....

While I welcome an enhanced API, I expect a major revision to bring major functionality and API specific enhancements. Not just incremental (major already existing functionality via entensions) enhacements. According to the last opengl meeting notes, OpenGL 2.0 won't really be much of an upgrade to 1.5, except making a few things "core" in 2.0, like the GLSL and a few other things.

Read the meeting notes to see what I mean....
 
AndrewM said:
However, they _can_ be evil when you cover the same functionality, like ATI and NV's vertex array extensions. That was silly.

It sure can be silly sometimes, but in this particular case I don't think there was much silliness involved. NV added their VAR extension when their cards supported it. However, that extension sucked, just way too low-level and overall inconvenient. So I don't think ATI was silly when they opted for another approach. Now the VAO extension was a good deal better, but wasn't perfect either, so I think it made a lot of sense for the ARB then to make VBO a bit different than the VAO.
 
Sigma said:
Isn't PS1.4 a custom extension to ATI?

No. It's a part of the DX API, so any developer or IHV who wants can support things like ps2.0b, etc. A custom extension is something not formally supported by the API, but which may electively be supported by a developer--take nVidia's "Ultra-Shadow" feature. Although unsupported in either API, nVidia can write a custom extension for OpenGL in its drivers to support it, and any game developer who chooses to can support that OpenGL extension in his game. (In DX games, it could be supported directly by the game engine *outside* of the DX API.)

However, the catch is that a developer has to decide if the work involved in supporting the custom extension is worthwhile, since it pertains to only a specific IHV's products, and often only a limited subset of that IHV's total 3d-product offerings. Sometimes, too, an IHV will make loud PR noises about a custom feature that it turns out the IHV doesn't actually support all that well in its drivers (if at all--the old "it's there but currently unexposed in the drivers" ploy...;)).

By restricting things to the API and avoiding an extensions mechanism, developers have only to write to the API in any way they choose within the latitude provided by the version of the API they target, and any IHV who sells hardware advertised to be compliant with the same API version *should* be able to run their software with a minimum of effort being expended by the developer.

The purpose of APIs is to directly benefit 3d game developers by enormously simplifying the 3d programming involved or required by otherwise having to write separate 3d engines to support hardware and drivers produced by multiple IHVs. Had 3dfx not instituted GLIDE, as an example, the sale of their chips would have been greatly constrained due to lack of developer interest which would have spelled a distinct lack of 3dfx-compatible 3d games. Lack of GLIDE as an API would have set back the clock on 3d by at least 3-5 years, if, that is, 3d as we know it today would ever have left the starting blocks at all. The 3d API is a foundational bedrock for the 3d gaming industry.

Extensions needlessly complicate the process and may add considerable work at the developer level--like we saw with NWN and Bioware's support in its engine for nVidia's OpenGL extensions--but not ATi's OpenGL extensions. Thus at first things like "shiny water" wouldn't render on a 9700P, but rendered as expected on nV25. The situation wasn't remedied until Bioware made the changes in its engine to support the ATi OpenGL extensions for supporting the R300 hardware necessary to render "shiny water," which were different from nVidia's OpenGL extensions required to support "shiny water." The NWN thing is a classic example of what kinds of needless problems and extra work the OpenGL extensions mechanism can cause a developer.

What about DX9.0a/b/c/d/e/f/g..... ? Why so many of them? It almost seams that they are trying to map each hardware that comes along from NVIDIA or ATI. :rolleyes:

The purpose of API development is to work *with* hardware IHVs (certainly not to control or dictate to them) to include formal API support for their newer hardware standards. Letter changes in DX API version schemes denote minor inclusions of newer capabilities within the formal structure of the API. They are not directly IHV-specific inclusions, and certainly not extensions, but are direct inclusions within and expansions of the API. So any developer can write his game to support a specific version of DX9 because any IHV's 3d products advertised to be compliant with that version of the API should run the game with little if any problem.

The difference is that had support of the hardware functionality required for NWN's shiny water been a part of the formal structure of the OpenGL API, then all Bioware would have to have done is to write in shiny water API support, and there'd have been no necessity to support multiple extensions from multiple IHVs (it would then be up to the IHVs to support API functionality in their drivers) so that the game engine's shiny water support would render on everyone's hardware capable of rendering it. The difference seems fairly clear to me.

Also don't forget that GL is not games only. There must be an ARB to decide the best for everyone, and that includes CAD software and games.

I think I mentioned that (although not CAD specifically), including cross-platform necessities. OpenGL is much less specialized than DX, serving different purposes, and so therefore has to be different in some respects. But it's 3d-gaming comparisons between the APIs that I think we're talking about here.

I also find the subject of extensions funny. If Doom3 demanded a version of GL, say 1.5, it would not require any extension to run. Exacly like any other game demanding DX9.0.

But of course--the problem is that it doesn't, apparently...;) The other side of the problem is then that nVidia and ATi need to write drivers which maintain a certain set of extensions in order to run with older the OpenGL games whose engines expect the extensions, as well as writing newer OpenGL drivers supporting the newer API core functionalities for newer games. This is I think markedly different from DX driver development, wherein each succeeding API version contains compatability with all of the older versions as an inherent subset.

What is the diference between checking caps bits or checking for extensions? There is only one problem with extensions. Sometimes, each ISV would create it's own extension to do the same thing. But that does not happend that often (nice to see NVIDIA using ATI's extensions....)

The cap bits are supported by the version of the API, extensions are not. The game developer doesn't really *have* to check the driver cap bits, since, if an IHV's DX drivers are advertised as compliant with a specific version of the API, the API-version cap bit support is assumed by the developer to be there. It's the job of the IHV to ensure that his drivers support the cap bits needed to support the API version. Extensions, otoh, lack any formal cross-IHV, trans-API-version structure.

Problems can materialize for a developer when, say, he writes his engine to require a specific version of the DX API, and then discovers that an IHV's hardware *won't run* his engine because it supports only parts of the API version, even though the IHV advertises his products as 100% compatible with the version of the API the developer is targeting. That's when they need to get into looking at cap bits and so on to decipher what in the API version the IHV actually supports and what he doesn't, so that they can make changes to their engine to accomodate the holes in the IHV's support.

This resembles superficially a problem with OpenGL extension support developers face, but is actually not the same thing at all. There are no extension requirements for an IHV--the whole notion of extensions is geared to custom IHV hardware support *outside* of the API. DX-version cap bit support, however, is geared toward both the developer and the IHV knowing in advance what is required for functionality support.


Isn't DX10 supposed to have a formal extension mechanism?! If it had one, probably DX9 developers could be using NV's depth bounds be now...

I can't imagine why they'd want to introduce the same problems inside of DX that OpenGL now has relative to extensions (go back to the NWN example.) Why degrade your API in that fashion, when instead you can continue to work with IHVs to advance the API in a formal, universally supportable fashion. Certainly, that's got to be a lot better for developers, I would think. Also, I consider it unlikely because M$ has always had the option of permitting extension support within the API, but has declined to do so to avoid NWN-like scenarios. If, however, M$ intends DX10 to be the last DX-version for a long, long time, then perhaps it might make more sense.

To turn it around on you a bit--what prevented nVidia from preparing some nice, OpenGL-extension demos of its ps3.0 functionality in nV40? No need to "wait" on DX9.0c at all, right? But it seems like that's just what nVidia has done--wait on DX9.0c. Perhaps nVidia doesn't want to expose it through an extension, but rather through a new set of OpenGL drivers to support upcoming versions of the OpenGL API? I'd consider that likely--which tells me they didn't think supporting it through current OpenGL extensions would have been worth the effort. (I could get cynical and say that they didn't bother with an sm3.0-type OpenGL extension because there really isn't much in nV40 in the way of actual SM3.0 support, and that "waiting" on DX9.0c was a stalling tactic to give them time to do other optimizations they could claim to be SM3.0-related...but I won't...;))

If the supper buffer are rectified at the end of the year, GL2.0 + super buffers will surpass DX9 I think (without the annoying SM versions thingy).

And GL2.0 + super buffers + topology processor even equal DX10, right? That doesn't seam "slow as old man Christmas" to me...

Well, not a lot of merit in comparing what is (DX9) to what *may be* (DX10 & ogl 2.0), is there?...;) Again, yes, OpenGL core API structuring has been as slow as Christmas, no doubt about it, in comparison to D3d. But that isn't necessarily a criticism, considering the different purposes the OpenGL 3d API is intended to support as contrasted to D3d, as I mentioned in my initial post.
 
WaltC said:
Sigma said:
Isn't PS1.4 a custom extension to ATI?

No. It's a part of the DX API, so any developer or IHV who wants can support things like ps2.0b, etc.
It's an exact mapping of R200's fragment processing capabilities, so for all intents and purposes, it's an ATI-extension to DX8. Now, DX's versioning constraints mandate that any PS2.0 capable driver must also accept PS1.4 code, ie emulation of non-native features for the sake of backwards compatibility.
For PS1.4, that really only happened with DX9, though, which is important to note. When DX8.1 was the latest and greatest, this was a simple question of R200 vs the rest of the world. Smells like a vendor specific extension to me.

Nitpick: PS2.0b is not a shader version. There's a "PS2_0_b" HLSL compiler target profile, which you were probably referring to. "2.0" is a shader version, "2.x" is another one. There are no other shader versions starting with "2". PS2.0 is extended via a number of optional caps and, in the process, becomes PS2.x (which is not a true version but rather a category of messes; it can be anything between PS2.0 and PS3.0). If it weren't for the fact that all other IHVs except for ATI and NVIDIA have slowly withered and died, disaster would be upon us.
WaltC said:
A custom extension is something not formally supported by the API, but which may electively be supported by a developer
<deferred>
Like "two sided stencil", which is a caps bit?
The difference between caps and extensions is that extensions come when they are ready, and caps bits come when Microsoft releases a new runtime. And that's the whole story. Your narrow definition of "extension" is amusing, but not particularly relevant for development purposes.
WaltC said:
--take nVidia's "Ultra-Shadow" feature. Although unsupported in either API, nVidia can write a custom extension for OpenGL in its drivers to support it
<deferred>
"UltraShadow" is already fully exposed in OpenGL in the form of, you guessed it, NVIDIA extensions.
WaltC said:
, and any game developer who chooses to can support that OpenGL extension in his game.
Just like DX Graphics code can take advantage of two-sided stencil support, but must provide a fallback, I suppose.
WaltC said:
(In DX games, it could be supported directly by the game engine *outside* of the DX API.)
Wrong. Unless you want to poke random PCI registers and cross your fingers.

WaltC said:
However, the catch is that a developer has to decide if the work involved in supporting the custom extension is worthwhile, since it pertains to only a specific IHV's products, and often only a limited subset of that IHV's total 3d-product offerings.
Like two-sided stencil ... stream frequency dividers ... floating point render targets ...
This is getting silly. I'll stop now.
 
WaltC said:
No. It's a part of the DX API, so any developer or IHV who wants can support things like ps2.0b, etc. A custom extension is something not formally supported by the API, but which may electively be supported by a developer--take nVidia's "Ultra-Shadow" feature. Although unsupported in either API, nVidia can write a custom extension for OpenGL in its drivers to support it, and any game developer who chooses to can support that OpenGL extension in his game. (In DX games, it could be supported directly by the game engine *outside* of the DX API.)
Outside of the DX API? How do you imagine this to work?

However, the catch is that a developer has to decide if the work involved in supporting the custom extension is worthwhile, since it pertains to only a specific IHV's products, and often only a limited subset of that IHV's total 3d-product offerings. Sometimes, too, an IHV will make loud PR noises about a custom feature that it turns out the IHV doesn't actually support all that well in its drivers (if at all--the old "it's there but currently unexposed in the drivers" ploy...;)).
How is that different from features in the DX spec that are only supported by one IHV?
Loud PR noises? Remember TruForm, RT patches, displacement mapping?

By restricting things to the API and avoiding an extensions mechanism, developers have only to write to the API in any way they choose within the latitude provided by the version of the API they target, and any IHV who sells hardware advertised to be compliant with the same API version *should* be able to run their software with a minimum of effort being expended by the developer.
OpenGL developers can do just the same. They can choose to do more.


I also find the subject of extensions funny. If Doom3 demanded a version of GL, say 1.5, it would not require any extension to run. Exacly like any other game demanding DX9.0.

But of course--the problem is that it doesn't, apparently...;)
Why is that a problem? John Carmack apparently saw no problem in that at all.

The other side of the problem is then that nVidia and ATi need to write drivers which maintain a certain set of extensions in order to run with older the OpenGL games whose engines expect the extensions, as well as writing newer OpenGL drivers supporting the newer API core functionalities for newer games. This is I think markedly different from DX driver development, wherein each succeeding API version contains compatability with all of the older versions as an inherent subset.
Since most new core functionality is basically unchanged former extension functionality, driver developers have to write almost no extra code at all. All they have to do is to provide the same API entry points with two different names (and many extensions only define a few new constants, which means no additional work at all).

What is the diference between checking caps bits or checking for extensions? There is only one problem with extensions. Sometimes, each ISV would create it's own extension to do the same thing. But that does not happend that often (nice to see NVIDIA using ATI's extensions....)

The cap bits are supported by the version of the API, extensions are not. The game developer doesn't really *have* to check the driver cap bits, since, if an IHV's DX drivers are advertised as compliant with a specific version of the API, the API-version cap bit support is assumed by the developer to be there. It's the job of the IHV to ensure that his drivers support the cap bits needed to support the API version.
The DX specs define some minimum capabilities for "compliancy" with a certain version, however these minimum caps are quite low.
You can't write a modern game engine based on these minimum caps.

Extensions, otoh, lack any formal cross-IHV, trans-API-version structure.
:?:

Problems can materialize for a developer when, say, he writes his engine to require a specific version of the DX API, and then discovers that an IHV's hardware *won't run* his engine because it supports only parts of the API version, even though the IHV advertises his products as 100% compatible with the version of the API the developer is targeting. That's when they need to get into looking at cap bits and so on to decipher what in the API version the IHV actually supports and what he doesn't, so that they can make changes to their engine to accomodate the holes in the IHV's support.
"100% compatible" does not mean it supports all features. It hardly means anything.
There is no card that ever supported all features of any given DX version.

If you could rely on a feature just being there, caps bits wouldn't exist in the first place.
Relying on anything else than just the minimum caps for a certain DX version without caps checking is a bug. It's no different than OpenGL extension checking.


Isn't DX10 supposed to have a formal extension mechanism?! If it had one, probably DX9 developers could be using NV's depth bounds be now...

I can't imagine why they'd want to introduce the same problems inside of DX that OpenGL now has relative to extensions (go back to the NWN example.) Why degrade your API in that fashion, when instead you can continue to work with IHVs to advance the API in a formal, universally supportable fashion. Certainly, that's got to be a lot better for developers, I would think. Also, I consider it unlikely because M$ has always had the option of permitting extension support within the API, but has declined to do so to avoid NWN-like scenarios. If, however, M$ intends DX10 to be the last DX-version for a long, long time, then perhaps it might make more sense.
WGF is intended to stay for a long, long time. Extensions are a necessity here.

To turn it around on you a bit--what prevented nVidia from preparing some nice, OpenGL-extension demos of its ps3.0 functionality in nV40? No need to "wait" on DX9.0c at all, right? But it seems like that's just what nVidia has done--wait on DX9.0c. Perhaps nVidia doesn't want to expose it through an extension, but rather through a new set of OpenGL drivers to support upcoming versions of the OpenGL API? I'd consider that likely--which tells me they didn't think supporting it through current OpenGL extensions would have been worth the effort. (I could get cynical and say that they didn't bother with an sm3.0-type OpenGL extension because there really isn't much in nV40 in the way of actual SM3.0 support, and that "waiting" on DX9.0c was a stalling tactic to give them time to do other optimizations they could claim to be SM3.0-related...but I won't...;))
NV_fragment_program2
NV_vertex_program3
And, most importantly: GLSL
 
WaltC said:
To turn it around on you a bit--what prevented nVidia from preparing some nice, OpenGL-extension demos of its ps3.0 functionality in nV40? No need to "wait" on DX9.0c at all, right? But it seems like that's just what nVidia has done--wait on DX9.0c. Perhaps nVidia doesn't want to expose it through an extension, but rather through a new set of OpenGL drivers to support upcoming versions of the OpenGL API? I'd consider that likely--which tells me they didn't think supporting it through current OpenGL extensions would have been worth the effort.

I believe nothing prevented them. All of their demos, have always used GL, and that includes all the demos of the NV40 launch... Download Nalu and see for yourself...

WaltC: you wrote a lot so I will not quote you on everything. :oops:
Bit I will try to resume it... :D

NVIDIA has an extension for the "ultra shadow". But you said it could be used in a DX game, "*outside* of the DX API"?! How?
And in the case of this extension, it's use in GL would resemble something like this:
Code:
if (depth_bounds) { glEnable(....)}
//render normally
if (depth_bounds) { glDisable(...)}
What is the trouble with this?!

The purpose of an API is to enormously simplify the 3D programming involved AND give access to the hardware features. Unless the API changes every 6 months, it has to have extensions.

WaltC said:
But of course--the problem is that it doesn't, apparently...Wink The other side of the problem is then that nVidia and ATi need to write drivers which maintain a certain set of extensions in order to run with older the OpenGL games whose engines expect the extensions, as well as writing newer OpenGL drivers supporting the newer API core functionalities for newer games. This is I think markedly different from DX driver development, wherein each succeeding API version contains compatability with all of the older versions as an inherent subset.

Oh yes it does. It just demands versions with extensions. When each ISV writes an extension it is like being core to that particular driver. The diference between GL versions is that a set of extensions become automaticly available. You just need to check the version and an extension will be there. Just like knowing that DX9 has PS2.0. But then, that doesn't mean the hardware supports PS2.0, so you have to have Caps Bits hell...

Because every developer has to check for cap bits, just like Carmack does for extensions...

And of course you forget that there are extensions that are not ISV only: every ARB_extension. What is better is even when on ISV implements other ISV extensions... The only diference really, between GL and D3D is that D3D offers the same mechanism, when sometimes (sometimes)in GL, ISVs drift and provide diferent ways of doing the same thing (and even that doesn't last forever).

The problem with ISVs is that they tend to go diferent ways in the hardware (PS a/b/c/d, 3Dc, etc). So D3D in order to support both, one of them as to lose and some capabilities are left out. A good example is the fragment_program. The ARB hasn't updated the extension because the move is towards GLSL and not the low level interface. BUT, NVIDIA decided to extend the language and voilá! Every part of the NV40 hardware is available in GL and only in GL. If the developer wants to support it, it can. If not, simply don't use the extension.

"but has declined to do so to avoid NWN-like scenarios". It is just one, so why do you say "scenarios"? That decline has prevented D3D developers from using depth bounds and the F-Buffer.
 
nutball said:
It strikes me that with DX10 being so closely tied in to Longhorn (2006?), that the ARB might have a bit of a breathing space to develop OGL to a point where it will at least equal DX10 when that arrives.
Since Longhorn has been delayed, X-Box2 is the current driver for DX10. And honestly, it has always been a lot more significant than Longhorn for the promotion of DirectX.

XB2 is supposedly targeted for Christmas '05, and anyone who is serious about releasing near the launch date is already well into development. I don't see a lot of breathing space.
 
It's somewhat of a misconception that DX reduces the number of code paths. DX in fact has much more granular querable options, the capabilitity bits, which cause much more branching in your code than a typical OpenGL app, which is a result of OpenGL's "nothing in core is optional" philosophy.
 
Last month you were suggesting that this was not the case and developers just package according to the general capabilities (i.e. tying to FP Filtering / Blending to SM3.0 because thats what configurations are currently available, rather than testing for caps bits).
 
Yes, I was suggesting that developers don't target APIs, they target HW classes (performance and features). There will be a number of code paths equal to the number of pieces of HW or HW classes they choose to support, regardless of the API (OGL/DX). This is orthogonal to the issue of whether OGL or DX "forces additional code paths". For example, regardless of what the 5200 Ultra supports API wise, developers are going to treat it like a DX7/8 part. Regardless of whether the API/driver reports "I support feature X" Your intimation that OGL makes the developer's life harder or increases the need to write more code paths over and above DX9 is what I am criticizing, because DX9 has way more runtime driver capabilities which are optional and must be queried for.

An OGL developer has to deal with the presence or non-presence of very coarse grained OpenGL extensions. A DX developer has to deal with the presence or non-presence of hundreds of fine grained capabilities. But both have to deal with the actual real world HW, whose implementation of the API may leave something to be desired performance wise or completeness wise.
 
Ah, I see that Beyond3d's very own fiction author, WaltC makes his triumphant return.

Several people have already dealt with most of what was wrote, but they missed (IMO) the most hilarious paragraph of the lot, the one where he informs us:
The purpose of APIs is to directly benefit 3d game developers by enormously simplifying the 3d programming involved or required by otherwise having to write separate 3d engines to support hardware and drivers produced by multiple IHVs. Had 3dfx not instituted GLIDE, as an example, the sale of their chips would have been greatly constrained due to lack of developer interest which would have spelled a distinct lack of 3dfx-compatible 3d games. Lack of GLIDE as an API would have set back the clock on 3d by at least 3-5 years, if, that is, 3d as we know it today would ever have left the starting blocks at all. The 3d API is a foundational bedrock for the 3d gaming industry.
and then proceeds to hold Glide (of all API's) up as an example of an industry uniter. ROOFLE! :LOL:
 
What I haven't seen mentioned much yet, is the NATURE of OpenGL extensions vs D3D functionality.
Sure, every IHV is free to implement every extension from every other IHV, but that is purely theoretical. In practice, such extensions are usually closely related to the hardware they were designed on, and making it run on other hardware is not worth the trouble... It would more or less be the backward situation of NV3x and ARB2/ps2.0. The hardware was not capable of executing such shaders efficiently, so why bother supporting them at all?

Which brings me to the next point... IHVs are not free to just support a subset of shaders or other features. There are a number of rules for hardware to be compliant with a certain version of Direct3D, or its shaders. There is a minimum subset of features defined for each version, which MUST be supported. And, supporting shader version N means you MUST support all shader versions < N aswell.
To get back to the ps1.4 point... This means it is not an ATi-specific shader, since every IHV knew that they were eventually going to be forced to implement it, if they were ever to support shaders > 1.3. ATi happened to be the first. In retrospect, 1.4 is quite similar to 2.0 or even ARB2, so it doesn't seem to be all that specific anyway?

It works both ways... Developers will know that ps1.4 will be supported by all current hardware at some point, so they can freely develop for it. And any IHV can simply write a driver with ps1.4 support, and know that all D3D software can take advantage of it.

With OpenGL that is very different. Vender-specific extensions will rarely be supported by any other IHV. And even ARB-extensions are exactly that: extensions. They aren't part of the standard, and therefore no IHV is forced to include them in their OpenGL driver. And from what I've read, support from the less-popular IHVs such as XGI or Matrox, is already quite limited.

Anyway, as mentioned before, with NWN, with OpenGL you get the strange situation that superior hardware is unable to render effects that inferior hardware can, simply because the 'wrong' extension was used (or, if you want to put it another way: because the hardware is the 'wrong' brand. OpenGL is plagued a lot by that anyway. NV historically gets the best OpenGL support. Which was nice when they were the #1 selling card, but now Radeons have caught up, and things need to change). In D3D this can never happen. Support the caps, and the software will work. It's just that simple. In OpenGL there are often 2 or more extensions that essentially abstract the same functionality, but they were developed by different IHVs. And the only way to ensure decent software support is for the IHVs to implement them all. Which they don't.

That's also why I found it rather silly that someone mentioned that NV register combiner extensions were available before shaders in D3D. Sure they were, but what use were they on anything but the NV hardware itself? D3D shaders were vendor-independent from the start, and that (at least to me) is a key-issue for decent support. With OpenGL, you need ARB-extensions (and some luck), and generally, ARB-extensions are available years after the functionality was introduced, while D3D has much more up-to-date support, and forcing the support too, to a certain extent. Which results in better support by IHVs, so software actually works. Buying the latest Radeon and finding that a simple water effect doesn't work, is a joke.
To this day, an entire generation of shader-capable cards is simply ignored by the ARB, which I find ... well I have no words for it, really.
 
OpenGL existed prior to Glide.

Glide kick started the 3d industry, it did not unite it or make it easy for developers to code across multiple IHV's.

If 3dfx had opened Glide and allowed its use by other IHV's they would likely still be alive and well today in spite of all their other dodgy business decisions.

Then again, I;m rather glad that didn't happen either, you tend to forget just how nasty small textures are until you run a game thru a glide wrapper and compare with the results from D3D or OpenGL (U9: Ascension and DeusEX being two good illustrative examples).
 
Well, I'd be more inclined to say that the 3dfx Voodoo kickstarted the 3D industry. If 3dfx had produced OpenGL drivers from the start, there may never have been a need for Glide (at the time, Direct3D was pretty bad...). But, drivers were not 3dfx's strong suit....
 
Perhaps, but, there were chips before the Voodoo that were decent for their time, but nearly all of them lacked an accessible way of harnessing their capabilities and showcasing them. (S3's Virge, Renditions Verite. 3DLabs Permedia).

3dfx were clever enough to realise that the chip is only as good as the API that supports it. Unfortunately they locked themselves into that API and others out of it, so their early lead was never going to be sustainable as competing chips and API's matured around them.

Just look at the abilities of the Voodoo5 at the bitter end - it barely matched up to a TNT-1 featurewise, probably because it was designed more around what Glide could do than what D3D and OpenGL could do.
 
Chalnoth said:
Well, I'd be more inclined to say that the 3dfx Voodoo kickstarted the 3D industry. If 3dfx had produced OpenGL drivers from the start, there may never have been a need for Glide (at the time, Direct3D was pretty bad...). But, drivers were not 3dfx's strong suit....

Didnt it take till like the Voodoo4 era for 3dfx to even get a working OpenGL ICD?
 
Didnt it take till like the Voodoo4 era for 3dfx to even get a working OpenGL ICD?

Not sure... I'm quite sure it appeared late in the Voodoo3's existance though (well, more like 3dfx's existance)

Believe it or not, the Voodoo 5 came before the Voodoo 4 (in terms of getting it out, at least that's how I remembered it)

Just look at the abilities of the Voodoo5 at the bitter end - it barely matched up to a TNT-1 featurewise, probably because it was designed more around what Glide could do than what D3D and OpenGL could do.

Actually, the Voodoo 5 did match up fairly well.. but the wrong generation you are referencing with.. the Voodoo 3 coincided with the TNT generation... but the Voodoo 5 appeared late towards the GF2 generation

The TNT had a lot of features.. but most likely unusuable for that generation (like the use of 4096x4096 texture sizes)... for the Voodoo 3 at the time.. it was no big deal...

Then again, I;m rather glad that didn't happen either, you tend to forget just how nasty small textures are until you run a game thru a glide wrapper and compare with the results from D3D or OpenGL (U9: Ascension and DeusEX being two good illustrative examples).

At that point it time, it most likely looked better than what was before that era... now, upping the usage such features makes you see the difference current hardware (even last generation) brought to the table...

Well, I'd be more inclined to say that the 3dfx Voodoo kickstarted the 3D industry. If 3dfx had produced OpenGL drivers from the start, there may never have been a need for Glide (at the time, Direct3D was pretty bad...). But, drivers were not 3dfx's strong suit....

Well, their "OpenGL drivers" in the form of MiniGL wasn't all that bad. They did end up making some (not sure if they were truly compliant.. as running them through OpenGL Extension Viewer shows nay).... however, they wrapped to Glide calls (to their Glide driver of course)... their Direct3D wasn't bad at all though.

3dfx could claim to some extent their drivers were as good as NVidia (personally I had no problem).. but.. there goes their legacy...
 
Back
Top