Will DirectX replace OpenGL in game development?

BrandonFurtwangler said:
I totally disagree. DX9 isn't going to disappear. In fact it will run on Vista and XP. So why would they jump to OGL when they could _continue_ to use DX9 for the interim period?

What _will_ make them jump to DX10 is all the new features it enables. If you look into OGL2.0 you basically see a less elegant version of DX9 (although much better than 1.5). It may be cross-platform, which is great, but I dont think this is top priority for game developers.

[opinion alert]
As I see it, OpenGL's extension mechanism is what killed it for game developers. Its like coding in DOS. You have to 10 different extensions to do any one thing. Sure you can use ARB extension, but half the time they run so much slower that you end up needing to write vender and card specific render paths. With D3D you still have the different feature sets, however you are at least (mostly) spared the vender specific BS.

When it comes down to it, I think OpenGL isn't going anywhere soon, but you cant argue that the API is a relic of the past. Some people find it less daunting to get started with such an api design, but when you try to do more complicated graphics I find it horribly contrary to everything software engineers have learned in the past 15 years.
[/opinion alert]

ARB runs so much slower? Slower then what? I see almost no speed differences from ARB and GLSL, well alittle bit, but it varies from shader to shader. Sometimes ARB is faster sometimes GLSL is faster but usaully a marginal speed difference.

Optimization with OGL tends to be easier because you have 10 different functions to do the same thing pick and choose the best one for the operation.

So far if venders support certain features then it becomes a standard in OGL, I see things in OGL that aren't avaiable in Dx, and the other way around. And of course updates on a periodic basis also.
 
Last edited by a moderator:
K.I.L.E.R said:
Sorry but I've been reading a lot of doom and gloom scenarios that say OpenGL will be replaced by DX.
Not if John Carmack can help it. I hear he's a pretty important fella in this aspect.
 
Razor1 said:
ARB runs so much slower? Slower then what? I see almost no speed differences from ARB and GLSL, well alittle bit, but it varies from shader to shader. Sometimes ARB is faster sometimes GLSL is faster but usaully its a marginal speed difference.

Optimization with OGL tends to be easier because you have 10 different functions to do the same thing pick and choose the best one for the operation.

So far if venders support certain features then it becomes a standard in OGL, I see things in OGL that aren't avaiable in Dx, and the other way around.


You make a good point that you can choose the best of 10 different ways to do something, but the problem in practice is that the same way isn't always the best for every card. Just read John Carmack .plan files while he was working on doom3. The point I was making was that developers dont want to allocate time to test all the combinations of ways to do things so that everyone gets the best possible graphics. Instead with DX it is up to the gpu manufactures to optimize for d3d's single way to do something (well per generation with the advent of shaders). With this model, if it runs slow on ATI, then it's ATI's fault not good old John Carmacks. (btw, I'm sure HE loves such optimizing, but I would say he's from the old school).

Of course, I will acknowledge that the DX model has problems. Changing API every few years isn't easy on people trying to learn it. The thing people forget is that DX is fully backwards compatible, so if you know DX9 you can still use it. The fact that graphics cards are changing so fast makes it actually kind of nice (as I see it) that DX changes fast enough to keep up (or even drive) new features.

I'm not saying OpenGL is bad...in fact I use it quite a bit with school. I'm just saying why I like DX better.
 
So why would they jump to OGL when they could _continue_ to use DX9 for the interim period?
Because unlike D3D, OpenGL is forward compatible too. You can add new features much later in your development cycle, because you don't need to completely recode your graphics to work with DX n+1.

As I see it, OpenGL's extension mechanism is what killed it for game developers. Its like coding in DOS.
Have you actually built a game with a reasonably wide target audience, using D3D? It's worse than OpenGL. Not only do you need to check 57 cap bits, but even if two cap bits are set to indicate that some features are supported, the combination of those two cap bits together may not work. For some features (like 8 combiner stages), you need to set some completely unrelated state (that's normally invalid) to gain access to that feature.

You end up with plenty of card-specific code either way. At least, OpenGL makes it explicit.
 
jb said:
FYI there are a number of features that do not work in the OGL port of UT2k4. A minor one is scripted textures which are used to place player names on the licence plate (or the moving scrore boards in some user made maps, digital ammo counters on some user made weapons, ect) of the Hellbinder for example.
It's render-to-texture that isn't supported, and there's a good reason for that: render-to-texture support in OpenGL was almost nonexistant at the release of UT2k4. But now we have pixel buffer objects in OpenGL, so I don't think there's any longer any reason to not have full support of all features in UE3.
 
BrandonFurtwangler said:
I totally disagree. DX9 isn't going to disappear. In fact it will run on Vista and XP. So why would they jump to OGL when they could _continue_ to use DX9 for the interim period?
Well, I didn't say it would. What I'm suggesting is that it's going to be more challenging to support DX9 and DX10 in the same game than it would be to instead support OpenGL 2.0 (current OpenGL) and OpenGL 2.0 + extensions that support the new hardware features available at a similar timeframe to Vista.

What _will_ make them jump to DX10 is all the new features it enables. If you look into OGL2.0 you basically see a less elegant version of DX9 (although much better than 1.5). It may be cross-platform, which is great, but I dont think this is top priority for game developers.
Less elegant? In what way?

As I see it, OpenGL's extension mechanism is what killed it for game developers. Its like coding in DOS. You have to 10 different extensions to do any one thing. Sure you can use ARB extension, but half the time they run so much slower that you end up needing to write vender and card specific render paths. With D3D you still have the different feature sets, however you are at least (mostly) spared the vender specific BS.
Today the ARB paths are very commonly-used. See Doom 3 for example. I'd really like you to give some specific examples of particular extensions that you have problems with.
 
Chalnoth said:
Today the ARB paths are very commonly-used. See Doom 3 for example. I'd really like you to give some specific examples of particular extensions that you have problems with.

It's often used as an excuse to bash extensions in general. IMO, if you can't handle using extensions and their issues, then stop programming. In my experience, it's not a big deal either way.

The biggest gripe I've personally had was the "Vertex Buffer" functionality. A few years ago that was a REAL pain. It was dealt with through various programming techniques to hide each implementation (VAO - ATI and VAR - NV), but these days it's all fine. Most other things at that time were handled the same way (see carmacks multiple "paths" stuff).

In short, people like to bash extensions.
:)
 
neliz said:
With this months catalyst (5. 8) the ogl lead by nv will be as good as gone..


Really ? ............... Are you sure ? Any links orinsider info ? Is this the rumoured OpenGl driver rewrite that we have been hearing about for so long ??

Maybe I'm a bit jaded but if ATI removes or even reverses the now slim lead that Nivida has in OpenGl titles , what benefit is that really for ATI in the face of SLI and the 7800GTX ??
 
Chalnoth said:
It's render-to-texture that isn't supported, and there's a good reason for that: render-to-texture support in OpenGL was almost nonexistant at the release of UT2k4. But now we have pixel buffer objects in OpenGL, so I don't think there's any longer any reason to not have full support of all features in UE3.


Yes I know that but I did not think that the person I was posting here would know what that meant as far as Unreal world went. Where as scripted textures are much easier to explain. At least I thought so... :)
 
It's render-to-texture that isn't supported, and there's a good reason for that: render-to-texture support in OpenGL was almost nonexistant at the release of UT2k4. But now we have pixel buffer objects in OpenGL
Pixel buffers have little to do with rendering to textures. I think what you're looking for is EXT_framebuffer_object.
 
If OpenGL ever really dies, it'll die a Monty Python death -- ("I'm not *quite* dead yet!!"). Seriously, though DX won't own the gaming market simply because it's MS and MS alone. People developing on non-MS platforms won't use DX (at least not for those versions), same as DX won't ever exist on any non-MS platform. Prior to the explosion of extensions, OpenGL was superior to DX in almost all aspects. Sure DX was fast back then, but it was messy and impractical and you had to write volumes of code to do just anything.

Using the extensions is a fairly common practice now, but that kills the cleanliness that GL was originally known for, and causes you to have to write the same thing several different ways. You can't argue that part of the beauty of OpenGL is it's pick-up-and-go nature that DirectX will probably never achieve. It's one of the many reasons why OpenGL is the standard for academia. And the extensions are really a hindrance to that, no matter how usable you might make them. The only reason they even exist is because people wanted to let unique features of their particular cards usable -- which is another reason to use GL if you're developing something specifically using certain features that aren't formally in the spec (as opposed to DirectX where anything not in the spec is not supported at all, which makes it more suitable for arbitrary hardware).

One of the things so-called "pure" GL 2.0 was *supposed* to bring was to clear out all the extensions and roll the important ones directly in as API features as well as get rid of the fixed function pipe.
 
I think what OpenGL 2.0 originally planned to do is release a new library and header to replace the aging ones that currently comes with Windows. It would include all the core extensions as basic functionality, while leaving the rest as extensions. Extensions themselves will not disappear. They're part of the strength of the entire API.
 
Mordenkainen said:
I don't know if this is what you meant but MS left the OGL ARB two years ago IIRC.

Thats probably what I meant :oops:

It seems odd that MS would take themselves out of a situation where they could put a spanner in the works. Me thinks they were pushed.
 
I think what OpenGL 2.0 originally planned to do is release a new library and header to replace the aging ones that currently comes with Windows. It would include all the core extensions as basic functionality, while leaving the rest as extensions. Extensions themselves will not disappear. They're part of the strength of the entire API.
Possibly, but the proposals that eventually came up were much more radical and seemed driven to try and change everything, then the 2.0 spec came out, and it was basically the 1.5 spec all over again. Also, I don't think the intention was to remove the possibility of extensions so much as to clear out the jumbled mess that was there. Clean out the extensions that existed up to that point, construct a new API that drops fixed-function and rolls in a lot of core functionality from the extensions, and let the modification begin anew.

Either way, regarding PS3, the fact that you've got fixed hardware makes extensions meaningless, so everything unique to RSX would likely be rolled in as standard API features.
 
ShootMyMonkey said:
Possibly, but the proposals that eventually came up were much more radical and seemed driven to try and change everything, then the 2.0 spec came out, and it was basically the 1.5 spec all over again. Also, I don't think the intention was to remove the possibility of extensions so much as to clear out the jumbled mess that was there. Clean out the extensions that existed up to that point, construct a new API that drops fixed-function and rolls in a lot of core functionality from the extensions, and let the modification begin anew.

Either way, regarding PS3, the fact that you've got fixed hardware makes extensions meaningless, so everything unique to RSX would likely be rolled in as standard API features.
The thing is that there does not have to be extensions for the core functionality (including extensions that were upgraded to core functionality). The problem is that the default OpenGL library and header for Windows is bloody old! I think it's OpenGL 1.2. . . If that libary and header were updated with each spec update, then only optional functionality would remain as extensions.

(I'm not sure what the situation is in Linux or Unix, though Apple historically has incredibly good OpenGL support. I'm actually quite jealous. ;))
 
Ostsol said:
The thing is that there does not have to be extensions for the core functionality (including extensions that were upgraded to core functionality). The problem is that the default OpenGL library and header for Windows is bloody old! I think it's OpenGL 1.2. . . If that libary and header were updated with each spec update, then only optional functionality would remain as extensions.

(I'm not sure what the situation is in Linux or Unix, though Apple historically has incredibly good OpenGL support. I'm actually quite jealous. ;))

OpenGL support is great under Linux, with Nvidia anyways. Nvidia has OpenGL 2.0 support for linux, and the drivers come with everything you need (up-to-date header files) for development.
 
Chalnoth said:
It's render-to-texture that isn't supported, and there's a good reason for that: render-to-texture support in OpenGL was almost nonexistant at the release of UT2k4. But now we have pixel buffer objects in OpenGL, so I don't think there's any longer any reason to not have full support of all features in UE3.

That's just plain wrong.
WGL_ARB_render_texture is available since years.
From the extension registry : http://oss.sgi.com/projects/ogl-sample/registry/ARB/wgl_render_texture.txt
Status

Complete. Approved by ARB on June 13, 2001

Version

Last Modified Date: July 16, 2001
 
Ostsol said:
The thing is that there does not have to be extensions for the core functionality (including extensions that were upgraded to core functionality). The problem is that the default OpenGL library and header for Windows is bloody old! I think it's OpenGL 1.2. . . If that libary and header were updated with each spec update, then only optional functionality would remain as extensions.

(I'm not sure what the situation is in Linux or Unix, though Apple historically has incredibly good OpenGL support. I'm actually quite jealous. ;))


AFAIR it's OpenGL 1.1, it was to be updated to 1.2 with win2k but it never happened.
There are a few small libs that let you use the extensions painlessly, like GLEW...
 
Back
Top