Will DirectX replace OpenGL in game development?

Discussion in 'Architecture and Products' started by K.I.L.E.R, Aug 1, 2005.

  1. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    ARB runs so much slower? Slower then what? I see almost no speed differences from ARB and GLSL, well alittle bit, but it varies from shader to shader. Sometimes ARB is faster sometimes GLSL is faster but usaully a marginal speed difference.

    Optimization with OGL tends to be easier because you have 10 different functions to do the same thing pick and choose the best one for the operation.

    So far if venders support certain features then it becomes a standard in OGL, I see things in OGL that aren't avaiable in Dx, and the other way around. And of course updates on a periodic basis also.
     
    #21 Razor1, Aug 2, 2005
    Last edited by a moderator: Aug 2, 2005
  2. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Not if John Carmack can help it. I hear he's a pretty important fella in this aspect.
     
  3. BrandonFurtwangler

    Newcomer

    Joined:
    Aug 2, 2005
    Messages:
    4
    Likes Received:
    0

    You make a good point that you can choose the best of 10 different ways to do something, but the problem in practice is that the same way isn't always the best for every card. Just read John Carmack .plan files while he was working on doom3. The point I was making was that developers dont want to allocate time to test all the combinations of ways to do things so that everyone gets the best possible graphics. Instead with DX it is up to the gpu manufactures to optimize for d3d's single way to do something (well per generation with the advent of shaders). With this model, if it runs slow on ATI, then it's ATI's fault not good old John Carmacks. (btw, I'm sure HE loves such optimizing, but I would say he's from the old school).

    Of course, I will acknowledge that the DX model has problems. Changing API every few years isn't easy on people trying to learn it. The thing people forget is that DX is fully backwards compatible, so if you know DX9 you can still use it. The fact that graphics cards are changing so fast makes it actually kind of nice (as I see it) that DX changes fast enough to keep up (or even drive) new features.

    I'm not saying OpenGL is bad...in fact I use it quite a bit with school. I'm just saying why I like DX better.
     
  4. Bob

    Bob
    Regular

    Joined:
    Apr 22, 2004
    Messages:
    424
    Likes Received:
    47
    Because unlike D3D, OpenGL is forward compatible too. You can add new features much later in your development cycle, because you don't need to completely recode your graphics to work with DX n+1.

    Have you actually built a game with a reasonably wide target audience, using D3D? It's worse than OpenGL. Not only do you need to check 57 cap bits, but even if two cap bits are set to indicate that some features are supported, the combination of those two cap bits together may not work. For some features (like 8 combiner stages), you need to set some completely unrelated state (that's normally invalid) to gain access to that feature.

    You end up with plenty of card-specific code either way. At least, OpenGL makes it explicit.
     
  5. Farid

    Farid Artist formely known as Vysez
    Veteran Subscriber

    Joined:
    Mar 22, 2004
    Messages:
    3,844
    Likes Received:
    108
    Location:
    Paris, France
    Note that Sony Online Entertainment is a division of Sony Pictures, not Sony Computer (Playstation business).
     
  6. Temporary Name

    Newcomer

    Joined:
    May 21, 2004
    Messages:
    65
    Likes Received:
    1
    I'd be interested to hear what he thinks of DX10. Nudge nudge.
     
  7. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    It's render-to-texture that isn't supported, and there's a good reason for that: render-to-texture support in OpenGL was almost nonexistant at the release of UT2k4. But now we have pixel buffer objects in OpenGL, so I don't think there's any longer any reason to not have full support of all features in UE3.
     
  8. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Well, I didn't say it would. What I'm suggesting is that it's going to be more challenging to support DX9 and DX10 in the same game than it would be to instead support OpenGL 2.0 (current OpenGL) and OpenGL 2.0 + extensions that support the new hardware features available at a similar timeframe to Vista.

    Less elegant? In what way?

    Today the ARB paths are very commonly-used. See Doom 3 for example. I'd really like you to give some specific examples of particular extensions that you have problems with.
     
  9. AndrewM

    Newcomer

    Joined:
    May 28, 2003
    Messages:
    219
    Likes Received:
    2
    Location:
    Brisbane, QLD, Australia
    It's often used as an excuse to bash extensions in general. IMO, if you can't handle using extensions and their issues, then stop programming. In my experience, it's not a big deal either way.

    The biggest gripe I've personally had was the "Vertex Buffer" functionality. A few years ago that was a REAL pain. It was dealt with through various programming techniques to hide each implementation (VAO - ATI and VAR - NV), but these days it's all fine. Most other things at that time were handled the same way (see carmacks multiple "paths" stuff).

    In short, people like to bash extensions.
    :)
     
  10. Cowboy X

    Newcomer

    Joined:
    May 22, 2005
    Messages:
    206
    Likes Received:
    2

    Really ? ............... Are you sure ? Any links orinsider info ? Is this the rumoured OpenGl driver rewrite that we have been hearing about for so long ??

    Maybe I'm a bit jaded but if ATI removes or even reverses the now slim lead that Nivida has in OpenGl titles , what benefit is that really for ATI in the face of SLI and the 7800GTX ??
     
  11. jb

    jb
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,636
    Likes Received:
    7

    Yes I know that but I did not think that the person I was posting here would know what that meant as far as Unreal world went. Where as scripted textures are much easier to explain. At least I thought so... :)
     
  12. Bob

    Bob
    Regular

    Joined:
    Apr 22, 2004
    Messages:
    424
    Likes Received:
    47
    Pixel buffers have little to do with rendering to textures. I think what you're looking for is EXT_framebuffer_object.
     
  13. ShootMyMonkey

    Veteran

    Joined:
    Mar 21, 2005
    Messages:
    1,177
    Likes Received:
    72
    If OpenGL ever really dies, it'll die a Monty Python death -- ("I'm not *quite* dead yet!!"). Seriously, though DX won't own the gaming market simply because it's MS and MS alone. People developing on non-MS platforms won't use DX (at least not for those versions), same as DX won't ever exist on any non-MS platform. Prior to the explosion of extensions, OpenGL was superior to DX in almost all aspects. Sure DX was fast back then, but it was messy and impractical and you had to write volumes of code to do just anything.

    Using the extensions is a fairly common practice now, but that kills the cleanliness that GL was originally known for, and causes you to have to write the same thing several different ways. You can't argue that part of the beauty of OpenGL is it's pick-up-and-go nature that DirectX will probably never achieve. It's one of the many reasons why OpenGL is the standard for academia. And the extensions are really a hindrance to that, no matter how usable you might make them. The only reason they even exist is because people wanted to let unique features of their particular cards usable -- which is another reason to use GL if you're developing something specifically using certain features that aren't formally in the spec (as opposed to DirectX where anything not in the spec is not supported at all, which makes it more suitable for arbitrary hardware).

    One of the things so-called "pure" GL 2.0 was *supposed* to bring was to clear out all the extensions and roll the important ones directly in as API features as well as get rid of the fixed function pipe.
     
  14. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    I think what OpenGL 2.0 originally planned to do is release a new library and header to replace the aging ones that currently comes with Windows. It would include all the core extensions as basic functionality, while leaving the rest as extensions. Extensions themselves will not disappear. They're part of the strength of the entire API.
     
  15. Ragemare

    Regular

    Joined:
    Apr 8, 2004
    Messages:
    333
    Likes Received:
    7
    Location:
    England
    Thats probably what I meant :oops:

    It seems odd that MS would take themselves out of a situation where they could put a spanner in the works. Me thinks they were pushed.
     
  16. ShootMyMonkey

    Veteran

    Joined:
    Mar 21, 2005
    Messages:
    1,177
    Likes Received:
    72
    Possibly, but the proposals that eventually came up were much more radical and seemed driven to try and change everything, then the 2.0 spec came out, and it was basically the 1.5 spec all over again. Also, I don't think the intention was to remove the possibility of extensions so much as to clear out the jumbled mess that was there. Clean out the extensions that existed up to that point, construct a new API that drops fixed-function and rolls in a lot of core functionality from the extensions, and let the modification begin anew.

    Either way, regarding PS3, the fact that you've got fixed hardware makes extensions meaningless, so everything unique to RSX would likely be rolled in as standard API features.
     
  17. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    The thing is that there does not have to be extensions for the core functionality (including extensions that were upgraded to core functionality). The problem is that the default OpenGL library and header for Windows is bloody old! I think it's OpenGL 1.2. . . If that libary and header were updated with each spec update, then only optional functionality would remain as extensions.

    (I'm not sure what the situation is in Linux or Unix, though Apple historically has incredibly good OpenGL support. I'm actually quite jealous. ;))
     
  18. skirst

    Newcomer

    Joined:
    Sep 19, 2003
    Messages:
    5
    Likes Received:
    0
    Location:
    Cincinnati, Ohio USA
    OpenGL support is great under Linux, with Nvidia anyways. Nvidia has OpenGL 2.0 support for linux, and the drivers come with everything you need (up-to-date header files) for development.
     
  19. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    That's just plain wrong.
    WGL_ARB_render_texture is available since years.
    From the extension registry : http://oss.sgi.com/projects/ogl-sample/registry/ARB/wgl_render_texture.txt
     
  20. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.

    AFAIR it's OpenGL 1.1, it was to be updated to 1.2 with win2k but it never happened.
    There are a few small libs that let you use the extensions painlessly, like GLEW...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...