OpenGL 4.0

Wow, I totally lost hope for OpenGL but new versions are popping up fast now.
Is the reason there is an OpenGL 4.0 now that it wont work for pre DX11 Hardware (or non-double-supporting) ? If its fully backwards-compatible I dont understand why OpenGL3.3 was released aswell.

If thats the case its getting a bit cluttered IMHO.
DX 9 Hardware - use OpenGL 2.x or OpenGL3.x in "non-pure" mode (or however that was called)
DX 10 Hardware - use OpenGL 3.x while having some features like geometry shaders only as extensions
DX 11 Hardware - use OpenGL 4.0

Would be a mess having to support all that at once.
 
Wow, I totally lost hope for OpenGL but new versions are popping up fast now.
Is the reason there is an OpenGL 4.0 now that it wont work for pre DX11 Hardware (or non-double-supporting) ? If its fully backwards-compatible I dont understand why OpenGL3.3 was released aswell.

As it looks to me, OpenGL 3.3 is intended for DirectX10-class devices, exposing features common among such devices that somehow didn't make it into previous 3.x revisions (instancing with divisors, sampler objects and dual-source blending being the biggest ones). This feature set was evidently large enough to justify producing this revision alongside 4.0 .
 
Has OGL4.0 caught up with DX11? I mean is there any missing functionality now, apart from multi-threading? My guess is that ARB expects devs to create different contexts on different threads and then share stuff amongst them. Is that possible?
 
As it looks to me, OpenGL 3.3 is intended for DirectX10-class devices, exposing features common among such devices that somehow didn't make it into previous 3.x revisions (instancing with divisors, sampler objects and dual-source blending being the biggest ones). This feature set was evidently large enough to justify producing this revision alongside 4.0 .

Everything from GL 3.0 through 3.3 requires DX10-class (SM4) hardware.
 
Has OGL4.0 caught up with DX11? I mean is there any missing functionality now, apart from multi-threading? My guess is that ARB expects devs to create different contexts on different threads and then share stuff amongst them. Is that possible?

There are a handful of smaller features that DirectX11 supports that the OpenGL 4.0 core spec still hasn't got support for:
  • Atomic operations and Unordered-Access-View accesses from the Pixel Shader (presumably useful for order-independent transparency; no OpenGL extension appears to exist)
  • BC6H and BC7 compressed texture formats (available as ARB extension)
  • Texture operations: gather4 and LOD-query (available as ARB extension)
  • Neither OpenGL nor OpenCL have any constructs equivalent to DirectX11's append/consume buffers.
Still, it's much better than what was the case with OpenGL 3.0 - which targeted DX10-class devices, but lacked major headline features (e.g. Geometry Shading).

There will probably be an OpenGL 4.1 somewhere down the line to tie up some of these loose ends.
 
There are a handful of smaller features that DirectX11 supports that the OpenGL 4.0 core spec still hasn't got support for:
  • Atomic operations and Unordered-Access-View accesses from the Pixel Shader (presumably useful for order-independent transparency; no OpenGL extension appears to exist)
  • I think they expect this to be covered by OCL.

    Lacking texture format support is disappointing. Why drop it from spec when all the hw you are targeting supports them?
 
Yeah, the new BCT formats not being core are a disappointment, but it's an ARB ext at least. Also, very glad to see new versions of OGL coming out a lot closer to the DX version they're meant to compete with but I don't know how Khronos can state "most widely adopted 2D and 3D graphics API" without any contextualisation like "in the super high-end CAD market" and keep a straight face. OTOH, they pretty much yielded the 3D games field to DX so perhaps it's not ye olde spin. Not that it matters much.

Did anyone read the checked version? I wonder if the original 3.0 deprecation model couldn't just be erased from all history books like a bad memory. Is anyone not using a forward context without 3.2> compatibility profile?

Pet Deprecation Missed Feature: What are people using instead of line_width > 1?
 
Open GL

I have watched the 3d and gaming industry for a long time and truly believe that developing in OpenGL is going to more important in the long haul. Unfortunately most new developers are being herded into learning strictly DirectX, I feel that OpenGL has such a longer history and it's open source nature make it something that will always be better, maybe not the first drivers for some games or video cards but a more complete API in general. Most video card manufacturers have built their rush demos using OpenGL, I think that is a very important point to consider. Along with the fact that Microsoft's dominance in the OS business has been eroding over time. I am also a big fan of Linux so there is some bias, but what an OS Linux is...it can be scaled down to run on a cell phone! When developers are thinking about building systems that are ultra fast and stable, Linux and Unix variants like OpenBSD come to mind. Even the MAC runs linux at the core and that says a great deal. Not that I use Linux on a regular basis, I am primarily a Windows IT specialist, and all the 3d and compositing I do with Windows programs. I have been around long enough to remember how advanced SGI was with their development of OpenGL, and I think there are many pieces of legacy technology that have been overlooked from their work.
 
Even the MAC runs linux at the core and that says a great deal.

If you mean Apple's Macs, OS X has lots of FreeBSD in it, but not Linux.

I have been around long enough to remember how advanced SGI was with their development of OpenGL, and I think there are many pieces of legacy technology that have been overlooked from their work.

Agreed, turning the IrisGL library into the OpenGL open specification was definitely a massive Good Thing, but what overlooked pieces do you mean?
 
Unfortunately most new developers are being herded into learning strictly DirectX, I feel that OpenGL has such a longer history and it's open source nature make it something that will always be better, maybe not the first drivers for some games or video cards but a more complete API in general.
The issue is that OpenGL has just been too far behind the times over the past few years. I'm not talking mainly in terms of features (which OpenGL 4.0 adds most of) but more in terms of the basic API itself... from that point of view I was disappointed to see that no overhaul of the state management mechanism made it into 4.0 (although they have been talking about direct state access and other proposals which is a good sign).

So it's great to see OpenGL catching up "more quickly" than in the past but I'd love to see them take a leadership role - say with some speculative features or maybe go to a fully functional-style API. Right now they're doing an okay job of providing a "DirectX for other platforms" (assuming driver support is swift, which I expect that it will be considering they're just following DX at this point), but there's still no compelling reason to use it on Windows.
 
So it's great to see OpenGL catching up "more quickly" than in the past but I'd love to see them take a leadership role - say with some speculative features or maybe go to a fully functional-style API. Right now they're doing an okay job of providing a "DirectX for other platforms" (assuming driver support is swift, which I expect that it will be considering they're just following DX at this point), but there's still no compelling reason to use it on Windows.
It seems like at this stage the sitution is somewhat like a mirrored version of the DX7 days - back then D3D was catching up fast as (a) it had a model to follow, and (b) it was gaining audience. These days GL is gaining audience on the non-windows desktops ('look, mom, I can play games on non-windows!'), has a model to follow, and the GL ES synergy surely helps with a nudge there too.

Apropos, I notice many talented developers from the new generations choose to use GL on windows. The rationale is simple - if you even envision your code running on non-windows, GL is the only choice. This is before we even consider multiplatform and embedded developers (like me) for whom going DX could be anything from moronic to suisidal.
 
Back
Top