Will OpenGL 2.0 be the equivalent of DirectX 10/ Next?

j^aws

Veteran
Just trying to guage the relative features or proposed features of these two API's. Are they comparable in features or are there significant differences, advantages and disadvantages? :p
 
No.

GL2 is supposed to integrate GLSL into the core. Much like current ATI and NVIDIA drivers do, but with the official ARB blessings, whatever that means.

So you'll have a high level language to program vertex and fragment shaders, compiled at runtime by the driver itself. There will be no artificial functionality limits and, as a logical consequence, there will be no shader profiles like in the current DXG toolchain.

There's no need to wait for an official OpenGL2.0 though. The funcionality is already available.
 
zeckensack said:
No.
.....
There's no need to wait for an official OpenGL2.0 though. The funcionality is already available.

So it's just an update to make things 'tighter' and 'faster' than any functionality enhancements then?

Joe DeFuria said:
Standard GL will eventually get DX10 functionality (whatever that turns out to be), it'll just be 2 years or so later.

Curious as to why GL2.0 will get whatever DX10 functionality 2 years after DX10? :?
 
Well, it'll be some 2.x that will get it. The core of 2.0 will basically be DX9. It'll be the first OpenGL spec to require programmable fragment and vertex pipelines. I suspect that further versions after 2.0 will make other parts of the API programmable, such as primative processing (something supposedly to be in DX Next), as well as providing more shader models.

EDIT: The reason why it may be 2 years is that it'll take that long for the ARB to settle on a spec. ATI and NVidia are sure to have their own extensions implementing the new functionality that will be available in DX Next, but there's no telling as to how those extensions will differ.
 
Jaws said:
zeckensack said:
No.
.....
There's no need to wait for an official OpenGL2.0 though. The funcionality is already available.

So it's just an update to make things 'tighter' and 'faster' than any functionality enhancements then?
It's an update to make GLSL an official part of a core revision. This isn't required to use GLSL. Because of the extension mechanism, you don't need a particular OpenGL version to get at functionality. This is as true for GLSL as it is for everything else in OpenGL.

The main benefits of this move are marketing and, perhaps, that an integration into the core encourages IHVs to come through with implementations. This isn't really an issue if you're concerned with ATI and NVIDIA. Their drivers already support GLSL. It may however be something that motivates Intel, S3 and XGI to put that stuff into their drivers. After all, a "GL 2.0" checkbox can be printed on the box, just beside the "DX9 compatible" checkbox, and marketed. Much easier than "current drivers feature GLslang related extensions".

edit:
Jaws said:
Joe DeFuria said:
Standard GL will eventually get DX10 functionality (whatever that turns out to be), it'll just be 2 years or so later.

Curious as to why GL2.0 will get whatever DX10 functionality 2 years after DX10? :?
That's just Joe talking out of his ass again. Unextended OpenGL 1.2 already has more features than a DX9 device with no caps bits set. But who cares about that? Unextended (i.e. "Standard") OpenGL implementations do not exist, neither do DX9 devices with zero set caps exist.
 
zeckensack said:
That's just Joe talking out of his ass again.

So, what's your esimate of when Unextended GL will gaining DX Next functionality?

Unextended OpenGL 1.2 already has more features than a DX9 device with no caps bits set. But who cares about that?

1) As a gamer I care what the hardware capabilities are, and don't care much about software fall-backs. I'm pretty sure that game developers care about hardware capabilities in GL as well, even if transparent fall-back exists.

2) I'll take your word about GL 1.2...though I was under the impression that DX9 (PS2.0) type fragment shader standard was only recently ratified? And in any case, how long after they were exposed in the DX9 API was the standard in GL ratified?
 
A committee of rivals who are just working together cause they dont want to be too dependent on Microsoft are never going to equal the pace of Microsofts more or less enlightened dictatorship. Even the ARB cant keep up, let alone core OpenGL.

If all the things mentioned in the DX-Next presentation come to pass it will be a huge leap beyond GL2. Ever since hardware overtook OpenGL it has been reactionary, no reason to assume that is going to change.
 
Joe DeFuria said:
zeckensack said:
That's just Joe talking out of his ass again.

So, what's your esimate of when Unextended GL will gaining DX Next functionality?
Don't know. Depends on the feature set of DX Next, in particular on whether or not PPPs are in. If PPPs are not in DX Next, the answer is 2.0.
Joe DeFuria said:
Unextended OpenGL 1.2 already has more features than a DX9 device with no caps bits set. But who cares about that?

1) As a gamer I care what the hardware capabilities are, and don't care much about software fall-backs. I'm pretty sure that game developers care about hardware capabilities in GL as well, even if transparent fall-back exists.
Exactly, hardware capabilities. Installing DX9 on a system doesn't guarantee that PS2.0 will work. It merely guarantees that PS2.0 can be used by applications written to the DXG API if the device supports it. Not all devices do. A Geforce 4MX will certainly not.

That's a fine line, and it's the basic problem with your comparison. Core GL doesn't have a lot of features because it doesn't need a lot of features. The driver writers can expose whatever they want through extensions (and many extensions are cross-vendor and widely supported). DXG OTOH needs new runtime versions that accomodate lots of features, because any DX version limits the maximum usable feature set of the device (with the interesting exception of FourCC codes).

It's just not right if you're comparing the maximum feature set you can use via a version of DirectX Graphics to the minimum feature set allowed to claim conformance to an OpenGL version. This is where I was going with the "no set caps bits" thing. Compare minimum possible feature set for both. That would be silly, but fair.
Joe DeFuria said:
2) I'll take your word about GL 1.2...though I was under the impression that DX9 (PS2.0) type fragment shader standard was only recently ratified? And in any case, how long after they were exposed in the DX9 API was the standard in GL ratified?
A DX9 device with zero set caps bits cannot even perform texture filtering. An OpenGL 1.2 implementation can. But as we all know, such things don't exist in practice. In practice, for any given graphics card, you'll have the same hardware capabilites available through both APIs with possibly a few extras in OpenGL, and possibly more unified cross-vendor access to these features in DXG.

ARB_fragment_program (the assembly style, floating point aware fragment shading API) a bit older. See the revision notes at the end of its spec.

GLslang (the high level language stuff) was much later, and still appears to be in flux. Should be "final" RSN :LOL:
However you can already play around with it on Radeon 9500+ and Geforce FX cards. Compilation will obviously fail if you exceed hardware limits.
 
MfA said:
A committee of rivals who are just working together cause they dont want to be too dependent on Microsoft are never going to equal the pace of Microsofts more or less enlightened dictatorship. Even the ARB cant keep up, let alone core OpenGL.

If all the things mentioned in the DX-Next presentation come to pass it will be a huge leap beyond GL2. Ever since hardware overtook OpenGL it has been reactionary, no reason to assume that is going to change.

Exactly.

The ARB structure for stnadardizing the OpenGL environment has its pros and cons vs. the "Microsoft Dictatorship" way.

Speed of implementation of the standard happens to be one "con" of the committee model (GL ARB).

This is just a fact of life.
 
Joe DeFuria said:
Standard GL will eventually get DX10 functionality (whatever that turns out to be), it'll just be 2 years or so later.
What about those points where OpenGL is already matching WGF functionality, years ahead? Like GLSL?
 
I predict DX10 will get DX Next functionality approximately 2 years after DX9 :)

I also predict that DX10 will slap in a bunch of new tech with poorly designed APIs, which will be fixed in DX11 about the same time OGL 2.x ships with a more sanely designed spec.
 
Joe DeFuria said:
Exactly.

The ARB structure for stnadardizing the OpenGL environment has its pros and cons vs. the "Microsoft Dictatorship" way.

Speed of implementation of the standard happens to be one "con" of the committee model (GL ARB).

This is just a fact of life.
No, because we'll have vendor-specific extensions pretty much immediately upon the release of new hardware. There won't be anything you can do in DirectX that you'll be able to do in OpenGL. It may be somewhat more of a hassle, depending upon how well the IHV's work together, but you'll still be able to do it.
 
Hopefully we'll know the final spec in a few weeks, but you can see what has been proposed for the next version in the latest meeting notes (from a couple of months ago).

I've said it before, and I'll say it again, the "2.0" version number seems to be more marketing than feature driven
 
Well, given NVidia's and ATI's choices of each others extensions, it is apparent that they can work together within OpenGL. The real issue with the ARB and extensions is that one vendor's implementation may not support something that the other vendor has, or it may require something that that vendor does not support. The result may end up being a compromise that may not entirely expose all of a vendor's capabilities within the extension. The ARB version of NV_point_sprite, for example, lost the capability to specify the R texture coordinate.
 
Joe DeFuria said:
Chalnoth said:
No, because we'll have vendor-specific extensions pretty much immediately upon the release of new hardware.

We're talking about standardized. Not vendor specific.

Sorry mate, but I don't think you "get it". (especially after the last discussion on extensions and whatnot)
 
Back
Top