When OpenGL 3 comes out it will only be for Shader Model 2 cards, a few months later the spec for Shader Model > 2 will come out, between this time how will developers interface with GL for support of SM3.0+?
OpenGL doesn't have "shader models". A GLSL if statement will compile down to a branch on SM3 hardware if the driver feels it'll be faster that way. Any formats for textures, buffers and render targets will also be supported out of the box as creation will just fail if the hardware doesn't support the chosen format.Will they use the Mt Evans interface with extensions to access SM3+ features or will things be as they are now?
And now I see you were actually asking a question, not making a statement. Apologies.It's just not correct to say that OpenGL 3 won't support SM3+ hardware.
For an outside observer like me it seems Microsoft is very prolific as opposed to OGL with what? 2-3 definitions of DX9 and now DX10 & DX10.1 within their APIs lifetime. Is it a good thing for OGL devs or...?
OpenGL 3.0 is the first major OpenGL rewrite since it was born in the mid 80's. They are making sure it'll be as simple as possible, so that drivers are more robust and faster, and of course as much future proof as previous OpenGL was...
Starting anew I take it they will be using todays hardware as a baseline and their imagnation as the future? How do you see to it that extensions are supported by HW devs?
Thank you for the informative link Killer-Kris
OpenGL doesn't have "shader models". A GLSL if statement will compile down to a branch on SM3 hardware if the driver feels it'll be faster that way. Any formats for textures, buffers and render targets will also be supported out of the box as creation will just fail if the hardware doesn't support the chosen format.
It's just not correct to say that OpenGL 3 won't support SM3+ hardware. I'll support a lot of features of this hardware, but not all of them (geometry shaders for instance).
That's terribly untrue... language design issues aside, GLSL needs at the very least EXT_gpu_shader4 to get anywhere near SM4.0-style features and functionality... and that extension changes a big chunk of stuff in GLSL!So from day one it was basically on par with SM4.0+ with supported features
Incidentally this is one of the biggest issues with using GL in a production environment. You may write some nice code that works flawlessly on whatever hardware you are testing/debugging on but something as seemingly innocuous as a driver update could make everything break. New cards and different IHVs almost always require new paths for non-trivial programs which makes poses a maintainability and support nightmare. FBOs are a particularly terrible spec where these issues are concerned (there's no guarantee that any combination of formats will ever work... the driver is free to just veto stuff at will with no way to query what DOES work), but GLSL has its share of similar issues.Of course the problem with that is that the hardware does not necessarily support it and it leaves it up to the developer to deal with it.
Possibly, but I think they'd rather wait with implementing it until 3.1 where they're going to add SM4 support "for real". It's supposed to be released about 6 months after the 3.0 spec. (So that's 1 year at least if history repeats. )Surely geometry shaders will be supported via extensions within the object model when GL 3.0 comes out?
Isn't that one of the main things they're looking to change in OGL 3 though? Hardware is still fairly free to expose what features it supports, but if you try to do something it doesn't, creation of that format-object, VAO, render-target, etc. will fail. GL3 should never fall back to software rendering like GL2 can do at times.The driver is free to just veto stuff at will with no way to query what DOES work
I'm not Andy, but GL3 will be a clean break from GL2. This is done to clean up the API (there's just too much legacy crud in there that no-one should use) and consequently make the driver-writers' jobs easier. There is supposed to be some interoperability between GL2 and GL3 though meaning you can switch to GL3 while still using some of your legacy GL2 code in there (resources like textures and buffers can be used across versions etc.) I don't think this has been properly confirmed yet though. More of a thing they hope they're able to do I think.So do you believe we will see a break in backwards-compatibility at OGL3 Andy
That's terribly untrue... language design issues aside, GLSL needs at the very least EXT_gpu_shader4 to get anywhere near SM4.0-style features and functionality... and that extension changes a big chunk of stuff in GLSL!
I've never really understood how OpenGL worked. From a Direct3D developer's point of view, it is pretty simple. DirectX9.0 supports SM2. DirectX9.0c supports SM3. DirectX10 supports SM4. Each DirectX revision contains a "must-have" set of features (especially true of DirectX10, where most features are guaranteed to "just work").Possibly, but I think they'd rather wait with implementing it until 3.1 where they're going to add SM4 support "for real". It's supposed to be released about 6 months after the 3.0 spec. (So that's 1 year at least if history repeats. )
Isn't that one of the main things they're looking to change in OGL 3 though? Hardware is still fairly free to expose what features it supports, but if you try to do something it doesn't, creation of that format-object, VAO, render-target, etc. will fail. GL3 should never fall back to software rendering like GL2 can do at times.
Now you can of course argue that a capability lookup system would be better than the "create and see" scheme (many have), but that's another story.