OpenGL deprecation myths

With or without the additional shader stages?

Unfortunately I had only access to the high level planning for Direct3D 10 were tessellation was listed as feature. But based on what I have seen I expect that it was equal to the tessellation on XBox 360.
 
If it's not going to help AMD/NV/Intel move more hardware I don't see it happening. Besides that DX provides not only 3D but a complete framework (audio, input, etc) whereas OGL is only about 3D graphics, hardly the whole picture.

Does Kronos sit down with game developers to gather feedback? I know Carmack has posted some comments over the years on what he felt was needed in OpenGL to be a good API for games, but don't remember many others doing so.

Well, Blizzard, EA and transgaming are contributing members of Khronos, and they gave input on 3.0.
Additionally Khronos had a "school" at GDC so they do seem interested in getting developers to use it, but (as far as I can tell) not quite to the point of going out asking for input.

repi twittered that he was going to a meeting "with the real powers that be at Intel in a few weeks about future gamedev parallel SW/HW utopia."

Khronos would do well/could probably benefit from doing something similar with lead graphic devs from the game industry and find out what they think are the major hurdles for using OpenGL instead of DX.
 
Last edited by a moderator:
And how is it done there?

It’s a fixed function unit that can generate barycentric coordinates as additional vertex shader inputs. These coordinates are either based on a fixed tessellation factor or per edge factors read from the memory. Therefore adaptive tessellation would require two passes instead of one.
 
OpenGL was not created for games, and it shows. MS is in full control of DX and can add whatever features they want without having to run them through a committee, and they clearly targeted games. DX being the dominant API for games is irrelevant because those games are written for Windows, so you might as well use Microsoft's proprietary APIs anyway if it makes your life as a developer easier.

If you want to go multiplatform you use OpenGL, of course.

I view OpenGL's model as something similar to X Windows. Strict separation of functionality based on usage requirements and hardware of the time, resulting in a very efficient use of resources that can be very restrictive and break down severely.

A while back, I was running my PCI-Express slot at 1x. In Windows with DirectX, games often slowed to a crawl (even old, dx7 games) on a 9800gtx+. Under Wine, converted to OpenGL, the games still ran full speed. Not identical code, but I'd assume as close as you can get, probably closer even than a port of a d3d engine to opengl. My guess: OpenGL lends itself more to a dumb display model and not so much as a coprocessor model.
 
A while back, I was running my PCI-Express slot at 1x. In Windows with DirectX, games often slowed to a crawl (even old, dx7 games) on a 9800gtx+. Under Wine, converted to OpenGL, the games still ran full speed. Not identical code, but I'd assume as close as you can get, probably closer even than a port of a d3d engine to opengl. My guess: OpenGL lends itself more to a dumb display model and not so much as a coprocessor model.
Sounds like a driver issue to me, not an API issue. In Windows, probably some resources were being put in system memory and that was causing slow performance due to a slow PCIe slot. In the OpenGL driver under Wine, it was probably putting those resources into GPU board memory.
 
Back
Top