codedivine
Regular
OpenGL 4.3 specifications are now up on the registry:
http://www.opengl.org/registry/
New addition: Compute Shaders! Wha?
http://www.opengl.org/registry/
New addition: Compute Shaders! Wha?
There are some pretty specific scenarios Khronos is targeting with OpenGL compute shaders, not to mention a desire to make OpenGL more approachable to DirectX developers.New addition: Compute Shaders! Wha?
New addition: Compute Shaders! Wha?
CL-GL interop is not exactly a substitute for something that exposes similar functionality yet lives in the same API and employs the same scheduling mechanism. This was a reasonably big advantage for DX CS, IMHO.
WRT CL, what the bloody hell? 6 years after G80 and all they can offer is G80+CLU. The hw is miles ahead of what CL is offering. Pathetic.
True, but does that mean that OCL is a dead end and we'll see khronos adding more features into compute shaders going forward? So now we have two ways of doing the same thing.
WRT CL, what the bloody hell? 6 years after G80 and all they can offer is G80+CLU. The hw is miles ahead of what CL is offering. Pathetic.
But but but...extensions!!! and it's open!!!
About OCL, not really, but OCL is far more generalist and dont concern directly gaming and general graphism, it is more flexible and aimed at "computing" on a large sense ... OpenGL, for graphic and games, need to bring a more suitable solution directly with it for some specific purpose.
but OCL is far more generalist and dont concern directly gaming and general graphism
Hasn't that been a big problem for OGL as a whole for years now?
Obviously GPU compute is useful for a lot of things outside of graphics, but it's also important to note that they're also really good for graphics! In that regard compute shaders have been a major advantage (IMO) for D3D compared to OpenGL. In D3D if I wanted to render to a texture and then use a compute shader to do some fancy maths on it with a compute shader, it was easy since compute shaders are a first-class citizen and no interop is required. Or if I wanted to to have a whole slew of shader code shared between a compute shader used for deferred rendering and a more standard forward rendering pixel shader, it was also easy since both shaders use the same language, compiler, and API.
Laughing at someone's opinion without explaining why it's funny isn't much of a contribution.lol I love how there are still people convinced that extensions are a great thing. It really must be me...
Laughing at someone's opinion without explaining why it's funny isn't much of a contribution.