the final OpenGL 2 Shading language spec has been posted by 3DLabs, along with a bunch of other stuff.
http://www.3dlabs.com/support/developer/ogl2/index.htm
http://www.3dlabs.com/support/developer/ogl2/index.htm
I thought from the description given in nvidia's Siggraph 2003 paper that there weren't that many differences between the various "C-like" shading languages. Perhaps I've just not looked closely enough at them all?Hyp-X said:It looks like Microsoft and 3DLabs made sure the two shading languages become as incompatible as possible.
Please feel free to tell me I'm deranged, but wasn't the OGL 2 spec/proposal out before the others?DemoCoder said:Well, they made some really dumb choices, like instead of calling something float4 to remain in sync with HLSL and Cg, they call it vec4.
Simon F said:Please feel free to tell me I'm deranged, but wasn't the OGL 2 spec/proposal out before the others?DemoCoder said:Well, they made some really dumb choices, like instead of calling something float4 to remain in sync with HLSL and Cg, they call it vec4.
Popnfresh said:I thought your original argument was that they should harmonize with existing practice, which they have done so.
Nothing, and that's the point. It explicitly gets away from 'This is represented as this' in the same way that everything else in GL is.DemoCoder said:vec4 is a vector of what? ints? floats? doubles? float4 is a vector of floats. "float" implies IEEE. What does "vec" imply?
I'm a big C fan. But C is terrible in this respect. The wishy-washy rules on 'int is this big, but might be bigger' are terrible. Why else would so many projects explicitly use 'int32' 'uint32' and similar typedefs - and why do so many projects standardise on 'byte' 'word' and 'dword' types?DemoCoder said:Again, according to your argument, existing practice in high level languages is to have "Int32" or "Integer" and "Float" objects defined as typedefs or in apis. Therefore, if they harmonized against existing practice across all programming languages, they should have used "Integer" or "Int32" instead of "int"
I don't think it's meant to be adversarial - just wanting to do it 'properly'.DemoCoder said:I mean, why take an adversarial not-invented-here position on something so trivial to fix?
Would there being only one HLL be a good thing? Whether it is or not, it's not likely to happen, because everyone believes every HLL has its flaws.DemoCoder said:IMHO, there should be one C-style HLSL, and the only difference between the MS and GL versions would be the stanfard library provided of functions, and the global variables and API state bindings, just like with C compilers today. Single syntax, multiple compiler implementations and provided libraries and OS bindings.
Nothing, and that's the point. It explicitly gets away from 'This is represented as this' in the same way that everything else in GL is.