nooneyouknow
Newcomer
Doomtrooper said:If Chalnoth only knew who he was talking too
I am not sure that would help
Doomtrooper said:If Chalnoth only knew who he was talking too
pocketmoon_ said:nooneyouknow said:I only read 2 pages of this and stopped reading because it was too funny reading it.
Here is the bottom line folks:
Don't you just hate it when people say that ?
Cg and HLSL are, if not twins, close cousins. Cg was developed to be as close to HLSL as possible. Cg is capable of supporting OpenGL on all platforms that support the required ARB (note - not Nvidia) extensions. It gives you the added ability to squeeze the best out of Nvidia hardware and the ability to carry your shaders, unchanged, from OpenGL the DX and back again.
My findings when comparing the compiled output is that Cg and HLSL ane very close in terms of optimisation. Some shaders HLSL nudged ahead, other Cg. There were one or two cases where CG blew HLSL out of the water and vica versa! Personally I'd like to see Intel write a back-end for Cg, they really know how to write a compiler
Chalnoth said:So? I don't see how this means much of anything. Other companies can write their own back-ends, if they choose to do so.nooneyouknow said:HLSL compiler was written with input from all IHV's (Nvidia and ATI at the very least). CG compiler was written with input ONLY from Nvidia.
Chalnoth said:Why? Do you know who was on both teams? Are you aware of how many resources Microsoft and nVidia dedicated to their compilers? Do you know how much experience with compilers the relevant engineers had?nooneyouknow said:MS compiler guys are FAR better than Nvidia compiler guys.
Just because Microsoft also develops other compilers doesn't mean that they'd do better at this one.
Chalnoth said:Cg is still a language, and any other IHV can make their own back-end. The front end is also open-source. Another IHV could add anything they wanted to.nooneyouknow said:MS is an independant controller of HLSL, Nvidia is the SOLE controller on CG, a competing IHV.
Chalnoth said:Cg supports OpenGL.nooneyouknow said:HLSL supports ALL versions of Pixel and Vertex Shaders. CG does not.
Chalnoth said:Where did you get this from?nooneyouknow said:HLSL will supply better shader code than CG.
Chalnoth said:The languages are close to identical.nooneyouknow said:AND FOR THE LOVE OF GOD, HLSL is NOT the same as CG!!!!! Microsoft will tell you that any time you ask!!
Doomtrooper said:I always get flamed on this subject, have since the original thread about CG was posted a year ago, but the 1st DX9 title that used CG didn't even run on a 9700 Pro in December but ran on a Geforce 3.
Gunmetal Demo.
I've released a new version of 3DA, if someone want to try the demo on his 9500 / 9700. Its a dx8 game, so how can they use dx9 features, maybe in the full version....
It basicly checks some supported texture formats/frame buffer combinations, which the geforce and the ref. rast. supports, but it never uses this combination, becuase it runs without errors on the radeon 9500/9700, I just had to return different results in the "check if" phase.
Regards,
Thomas
_________________
http://www.tommti-systems.com
Hmm.. Let me ask you a question. How would ATI get features added to CG? Because if it is simply telling Nvidia what they want, or submitting code to Nvidia, I doubt ANY IHV would do so. Giving away feature sets of future hardware is not wise.
Doomtrooper said:No one said CG wouldn't run on a 9700, since CG is limited to 2 year old PS 1.1 technology....
Chalnoth said:The languages are close to identical.nooneyouknow said:AND FOR THE LOVE OF GOD, HLSL is NOT the same as CG!!!!! Microsoft will tell you that any time you ask!!
It simply shows that they both have their strengths.nooneyouknow said:Also, why are there a couple developers that support CG also support HLSL? Hmmm.. Oh yeah, they want something to work on everyones card
nooneyouknow said:And if they are soooo identical, then why did Nvidia write it? They had to have known MS was doing HLSL. Heck, what about vica-versa.
Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.nooneyouknow said:Also, why are there a couple developers that support CG also support HLSL?
Reverend said:Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.nooneyouknow said:Also, why are there a couple developers that support CG also support HLSL?
Reverend said:Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.nooneyouknow said:Also, why are there a couple developers that support CG also support HLSL?
nooneyouknow said:Reverend said:Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.nooneyouknow said:Also, why are there a couple developers that support CG also support HLSL?
Rev, I was merely trying to point out that CG must not be that great since developers who have been using CG have also decided to use HLSL. My interpretation from that is that CG is obviously great for Nvidia hardware and HLSL is great for everyone else. Again, my opinion.
DaveBaumann said:The instruction scheduler on R300 is pretty effective as it is, so using Cg will not exactly "screw" ATI's hardware as its performance will still be pretty good.