CG==HLSL or CG != HLSL?

-------------------------------------------------
1. The Cg/HLSL Language
-------------------------------------------------
1.1: What's the difference between nVidia's Cg and Microsoft's HLSL
(High Level Shading Language)?

A: Cg and HLSL are actually the same language! Cg/HLSL was co-developed
by nVidia and Microsoft. They have different names for branding
purposes. HLSL is part of Microsoft's DirectX API and only compiles
into DirectX code, while Cg can compile to DirectX and OpenGL.

In this FAQ, Cg and HLSL can be used interchangably.
This is 100% wrong, AFAIK.
 
When Cg was announced Nvidia stated they developed Cg and HLSL along with Microsoft and that they were very similar. I don't think you can say they're the same thing though. I haven't developed with Cg that's just what I read.
 
Cg and HLSL has some syntax differences. The Cg compiler is however HLSL compatible AFAIK.
 
Cg is also a high level shader language just not the one M$ primarly developed for DX9.

Cg has float indexs on the arrays doesn't it??
 
I do remember someone at Nvidia claiming in an interview that you should be able to write Cg code that will also compile using HLSL... though I've never tested that claim myself. Like Humus said, I'm sure there are some syntax differences that would require some minor rewriting.
 
They are they same language. The implementations are different however.

You can take a HLSL shader and compile it with Cg and the other way around if you wish.
 
Well I can't really say that I've looked at Cg I was under the impression that it was different to HLSL unless you can mix and match both Cg and HLSL in the same shader I would still call them different I would just say the Cg compiler also implements a HLSL compiler and vice-versa.
 
Those that saw the full leaked ATI presentation a little while back may also remember a quote from MS decrying Cg's similarity to HLSL.
 
OMG this means OGL supports MS HLSL via CG ( and nvidia extension ) LOL what did we need GLSlang for?
 
bloodbob said:
OMG this means OGL supports MS HLSL via CG ( and nvidia extension ) LOL what did we need GLSlang for?
So that it'd be supported (efficiently) on something other than an NV chip?
 
Simon F said:
bloodbob said:
OMG this means OGL supports MS HLSL via CG ( and nvidia extension ) LOL what did we need GLSlang for?
So that it'd be supported (efficiently) on something other than an NV chip?
Meh, you'll have to write a compiler for GLSlang, so it wouldn't be all that different than writing the backend for Cg.

Of course, GLSlang gives some other features for your work, plus is not from that pit of evil, NVIDIA.
 
stevem said:
Simon F said:
Seriously though, this Siggraph Paper describes some of the differences between CG, HLSL, and GLSLANG.
That's an interesting paper. I also found his papers on VLIW Fragment Pipeline & F-Buffer useful. Thanks for the link.
No worries. By chance I sat next to Bill Mark at this year's Graphics Hardware conference - he seemed like a really nice guy.

FWIW, I can recommend Tim Rowley's Web site (easy to find with Google) - it has a lot of links to recent Siggraph and Graphics Hardware Papers.
 
dominikbehr said:
but really, CG is supposed to be superset of HLSL
so it would be CG >= HLSL
It is not a superset of HLSL. They are similar, but different.

Code:
// In Cg:
tex1Dproj( sampler1D, float4( x, y, z, w)); // fails to compile - ambiguous overloaded function reference
tex2Dproj( sampler2D, float4( x, y, z, w)); // fails to compile - ambiguous overloaded function reference

//In HLSL:
tex1Dproj( sampler1D, float4( x, y, z, w)); // equivalent to tex1D( sampler1D, float( x/w));
tex1Dproj( sampler2D, float4( x, y, z, w)); // equivalent to tex2D( sampler2D, float2( x/w, y/w )) ;
-mr. bill
 
Back
Top