Richard "ex-NVIDIA" Huddy talks Cg

Guest

Newcomer
Some quotes from the DirectX Mailing List :

HLSL is the abbreviation which Microsoft has been using for their High
Level Shading Language. It's part of the impending DX9 release.

And with Microsoft's compiler team I'd reckon they will do an awesome
job of optimizing code which is submitted to HLSL.

When Microsoft deliver HLSL it seems to me that the arguments for using
Cg (on DirectX) suddenly become much weaker.

>>The optimization can't be better than what you can do with the vertex
>shader. So Cg has an additional layer above the vertex shader has the
same
>restrictions at least.
>
> Yes, that's clear. But I was wondering if they know something we don't
>know about instruction re-arranging, etc. Or even vendor-specific
things,
>allthough I remember someone from nVidia saying that Cg is 100% free
from
>something like that. Anyway, it's still nice to have your shaders
>optimized automatically for you and then do your own further
>optimizations.

Strictly speaking they're working with the same facts that you get to work with so you might expect little or no differences.

You'd be wrong.

Human coding is very time consuming - especially vectorizing some algorithms which don't naturally express themselves in that way. And a compiler has the persistence to get that kind of stuff right.

Plus, yes, it can know all of the hardware specific issues relating to the platform. That means it can eliminate almost all possible stalls and should be able to do a great job.

And, then on the other hand... The June version of the compiler is pretty naïve about some things and may cost you as much as twice as many instructions as hand coding it.

So, that's complicated. Right now I'd say that optimiser has a long way
to go. But as a long term bet it's pretty safe to say that a compiler is a much more efficient way of handling the problem.

And because NVIDIA have released the source to the Cg compiler on an open licence basis it should be straightforward for other IHV's to compete - if they choose to do so.

On the other hand I'm not sure why a DirectX programmer would choose Cg over HLSL. They're supposed to be fully compatible so maybe it makes no difference, but at least you _know_ that HLSL will work with all
vendor's hardware.

Link : http://discuss.microsoft.com/SCRIPTS/WA-MSD.EXE?A1=ind0207d&L=directxdev#20 See under Shader Optimizations.

Sounds more and more like Cg in essence is a compiler for Microsofts HLSL (not really NVIDIAs since the NVIDIA language is morphing into the Microsoft created language) that happens to compile to NVIDIA OpenGL extensions.

G~
 
Yep, before you start praising him dont forget that he seems to consider platform independence a total non issue :) DirectX all the way.
 
Link : http://discuss.microsoft.com/SCRIPTS/WA-MSD.EXE?A1=ind0207d&L=directxdev#20 See under Shader Optimizations.

Some quotes from the DirectX Mailing List :

HLSL is the abbreviation which Microsoft has been using for their High
Level Shading Language. It's part of the impending DX9 release.

And with Microsoft's compiler team I'd reckon they will do an awesome
job of optimizing code which is submitted to HLSL.

When Microsoft deliver HLSL it seems to me that the arguments for using
Cg (on DirectX) suddenly become much weaker.

Hrm, well this appears to say really that there is nothing all that speacial about nvidias Cg at all. Much adoo about nvidia.... It look as though MS will have a HLSL out with DX 9 so...what is the point of Cg?

And because NVIDIA have released the source to the Cg compiler on an open licence basis it should be straightforward for other IHV's to compete - if they choose to do so.

Why would anyone "choose" nvidias solution over Microsofts solution?

On the other hand I'm not sure why a DirectX programmer would choose Cg over HLSL. They're supposed to be fully compatible so maybe it makes no difference, but at least you _know_ that HLSL will work with all
vendor's hardware.

Ummm correct me if I am wrong here but it seems that nvidia is possibly attempting to have developers believe possibly that its solution would be better or something then other HLSL? Then after developers begin to use it force other IHV's to use their compiler? .. or WTF... Cg looks like a PR move in light of DX 9. Does anyone have an advantage by using Cg outside of nvidia? or have I gathered the wrong impressions from the above quotes from Mr Huddy. Just trying to make an objective decision on Cg is really turning into a bloody headach....
 
Hrm, well this appears to say really that there is nothing all that speacial about nvidias Cg at all. Much adoo about nvidia.... It look as though MS will have a HLSL out with DX 9 so...what is the point of Cg?

The point is to allow developers to make faster shaders and port those shaders to GL easier for Nvidia harware. Nope nothing so special that other companies aren't doing or going to do the EXACT same thing or similar things to make shaders easier/better on their platform(See RenderMonkey).

Why would anyone "choose" nvidias solution over Microsofts solution?

So they to can create optimized shaders and port to GL. Other IHV's only have to write the backend to allow their customer's to do the same. Or they can choose to do their own thing as well.

Ummm correct me if I am wrong here but it seems that nvidia is possibly attempting to have developers believe possibly that its solution would be better or something then other HLSL?

Yes, they are trying to make it more convienent for developers to make optimized shaders for Nvidia gfx cards on DX and GL.

Then after developers begin to use it force other IHV's to use their compiler? .. or WTF... Cg looks like a PR move in light of DX 9.

force? No. Nvidia is not in a position to do so. Developers are. If they like it they may demnad such tools from other IHV's. Other IHV's can and will make their own tools to facilitate optimized shaders on their cards for many platforms.

Does anyone have an advantage by using Cg outside of nvidia?

Developers may find it convienient to use. Other IHV's may use it as a beginning/example for their own similar tools.

For crying out loud people! All 3D companies make tools and have developer relations to help make it easier for developers to write applications for their hardware. Of course these companies will promote their tools/technology to advertise to developers. They want developers to make games that play well on their hardware so that they can sell more cards. Simple as that.
 
For crying out loud people! All 3D companies make tools and have developer relations to help make it easier for developers to write applications for their hardware. Of course these companies will promote their tools/technology to advertise to developers. They want developers to make games that play well on their hardware so that they can sell more cards. Simple as that

CG is not a tool, its a language. CG has a proprietary backend controlled by Nvidia which optimized for NV cards, Nvidia controls what changes are made when and where. It is not completley open source.
On the other hand DX9 HLSL does not cater to anybody and controlled by the company that wrote the OS, as is the ARB. It is a conflict of interest to think ATI or other IHV's would use a competitors software.

Rendermonkey is no way the same as CG, Rendermonkey uses complete open source plugins which work with DX9 and OGL HLSL. Rendermonkey is a tool.
 
we dont need to know something programing or coding ,
to understand ..whats going On with Cg.

Common Sense tells that CG its not useless !!! at least for NVIDIA!
Nvidia is looking ->WHats BEst for them... like ATI and any other
company you can name..

So i can easily imagine that this Cg thing MUST be a Step Forward
for Nvidia to increase the eficiency and/or speed the process for
gameDEVELOPERS and 3dapp comapanies to implemet and use Nvidia
new technology like Nv30/Nv3x/... as soon as a possible.

the fact is that Nvidia wants to go Faster ... not slower..
So this CG thing should Help them going faster .
and i have no problem with PROGRESS.

if you observe the way the Game Industry works ,
you will easily notice that there is a SLow process between
the Features of X,Y video card are avaible and the time
when will start seeing those new features in games ..

i can imagine it should be PAINFULL for NVidia every year showing
Gamedevelopers how this and this works , Hundreds of hours if not more teaching Gamedeveloper how to use incoming Geforce5/6/7 features
in their games.

i think this CG language ->should make it easier for Nvidia
and Gamedeveloper to implemets more easily the new tech in games.

Nvidia is aware of the LIMITATIONS needs to challenge
every time they release a new card.and see Nv3x's and Nv40's features (already in process) used when released.
and i think it could be a little dificult for them to throw in the garbage
MIlliONS of Enginneers Hours used in incoming new features that
will NEVER be used ; just because Microsoft Direct3d or the ARB OpenGL
are not compatible at that time with their new tech.

just look at ATI ,they are releasing directX9 compatible card ,
without Directx9. why ? because Microsoft say it. :)

So it can happen too with Nvidia .

Opengl improvements have been SLOW as HELL
it took years to change from version 1.2 to 2.0
, and MS Direct3d has been continuosly improving ,However
you Know Shit happens and you dont know if in the future MS
and NVidia relationships will break ,making the Life of Nvidia
a little harder.

so Nvidia is aware that they need to do something
or they progress will continue to be restricted to
Microsoft/ARB decisions about future .

So I think Cg is not Useless at all ,
it is Important ,i doubt Nvidia dont know what their are doing ,
just look at the rate they have expanded and the NEW! Technology
they have shown in less than 3 years in games!!!

Cg is a solution for Nvidia plans in the future ,nothing less ,
nothing more . and i think making Cg Opensource is
some guarantee that it can be improved and used by ANy Other
who likes wihtout limitations.

/wheter this is important for Others is another question.
 
Doomtrooper said:
CG is not a tool, its a language.

Languages don't have back ends. Compilers do. Compilers are tools.

Doomtrooper said:
CG has a proprietary backend controlled by Nvidia which optimized for NV cards, Nvidia controls what changes are made when and where. It is not completley open source.
It doesn't have to be. The backend is a plugin. Each IHV writes their own optimized plugin. NVIDIA even offers a skeleton. This is a complete non-issue.

Doomtrooper said:
Rendermonkey is no way the same as CG, Rendermonkey uses complete open source plugins which work with DX9 and OGL HLSL. Rendermonkey is a tool.
Except that whole middle part where it does the compilation, simplification, etc. That part doesn't seem to be open sourced. Or the XML language for feeding the compiler. Of course, I'm only theorizing here, because the Siggraph presentation is mighty sketchy as to what exactly rendermonkey is, but I'm guessing its not nearly as different from Cg as you're trying to make it out to be.

Cg, defines a language, and provides a compiler to transform Cg to some internal format, which is then passed to a backend (aka exporter plugin).

Rendermonkey, defines an XML language, and provides "importer plugins" to convert different shading languages to its XML language, which is then passed into the compiler (which may or may not be open source) to transform it into some form to be passed to an exporter plugin. (again, their slides are mighty sketchy)

Do you see the only difference is the company who's making the tool, and the fact that rendermonkey seems to have this extra transformation thing on the front end? Yet you harp on Cg likes it's the spawn of Satan.

Personally, I like the idea of using whatever language you want; but at the same time, I like the idea of using the same language to target multiple platforms. Both of these ideas hold a tremendous amount of merit, but you insist on tearing Cg down, and mostly on items that you just don't seem to understand and have latched onto, even though they're patently wrong.
 
You can twist your thinking around all you like Russ, CG is a High Level Shading Language...even NVIDIA states it :rolleyes:

What is Cg?
C for Graphics. Cg is the high level language for programming GPUs, developed by NVIDIA in close collaboration with Microsoft.


Who maintains the Cg Language Specification?
NVIDIA maintains the Cg Language Specification and will continue to work with Microsoft to maintain compatibility with the DirectX High Level Shading Language (HLSL).

Is Cg open or proprietary?
a.The Cg Language Specification is published and is open in the sense that other vendors may implement products based on it. To encourage this, an open-source implementation of a Cg compiler front-end is expected to be available by Siggraph ’02.
b.Vendor implementations of Cg compilers are typically proprietary and owned by their creators. NVIDIA has developed the NVIDIA Cg Compiler, and we expect other vendors to develop their own Cg compiler products.

http://developer.nvidia.com/view.asp?IO=cg_faq
 
Doomtrooper said:
You can twist your thinking around all you like Russ, CG is a High Level Shading Language...even NVIDIA states it :rolleyes:
Can anyone tell me what does the L in the term "DX9 HLSL[/L]" stand for?
 
Because I'm feeling punchy:

NO SH!T SHERLOCK!

Of course its a language. But you're harping on the LANGUAGE'S proprietary back end. Languages don't have back ends, compilers do.

Why not harp on ATIs proprietary compiler? While the plugins seem open sourced, nothing is mentioned about the compiler's status (at least in their slides). Why not harp on ATIs proprietary XML language? (its certainly got to be controlled by somebody, else the tool will immediately fracture as people take it and pull it in as many different ways as there are people).

For every single thing you harp on Cg, you'll find if you open your eyes, its directly applicable to Rendermonkey as well.
 
Rendermonkey is not a language, it plugs into DX9 or OGL or Rendering software...CG is a 'C' Language that is a subset/superset of DX9 HLSL...or in other words another frickin High Level Shader Language..

I dare you to find comparisons from Rendermonkey to Cg.. ;)
 
RussSchultz said:
Of course its a language. But you're harping on the LANGUAGE'S proprietary back end. Languages don't have back ends, compilers do.

Here you seem to contradict yourself. First you say it IS a language then you say it is a compiler.. So to make this logic work one could say that Cg is a proprietary compiler that translates a High Level Shading Language that manages optimization for nvidia hardware as a result of the proprietary back end that nvidia owns and is not open... Does that make sense?

RussSchultz said:
Why not harp on ATIs proprietary compiler? While the plugins seem open sourced, nothing is mentioned about the compiler's status (at least in their slides). Why not harp on ATIs proprietary XML language? (its certainly got to be controlled by somebody, else the tool will immediately fracture as people take it and pull it in as many different ways as there are people).

For every single thing you harp on Cg, you'll find if you open your eyes, its directly applicable to Rendermonkey as well.

Are you saying here that ATIs Rendermonkey is not open and is a proprietary HLSL from ATI? That would be the first time I ever heard this suggested..
 
Ummm correct me if I am wrong here but it seems that nvidia is possibly attempting to have developers believe possibly that its solution would be better or something then other HLSL?

Yes, they are trying to make it more convienent for developers to make optimized shaders for Nvidia gfx cards on DX and GL.


Then after developers begin to use it force other IHV's to use their compiler? .. or WTF... Cg looks like a PR move in light of DX 9.

force? No. Nvidia is not in a position to do so. Developers are. If they like it they may demnad such tools from other IHV's. Other IHV's can and will make their own tools to facilitate optimized shaders on their cards for many platforms.

Hrm, of course it is the developers who will force other IHVs to use Cg nothing like stating the obvious. But suppose that nvidia has implemented Cg with the intention of FORCING other IHVs to use their proprietary HLSL compiler VIA developers... Developers wouldn't be forcing anyone to use the Cg compiler if it didn't come from nvidia in the first place. All the while nvidia claiming that it is "open source"..


Does anyone have an advantage by using Cg outside of nvidia?

Developers may find it convienient to use. Other IHV's may use it as a beginning/example for their own similar tools.

Again nvidia will force other IHVs to use their proprietary HLSL compiler via developers. But it is the developers who may find that it isn't so "convienient" to use if nvidia is their only target market. Unless of course nvidia is totally successful in convincing developers that its proprietary HLSL compiler is better then an OPEN SOURCE HLSL. Argh, I don't see why a developer would be in a rush to take nvidias proprietary compiler when they know that MS DX 9 will have its own non-proprietary HLSL that will automatically be compatible will all IHVs hardware and not have a propritary back end like nvidias.

For crying out loud people! All 3D companies make tools and have developer relations to help make it easier for developers to write applications for their hardware. Of course these companies will promote their tools/technology to advertise to developers. They want developers to make games that play well on their hardware so that they can sell more cards. Simple as that.

Yeah but nvidia is claiming open source HLSL with Cg when it is clearly not. Further is attempting to high jack developers into using their propritary HLSL compiler in an attempt to force other IHVs to use nvidias software. Correct me if I am wrong here but this isn't the way things go normally.
 
Doomtrooper said:
I dare you to find comparisons from Rendermonkey to Cg.. ;)
Erm, didn't he do that in his last two posts? The thing is you just ignore the things he presented, same thing you have done in the past. Let me draw some comparisons that are hopefully "in your face" enough, and then mention some of my concerns about things that are not sufficiently clarified so far in regard to Rendermonkey IMHO.

Disclaimer: Of course both products have more and different capabilities than this, but to claim they are not in fact comparable is plain silly:

- both RM and Cg have an own language
- both can compile shaders based of their respective languages directly into DX or GL
- both are apparently not fully open-source
- both are only made by one singular IHV
- both are copyrighted
- both can but not neccessarily must contain possible optimizations for specific IHV's hardware
- both are released as shader development tools to developers for free
- both will be able to plug into important content creation applications in the future to ease workflow for artists


Now concerns and problems:

- Is RendermonkeyTM fully open-source? Its available documents speak of an "open, extensible" tool built on a plug-in architecture, which just means that it is, well open and extensible, nowhere is the word open-source mentioned.

- ATI's docs mention source-code for importer, exporter, editor and viewer plugins will be made available. What about the compiler, is that a part of the exporter plugins or something seperate?

- There is not yet sufficient information to effectively answer these and other remaining concerns about Rendermonkey. There are still loads of possible problems with it, we just can't tell yet.

- What we can tell though, is that Rendermonkey does have its own XML style language. While not an HLSL by term, it still is a new language to define shaders, and a new one to learn for developers. Thanks to you going through great pains over the past weeks to make that clear, any new or additional language is a bad thing, right? Suddenly claiming only HLSLs would cause more work for developers is ridiculous. If anything, then learning Rendermonkey's XML style language is a lot "worse". Cg is at least basically identical to DX9 HLSL, so no learning done on Cg is wasted and code should be interchangable.

- It remains to be seen just how effective a shader can be if it is going through several different layers of shading languages. E.g. Import a renderman shader, optimize it a bit in Rendermonkey XML, export it to DX9 HLSL and then compile to DX9 low-level assembly. I can see some headroom for wasted performance compared to coding directly in a HLSL.

- Neither the XML language nor Rendermonkey's compiler seem to be open sourced based on current information. Rendermonkey's compiler is mentioned seperately from the plugins in the rest of the presentation, so there is little reason to believe it is open-source until we hear otherwise.

All of this, yet we don't hear even a word of criticism or at l e a s t scepticism from the same people who constantly hammer on Cg for possible concerns based mostly on speculation. That is not applying the same fair standards to different companies, that is just obviously biased arguing.

Geek_2002 said:
Here you seem to contradict yourself. First you say it IS a language then you say it is a compiler.. So to make this logic work one could say that Cg is a proprietary compiler that translates a High Level Shading Language that manages optimization for nvidia hardware as a result of the proprietary back end that nvidia owns and is not open... Does that make sense?
It's hard to make sense of what you tried to say there, that logic argument of yours is not at all what Russ said though. I suggest you read Russ' statement again carefully. Try to understand what he's saying, not instantly bash it. If you lack the information on how Cg or other languages interact with a compiler, or what a back-end is, then please read up on it before throwing the terms into a mixer like that.
 
I find it strange that people have difficulty in separating a language from a compiler. Cg the language is high level method of designing a shader. The compiler parses, tokenises, and optimises the Cg language into shader assembly according to a set of defined rules contained in the profile for that shader.

for example is it easier to understand:

#include <stdio.h>

int main(void){
printf("Hello World");
return 0;
}

or

push cs
pop ds
mov dx,offset MSG
mov ah,9
int 21h

MSG db 'Hello World',0

?
 
I'm not going to argue, believe what you will. More quotes from the FAQ

http://developer.nvidia.com/view.asp?IO=cg_faq

Quotes from NVIDIA's site:

What’s the difference between the Cg Language and a Cg Compiler?

The Cg language has a syntax and grammar suitable for real-time programmable GPUs.
A Cg Compiler is an application that accepts Cg Language input, and produces output in one of several standard assembly language formats that are accepted by modern programmable GPUs.

Is Cg open or proprietary?

The Cg Language Specification is published and is open in the sense that other vendors may implement products based on it. To encourage this, an open-source implementation of a Cg compiler front-end is expected to be available by Siggraph ’02.
Vendor implementations of Cg compilers are typically proprietary and owned by their creators. NVIDIA has developed the NVIDIA Cg Compiler, and we expect other vendors to develop their own Cg compiler products.


What is NVIDIA providing to developers?
NVIDIA is initially providing developers with the NVIDIA Cg Toolkit 1.0 (Public Beta) comprised of:

NVIDIA Cg Compiler Public Beta 1.0 (supporting DirectX 8: Vertex and Pixel Shaders, OpenGL NV_vertex_program)
Cg Language Specification 1.0
NVIDIA Cg Standard Library 1.0
NVIDIA Cg Runtime Libraries (supporting DirectX 8 and OpenGL 1.0)
Cg User’s Manual 1.0
NVIDIA Cg Browser 4.0 (with many example shaders and demos)

If Developer's like it they will use it. If they don't, they won't.
 
- What we can tell though, is that Rendermonkey does have its own XML style language. While not an HLSL by term, it still is a new language to define shaders, and a new one to learn for developers. Thanks to you going through great pains over the past weeks to make that clear, any new or additional language is a bad thing, right? Suddenly claiming only HLSLs would cause more work for developers is ridiculous. If anything, then learning Rendermonkey's XML style language is a lot "worse". Cg is at least basically identical to DX9 HLSL, so no learning done on Cg is wasted and code should be interchangable.

This is an amazing stretch. The XML file for Rendermonkey is a set of rules for determining how it handles converting a particular HLSL to a particular set of shader code.

First, why would a developer ever have to learn it?
Second, you cannot prevent an XML spec for syntax rules and code expression from being suited to any language or code spec at all unless you cripple it. Despite what capabilities or precedence or data types are expressed in the target code, it will still abide by the same syntax rules and compilation principles as other target code. The XML specification for compilation rules will never have anything to do with how a developer targets their shader code (since it doesn't attempt at all to specify a shader HLSL), only with how effective RenderMonkey is in converting their shader code to the target specification. Since Rendermonkey is not a HLSL, there is no inertia to then prevent the developer from switching to some other compilation tool for their completely separately specificied HLSL if Rendermonkey doesn't do the job for them.

And to reiterate MY problem with Cg, not to be confused with anyone else,

Who maintains the Cg Language Specification?
NVIDIA maintains the Cg Language Specification and will continue to work with Microsoft to maintain compatibility with the DirectX High Level Shading Language (HLSL).

I believe I termed this as vague marketing-speak, and in my evaluation it is a pretty weak assurance. But we've covered that and I don't seek to convert anyone, or really dicuss it without the context of that long post I kept referencing in that other thread as it would necessitate quite a bit of repetition.
 
Back
Top