NVIDIA Stepping Back from Cg?

A response from NVidia

http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/010167.html

Cg is not being deprecated in favor of GLslang.
NVIDIA does recommend that developers working in a pure DirectX development environment use Microsoft's HLSL.

NVIDIA is 100% committed to OpenGL, and will fully support the OpenGL Shading Language. It is worth noting that NVIDIA is one of only two vendors to sign the OpenGL Shading Language contributor agreement. In addition, NVIDIA has contributed more than 25% of the code in the reference compiler and pre-parser.

We are also pleased to announce that we will be offering a new profile for Cg, so that developers can continue to use the Cg Shading Environment on any hardware that supports the OpenGL Shading Language. This will also allow users to take advantage of high-level Cg features such as CgFX and subshader interfaces.

NVIDIA is dedicated to providing the best graphics development experience, regardless of which shading language, API or operating system you choose.

In the future, if anyone has any questions regarding NVIDIA's products or strategy, please feel free to ask us directly.

Thanks,

Simon Green
NVIDIA Developer Relations
 
Re: A response from NVidia

Popnfresh said:
http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/010167.html

Cg is not being deprecated in favor of GLslang.
NVIDIA does recommend that developers working in a pure DirectX development environment use Microsoft's HLSL.

NVIDIA is 100% committed to OpenGL, and will fully support the OpenGL Shading Language. It is worth noting that NVIDIA is one of only two vendors to sign the OpenGL Shading Language contributor agreement. In addition, NVIDIA has contributed more than 25% of the code in the reference compiler and pre-parser.

We are also pleased to announce that we will be offering a new profile for Cg, so that developers can continue to use the Cg Shading Environment on any hardware that supports the OpenGL Shading Language. This will also allow users to take advantage of high-level Cg features such as CgFX and subshader interfaces.

NVIDIA is dedicated to providing the best graphics development experience, regardless of which shading language, API or operating system you choose.

In the future, if anyone has any questions regarding NVIDIA's products or strategy, please feel free to ask us directly.

Thanks,

Simon Green
NVIDIA Developer Relations

I find this a fascinating statement. It's not surprising that nVidia is "100% committed" to OpenGL, well, at least when you consider the context of where this message was posted....;) I also have to wonder who the "other vendor" is who signed on to the OSL contributor agreement and what "percentage" of the code that vendor has contributed.

Seems pretty obvious though that the only thing nVidia is going to do with Cg is create a profile for using it with OSL. It's hard to determine from this if they're creating a Cg interface for the OSL interface, or whether they are plugging OSL directly into Cg. Not sure I get the point--why use Cg to access OSL when you could just use OSL itself...?

Anyway, nice to see them making statements moving away from attempts to set industry standards towards working co-operatively with other companies to set standards everyone can agree on.
 
So, you guys are saying that without OSL Cg is natively unable to support CgFX and appropriate subshaders...? Or is this merely a reference on the value of Cg used to specifically support nVidia hardware?

I don't understand the idea of "bridging the gap between OGL and DX". In the first line of the statement Green says nVidia officially recommends M$ HLSL for DX shader development, and treats OSL as a separate issue. Not disputing your interpretation--just saying it doesn't sound like the optimal approach from the standpoint of generating efficient code for the two APIs.
 
Well, from my read, he recommends DX-HLSL when "working in a pure DirectX development environment". In otherwords, when you don't need to support both DX and OpenGL.
 
My take on things

My reading of this is that the Cg language is being de-emphasized.

They recommend using HLSL when using DirectX, leaving Cg for OpenGL or mixed enviroments. There's some hostility against Cg in the OpenGL community however due to the way it was introduced. NVidia making statements about OpenGL 2.0 not being needed and basically blind-siding the ARB. This makes many people who use OpenGL unlikely to use it. True mixed enviroments are fairly rare.

CgFX and subshader surfaces are mentioned by Mr. Green. This suggests to me that NVidia will be shifting emphasis from the Cg language and towards the Cg toolchain. The CgFX file format which allows encapsulating multiple shaders for different types of hardware / rendering quality into a single file as an example.
 
Re: My take on things

Popnfresh said:
They recommend using HLSL when using DirectX, leaving Cg for OpenGL or mixed enviroments. There's some hostility against Cg in the OpenGL community however due to the way it was introduced. NVidia making statements about OpenGL 2.0 not being needed and basically blind-siding the ARB. This makes many people who use OpenGL unlikely to use it. True mixed enviroments are fairly rare.

And that was likely to be their single largest target market. MS are hardly likely to be envanglising the use of it to developers and they probably have been pressuring NVIDIA as well.
 
RussSchultz said:
Well, from my read, he recommends DX-HLSL when "working in a pure DirectX development environment". In otherwords, when you don't need to support both DX and OpenGL.

Which account for what, 99+% of the games out there for PC? :LOL:

EDIT: Just wanted to add that in the DCC world, that # is certainly not true although there seems to be a slow shift towards DirectX more in the game-related ones. Not sure if they are DX only though (thinking of the new Maya and 3DS Max).
 
3D Studio Max has a plugin architexture for rendering its 3D display. Its shipped with support for a software, D3D and OpenGL for some time now.

Maya is purely OpenGL. There aren't even abstracted drawing functions for plugins. Plugins that need to draw use OpenGL directly.
 
I thought the new Maya had Direct 3D support as well. Lemme check...

EDIT: Checked and no mention of DX anywhere. Wonder what program I was thinking of. SoftImage maybe?
 
I didn't see anyone mention this here...

But why should nVidia care about Cg anymore, with the NV40 coming soon?
The NV40 is a lot nearer to all accepted standards than the NV30. Sure, it makes the NV30 look worse because it'd be better if they usec Cg. But as others noted, they're doing as much as they can to make it work better in HLSL and make it work well in OSL.
So considering the NV30 might be legacy in less than a year for nVidia ( and yes, they're moving quick - you hopefully all know why by now though, hehe ) - it isn't really that important anymore. Any game that now uses Cg most likely isn't gonna switch back, and any game not yet running Cg and thinking of switching to it most likely isn't far enough in development to be ready before the NV30 is "practically legacy".


Uttar
 
Uttar said:
But why should nVidia care about Cg anymore, with the NV40 coming soon?
nVidia may not care about Cg anymore, but everyone who spent 3, 4, or 5 hundred dollars on a FX graphics card should care. We know the nv3x cards need some tweaking to get competitive performance; Cg helped make that tweaking a little bit easier.

I'm glad to see Cg and nv3x cards falling by the wayside, mainly because of the fx's slow pixel shaders. IMO large scale adoption of the fx as the standard would've stunted the growth of gaming graphics for the next couple years. Picking between either slow performance or reduced image quality is not a happy choice.

Man, I'm glad ati finally got their act together! That 9700pro arrived just in the nick of time.
 
Uttar said:
I didn't see anyone mention this here...

But why should nVidia care about Cg anymore, with the NV40 coming soon? The NV40 is a lot nearer to all accepted standards than the NV30.
I hope thier IKOS machine is working this time ;) .
 
digitalwanderer said:
Kalbaz said:
what do they mean by 'legacy language' though? Is Cg that much less comprehesive and capable than HLSL?
I think that "legacy language" just refers to the fact that it never caught on and that no one is using it. :)

Damn, this IS good news! :D

Why? ATI seemed to like CG since the DAWN demo ran so fine on it. :LOL:
 
Back
Top