Cg 1.5 Beta 1 is available

pharma

Legend
Cg 1.5. Cg is Nvidia's high-level programming language for graphics processors. According to NvNews:

NVIDIA Cg Compiler Release 1.5 Beta 1 consists of the following components:
  • NVIDIA Cg Compiler Release 1.5 Beta 1
  • Unified Cg/CgFX Runtime
  • Cg User's Manual
  • Cg Language Specification
The 1.5 Beta 1 release of Cg introduces several new features:
  • OpenGL GLSL profiles
  • Direct3D9 SM3.0 profiles
  • Procedural API for Effect creation (COLLADA support)
  • Program combining for inter-program optimization
  • Rewrite of major portions of the Cg Runtime for enhanced performance
  • Cg 1.5 Beta 1 now ships with native implementations for Win32, Win64, Linux (32-bit and 64-bit), and MacOS 10.3 (Panther) and 10.4 (Tiger), and Solaris
  • Cg 1.3 and 1.4 style Effects files supported
  • Cg 1.5 Beta 1 should be backward compatible with apps written against Cg 1.4.1.

Pharma
 
What do "profiles" do for you? I can't believe that means that SM3 wasn't supported before. . .or does it?
 
Hi geo,

Profiles define specific execution environments for output code generation.
Previous versions of Cg had SM3.0 class support under OpenGL, but only
via vendor-specific assembly extensions. The portable SM3.0 support
via Cg -> GLSL in OpenGL and Cg -> SM3.0 ASM in Direct3D9 is indeed new
in Cg 1.5.

Thanks -
Cass
 
  • Like
Reactions: Geo
Thanks for the clarification, cass. You registered for that? Part of the Cg team, maybe? :smile:
 
That's look interesting but what I really need is the full source code for a HLSL frontend I could integrate with my OpenGL framework to translate glSlang (or similar) into ARB or whatever I use as ATTILA shader ISA. Someone knows where I could find one of those? :LOL:

Just joking.

It seems like in a few months I will be changing my sig to 'the complete GPU simulator' not because it will be complete (these kind of things are never complete) but because I will be finished with it as my PhD (relatively long) life nears its end and I won't be able to continue working with (other than looking at how other people work on it and complaining about that ;)). In fact I have done near nothing since last january. Of course I will be releasing the source code (or at least the part of the source code I can claim as mine) and I hope some documentation (it looks as I will never find the time to write it ...) about the low level details of my messy implementation of a 'modern' (well, at the end it only has 2002 era features other than the unified shader bit, but blame OpenGL game developers for that not me) GPU pipeline.
 
RoOoBo said:
It seems like in a few months I will be changing my sig to 'the complete GPU simulator' not because it will be complete (these kind of things are never complete) but because I will be finished with it as my PhD (relatively long) life nears its end and I won't be able to continue working with (other than looking at how other people work on it and complaining about that ;)). In fact I have done near nothing since last january. Of course I will be releasing the source code (or at least the part of the source code I can claim as mine) and I hope some documentation (it looks as I will never find the time to write it ...) about the low level details of my messy implementation of a 'modern' (well, at the end it only has 2002 era features other than the unified shader bit, but blame OpenGL game developers for that not me) GPU pipeline.

oh, man, i just love such things. thanks for sharing your work with us!

on a sidenote
RoOoBo gpu3d log said:
The last part is a small experiment 'inspired' in the RV530 architecture that is used just as an excuse to present some of the stadistics and graphics we can obtain with our simulator.

so which is it: statistics or sadistics? : )
 
geo said:
Thanks for the clarification, cass. You registered for that? Part of the Cg team, maybe? :smile:

No problem, geo. Yes, I manage the Cg team at NVIDIA.

You're not a stranger to me, but this is the first time I've had occasion to do more than lurk. ;)

Thanks -
Cass
 
cass said:
No problem, geo. Yes, I manage the Cg team at NVIDIA.

You're not a stranger to me, but this is the first time I've had occasion to do more than lurk. ;)

Thanks -
Cass

Hey, as long as it's not from my picture on the dart board at hq, then happy to have a new acquantance. :LOL:

So a question you may or may not want to answer, but. . .this new version of Cg, do we have the Sony partnership to thank to some degree for getting it some extra lovin'? If so, and this is now coming over to the PC side, then count me in favor of synergy and leveraging. . . :smile:
 
Last edited by a moderator:
geo said:
. . .this new version of Cg, do we have the Sony partnership to thank to some degree for getting it some extra lovin'?

Yes, Sony's involvement with Cg has been very positive - for some time now. Much of the Cg 1.5 feature set is due to their vision for shader/content management.
 
cass, i installed cg + it screwed my apps i program, the reason being it places the cg include directories at the top of the list of places for the compiler to look in for include files, the specific problem was with GL/glext.h.
true its a minorthing that is easily rectified but since i rarely do full builds the problem only cropped up quite a while after i installed cg, so left me scratching my head saying how did that happen?
zed
 
zed said:
cass, i installed cg + it screwed my apps i program, the reason being it places the cg include directories at the top of the list of places for the compiler to look in for include files, the specific problem was with GL/glext.h.
true its a minorthing that is easily rectified but since i rarely do full builds the problem only cropped up quite a while after i installed cg, so left me scratching my head saying how did that happen?
zed

Hi zed,
We've reworked the installer, and probably got too aggressive with updating
environment variables. We'll get this fixed before the next public release.

Thanks for the feedback!
Cass
 
cass, another more serious problem (not your department though)
i sometimes get the popup message with my gffx 5900 that due to limited power, to prevent damage to the hardware performance on the gffx has been degraded. which is good.

but ive had it where due to the app im running that this popup gets created over + over again (thus 30+ instances of this warning popup running) the last thing u want when youre trying to shut down an app is for another app(the warning) to steal focus.
perhaps pass along to whoevers responsible that whenever they want to display this warning, first check to see if theres not already another instance running.
ta zed
 
Back
Top