Cg released

Crusher said:
Personally, if I were a game developer, I'd be wondering why ATI hadn't provided me with a similar program (or at least an announcement that they would be providing their own optimized compiler/profile/whatever to work with the Cg front end).

Well that would be because ATI, Matrox, and 3Dlabs are working on OGL 2.0 and DX 9 HLSL wouldn't it :rolleyes:
 
Well that would be because ATI, Matrox, and 3Dlabs are working on OGL 2.0 and DX 9 HLSL wouldn't it

As is NVIDIA. There is a need *right now* for a high-level shading language (and a direct path for content creation in 3D modeling packages).

You're putting a lot of stock on OpenGL 2.0 being finished (and ready for use) sometime soon, when I think that isn't going to be the case at all. The "standard" isn't finalized (even the shading language is just a proposal at this point), and once it is, it will take at least a year before OpenGL 2.0 ICDs are stable and ready for everyday use, let alone development.

And, I think OpenGL 2.0's shading language is deficient in the way that it handles texture fetching, as it will require far too many resources to be spent on compiler development than a model such as Cg's (where the texture fetch instructions are bundled with the output profile), resulting in either worse performance or higher cost, without any tangible benefits.
 
You're putting a lot of stock on OpenGL 2.0 being finished (and ready for use) sometime soon, when I think that isn't going to be the case at all

I'm putting alot of stock on DX9 HLSL also, with Dx9 coming approx in the fall, that isn't very far away...
For a another graphic vendor to start working on some profile for their GPU right now, (where is the profile Docs ??) by the time it would be ready DX 9 HLSL will be here.
 
And Cg will have a DX9 profile, so all DX9 compliant cards will be able to take advantage of it as soon as DX9 is ready.
 
gking said:
And Cg will have a DX9 profile, so all DX9 compliant cards will be able to take advantage of it as soon as DX9 is ready.

Why have a profile for a another HLSL if Nvidia is working on DX9 HLSL ??
 
Where does anything say that? That sounds completely at odds with what David Kirk told me when I asked if other companies were free to make thier own compilers.

Are you high or did you just make that up? Read "any" interview and it clearly states that ATI or anyone else can make their own compilers.
Don't be such a tool because you don't like Nvidia. I mean 3DFX is dead and it was dead before Nvidia bought them and don't hate them because they bought 3DFX. Its like you have such a grudge against anything thats positive about Nvidia. I have been reading this forum for quite some time and its like you are never bullish when it comes to anything positive about Nvidia.

I've asked you to contribute something useful before - this isn't it.

I think telling you to get a girlfriend or a wife is a contribution to your social life and social skills.

As for 3D related items, I told you that Carmack was not using a NV30 chip and I was right. I told you that the chips might be taped out, but the silicon wasn't in developers hands and I was also right.

I also told you that Carmack was using an Geforce 4 chipset and comparing that to the new ATI chipset and I was right again. What do you not get about me not contributing.

I see you contribute theories on how nvidia sucks at architecture and how you don't care for them, but thats all I see you post. Big whoop.
 
Docwiz said:
Where does anything say that? That sounds completely at odds with what David Kirk told me when I asked if other companies were free to make thier own compilers.

Are you high or did you just make that up? Read "any" interview and it clearly states that ATI or anyone else can make their own compilers.
Don't be such a tool because you don't like Nvidia. I mean 3DFX is dead and it was dead before Nvidia bought them and don't hate them because they bought 3DFX. Its like you have such a grudge against anything thats positive about Nvidia. I have been reading this forum for quite some time and its like you are never bullish when it comes to anything positive about Nvidia.

I've asked you to contribute something useful before - this isn't it.

I think telling you to get a girlfriend or a wife is a contribution to your social life and social skills.

As for 3D related items, I told you that Carmack was not using a NV30 chip and I was right. I told you that the chips might be taped out, but the silicon wasn't in developers hands and I was also right.

I also told you that Carmack was using an Geforce 4 chipset and comparing that to the new ATI chipset and I was right again. What do you not get about me not contributing.

I see you contribute theories on how nvidia sucks at architecture and how you don't care for them, but thats all I see you post. Big whoop.

What do you post then ace ??
flaming.gif
 
There's nothing stopping game developers from only optimizing for NVIDIA cards as it is, so if your paranoid fears of a single company dominating the market are correct, it probably began long before Cg was announced. I also don't think anyone honestly believes Cg is going to be an industry standard in its current form and without support from other hardware manufacturers, but that doesn't make it useless. Personally, if I were a game developer, I'd be wondering why ATI hadn't provided me with a similar program (or at least an announcement that they would be providing their own optimized compiler/profile/whatever to work with the Cg front end).

Take a course in critical thinking. You're entire argument is based on a fallacy of black and white thinking. nVidia already rules the 3D arena, why stop now? That's great there buddy, but a monopoly would be far worse than a competitive/colaborative atmosphere in this marketplace.
 
gking said:
And Cg will have a DX9 profile, so all DX9 compliant cards will be able to take advantage of it as soon as DX9 is ready.

Assume for a moment that NVIDIA does not have displacement mapping in NV30, do you then think that the functionality to use the displacement mapping value in the vertex shader will go into Cg ? There will be a DX9-As-Nvidia-Sees-Fit profile when DX9 is ready and NV30 is ready.

Personally, if I were a game developer, I'd be wondering why ATI hadn't provided me with a similar program (or at least an announcement that they would be providing their own optimized compiler/profile/whatever to work with the Cg front end).

Quite possibly because NVIDIA told no competitor about this ? Quite possibly that ATI and others only got access to the syntax and functionality on the same day (or even later) than the developers ? Or do you want yet another language ? The more languages we get the more it goes against what we are trying to reach : write the program once and use it on all hardware, all platforms and reuse it in the future without too much work. If there are 10 languages each with different advantages then the whole idea is up in the air.

G~
 
Docwiz said:
Are you high or did you just make that up? Read "any" interview and it clearly states that ATI or anyone else can make their own compilers.
Don't be such a tool because you don't like Nvidia. I mean 3DFX is dead and it was dead before Nvidia bought them and don't hate them because they bought 3DFX. Its like you have such a grudge against anything thats positive about Nvidia. I have been reading this forum for quite some time and its like you are never bullish when it comes to anything positive about Nvidia.

Grow up. This has nothing to do with 3dfx or a hatred of nVIDIA this comes from the fat that I sat face to face in a nice little meeting with David Kirk on Friday, and when I asked the question will they allow others to make compilers for their own hardware will that be OK by nVIDIA the answer I got was certainly not a straight ‘Yes’ – now, admittedly I was tired when I got there and I haven’t listened to the tape again, but it seemed to me as though he was basically saying “Why fragment things again?â€.

Interestingly the only other interview I’ve read with Kirk he skirts around the issue of other vendors creating their own compilers:

http://www.hothardware.com/hh_files/S&V/nvcg(2).shtml

You’re releasing a tool suite and compiler for this new “open source†language. Do you think the rest of your competitors are likely to follow suite and release their own versions?

We are hopeful that the entire graphics community will embrace Cg. It’s clearly good for everyone. However, they don’t actually need to do any work to support Cg. Since the language is layered on top of the major graphics APIs, OpenGL and DirectX, Cg will run on all hardware that supports those APIs, assuming that they are conformant to the standards.

Of course, the other issue is that NVIDIA own all the syntax so other vendors cannot create syntax for features that are not currently exposed in Cg.

I think telling you to get a girlfriend or a wife is a contribution to your social life and social skills.

If you really think that then you’re obviously the one troubled.

I see you contribute theories on how nvidia sucks at architecture and how you don't care for them, but thats all I see you post. Big whoop.

Well read again, because you obviously read incorrectly.

Consider this a warning.
 
Democoder-
I see Cg as nothing more than a CASE tool.

On the surface, this is exactly how I see it. Nothing "earthshattering" about it and I think discussions of all this hooplah are way out of proportion. Speaks volumes for the effectivity of PR and marketing. :)

On the surface, it seems innocuous enough. The real test will be if a hypothetical ATI or 3DLabs or *whoever* comes out with their own compiler that can be fed NVIDIA design Cg code that pops out a working set.. and what proprietary, legal or IP wars would surround such a translation project.

If this can be done fairly painlessly from a legal standpoint, the only last vestige of debate will have to do with the level of the initial framework- obviously if the framework level is dictated by NVIDIA, this means they will have to hold the pieces for the highest common denominator. In the case of GF3 vs 8500, this would not be the case as PS/VS code levels would be intentionally crippled to the maximum capabilities of an NVxx chipset and likely not have the framework for higher levels of API compliancy.

Hopefully NVIDIA will be the highest common denominator with DX9 going forward. The debate begins in the hypothetical situation where they are a notch or lower of this level- as they couldnt possibly be accountable for a framework that would not support their own chipsets... it would be ludicrous for NV to make a framework to produce code their own VS/PS couldnt handle, and thus begins the controlling aspect. Cg quickly becomes useful for the NV-level downwards, never exploiting anything new or innovative if such ever occurs by other chipsets and the APIs in use.

Luckily I dont see this being that earth shattering to have such impact in the realm of games/3d entertainment. Like said previously, it's a simple CASE tool- likely of only partial interest to game coders. Most pro's are going to prefer back-of-the envelope shaders over a tool and tinker at the lowest level. All the John Carmack's of the world aren't "wizard" users- they are slide-rule and polish-notation calculator users. It's of great significance to plug-in designers, demos, 2d/3d application designers and the like- but I just dont see it's game impact being more than a novelty for the "lower half" of the developers that pretty much pawn off pretty demos with horrible models, textures and gameplay now. They will finally be able to have nice water and shader effects in their buggy, 2-star games. :)

Just my $0.02,
-Shark
 
Ultimately, it will be up to game developers to try CG and decide whether its a good or a bad thing. Only the "field testing" by people it was intended for will determine the CG's usefulness (or lack thereof); no amount of armchair salivating about the "great potential" or moaning about "unfair advantage" will change that.
 
Geeforcer said:
Ultimately, it will be up to game developers to try CG and decide whether its a good or a bad thing. Only the "field testing" by people it was intended for will determine the CG's usefulness (or lack thereof); no amount of armchair salivating about the "great potential" or moaning about "unfair advantage" will change that.

I agree wholeheartedly. The developers will be the one to choose if this will be the industry standard, no amount of hyping by NVIDIA or Microsoft will help if developers don't adopt it.
 
DaveBaumann said:
Interestingly the only other interview I’ve read with Kirk he skirts around the issue of other vendors creating their own compilers:

http://www.hothardware.com/hh_files/S&V/nvcg(2).shtml

NVIDIA's Nick Triantos mentions that other vendors are at liberty to create their own Cg compilers.

http://www.cgshaders.org/forums/viewtopic.php?t=18

This approach could cause nightmares for developers wanting to make use of high level shading languages. In fact, the industry might want to go ahead and adopt Microsoft's popular slogan:

"What Cg compiler do you want to use today?" :) "
 
Nick Triantos said:
Other vendors can certainly implement Cg compilers that include their vendor-specific profiles, if they find that the vendor-independent ones are in some way inadequate.

Then why didn't David just say that if its the case.

However, this still doesn't get around the fact that NVIDIA owns the IP and controls the Cg language, so if theres no support in the language for something then other vendors still can't expose that functionality. Hopefully all DX9 hardware will equally support DX9 functionality, in which case the playing feild will be levelled somewhat as the DX9 CG will cater for all hardware that supports DX9.

This approach could cause nightmares for developers wanting to make use of high level shading languages.

Well, supporting separate compilers is not as difficult as supporting separate languages, it is similar to me taking some C code and compiling it on HPUX and Sun to allow the application to run on the respective hardware OS.

However it does raise the question of how they would do this. Do you ship the code in Cg on the game CD and use a Runtime compiler, thus hoping that all the other vendors either have their own runtime compiler in their drivers or are shipping nVIDIA generic one? Doesn’t seem likely that that will happen anytime soon. Alternatively do you generate the assembly code through each compiler and possibly have hardware specific executables (errr, no) or attempt to piece each one together and produce code branches for the particular hardware that’s being supported? Ouch, although probably not too dissimilar to what may actually be occurring for those that wish to use PS1.1/1.3/1.4 (even though they a have to do it long hand).

This approach is cumbersome and annoying which is why it should be transparent for the developer to be able to support the features they want.

As an aside, I wonder why nVIDIA haven’t created a code path for PS1.3 yet?
 
My guess is Cg will be good for some work but not all work.

High speed great looking FPS probably will be hand coded and optimized.

Some people (any guess?) will continue to do "graphics acrobatics" that are not in the books 8)
 
I was watching my friend Gump use the Cg compiler on his PC, running an Athlon and Radeon 8500. Some of the shader effects did not work properly. Gotta make you wonder if the whole thing is just biased towards Nvidia...
 
I was watching my friend Gump use the Cg compiler on his PC, running an Athlon and Radeon 8500. Some of the shader effects did not work properly.

Thats not exactly a conclusive, exhaustive test, afterall that could just be a driver issue.
 
it is similar to me taking some C code and compiling it on HPUX and Sun to allow the application to run on the respective hardware OS

Bad example. If you've ever ported ANSI C code between GNU, HP/UX and Solaris C compilers, you'd know what a freaking nightmare this is. :)
 
Back
Top