How Cg favors NVIDIA products (at the expense of others)

demalion said:
What if they proceed by then specifying specific alterations, as time progressed, in such a way that it adds functionality as they see fit?
Hmm...what if a suite of development tools tuned to producing output for the particular specification in question evolved, but only along the path of the altered vendor-centric specification, and became popular among developers?
[snip]


Now that nVidia has, apparently, fully open-sourced the language, is there any reason why nVidia alone now has control over the specification? Anyway, I still think that the only potential that Cg really has is a cross-API language...only if nVidia can manage to merge both DX and GL HLSL into Cg will it be truly successful.

If this merging happens, then Cg will effectively be out of nVidia's hands. In order to retain compatibility with DX and GL, Cg would need to include any other specification changes that other IHV's push forward (Or, perhaps more accurately, nVidia would have to allow other IHV's to include such changes in their own versions of the compiler).

I just hope that nVidia has done a good job of making it easy for other IHV's to make "plugins" that would work well with nVidia's Cg, making it easy for runtime Cg to compile to any possible profile.
 
Its not FULLY open sourced...the back end is still proprietary..thats the part that I'd like to see..and the only way we can prove this 'technical reason' would be if that portion was opened up.

Why would Nvidia hold back Opengl ?? Whats the purpose ?? Think about it..
 
Chalnoth said:
demalion said:
What if they proceed by then specifying specific alterations, as time progressed, in such a way that it adds functionality as they see fit?
Hmm...what if a suite of development tools tuned to producing output for the particular specification in question evolved, but only along the path of the altered vendor-centric specification, and became popular among developers?
[snip]


Now that nVidia has, apparently, fully open-sourced the language, is there any reason why nVidia alone now has control over the specification?


I know we have a thread title stating that, but I was under the impression that the thread title was misleading? AFAIK, your statement here is inaccurate...if it isn't, please enlighten.
 
Doomtrooper said:
Its not FULLY open sourced...the back end is still proprietary..thats the part that I'd like to see..and the only way we can prove this 'technical reason' would be if that portion was opened up.

Why would Nvidia hold back Opengl ?? Whats the purpose ?? Think about it..

From the press release at 3DGPU that this thread is based on:

Available in August for download from the developer.nvidia.com and www.cgshaders.org Web sites, this code will contain the parser that reads the language and creates intermediate code for compilation, as well as a generic back-end.

Granted, this may mean that nVidia will not give out the code for their specific PS1.x and 2.x profiles, but will certainly allow other IHV's to make their own backends.
 
So, in the three hours I've been gone to inspect the progress on my house, I see only one vaguely technical argument, and that one is sorely lacking in factual basis.

The one semi-technical argument is that the back end is not open source, or is prevented by NVIDIA from having other IHVs develop their own backends.

Which is, from what I can tell, hogwash.

NVIDIA will be releasing source code to the "front-end" of the compiler, and a simple back-end. This code will contain the parser and basic non-back-end-specific parts of the compiler, and the back-end we create will walk through the parsed program and print out some human-readable output.

Right there. The front end is open, and they give an example for the back-end, which is meant to be rewritten by the IHV for their hardware anyways. Since the back end is essentially open source, since it's meant to be a plug-in anyways, how can NVIDIA somehow make bad code for everybody else if everybody else is providing their own back end?

And, thank you very much, every one of you who immediately launched into a political discussion. Even more thanks to the person who decided the technical discussion in general didn't merit addressing, and THEN launched into a political discussion.

But lets go down that path brought up about the potential of Cg being morphed into something that benefits NVIDIA over other companies by manipulating the language specification:

Is Cg anything more than loops, arrays, and function calls? What concievable things could NVIDIA add to it to make it better for NVIDIA and worse for everybody else? Anybody? Bueller?
 
pascal said:
I am a consumer I would rather have an open standard, not Cg or RenderMonkey or M$ HSL.
Standford RTSL, go go go! :)

Serisouly though, as a consumer, I could care less about open standards, as long as we finally get to have some sort of standard that helps improving the amount of advanced features in upcoming games. If I have a choice between two similarly powerfull alternatives, and one is an open alternatives, I'd choose it any day. But if I have two similarly powerfull choices, yet the open standard is going to be months or even years late, I'm all for going with the proprietary solution for the time being.

Things are constantly evolving, just because one standard might be chosen today, doesn't mean its gonna last forever. Newer, better things are going to appear over time and replace the first couple of solutions, it's always worked like this...


Chalnoth said:
If this merging happens, then Cg will effectively be out of nVidia's hands. In order to retain compatibility with DX and GL, Cg would need to include any other specification changes that other IHV's push forward.
Hmm, something worth considering, haven't thought about it that way yet...
 
RussSchultz said:
So, in the three hours I've been gone to inspect the progress on my house, I see only one vaguely technical argument, and that one is sorely lacking in factual basis.

The one semi-technical argument is that the back end is not open source, or is prevented by NVIDIA from having other IHVs develop their own backends.

Which is, from what I can tell, hogwash.

NVIDIA will be releasing source code to the "front-end" of the compiler, and a simple back-end. This code will contain the parser and basic non-back-end-specific parts of the compiler, and the back-end we create will walk through the parsed program and print out some human-readable output.

Right there. The front end is open, and they give an example for the back-end, which is meant to be rewritten by the IHV for their hardware anyways. Since the back end is essentially open source, since it's meant to be a plug-in anyways, how can NVIDIA somehow make bad code for everybody else if everybody else is providing their own back end?

And, thank you very much, every one of you who immediately launched into a political discussion. Even more thanks to the person who decided the technical discussion in general didn't merit addressing, and THEN launched into a political discussion.

But lets go down that path brought up about the potential of Cg being morphed into something that benefits NVIDIA over other companies by manipulating the language specification:

Is Cg anything more than loops, arrays, and function calls? What concievable things could NVIDIA add to it to make it better for NVIDIA and worse for everybody else? Anybody? Bueller?

I stated why I think your request is a faulty one, and provided a line of comments indicating why.

We do not know how the demands of the industry will evolve, so your are placing the burden of prediction of the industry and justification of that prediction on the wrong side of the argument when you ask for technical justification, since as I attempt to illustrate with my "What if" examples the indication is that expectation of benefit at the expense of competitors is the more reasonable expectation, not the lesser.

Couldn't you say HTML is nothing more than tags and syntax? Would you agree that Microsoft has effectively locked down html evolution to fit into their own context, and delayed adoption as it suits their own pace and goals? How about Java?

I think my post is very relevant to a technical discussion about this...it is not political, it's focus is on a scenario that is directly and logically relevant to the discussion, or does logic not fit in with a technical discussion?

EDIT: Please save me time by not confusing my line of discussion with Doomtrooper's, and address our points separately.
 
RussSchultz said:
Is Cg anything more than loops, arrays, and function calls? What concievable things could NVIDIA add to it to make it better for NVIDIA and worse for everybody else? Anybody? Bueller?

For crying out loud....Gking already posted earlier that Cg could Expose more powerful register combiners on Nv1X and Nv2X..
What about NV30...what if there is no DX 9.1..whats gonna utilize the extra instructions...DX9 is not going to...enter CG...

I can reverse this arguement and provide proof to us non-believers that that there is no specific Nvidia optimizations in CG :rolleyes:
In fact unless you have the source code to prove this, your arguement has less water than ours..thx to Gking in the early CG thread about exposing specific Nvidia strengths...thats all I needed to read.
 
But lets go down that path brought up about the potential of Cg being morphed into something that benefits NVIDIA over other companies by manipulating the language specification:

Is Cg anything more than loops, arrays, and function calls? What concievable things could NVIDIA add to it to make it better for NVIDIA and worse for everybody else? Anybody? Bueller?

Decide to charge a licensing fee to anyone that creates a compiler that works on the language defined by it's specification?

Russ, as long as we're playing these games, try this one:

Why would nVidia bother to retain ownership of the language specification, if it's nothing more than "loops, arrays, and function calls" which have no practical bearing on IHV iplementation whatsoever. Why would nVidia create the language spec AT ALL, considering another group of fully compatible "loops, arrays, and function calls" (DX9 HLSL) is around the corner?
 
What cracks me up is that some of you want to chastise a company for spending their own money developing something and not releasing it as 100% free and open source so their competitors to directly see as much benefit from the work as they do.

Next thing you know, you'll be saying it's wrong for beer companies to make beer unless they give everyone the recipe and instructions on how to brew it using their competitor's brewery equipment.

If you don't like it, move to China, I hear there isn't as much of a problem with capitalism there. Or better yet, spend a few thousand hours writing your own shading language and include support for every feature of every video card, distribute it freely and openly, and don't expect anything in return. After all, if you want something done right (i.e. the way you want it), do it yourself.
 
Doomtrooper said:
For crying out loud....Gking already posted earlier that Cg could Expose more powerful register combiners on Nv1X and Nv2X..
What about NV30...what if there is no DX 9.1..whats gonna utilize the extra instructions...DX9 is not going to...enter CG...

Um, no. Cg compiles to assembly instructions. It needs the assembly support from DX or OpenGL in order to work. In other words, if Microsoft refuses to reveal certain functionality through DX (highly unlikely), then nVidia can't use Cg as a workaround whenever DX is in use.
 
BS..this has nothing to do with releasing it for free..what about Rendermonkey..ATI is releasing it for free..same goes here. There is a specific reason why Nvidia started working on CG, there is no reason why a IHV would start its OWN shader language when they were supposed to be working on two independent HLSL of DX and OGL.
 
Chalnoth said:
Doomtrooper said:
For crying out loud....Gking already posted earlier that Cg could Expose more powerful register combiners on Nv1X and Nv2X..
What about NV30...what if there is no DX 9.1..whats gonna utilize the extra instructions...DX9 is not going to...enter CG...

Um, no. Cg compiles to assembly instructions. It needs the assembly support from DX or OpenGL in order to work. In other words, if Microsoft refuses to reveal certain functionality through DX (highly unlikely), then nVidia can't use Cg as a workaround whenever DX is in use.

http://www.beyond3d.com/forum/viewtopic.php?t=1259&postdays=0&postorder=asc&start=50


It's worth pointing out that "adding functionality" to DX often takes months after hardware is available because it *is* Microsoft controlled, and even then not all functionality is added, as Micorosft plays political games with hardware manufacturers. The register combiners in NV1x and NV2x series chips are far more powerful than what Microsoft exposed with texture environments and pixel shaders.
 
There's an excellent reason, actually.

It's so that game developers start to use more advanced features sooner, to make their high-end products have more appeal.

Don't forget that Cg (And Rendermonkey...to be fair...) are out before the HLSL's for DX or OpenGL.
 
What cracks me up is that some of you want to chastise a company for spending their own money developing something and not releasing it as 100% free and open source so their competitors to directly see as much benefit from the work as they do.

Actually, I think you have that backwards. ;)

The people "chastising" nVidia are the ones that RECOGNIZE that when a company does something like this, OF COURSE they aren't going to do it so that the competitors benefit as much from the work as they do.

I think it's completley REASONABLE for nVidia to hold on to as much IP as they can...I agree....they did do the work after all.

That being said, we can still question whether or not the move is a benefit to the industry as a whole, in addition to being a benefit to nVidia...OR if it doesn't really progress the industry at all (or perhaps even hurt it.)

I think we all agree that HLSLs are the next step. But there is a valuid argument to be had whether or not "the industry" needs a Cg language in addition to coming "industry standards" like DX9 HLSL, OpenGL 2.0, Renderman, etc.
 
You're right Doomtrooper, everyone should pass their hardware design documents through Microsoft and the ARB before they even think about making a product, because heaven forbid they have features not covered by an API and decide to make their own software tools to help expose them.

You bitch about vendor specific extensions.

You bitch about Microsoft controlling features.

You bitch about anything and everything developed by NVIDIA.

You bitch about how much Microsoft and NVIDIA are ruining the world and holding the rest of the world down.

Here's a suggestion: install Linux and spend your time watching ATI demos and take comfort in knowing that NVIDIA and Microsoft haven't touched anything on your computer.
 
Can one of you anti-Cg people please post a complete side by side analysis of how DX9 HLSL and Cg differ? Joe? Where's your evidence.
 
Thank you for dragging this potential discussion into the toilet.

Could we have a moderator delete this thread? Its obviously a continuation of the longer NV30 vs. R300 thread and isn't serving a purpose on its own.

Next time, when I ask you to keep politics out of the conversation, could you at least humor me? If you don't like the rules of "my" thread, go make your own damn thread.

Oh, and as a parting shot, because I'm mighty pissed at the moment: Some of you folks are just plain ignorant, which is OK by me--there's a lot of things I am clueless about too. What's so f'kng iritating is you apparently like it that way and you don't seem to think it should impede you from opening your mouth over and over and over again. I'm sure Ben6 got a little crick in his neck when I said that, but I just couldn't control myself.
 
My evidence of how DX9 HLSL and Cg language differs? I haven't said they differ. I said they are functionally compatible.

That's my point.:

Why would the two need to exist if they are compatible / interchangeable, as nVidia has claimed?
 
Joe DeFuria said:
My evidence of how DX9 HLSL and Cg language differs? I haven't said they differ. I said they are functionally compatible.

That's my point.:

Why would the two need to exist if they are compatible / interchangeable, as nVidia has claimed?
Excellent question.
 
Back
Top