How Cg favors NVIDIA products (at the expense of others)

DemoCoder said:
OGL 2.0 will be accepted, ratified, and maintained and updated by industry representatives, not 3dlabs. I'm not sure if I made my viewpoint clear on 3dlabs in a post or a PM, but I have stated before that if this wasn't the case, it would be just as fair to hold this discussion about it

So will Cg now that it has been submitted to ARB.

I do believe I've also said that if indeed Cg is opened up to the industry to evolove as it sees fit, it would be a terrific thing. The problem is my memory of the board meeting notes give me an impression directly in contradiction to this. If you can correct my impression, or post the links I've asked for, you'll have answered my questions. I will point out that the only thing that has stopped you from doing this about 6 pages ago is choosing to ignore my posts.

Question is, if 3DLabs proposal is shot down and Cg is adopted, will you be happy?

If the above is true, well certainly! If not, I think it would be disasterous. I don't think you went and read the post I gave a link to.

Secondly, if 3DLabs ends up shipping their proposal with the P10 as a beta toolkit BEFORE OGL2.0 is ratified, are you going to criticize them as fiercely as you have done with NVidia.

I can't help think you aren't reading carefully...if ANYONE doesn't allow the industry, instead of one vendor with who has reason to consider only their own interest when determining when and how the standard will adapt to future technologies, control the future of the specification of the HLSL... it is undesirable, whether 3dlabs or nVidia, or whomever. I don't have a problem with either 3dlabs or nVidia developing it or benefiting from the first optimized implementation of it, and since I'm sure I've stated that before I tend to think this is the result of your lumping me in with Doom and Hell.

Both Nvidia and 3dLabs now have a proposal within ARB.

Yep, I'm aware.
 
Some of the Cg naysayers make the claim (paraphrased) "Nvidia is a business in a competitive industry, so Cg is clearly intended to give them an advantage over their competitors. Why else would they have developed it and be pushing it." They then go on to extrapolate all kinds of Doomsday scenarios where Nvidia destroys the competition through unfair control of Cg.

Others have already poked many holes in this argument. E.g. even if there was a way to disadvantage other chipmakers with control of the Cg spec, could there be a better/faster way to piss off developers ("my game won't work on ATI hardware anymore, there goes half my potential market. Thanks Nvidia.") and destroy your reputation in the marketplace?

Here's another motive for Nvidia to push Cg, and one which I think better follows Occam's Razor (the simplest explanation is usually the best): Nvidia wants Cg to become and remain the premier shading language in the future simply because it looks good for them. They want developers, hardware manufacturers, and software vendors across the entire 3D industry to support Cg, an Nvidia-developed language, because it is a demonstration of their technical prowess. They want Cg to become an accepted standard because it gives them clout. It's advertising. It's publicity. It gets their name out there as a leader in the world of 3D.

And that's why they'll do everything they can to keep Cg fair and open, and NOT disadvantage any other chip manufacturers. If they use Cg to gain a technical advantage, then (as DemoCoder and others repeatedly point out) developers will simply move on to other tools and Cg will die in the marketplace. Who thinks Nvidia wants to see that happen? Nvidia has spent too much money in development and marketing of Cg, they don't want to see it die. Pretty simple really.
 
Here's another motive for Nvidia to push Cg, and one which I think better follows Occam's Razor (the simplest explanation is usually the best): Nvidia wants Cg to become and remain the premier shading language in the future simply because it looks good for them. They want developers, hardware manufacturers, and software vendors across the entire 3D industry to support Cg, an Nvidia-developed language, because it is a demonstration of their technical prowess. They want Cg to become an accepted standard because it gives them clout. It's advertising. It's publicity. It gets their name out there as a leader in the world of 3D.
The problem is that they may be doing this at the expense of the standard itself. If Nvidia had actually collaborated with other industry leaders on the development of Cg, then it wouldn't be so easy for them to slap their logos and advertising over any mention of the word. So they've kept information about Cg deliberately close to their chests for as long as possible in order to keep more ownership of it - and possibly to keep as large a lead as possible over the competition in the critical area of compiler technology.

Compare and contrast this with the 3DLabs approach to OpenGL 2.0. They've been working with other graphics companies for months already on its development, and they don't try to lay claim to it and take credit for it like Nvidia is doing with Cg. ATI was showing OGL 2.0 code running at SIGGRAPH already, before even the final spec was completed. That's the way an open industry standard should be.

And that's why they'll do everything they can to keep Cg fair and open, and NOT disadvantage any other chip manufacturers. If they use Cg to gain a technical advantage, then (as DemoCoder and others repeatedly point out) developers will simply move on to other tools and Cg will die in the marketplace. Who thinks Nvidia wants to see that happen? Nvidia has spent too much money in development and marketing of Cg, they don't want to see it die. Pretty simple really.
This makes sense, but it doesn't look like they've figured out yet what it will take to achieve their goals. It seems like they figured their solution was so superior to anything else that they just assumed everyone would want to use it. Now they're slowly starting to accept reality, and considering things like open-sourcing the back-end compiler, merging Cg with the OGL 2.0 HLSL, etc. The question is, how much control will they be willing to relinquish for the good of the industry?
 
To my knowledge, nothing has changed in regards to how nVidia is treating Cg, since the original announcement (though I admit that I thought it did at one point...that the backend was open-sourced...).

nVidia is still opening everything up except the profiles that compile to specific vertex/pixel shader versions. They are apparently releasing an open-source example backend.
 
No I'd say the the CG critics like myself have reason to be skeptical...the other factor I have is ATI itself contacts..add the two together.

The other factor here is simple, we don't need three different High Level Shader Languages. The Pro-CG people harp CG was needed..hardly..Rendermonkey is the proper way to approach the issue.

The real REASON why these High Level Shader Languages were coming is to aid in the coding of shading effects.
Rendermonkey does all this, and with the current industry standard rendering software and the other two industry standard shading languages DX9 HLSL and OGL 2.0 HLSL.

Supposedly Nvidia is working with Microsoft on Direct X 9 High Level Shader Language, along with the other players so why would Nvidia all of a sudden release their own HLSL when they should have been putting efforts into DX9...there is a reason here why they would put manpower towards their own and DX9...some people are to blind to face facts here.

Nvidia is supposed to be supporting and pushing OGL ARB standards forward, working on OGL 2.0 with other ARB members but at the latest ARB meeting, along with M$ blocks the extension and then offers CG...Chess you say is correct.

Democoder talks about the ARB being slow, maybe he should spend the time reading the ARB notes and see why the ARB has been so slow :rolleyes:
 
SteveG said:
Some of the Cg naysayers make the claim (paraphrased) "Nvidia is a business in a competitive industry, so Cg is clearly intended to give them an advantage over their competitors. Why else would they have developed it and be pushing it." They then go on to extrapolate all kinds of Doomsday scenarios where Nvidia destroys the competition through unfair control of Cg.

Others have already poked many holes in this argument. E.g. even if there was a way to disadvantage other chipmakers with control of the Cg spec, could there be a better/faster way to piss off developers ("my game won't work on ATI hardware anymore, there goes half my potential market. Thanks Nvidia.") and destroy your reputation in the marketplace?

Here's another motive for Nvidia to push Cg, and one which I think better follows Occam's Razor (the simplest explanation is usually the best): Nvidia wants Cg to become and remain the premier shading language in the future simply because it looks good for them. They want developers, hardware manufacturers, and software vendors across the entire 3D industry to support Cg, an Nvidia-developed language, because it is a demonstration of their technical prowess. They want Cg to become an accepted standard because it gives them clout. It's advertising. It's publicity. It gets their name out there as a leader in the world of 3D.

And that's why they'll do everything they can to keep Cg fair and open, and NOT disadvantage any other chip manufacturers. If they use Cg to gain a technical advantage, then (as DemoCoder and others repeatedly point out) developers will simply move on to other tools and Cg will die in the marketplace. Who thinks Nvidia wants to see that happen? Nvidia has spent too much money in development and marketing of Cg, they don't want to see it die. Pretty simple really.

OH, I am sorry,
Now nVidia is not a company anymore :rolleyes:
All I want is an non proprietary open standard for the good of all consumers.

And their intention (from the deep of their heart) is being "fair and open", and you can warranty that. :LOL: :LOL:
 
No I'd say the the CG critics like myself have reason to be skeptical...the other factor I have is ATI itself contacts..add the two together.

And of course, ATI, Nvidias biggest competitor at the moment are supplying you with completely unbiased information.
 
Bjorn,

ATI didn't write a HLSL did they...they helped developed a industry STANDARD HLSL with DirectX 9 and also working on OGL 2.0, then to aid the use of shaders developed a plugin that works with industry standard rendering software (companies have alot of money invested in this software for licensing fees, training etc..) and the upcoming industry standard DX9 and OGL 2.0...

I can see why you think ATI is in the wrong :LOL:
 
I assumed from your post you think ATI is lying or providing false information, or not approaching the real reason HLSL's is here in the 1st place :-? ....
 
Doomtrooper said:
I assumed from your post you think ATI is lying or providing false information, or not approaching the real reason HLSL's is here in the 1st place :-? ....

There is a difference between what a company is saying and what they're doing. Besides, something that is biased doesn't necessarily have to be false.
 
Reverend said:
Cg = DX9 HLSL

My name is Nick Triantos. One of my responsibilities at NVIDIA is managing the development of many parts of the Cg Toolkit, including the lanugage specification, and the compiler. I wanted to address one of the points made on this thread:

Anonymous wrote:
Why do you insist on calling Cg a standard ? Its created by one company without any input from others. NVIDIA makes it sound like they worked with MS on this but all they did is check with MS to make sure that the code that is generated is actually compatible with the MS Pixel/Vertex Shader Instructions.


Actually, you're not correct. NVIDIA has evolved Cg in close collaboration with Microsoft, and we've also had the language reviewed by many software developers from game companies, rendering companies, tool providers, etc. Both NVIDIA and Microsoft have made changes to our respective languages, so that the high-level languages are completely compatible.

Compatible but not one and the same.
 
pascal said:
And their intention (from the deep of their heart) is being "fair and open", and you can warranty that. :LOL: :LOL:

Yes. Because if they do otherwise, then Cg fails. It's really that simple. The paranoia on this board is out of control.

I'm not arguing that Nvidia's motive is altruism. Clearly they hope to benefit from Cg. However, as I stated above, those benefits are contingent upon Cg being embraced by the market. If they are not fair and open, Cg fades away (remember Glide?), and then they reap zero benefit.
 
SteveG said:
pascal said:
And their intention (from the deep of their heart) is being "fair and open", and you can warranty that. :LOL: :LOL:

Yes. Because if they do otherwise, then Cg fails. It's really that simple. The paranoia on this board is out of control.

I'm not arguing that Nvidia's motive is altruism. Clearly they hope to benefit from Cg. However, as I stated above, those benefits are contingent upon Cg being embraced by the market. If they are not fair and open, Cg fades away (remember Glide?), and then they reap zero benefit.

Sent you a PM earlier...maybe I should have posted it here.
 
The other factor here is simple, we don't need three different High Level Shader Languages

erm... why not ? How can having an alternative harm me as a developer in any way ? Who are you to say that Pascal should never have been born because we have C ?
Embrace the Tao, young apprentice
"Each language has its purpose, however humble. Each language expresses the Yin and Yang of software. Each language has its place within the Tao.

But do not program in COBOL if you can avoid it."
:D :D

Just FYI, im an independent developer, working on my first bigger project right now ( sparetime ). Because the focus has been on quite *ahem* unusual aspects of the final product up until now, i have been using ages old software engine for 3D when i need it at all. Im no 3d guru myself, i just know the basics.
In a while i will get to work on all the bells'n'whistles, and this includes rendering code as well ( some of you Doom/Quake fans might think i have got the project development stages all backwards but its none of your business :p )
At the moment i will be staying with Ogl, and for Shaders its gonna be Cg all the way, targeting generic DX8.0 class HW ( especially Xabre and XP4 ;) ) Why ? Gl because its cross-platform, and i wont bother learning all the tricks of vendor-specific extensions, plus, im not much into PS assembler... i used to be a fan of x86 assembly once but after 8088 it got messy :p
Now tell my why should i abandon this plan ?
 
SteveG said:
pascal said:
And their intention (from the deep of their heart) is being "fair and open", and you can warranty that. :LOL: :LOL:

Yes. Because if they do otherwise, then Cg fails. It's really that simple. The paranoia on this board is out of control.

I'm not arguing that Nvidia's motive is altruism. Clearly they hope to benefit from Cg. However, as I stated above, those benefits are contingent upon Cg being embraced by the market. If they are not fair and open, Cg fades away (remember Glide?), and then they reap zero benefit.
Please stop with this "The paranoia on this board is out of control" because I could start to say that there are too much naive people here.

Good for you that you dont believe in altruism.
When someone is gambling usually someone try to get as much as possible.

There are many players (nVidia, ATI, 3DLabs, M$, etc...).
Is too much to ask for a non proprietary open standard?

3dfx died first, not Glide (other reasons). Many people still like to play Unreal with their V5.
 
Back
Top