How Cg favors NVIDIA products (at the expense of others)

Doomtrooper said:
Chalnoth they are going to code a game that the R300 couldn't handle :LOL: ....and we still have not even got more than a handfull of DX8 games on the market now with DX9 releasing in a couple months...

You think I was trying to say that? "Less than optimal" does not equate to "does not run."

So you think CG HLSL will be more popular than Microsofts offical DX9 HLSL..I don't.

Considering the two may well turnout to be the same language, there wouldn't be much difference, other than the compiler, between supporting one or the other. We will see, but this could be a totally mute point.

In other words, if the two languages turn out to be the same, it would be silly for any developer using DX9's HLSL not to use the Cg compiler for at least nVidia's cards, as that would produce optimal performance for those cards. If ATI doesn't make their own profiles, then it's their loss.
 
No it wouldn't...Rendermonkey and DX9 HLSL will ensure ATI's interests are maintained.

Lets be clear here...CG isa subset of DX9 HLSL...they are not the same...
 
also freakin pissing me off because there are a bunch of know nothings like Doomtrooper and HellBinder making comments about the development process

First.. i am not a know nothing..

Second my argument as i stated were political/corperate/pr in nature.. Not the nitty gritty of it.. I also CLEARLY stated that Technical people (like yourself) would not understad the argument becuase you are incapable of backing off and looking at the big picture.....

However, you can think of me any way you like..
 
'CG isa subset of DX9 HLSL...'

By the time DX9 ships, it will probably be the OTHER WAY AROUND.

As of right now, it looks to be a near isomorphism.

Some of you guys seem to not understand what's been written 20 times already.

A developer writes code for Cg, and even if there is no optimizations for ATI (if they don't write there own profile), it will default to DX9 HLSL and whatever the OGL extension concensus is right now. This will presumably COMPILE, and RUN on any ATI card that supports DX9 and OGL 1.xx (2.0 in the future edit). That's the point at least!

In fact, assuming the developer writes fairly easy shaders, like say for DX 8.1, this too will compile and run on cards like Geforce2, Parhelia, R8500.

If CG becomes widespread, it will be in ATI's interest to write there own compiler, so they can squeeze all the performance out of it that they can.
 
This will presumably COMPILE, and RUN on any ATI card that supports DX9 and OGL 2.0. That's the point at least!

OpenGL2 doesn't exist at the moment. What you will have in the interim will be nvidia specific extensions for NV30, or OpenGL-NV30 as NVIDIA calls it.
 
OGL2 doesn't exist, but it's still possible to write a backend for it and test it against 3DLab's shader compiler as a unit test. It won't be possible to do it end-to-end test until 3DLabs writes an OGL2.0 ICD prototype driver to test against. But presumably, the backend can be maintained to track the evolving OGL2.0 HLSL language and use 3DLab's tool to compile the shaders. If the generated shaders pass 3DLab's compiler, then presumably, they will run on hardware. If not, then there is a problem in the HLSL spec. There's also no guarantee that 3DLabs proposal will be the OGL2.0 language. It's just a proposal. Often, proposals go through several revisions and are merged with other proposals before becoming final.

Anyway, Can you blame NVidia for making extensions now? ATI will probably make extensions for OpenGL1.4 too. No one can wait 1+ years until OGL2.0 is approved by ARB.

One reason DirectX has advanced so quickly is because it has a benevolent dictator, Microsoft, who simple forces the IHV's hands and won't let DX releases be tied up in committee debate for years.

I'll bet that Cg will actually become identical to DX9 HLSL, in other words, MS and NVidia will resolve any incompatibilities and whatever extensions NVidia added (not NV30 specific, but actually good ideas, like the #pragma additions you see now), will get rolled into DX9 HLSL.

In the end, Cg won't be NVidia's "language", but will be their toolset for using DX9 HLSL, integrating it with DCC tools (one of the #pragma extensions), and supporting OpenGL before OGL2.0 HLSL is ready. It will also mean NVidia's "runtime", the cg* functions for loading the compiled HLSL and having the library automatically do all the nasty DirectX calls for you with having to manually load each shader and invoke them via DX.
 
Anyway, Can you blame NVidia for making extensions now? ATI will probably make extensions for OpenGL1.4 too. No one can wait 1+ years until OGL2.0 is approved by ARB.

Did I blame them? I'm just pointing out some facts.
 
BTW,
demalion, HellBinder, et al, when you propose a theory, you usually have to prove it. If you can't prove it directly, you can prove its negative, provide a counterexample.


Many of the arguments here look like this:

Ranter: I have a new Theory!
Doubter: Show me a sound argument, or give me an example

Ranter: No, you prove my theory false. If you can't disprove it, it must be true!
Doubter: I ask you for a simple example that fits your theory

Ranter: You're missing the point. Politically, my theory is true!
Doubter: Evidence?

Ranter: You never proved it incorrect in that other thread!
Doubter: Please post one example!

Ranter: Ok, here's an example (cut and paste)
Doubter: Your example doesn't work!

Ranter: You missed my point again! I told you in my Theory that the example won't work, and it doesn't!
Doubter: Actually, the example works even better than you thought and disproves your theory.

Ranter: I have a new theory! (cycle back)

Ranter#2: Ok, so technically I am wrong, but you are missing the big picture. Ok, so no technical evidence can be found today to substantiate my claims, but things could change in the future! See the big picture
Doubter: So there is no evidence for anything you said from the beginning, but I still must take you on your word that by some other means, The Theory will become true, by some unknown mechanism.


For me, the issue is cut and dried. If I were to claim that OGL2.0 was "3DLabs specific" and that their proposal unfairly disadvantaged NVidia, I could certainly use their own publically available proposals to show a simple example! Or maybe I could claim that RenderMonkey unfairly disadvantages any other 3D hardware if you use it to generate shaders. But I'd be expected to show an example of input to RenderMonkey that causes pathological behavior on other architectures!


This is a fine tradition in Western philosophy, that the person making the claim bears the responsibility of substantiating it. And extraordinary claims demand extraordinary evidence.
 
Hellbinder[CE said:
]
Second my argument as i stated were political/corperate/pr in nature.. Not the nitty gritty of it.. I also CLEARLY stated that Technical people (like yourself) would not understad the argument becuase you are incapable of backing off and looking at the big picture.....

Well, it's a fact that non technical people don't understand technical stuff (cause if they do, then they aren't really non technical).

But why should technical people be incapable of looking at the big picture ?

Edit: hehe, great post Democoder
 
DemoCoder said:
BTW,
demalion, HellBinder, et al, when you propose a theory, you usually have to prove it. If you can't prove it directly, you can prove its negative, provide a counterexample.


Many of the arguments here look like this:

Ranter: I have a new Theory!
Doubter: Show me a sound argument, or give me an example

Ranter: No, you prove my theory false. If you can't disprove it, it must be true!
Doubter: I ask you for a simple example that fits your theory

Ranter: You're missing the point. Politically, my theory is true!
Doubter: Evidence?

Ranter: You never proved it incorrect in that other thread!
Doubter: Please post one example!

Ranter: Ok, here's an example (cut and paste)
Doubter: Your example doesn't work!

Ranter: You missed my point again! I told you in my Theory that the example won't work, and it doesn't!
Doubter: Actually, the example works even better than you thought and disproves your theory.

Ranter: I have a new theory! (cycle back)

Ranter#2: Ok, so technically I am wrong, but you are missing the big picture. Ok, so no technical evidence can be found today to substantiate my claims, but things could change in the future! See the big picture
Doubter: So there is no evidence for anything you said from the beginning, but I still must take you on your word that by some other means, The Theory will become true, by some unknown mechanism.


For me, the issue is cut and dried. If I were to claim that OGL2.0 was "3DLabs specific" and that their proposal unfairly disadvantaged NVidia, I could certainly use their own publically available proposals to show a simple example! Or maybe I could claim that RenderMonkey unfairly disadvantages any other 3D hardware if you use it to generate shaders. But I'd be expected to show an example of input to RenderMonkey that causes pathological behavior on other architectures!


This is a fine tradition in Western philosophy, that the person making the claim bears the responsibility of substantiating it. And extraordinary claims demand extraordinary evidence.

Well Democoder you can be technically right about TECHNICALL aspects but If I were ATI&others I would never use a Cg compiler when:
-Cg is a proprietary trademark
-The Cg language evolution is controlled by the oponent

3DLabs is contributing to the OpenGL 2.0 specs. They are not the owners of OGL trademark and dont control the OGL evolution.

What do you want? An ATI newspress saying "ATI decided to use the Cg Language" ? ROTFLMAO

In the end (sometime down the road) probably some companies will have its set of tools for the M$ HLSL and OGL 2.0, and the discussion will be over ;)
 
Thanks. I know my blood boils some times and I might sound like a know it all condescending asshole, but my 3D knowledge is far below that of others like Simon, Mfa, ERP, Fafalada, et al. However, I know enough to spot a phony or a liar.

Imagine I went into a physics forum and said "Einstein's theory of relativity sucks! I have new Radical Theory X" Then, I am pressed for evidence and I stonewall everyone. The frustation of the people in the forum with some knowledge in their domain is going to escalate until I am declared a crackpot. At first, people may be diplomatic, but after causing so much noise and trouble and demonstrating that I am not there to learn or listen, but to start trouble, certainly I am going to be dismissed as a loony.

And this is not a hypothetical situation. For a big laugh, lookup up the Usenet NET.LEGENDS Faq which documents people who practice this as a profession. You can see how normally displomatic people are trolled into aggressive flame wars.
 
DemoCoder, I think you'll be better off just accepting that a few of the people in question is beyond pedagogical / educational reach. ;)
 
pascal said:
Well Democoder you can be technically right about TECHNICALL aspects but If I were ATI&others I would never use a Cg compiler when:
-Cg is a proprietary trademark
-The Cg language evolution is controlled by the oponent

3DLabs is contributing to the OpenGL 2.0 specs. They are not the owners of OGL trademark and dont control the OGL evolution.

Fact #1: We don't know whether the Cg "language" will be controlled by NVidia, or will be controlled by Microsoft DX9 HLSL spec. NVidia has not announced their intentions in this regard, all they have done is release a tool.
Fact #2: Nvidia has submitted Cg as a proposal to ARB as well.
Fact #3: Not all the facts are in. Don't you think it is a bit premature to be basing all these wild political theories on a beta toolkit?

Question: Isn't it possible that Cg will in fact, be Nvidia's implementation of DX9 HLSL language?

Question#2: Isn't it further possible that Cg HLSL and 3Dlabs HLSL will be merged (the differences aren't that big, they both derive from C). Moreover, it is even possible that DX9 HLSL could be merged.

If we don't know how things will play out, why are so many people jumping down Nvidia's throat for releasing a development tool that they have every right to release And that Cg is not evil and there is no reason for all these rants

pascal said:
What do you want? A ATI newspress saying "ATI decided to use the Cg Language" ? ROTFLMAO

In the end (sometime down the road) probably some companies will have its set of tools for the M$ HLSL and OGL 2.0, and the discussion will be over ;)


No, I want people to stop making wild accusations about how the sky is falling when it isn't. Or atleast if the sky is falling, provide the evidence. If Cg turns out to be something completely different from OGL HLSL and DX9 HLSL and no other vendors support it, who cares?

Either NVidia will make it really good at generating code for R300 and other cards (in which case, what's the problem?) or developers will refuse to use it and it will die in the market place

NVidia has market power, but they don't have the power to simply kill off 90% of the game developer's market, which incidently, doesn't include DX8 capable hardware


The scenario people are constructing here is ludicrous:

NVidia's master plan to rule the world
Step 1: Release Cg, make it support all hardware
Step 2: Get all developers onboard and everyone is using it
Step 3: Modify Cg so that it only works with NV40, put R400 and everyone else at disadvantage
Step 4: Developers are locked in and have no choice to use Cg3.0 with NVidia NV40 extensions
Step 5: Consumers refuse to buy R400 because games run horribly slow on it due to usage of NVidia features
Step 6: NVidia wins

The fatal flaw in these theories is, developers AREN'T LOCKED IN and developers don't target bleeding edge API/hardware features

Furthermore, NVidia doesn't control the rendering layer: OpenGL/DirectX, they'd only control a compiler tool.

This is analogous to saying you'll put Intel and AMD out of business because you control the compiler. But you don't control the OS, and anyone can write another compiler for another language.

Programming languages are a DIME A DOZEN. There are thousands of them. No one has ever successfully controlled a market by owning the programming language because there are too many substitute goods

All it would take to unlock a developer from Cg would be a tool that parsed Cg and serialized it back into whatever other shading language the wanted.

Far more control comes from hooking developers into an API, like OpenGl or DirectX. Hence, the Win32 monopoly. That's because it is hard to port code that aggressively uses one API into another efficiently. (e.g. turn Win32 MFC C++ code into Unix KDE code). But it is relatively easy to translate between Java and C# or C and PASCAL, etc.
 
DemoCoder said:
Fact #1: We don't know whether the Cg "language" will be controlled by NVidia, or will be controlled by Microsoft DX9 HLSL spec. NVidia has not announced their intentions in this regard, all they have done is release a tool.
Fact #2: Nvidia has submitted Cg as a proposal to ARB as well.
Fact #3: Not all the facts are in. Don't you think it is a bit premature to be basing all these wild political theories on a beta toolkit?

Question: Isn't it possible that Cg will in fact, be Nvidia's implementation of DX9 HLSL language?

Question#2: Isn't it further possible that Cg HLSL and 3Dlabs HLSL will be merged (the differences aren't that big, they both derive from C). Moreover, it is even possible that DX9 HLSL could be merged.

If we don't know how things will play out, why are so many people jumping down Nvidia's throat for releasing a development tool that they have every right to release And that Cg is not evil and there is no reason for all these rants

Hell Democoder, NOBODY is doing wild political theories :devilish:

FACT#1 Cg is a language and a compiler
FACT#2 Cg is a nVidia´s trademark
FACT#3 Cg language evolution is up to nVidia decide
FACT#4 We dont know what is happening (negotiations) between M$ and nVidia
FACT#5 the Cg language was submited to ARB and it is ANALYSING it.
FACT#6 nVidia is a COMPANY that makes MONEY in a highlly competitive segment, not a opensource organization or a wildlife protection organization.

Nobody is going to jump down to nVidia's throat, but nobody will give their a... to nVidia too :rolleyes:

DemoCoder said:
No, I want people to stop making wild accusations about how the sky is falling when it isn't. Or atleast if the sky is falling, provide the evidence. If Cg turns out to be something completely different from OGL HLSL and DX9 HLSL and no other vendors support it, who cares?

Either NVidia will make it really good at generating code for R300 and other cards (in which case, what's the problem?) or developers will refuse to use it and it will die in the market place

NVidia has market power, but they don't have the power to simply kill off 90% of the game developer's market, which incidently, doesn't include DX8 capable hardware


The scenario people are constructing here is ludicrous:

NVidia's master plan to rule the world
Step 1: Release Cg, make it support all hardware
Step 2: Get all developers onboard and everyone is using it
Step 3: Modify Cg so that it only works with NV40, put R400 and everyone else at disadvantage
Step 4: Developers are locked in and have no choice to use Cg3.0 with NVidia NV40 extensions
Step 5: Consumers refuse to buy R400 because games run horribly slow on it due to usage of NVidia features
Step 6: NVidia wins

The fatal flaw in these theories is, developers AREN'T LOCKED IN and developers don't target bleeding edge API/hardware features

Furthermore, NVidia doesn't control the rendering layer: OpenGL/DirectX, they'd only control a compiler tool.

This is analogous to saying you'll put Intel and AMD out of business because you control the compiler. But you don't control the OS, and anyone can write another compiler for another language.

Programming languages are a DIME A DOZEN. There are thousands of them. No one has ever successfully controlled a market by owning the programming language because there are too many substitute goods

All it would take to unlock a developer from Cg would be a tool that parsed Cg and serialized it back into whatever other shading language the wanted.

Far more control comes from hooking developers into an API, like OpenGl or DirectX. Hence, the Win32 monopoly. That's because it is hard to port code that aggressively uses one API into another efficiently. (e.g. turn Win32 MFC C++ code into Unix KDE code). But it is relatively easy to translate between Java and C# or C and PASCAL, etc.

Democoder, big company are competing and Cg is part of nVidia chess game, and this game is highlly adaptative.
 
NVidia isn't a software company. They aren't selling Cg and they don't make their money from it. They make their money from selling hardware. Whether or not people use Cg, NVidia will continue to sell their hardware.


Sun invented Java. Sun controlled Java. For years, detractors (including Microsoft) claimed that Java would end up with some "built in" advantage for Solaris and SPARC hardware. They it would run "best" on Solaris and drive people to buy Sun servers over Wintel or IBM.

In fact, Sun's homegrown language ran worst on their own servers and always trailed the Win32 and IBM versions.

If Cg is a chess piece in NVidia's game, it's a very weak piece, and possibly not even on the game board.

I guess you could say that every single developer tool now on developer.nvidia.com is a ploy to make money. In one sense, you are correct, no company, not NVidia, not ATI, and not 3DLabs has developers and users in mind when they promote anything. There's no altruism involved.

So yes, anything NVidia releases to "help" developers, implicitly raises brand awareness for NVidia hardware, and thus developers will make sure their games run good on NV hardware. This was true before Cg and it will be true afterwards.

But on another level, it's false. These tools and demos are cranked out by engineers who honestly play around with the APIs because it's FUN and INTERESTING. These are later picked up and demoed at conferences and put on the developer website. It is a serendipitous development, not a chess game.

Cg, if anything, was an internal development to create an HLSL for the NV30 because there was a INHOUSE NEED. There is no shipping HLSL that fulfills this need, so something had to be created. Later, this was turned into a developer tool for outside use, and later on, it was probably designed to use it as positive PR. Now, I would be willing to bet that MS and NVidia are syncing up.

I have seen this happen over and over again. Right now I serve on a working group where 3 different companies invented the same thing at the same time, and all three are being merged together with a 4 effort driven by a consortium. These companies developed these proprietary languages because THERE WAS NO SHIPPING SOLUTION AT THE TIME.
Master chess game? No. Business need? yes


BTW, you go from
FACT#1 Cg is a language and a compiler
FACT#2 Cg is a nVidia´s trademark
FACT#3 Cg language evolution is up to nVidia decide
FACT#4 We dont know what is happening (negotiations) between M$ and nVidia
FACT#5 the Cg language was submited to ARB and it is ANALYSING it.
FACT#6 nVidia is a COMPANY that makes MONEY in a highlly competitive segment, not a opensource organization or a wildlife protection organization.

to this
Democoder, big company are competing and Cg is part of nVidia chess game, and this game is highlly adaptative.

But you just said that Cg was submitted to ARB and that we have no idea what's going on between NVidia and MS, but insinuate that something nefarious is happening on the part of NVidia! Isn't it true that the alternate future (Cg = DX9 or Cg Union 3DLabs) are also possibilies?

If either of these futures pan out, are you going to feel sorry for all of this negative politicking ?
 
nVidia is a company, they sell hardware and software too (drivers), or the software is used the sell more hardware.

nVidia did a lot of press release quoting many developers as supporting Cg, nice webpages, etc..

Who used the word nefarious? It is all part of the game. Everybody is gaming, dont you believe that? If the others do not play well and lose for a weak game all we can do is laugh :LOL:
 
DemoCoder said:
BTW,
demalion, HellBinder, et al, when you propose a theory, you usually have to prove it. If you can't prove it directly, you can prove its negative, provide a counterexample.

My biggest problem, hell I guess it is my only problem, with you Democoder, is this exact crap that is exactly as useless as any ranting you can ascribe to Hellbinder or Doomtrooper. I go through great pains to clearly state points and provide logical support for an argument, and respond to your post to further clarify and give you an opportunity to answer me in a very clear and direct manner, and the ONLY response that responds to me by name is this generic rant that begins "demalion, Hellbinder, et al".

To answer the implication of your rant, again, my contention is that I view the assertion that Cg will not be able to be selectively adapted to suite nVidia's competitiveness going forward as a theory, and a weaker theory than that they will be able to. Your description of the sequence of events, atleast as I see it applying to myself, is only valid if you accept your own viewpoint in this matter as not a theory but a fact, which in the absence of you supplying the links and information I have asked for it is not. To me this is strongly reminiscent of exactly the type of "ranting" I criticize (visit Rage3D.com if you want many examples of this) Doomtrooper and Hellbinder for performing. It strikes me as a stance constructed to ignore any statements disagreeing with your viewpoint based on logic, and sets up a scenario, which I've described before, where someone else's viewpoint is a "theory" that needs proof, and your viewpoint is...what? A "fact" that does not?


For me, the issue is cut and dried. If I were to claim that OGL2.0 was "3DLabs specific" and that their proposal unfairly disadvantaged NVidia, I could certainly use their own publically available proposals to show a simple example! Or maybe I could claim that RenderMonkey unfairly disadvantages any other 3D hardware if you use it to generate shaders. But I'd be expected to show an example of input to RenderMonkey that causes pathological behavior on other architectures!

OGL 2.0 will be accepted, ratified, and maintained and updated by industry representatives, not 3dlabs. I'm not sure if I made my viewpoint clear on 3dlabs in a post or a PM, but I have stated before that if this wasn't the case, it would be just as fair to hold this discussion about it.

Also, I have made abundantly clear that it isn't the current spec that is the problem but who controls when and how the spec adapts to technologies that appear in the future. See prior posts you haven't replied to for more detail on this point.

This is a fine tradition in Western philosophy, that the person making the claim bears the responsibility of substantiating it. And extraordinary claims demand extraordinary evidence.

And you aren't making a claim? Oh wait, that's right, your viewpoint isn't a "claim", due to your technical knowledge it is "fact". That clears up my confusion.
Please read this post again(?), as answering or refuting my line of reasoning there would actually get us somewhere.
Or you could just ignore my posts, respond only to Doomtrooper and Hellbinder, and lump me under the label "ranter" and dismiss my points without addressing them. I know you could do this, because you've done it successfully so far.

EDIT: changed "agreeing" to "disagreeing"
 
Everything you say about NVidia's status as a for profit company could equally be applied to ATI or 3DLabs, and those two companies have their own "chess game" going. Likewise, so does Microsoft, and in fact, so did SGI (controlling OpenGL for so long sure helped sell their hardware and lock in their market, didn't it!!! :) and don't forget Pixar who controls RenderMan.


What I don't understand is, why all this focus on NVidia. Where were you when SGI was dominating GL and using it to checkmate the industry? (oh, they lost because they couldn't keep up with the rising consumer hardware) Why aren't you bashing RenderMonkey, another tool to sucker developers and content creators into making stuff render and look best on R300 hardware. (if we follow the same reasoning as the antiCg people)

And isn't 3DLabs ARB submission really just a money making ploy? They couldn't dictate by market force, so channel their hardware features into a language and push it through political committee!


Nah.. ATI and 3DLabs only care about lofty goals like advancing the freedom and openness of APIs.
 
OGL 2.0 will be accepted, ratified, and maintained and updated by industry representatives, not 3dlabs. I'm not sure if I made my viewpoint clear on 3dlabs in a post or a PM, but I have stated before that if this wasn't the case, it would be just as fair to hold this discussion about it

So will Cg now that it has been submitted to ARB.

Question is, if 3DLabs proposal is shot down and Cg is adopted, will you be happy? Secondly, if 3DLabs ends up shipping their proposal with the P10 as a beta toolkit BEFORE OGL2.0 is ratified, are you going to criticize them as fiercely as you have done with NVidia.

Both Nvidia and 3dLabs now have a proposal within ARB.
 
Back
Top