NVidia Cg...now we know where the "Glide" rumors c

If this Cg thing is at a high enough level than by definition it is *not* tailored for any specific hardware, and ATI should be able to write an r300 backend for it that uses their specific features just as NVidia can use it to do the same with nv30.

How can the design for some high-level language be "proprietary" to low-level hardware?

My 2c of speculation :

The front-end and language, and maybe even the translators to DX and OpenGL are open. Anyone is free to write a backend compiler for their hardware. However, the back-ends are proprietary... maybe they could even bypass DX/OpenGL drivers completely?

How NVidia benefits vs. other companies? As the developers of Cg they probably have high performance backends ready. Other companies would need to play catch-up on that front...
 
DaveBaumann said:
Matt Burris said:
Told you it was under NDA. :p

You've also confirmed to everyone that this is true so essentially you broke NDA as well ;)

True true, guess I'll go back and delete all my posts since PC.IGN took down their confirmation of Thursday's event. :D
 
Why would NVidia release CG for free or allow/encourage it to be used on other hardware besides NVidia?

Why did Sun, which is essentially a hardware company selling Sparc processors, workstations/servers built around this chip, and Solaris operating system invent Java, which makes it possible to write apps without knowledge of the Sparc CPU or the Solaris operating system?

Furthermore, why did they give it away? More than that, why did they release implementations for their arch-enemy Microsoft?

Why did SGI invent VRML and then give it away?

NVidia might be hoping that if they can get developers to target the capabilities inherent in CG, they will drive demand for higher end hardware. This may benefit their competitors as well, but if NVidia wants to avoid market saturation and force people to adopt/upgrade to super-high-end cars, they need tools that help developers write games that can take advantage of next generation hardware.

Moreover, NVidia also might think that it has a future in software as well as hardware. By giving away CG, they could sell/license developer tools that work on top of it, such as a "Shader Designer/Debugger", or an IDE.

And as the Java case shows, Sun benefited *tremendously* from Java, as it enhanced their brand and drove sales of their servers, even though the vast majority of Java VMs were deployed on Win32. Sun has yet to succeed selling ANY Java software, but for some reason, a platform independent programming language, mostly used on Win32 and Linux, also helped sell their hardware was well.

Who knows what NVidia's license, prices, or intellectual property will be finally, but I think it is premature to assert that this is bad for OpenGL or bad for the industry, or that NVidia is evil.

Let's just wait and see how this pans out.
 
Guest said:
Think MONEY, its all about making MONEY, by handing stuff free to your competitors you do not MAKE MONEY, you LOSE MONEY. In the end NVIDIA wants to sell as many boards as possible, ways to do that is exclusive features or extra performance and both are possible by a well shielded and protected proprietary high level language.
Substitute 'any well-run public company' for NVIDIA and I'd agree. Profit is the motive, efficient delivery of consumer products is the byproduct. Three cheers for capitalism!
 
OpenGL guy said:
I'd love to see nvidia enter the software market... then they can leave the hardware market to us :D

Software is the gatekeeper to the hardware. As Microsoft has proven, he who controls the software, controls everything. Besides, Nvidia is only entering the software market after successfully dominating all their own markets (pc oem, workstation, etc) I don't think their forays into mobile and console detracted that much from their ability to do the NV30, and those 100+ 3dfx engineers didn't help either.

Besides, devoting software engineers to other resources has hardly any impact on the hardware teams. The software guys are otherwise maintaining the drivers and updating the developer SDK.


If you're not giving developers great tools to develop on your platform (hardware), how are you going to attract them?
 
DemoCoder said:
And as the Java case shows, Sun benefited *tremendously* from Java, as it enhanced their brand and drove sales of their servers, even though the vast majority of Java VMs were deployed on Win32. Sun has yet to succeed selling ANY Java software, but for some reason, a platform independent programming language, mostly used on Win32 and Linux, also helped sell their hardware was well.

Sun makes a nice chunk of change (couple of hundred thousand dollars per) with every J2EE license they sell to interested application server vendors. The revenue of yearly/specification specific licenses adds up.

Not to mention it aids to sell their app-server hardware. By providing a language that developers want to use that runs on their servers as well as their OSes, it acts as an attack on (or limits the loss due to) the MS-centric solutions which does not run on their hardware or OS.

I can only hope that Nvidia has good intentions with this Cg shader-language. Any chance this is part of or derived from what they inherited/acquired from 3dfx? We know (strong/accurate speculation) that Rampage was to have some amazing shading/texture computing effects. And 3dfx did provide openly their TDFX compression. Perhaps this is a sign of good things to come. A kinder, gentler, open Nvidia?

--|BRiT| 'Just another J2EE app-server developer'
 
To me this sounds like NVIDIA lost control of the high level language that will be part of OpenGL2.0

Not to mention Dx9... Perhaps we shouldn't be so cynical... I'm sure Nvidia is particularly keen to address issues of developer support for advanced hardware vis-a-vis the legacy user base of hardware. They're not actually subverting Dx/OGL as APIs - yet... ;)
 
DemoCoder said:
Why would NVidia release CG for free or allow/encourage it to be used on other hardware besides NVidia?

I wonder what Microsoft would think if they had to pay license rights for this, who has more programming power and money.. :-?
 
Doomtrooper said:
DemoCoder said:
Why would NVidia release CG for free or allow/encourage it to be used on other hardware besides NVidia?

I wonder what Microsoft would think if they had to pay license rights for this, who has more programming power and money.. :-?

Microsoft already has their own high level language in DirectX9's D3DX library called High Level Shading Language. Go look at the last GDC 2002 presentations. If NVidia was smart, they'd cozy up to MS and work build on top of what HLSL has already created. I doubt there'd be any licensing rights since NVidia wants *maximum* developer adoption. Like I said, if they were smart, they'd give away as much as possible. We'll see what happens on Thursday.

I think you guys are jumping the gun. The IGN leak had very little information. All we know is they have a new high level language and compiler tool. For all we know, it could be a clone of Stanford's RTSL or even an early implementation of OGL2.0 concepts.
 
DemoCoder said:
Why did SGI invent VRML and then give it away?

Off topic, but I don't believe SGI invented VRML. I read about it in a book called "The Playful World", but I don't remember all of the details.
 
3dcgi said:
DemoCoder said:
Why did SGI invent VRML and then give it away?

Off topic, but I don't believe SGI invented VRML. I read about it in a book called "The Playful World", but I don't remember all of the details.

Nope, VRML is a subset of Open Inventor's file format. The original idea of needing a Web based VR language was proposed during a working group BOF held by Tim Berners-Lee of W3C, however, SGI was the first with an actual implementation, because they already had such a language for their workstations - Open Inventor. VRML 1.0 was a subset.
 
There are two possibilities:

1) Cg comes with it's own language definition. Even if it only differs slightly from the DX9 HL Shading language and even if competitors are able do build in their plugins, this is VERY VERY VERY bad for the industry, as then three APIs are competing for support, from Cg is controlled by the market leader. I see, this is hard to understand for fanboys ("You guys are so pessimistic when it comes to NVIDIA").

2) Cg comes will use EXACTLY the same language definition and NO (not even 1) additional methods/calls as DX9 HL Shading language, and essentially the new thing behind Cg is that you can compile the shading C code to OpenGL calls as well. Then Cg is basicly just a tool and two compilers, packed together in a nice looking toolset. The side effect is that they can control the OpenGL part with even more prorietary extensions making out of the Cg/OGL basicly their own native API.

As Nvidia is loosing control over OGL and DX, a move like this was to expect. They pissed of both (M$ by charging lots of $$ for DX8 Shaders and the ARB for blocking progress with lame IP issues)
 
Mephisto,

I can hardly see how a third party tool which assists in code generation can be a bad thing for the industry. Are you saying that programmers should be forced to code in either OpenGL2.0 or DirectX9 and AREN'T ALLOWED TO USE ANY HIGHER LEVEL TOOLS unless they are INDUSTRY STANDARDS?

According to your philosophy, I should be forced to use Microsoft MFC libraries, and that Borland's OWL or Delphi are *dangerous* to the industry. wxWindows is dangerous to the industry. LibQT is dangerous to the industry. All of these are separate high level APIs that build on top of the underlying GUI libraries and they all have separare IDEs for RAD development. I really fail to see how this has harmed the industry.


Cg is a TOOL. It uses a C-like language to generate lots of hideous and boring vertex shader and pixel shader assembly code. Why does a middleware tool need to be a single standard? Why are people so up and arms about this? Are they afraid developers will like it so much that all of them will start using it, and NVidia will have succeeded in commodifying OpenGL/DirectX by abstracting it away?

Let's hypothetically assume that in 2 years, 90% of developers are using Cg to generate OpenGl/DirectX code. So what? They still have the option of not using it and writing OpenGl/DirectX directly. Moreover, anyone can write their own code generation tools as well. The Quake3 engine is very popular. By licensing it, you can avoid having to write alot of DX/OGL code. Quake3 effectively abstracts the immediate-mode APIs. Does that mean ID now controls the graphics industry? They exert heavy influence, but they do not control it.

Finally, the last thing no one here has considered is: What if NVidia goes Open Source?

p.s. I doubt this project was started because of any political aspects with regard to DX and OGL. Much more probability, developers at NVidia wanted to write a tool to help do "renderman-like" shaders. It's so obvious, and NVidia isn't the first to investigate this. Moreover, encapsulation and abstraction are very common paradigms, especially when one has to deal with multiple platforms and APIs (Linux, Mac, Win32, DX vs OGL drivers)
 
I'm kinda with Democoder on this one.

If this turns out to be a toolkit (from which anyone using non-Win32 platforms knows there are many), it's all good.

The key will be mainly-
1) How it presents itself to the developer (i.e. at which level and to what end).
2) Extensibility for alternate platform support.
3) Natural desireability for usage versus funded adoption.

I dont think there is any information to make such designations until V1.0 of the SDK/toolkit is released, so for now it's all a shot in the dark.

I'd welcome more standardized toolkits for Win32/3D coding as every other platform has a plethora of high->medium level toolkits. The problem with Win32 has always been it's a much larger moving target given it's API immaturity, so maintaining a toolkit would require a lot more effort than most other platforms. A company such as NVIDIA would definately be well suited (and have much to gain) if this effort if it were managed by them.

So the only real keypoints we need to decide on Thursday are the presentation and extensibility before any stones can be thrown. :)
 
DemoCoder said:
Are you saying that programmers should be forced to code in either OpenGL2.0 or DirectX9 and AREN'T ALLOWED TO USE ANY HIGHER LEVEL TOOLS unless they are INDUSTRY STANDARDS?

IF the other tool comes with a language which capabilities are optimized on a specific hardware then yes.

According to your philosophy, I should be forced to use Microsoft MFC libraries, and that Borland's OWL or Delphi are *dangerous* to the industry. wxWindows is dangerous to the industry. LibQT is dangerous to the industry. All of these are separate high level APIs that build on top of the underlying GUI libraries and they all have separare IDEs for RAD development. I really fail to see how this has harmed the industry.

You didn't get the point. Borland does not sell hardware. Borland wants to sell software. A program written with Delphi does not work worse on a P4 than it does on a AthlonXP. So who cares?

Cg is a TOOL.

How do you know? What if it is not just a tool but also an extended DX9 HL C-language? You remember Microsofts plans to "extend" JAVA in a way that it only would have worked on a WinOS if you used these extensions? What if Nvidia "extends" the DX9 language capabilites to make better use of their dated and future hardware? May be not the first verison of Cg brings these extensions, but you never know what will happen in the future.

I won't have a problem with Cg if it is just a tool.
 
Come on, when have altruistic moves for the sake of the good of 3D development been part of Nvidia’s M.O.? Sure this will be pushed under the banner of being good for everyone but ultimately its pretty obvious that this is a move designed to be good for nvidia.

Democoder says that they’ve done this so they can have a tool that has renderman like shader capability – well isn’t that OGL2’s remit as well? And don’t you think it’s a little odd that that 3dLabs are working hard with the rest of the ARB to define a decent higher level shader language and all the while nvidia are sitting there smiling, nodding their heads and agreeing (or disagreeing it seems) and in the meantime they are developing their own ‘standard’ apparently without anyone else knowing!

Add to that you hear that nvidia’s developer relations is currently headed up by a former Sony Computer Entertainment dev-rel mangers and he’s running that much as he would for the Playstation by trying to score ‘exclusive’ titles and you really do wonder about the motives behind this.

Sorry but all this tells you its correct to be skeptical about the motives behind ‘Cg’.
 
The link has been pulled off and replaced with another article,so I think It's true!

oh,3dfx spirit is back?on Nvidia or Talon?
 
In the past Nvidia has been a good example to others when it comes to developer support, they showed others how it was supposed to be done, if ATi hadn't dramatically changed and adopted a similar style of developer support around the Radeon launch, they would probably be dead in the water by now. This, along with decent hardware, was actually what made Nvidia so successfull in the first place, so why is everyone suddenly so reluctant to hear Nvidia will give developers a programming tool to assist in making games? Sure there is a possibility that Cg might end up being useless, expensive or favour Nvidia hardware, but we don't know about that! Yet people almost go out of their way to only point out those negative possibilities, ignoring the positive innovations it might bring!

iRC said:
Come on, when have altruistic moves for the sake of the good of 3D development been part of Nvidia’s M.O.? Sure this will be pushed under the banner of being good for everyone but ultimately its pretty obvious that this is a move designed to be good for nvidia.
Nobody suggests that this is an altruistic move for the sake of 3D development, heck this is an *industry*, you know sometimes companies actually try to produce something that might be usefull and make them money at the same time. I can't believe how the name Nvidia alone obviously is enough to bring up all kinds of alarm bells and conspiracy theories already, even though we know almost nothing about Cg! Yet some people are already spelling doom for the 3D industry because the oh so evil nvidia *might* produce a language to help developers code advanced effects (probably even independent of any specific API). We'll have to see what Cg really is before we can judge wether its a good or bad thing, but the pessimistic attitude towards it is almost breathtaking and IMHO not at all justified. What are you afraid off, that this might be adopted by developers and that Nvidia will use this new leverage and patch Cg in a way to make other hardware look *bad* in games? WOuld do them mroe hanm than good in the long run. Get a life, get laid, watch less X-Files ...

Mephisto said:
1) Cg comes with it's own language definition. Even if it only differs slightly from the DX9 HL Shading language and even if competitors are able do build in their plugins, this is VERY VERY VERY bad for the industry, as then three APIs are competing for support, from Cg is controlled by the market leader. I see, this is hard to understand for fanboys ("You guys are so pessimistic when it comes to NVIDIA").
Just that from what that leaked article suggests, it is not a full blown API, its a high-level programming language, that pretty much contradicts the possibility of being afull blown API like DX. So can you just hold off the VERY VERY VERY bad comments until you can actually back up your claims?
IMHO its also ridiculous of you to end up this point by accusing anybody who's not sceptical of this as fanboys. Sorry but if Cg is bashed by people for being bad for the industry, based on a vague, short, leaked article on IGN, then I think it is very legitimate to ask others to lay off and wait for some hard facts to back up their apocalyptic visions of Nvidia ruining the industry. If Cg sucks, it will fail ...

It's much more likely that Cg is a mainly programming toolset though, it will have its own C-like language for programming advanced shading and geometry effects among other things, and probably compile code for different APIs in the end. This, if it is a good, powerfull and easy enough to understand language, might significantly help development of games that use the latest features of hardware, and these games will likely often look a lot better than current games too. Once more of these games are out, people with older hardware will miss the eye-candy and might upgrade their hardware. In the end increased sales of next-gen hardware would probably be the money-maker for Nvidia (as market leader) should Cg be fully free. Just because Nvidia (like every other company) is in this for the money, doesn't automatically imply they want to harm the competition, as long as their hardware continues to be well recieved, anything increasing demand of it is good for them.

Basically, if Cg would help making games look better in the end, then it'd be good for the whole industry IMHO. The specific pros and cons will have to wait until we know more ...
 
Gollum said:
Just that from what that leaked article suggests, it is not a full blown API, its a high-level programming language, that pretty much contradicts the possibility of being afull blown API like DX. So can you just hold off the VERY VERY VERY bad comments until you can actually back up your claims?

It doesn't mather if it is a full blown API like DirectX or not. The shader part of the API is becoming the most important part for 3d graphics vendors.

Gollum said:
If Cg sucks, it will fail ...

Windows 3.x sucked, it didn't fail. Just an example what you can do if you've enough market share (>60% in case of Nvidia), good marketing and money.

it will have its own C-like language for programming advanced shading and geometry effects among other things, and probably compile code for different APIs in the end. This, if it is a good, powerfull and easy enough to understand language, might significantly help development of games that use the latest features of hardware, and these games will likely often look a lot better than current games too.

What would be the benefit of bringing its own language in addition to MS' HL shader language and OpenGL 2.0? Why not just adopting one of the twos and add two compilers for DX/OGL low level assembler? It would all be about control, about harming the competition by beeing the first to know what your language can do, what kind of hardware capabilities you will need to fit the language capabilities best. Do you really think that is good for the industry? Even the current situation with DX is not optimal, as Microsoft sells 3d hardware too, but it's still better then what could happen with Cg.

This all IF Cg brings its own language.
 
Back
Top