Cg released

Bad example. If you've ever ported ANSI C code between GNU, HP/UX and Solaris C compilers, you'd know what a freaking nightmare this is.

Not done it for a while but all our original C code was written for Sun but ported over to HPUX, it was only really the threads that caused an issue as HPUX doesn't support threads.
 
I just want to know why Gking keeps sidestepping these popular questions:

1) Why do we need 3+ HLSL Complilers ??
2)If Nvidia is working on DX9 HLSL why do we to make a profile in Nvidia CG for DX 9 HLSL and the same goes for another graphics card vendor ??

I don't buy the 'we need it now theory' as Dx9 HLSL is coming very soon.. :-?
 
Sharkfood said:
it is similar to me taking some C code and compiling it on HPUX and Sun to allow the application to run on the respective hardware OS

Bad example. If you've ever ported ANSI C code between GNU, HP/UX and Solaris C compilers, you'd know what a freaking nightmare this is. :)


Quite the contrary, it is an good example then !


On a sidenote, how will the Cg behave in OpenGL? Will it generate code using the proprietary extensions from Nvidia or only normal OpenGL / ARB code?

If they generate proprietary code, then I would think it is impossible to run the same game/application on other hardware, cause the extensions from Nvidia and ATi/Matrox/ARB etc.. are not 100% compatible so you cannot get the same result with other hardware. Or the recompilation will need a lot of additional work to map the art onto an different code/hardware.
 
mboeller said:
On a sidenote, how will the Cg behave in OpenGL? Will it generate code using the proprietary extensions from Nvidia or only normal OpenGL / ARB code?

Looking at the thread in which nick posted in it would seem they are making use of the NV_ shader extensions until ARB ones are ratified (which may not be until OpenGL 1.5). Looking at the presentations slides nVIDIA allude to OpenGL-NV30 which sounds like they will be supporting a bunch of NV30 specific extensions via Cg as well.
 
Doomtrooper--
I just want to know why Gking keeps sidestepping these popular questions

1) As for why we need 3+ shading languages -- why haven't 3D Labs decided to work with Microsoft in designing OpenGL 2.0's shading language? Why does OpenGL even need its own unique shading language? I can't tell you the answer you really want to hear, though.

2) No other vendor needs to make a Cg profile for DX9. The reason one will exist is so that the usage semantics for both OpenGL and D3D will remain the same, if a developer wishes to use the runtime compiler.

DaveBaumann -
cannot create syntax for features that are not currently exposed in Cg.

Take a look at Cg's spec again -- none of the tex instructions that are NV20 specific (e.g., tex3v3spec) are part of the language specification. These are parts of the profiles, and the code is automatically generated by the code generator.

All ATI would need to do to provide new functionality would be to provide a header file that declares functions exposing their functionality (probably prefaced with __internal), and the code generator could output ATI-specific stuff straight into the compiled shader.

In its current form, Cg is really more akin to DX9 HLSL + an extension mechanism, with the side benefit that it also works in OpenGL.
 
Take a look at Cg's spec again -- none of the tex instructions that are NV20 specific (e.g., tex3v3spec) are part of the language specification. These are parts of the profiles, and the code is automatically generated by the code generator.

All ATI would need to do to provide new functionality would be to provide a header file that declares functions exposing their functionality (probably prefaced with __internal), and the code generator could output ATI-specific stuff straight into the compiled shader.

How would they go about creating a profile for PS1.4 then?
 
Once the front-end is opensourced this summer, they could download it and create a code generator for PS 1.4.

PS 1.4 would be kind of awkward due to the way it handles dependent look-ups (a lot of shaders that would seem permissable wouldn't be due to the separation of addressing from blending, but the code generator should detect that.
 
gking said:
Once the front-end is opensourced this summer, they could download it and create a code generator for PS 1.4.

PS 1.4 would be kind of awkward due to the way it handles dependent look-ups (a lot of shaders that would seem permissable wouldn't be due to the separation of addressing from blending, but the code generator should detect that.

Again why would ATI want to do this when they are working on a standard format in DX9 HLSL ???
 
I think gking was answering Dave Baumann's question on how it could be done, not necessarily if it will be done Doom...
 
Doomtrooper, you just don't get it.

Cg == subset (DX HLSL, profile)

Today, we have the HTML4.0 and XHTML1.0 standards. Those standards were too complex to work on mobile phone browsers. As a result, cHTML was created which is a subset of HTML. It is still HTML and cHTML will work in any HTML browsers. It is a simpler subset of the language. WAP2.0 is a subset of XHTML.

Then, after several mobile phone manufacturers realized the need for a simpler HTML, XHTML Basic was created.

If you now visit the w3.org site, almost every new markup language has a "Mobile" and a "Tiny" profile.

What NVidia did was take HLSL and realize that it is too complex to run on TODAYS hardware. So they created a compiler which can compile a PROPER SUBSET of the language, given an execution profile.

Cg shaders, with minimal changes, should compile in DX9's HLSL compiler as well. It is FORWARDS COMPATIBLE with the future DX9 HLSL standard, and backwards compatable with older OpenGL1.1/DX8.1 hardward.


Imagine the following: I write a C++ compiler that doesn't have exception handling or templates, only classes. I release this compiler with documentation that says "This is C++ --, you can use it to do object oriented programming, just can't use exceptions or templates"

All C++-- code will compile and run on ANY C++ compiler, it just lacks the full power of C++. C++-- code is future compatible with "Full C++" compilers in the future.

So what is the harm in C++--? With C++--, you can write code that anyone with C++ can compile and execute.


The same thing happened with Java. Sun created Java. Then they created PersonalJava (for PDAs). Then they created Embedded Java. Then they created Java Card. Each was a subset of the former. After that, they created CLDC (Connection Limited Device Configuration) and CDC (Connected Device Configuration). Again, both are these are subsets.

Any code written for CLDC or CDC will run on EmbeddedJava, PersonalJava, or Full Java. The code is fully resuable.

I see no harm in NVidia subsetting HLSL to make it run on todays hardware.

-DC

P.S. C++-- is a reality. For years after C++ came out, most compilers didn't implement templates or exception handling properly. The only truly portable compiler/platform independent C++ was one in which you ignored almost everything except basic OO.
 
If NVidia released an IDE for developing games, I have a feeling most of the people bitching here would raise hell. Basically, any tools that NVidia provides to developers to help them develop games scares the people here.

They don't want developers to get hooked on any NVidia specific tool for fear that in the future, NVidia could use this to lock people into NVidia hardware.


Well guess what, 3dfx had the ultimate lock-in. Hundreds of games totally dependent on the GlideAPI. A whole year or so head start with many developers doing all their learning and 3D developing totally in Glide.

In the end, they still lost the market and got crushed.

And in many ways, Glide is far far worse (tighter binding to the developer, more expensive switching cost) than Cg. It is harder to port a graphics engine written in Glide to DirectX or OpenGL unless you already wrote a total abstraction layer around 3D rendering. With Cg, since it is a subset of HLSL anyway, it wouldn't be hard to reuse most of your Cg code in DX9 HLSL, or to write a parser to transform Cg into DX9 HLSL.


Anyway, despite the advantages of a big head start, hundreds of games written to a low-level hardware API that is tied to 3dfx specific hardware features, it did not help them maintain their dominant position.

My opinion: the fearmongering here is much ado about nothing.

Developers will use what works. Microsoft had consistently won in the market because it is not the most ELEGANT approach that wins, but the one that actually delivers TODAY. Release early, release often, evolutionary improvement.

If you spend 3 years trying to build the ultimate API/engine/language/whatever, most of the time, you will get your ass handed to you in the market.
 
DemoCoder said:
Imagine the following: I write a C++ compiler that doesn't have exception handling or templates, only classes. I release this compiler with documentation that says "This is C++ --, you can use it to do object oriented programming, just can't use exceptions or templates"

Sound's just like Embedded Visual C++ 3.0 :devilish:

No Exceptions = Fun, fun, fun! :D

-Colourless
 
Its not like there weren't a shitload of game engines which were developed mainly for Glide and which made very shoddy transitions to other APIs/hardware causing pain for the end user now were there?

Competition will always be possible, but the more level the playing field the better it is for the end user (unless someone manages to change the rules of the game entirely by something revolutionary ... but Cg can hardly be termed such). Lets hope ATI has a little balls and pumps out its own profile/implementation in a couple of months, DX8 level hardware will be with us for a while. IMO it would be a shame for ATI to give up on PS 1.4, even if the fraction of developers which make the effort to support both even if tools make it as easy for both (like the present situation without Cg) is not big its a sign of weakness to just give up :/
 
Anyone remember 3dfx minigl drivers? 3dfx subseted OpenGL. That was much better than glide when it comes to porting your engine.

Subsetting is bettier than having a complete proprietary standard.
 
Well, I've been a developer since 1987. I have sat on standards groups at the W3C and IETF and I am very familar with the process of creating profiles for restricted devices and writing device independent specifications. I say subsetting is a good way to achieve language scalability. The industry seems to think so too, which is why there is a proliferation of "profiles" going around.

Oh, but you say so.
 
<feeling like he's dipping his feet into water that he already knows is too hot>

In an ideal world, Cg would be completely open sourced. However, I think we all knew days before its official announcement that this wouldn't be the case--Nvidia is too aggressive, and too smart, for such a move. And in retrospect, I never recall having a problem with Glide (the market was wide open for a company to establish itself in the way 3dfx did back in '96-'97), so I do feel that it would be hypocritical on my part to become threatened by Cg. Besides, if it helps developers add support for new hardware features, other major vendors such as ATi won't be ignored. . .developer management will probably feel comfortable using whatever time Cg might've saved in programming manhours to ensure decent support for non-Nvidia hardware. Or so I hope.
 
I remember 3DFX taking HUGE flak for not having a full Opengl ICD for years from developers and critics, now you are trying to tell me this a good thing.
I respect your credentials but that doesn't mean you are right :p
 
Back
Top