Battle of three HLSL : OpenGL 2.0 / DX9 / Cg

Based on what we know, what can be said about the strenghts and weaknesses of these three languages? Some say in the end it's all about the same whlie some stand behind one or the other. Techinically, what are the merits behind each and how do they differ in persuing near-CG quality on PC? Further more, what is the impact of HLSLs on the convergence of Real time 3D and high quality cinematic rendering?

Discuss.
 
The best language would be the one that is both easier and quicker for developers to use and write, but without sacrificing ingame performance or any features.

From what I know of all of the different languages, cg is the better one, as an engine can be written in cg and then ported over to directx, opengl and xbox.
 
Well... the way I see it: HLSL Languages = better and more complex shader effects done much faster!

Basically, while today the programmers are pretty skeptical of this entire affair, which is the reason they're staying away from creating more complex shader effects, simply cause it takes so much time to code them in ASM, the future looks much brigther!
First of all, we now finally moved towards fully programmable graphics architecture, supporting thousands of instructions both in the vertex & fragment stages (well, in fragment processing, R300 offers 160 instructions max, but it should still be more than enough for the future... otherwise, there is always the option of multipassing...), which is a major departure from the very limited programmability of the previous generation of graphics architectures (supporting countable number of instructions!). Keep in mind that the next-gen of graphics architectures (R300, NV30, etc...), even without flow control, can bring a significant boost to the graphics industry, with programmers taking advantage of the immense number of instructions and producing better, much more sophisticated imagery at less time! That's exactly what HLSL's were created for, to assist the programmers, to adopt them faster to the ever evolving graphics industry, so they could take better advantage of the latest state of the art features, investing less time and saving headaches of programming in the tightly integrated low level ASM!

If HLSL's fulfill their purpose as intended, the DX9 era would bring a revolution and if that's the case, that would hopefully continue in the future and become a well stablized trend.

Second of all, mainstream DX9 parts from ATI are already on the market and NVIDIA and (hopefully) other vendors are soon to follow and considering how fast the prices drop these days, that's a win-win situation and would further give the programmers more freedom to focus on the denominator on the market and faster adopt to the DX9 era, which the way I see it, marks a major departure from anything we've seen before...

Rest assured, the future holds plenty of suprises for us and if everything works out as it should, exciting times are ahead! :)
 
I hate to say it but Microsoft DirectX is going to win out in the end no matter what. o_O

So we may as well just start discussing what, if any, features of OpenGL and Cg are eventually incorporated in it. :-?
 
With the other two HLSL there is no need for CG.......I know this opinion is not the same as some members..or even developers, but if CG merged with DX 9 HLSL that would make things alot easier...since M$ and Nvidia seem to be partners anyway...(X-box and DX9 will release with Nv30..how conveniant), I see no reason to worry.

Having one IHV basically control the future of graphics is not good for the PC business...I don't care what anyone states, there is a competive reason why Nvidia would spend lots of $$$ making their own HLSL vs. participating in moving OGL 2.0 and DX9 HLSL quicker...if its to expose more power in their GPU's (my opinion) then we have developers making another Neverwinter Nights, Dronez..etc...I'm sorry we don't need that.

I do hope this gets settled quickly and IMO DX9 HLSL will be the target for any sane developer..not everyone will be running Nvidia cards and of course Richard Huddy says DX9 HLSL is superior..we'll have to see.
 
Merge? How can Cg merge with DX9 HLSL without just going away? Cg is portable, DX9 HLSL is not and never will be.
 
There is no 3 way battle. It is still OpenGL vs DirectX.

CG compiles to DirectX code. It does this to make the developers life simpler (rather than write ASM shaders, you write C- like shaders).
 
I thought that the big advantage for a developer to write a shader in cg was that they were not being tied down to one api.


So a developer would write the shader in cg and then port it over to linux, xbox, directx and opengl for ati, nvidia, matrox, etc hardware. Isn't that the whole point of cg.
 
Cg is DX9 HLSL. nVidia just offers additional compiler profiles for OpenGL shaders.

So, the argument is really OpenGL 2.0's HLSL vs. Cg. Right now, it looks like 3dlabs' proposal is the one that GL2 is going for. The primary benefit of this proposal is that it seeks to standardize the shading language, not the assembly language.

This means that each developer will have its own proprietary extensions/compiler for OpenGL 2.0, but all will interface with the same HLSL. I think that this is definitely a good idea, for a variety of reasons. We'll have to see what the future holds...
 
Yes, there is that also.

The take home message is that Cg does not seek to overtake OpenGl or Direct3D, rather it aims to make them easier to code for regardless of target platform.

All the target platform has to do is provide support for Cg.

In an industry that changes as rapidly as graphics that has to be a good thing for both developers and end users.
 
I beleive nVidia pushes Cg because of their intrest to get greater developer support.
And the main reason is probably that they couldn't wait for DX9 or especially OGL2.

They need to attract developers, but how you get to program for your hardware, when there no API support, or (in case of OGL extensions) it's way too complex to use.

The irony is - there hardware is not here...

So we have Cg - but we don't need it now. The only cards that it would made sense for is the R300 based one - but it doesn't support them (of course).

And by the time we have nv30, DX9 will be out for some time with a HLSL that supports ATi cards to - so I don't think many DX developers will use Cg - I know I won't.

So, it will likely end up so Cg will be killed by the delay of nv30.
 
So we have Cg - but we don't need it now. The only cards that it would made sense for is the R300 based one - but it doesn't support them (of course).
And there is nothing stopping ATi or anybody else from implementing Cg. nVidia would like all industry participants to adopt Cg.

I have not double checked, but I believe, it costs other companies nothing in the way of licensing etc to use Cg. All that is required is Cg aware drivers on their part.
 
Other companies dont need to implement Cg in drivers, they just need to write profile for Cg, so that Cg compiler know how to compile for that specific hardware.I think that Cg already works on Radeon 9700 because it can compile to ARB_vertex_program.There is no profile for ARB_fragment_program(yet).
 
I don't know opengl 2.0 that well (checking out docs as we speak though ...) but I would also not underestimate the value of the support tools directx gives you.

As far as I understood the dx9.0 presentations floating around on the net there will be support in dx for LOD selection, support for packaging several shaders, support for attaching shaders to models, support for editing shaders in modelling packages ...

Up to now in opengl programmers had to do everything on their own,(D3DXComputeTangent, D3DXLoadTextureFromFile, D3DXLoadModelFromX, ... anyone?) everyone will admit having these tools (not even using them) is an advantage ...

I hope GL2 will give us more support (and updates) in these terms ...

Jurgen
 
Hyp-X said:
So, it will likely end up so Cg will be killed by the delay of nv30.

Although the NV30 isn't here yet, developers can make use of the NV30 emulation feature in the Detonator drivers.
 
MikeC said:
Hyp-X said:
So, it will likely end up so Cg will be killed by the delay of nv30.

Although the NV30 isn't here yet, developers can make use of the NV30 emulation feature in the Detonator drivers.

If you're a developer who wants to get with DX9 shaders effects what are you going to use - emulation or hardware support (9700)?
 
If you're a developer who wants to get with DX9 shaders effects what are you going to use - emulation or hardware support (9700)?

It depends on who's buttering your bread.. *cough* MadOnion + "Advanced" pixel shader tests *cough*.

In the developer game, money talks... and be it money in the form of additional given resources or actual cold, hard cash.
 
Speaking of which, it looks like DX9 might become official very soon:

http://mirror.ati.com/playati/gamasutra/index.html

One of the most significant new advances in real-time 3D game development is the introduction of the Direct3D® High Level Shading Language in DirectX® 9. Utilizing this industry standard shading language, game developers are able to increase productivity and creativity by developing advanced shader effects for chips such as ATI's RADEON™ 9700 and RADEON™ 9500 more quickly and easily than ever before. This online NetSeminar, presented by ATI, will introduce the new high level shading language and illustrate its use with a variety of advanced visual effects...

The seminar is slated for Dec 11. I'm not sure if ATI would be doing this if DX9 is not public by then...
 
Back
Top