What was that about Cg *Not* favoring Nvidia Hardware?

Ok Randell I stand corrected...8 months :p ...

Anyhow my point was even the developer was in on the sales push for one brand of video card, and didn't plan on supporting Shiny Water on ATI cards at all..there was 20 threads locked on their forum from people demanding support for a game they laid cash on..and eventually gave in under pressure.
Looking at the forums dates their attitude went from.."No we will not be implementing this" to "stay tuned we will be making a announcement soon.".
 
RussSchultz said:
The market will determine whether Cg lives or dies.

If most developers find it useless, it'll die. These little flame wars will not make one bit of difference.

Edit: gorsh. That was a complete non-sequitor on my part. I have no idea what I was agreeing with.

  1. It is useless (except for nVidia hardware)
  2. It will die - a horrible and glorified death
  3. I will help build that coffin - but if we go for a cremation, I'm bringing the blow torch
  4. We will not forget 3Dfx Glide

That is all

Whoever decided that we needed another one of these, should be assigned to work on the DNF team. We'd never hear from him/her again.
 
And there we have it.

From now on, Cg's claim to fame will simply be:

"That which brought Derek Smart, Doomtrooper, and Hellbinder together!"

And people thought Cg wasn't anything "special!" ;)
 
party.gif
 
OMFG! That's hilarious! :LOL:

"And one tool to bind them..."
 
Joe DeFuria said:
And there we have it.

From now on, Cg's claim to fame will simply be:

"That which brought Derek Smart, Doomtrooper, and Hellbinder together!"

And people thought Cg wasn't anything "special!" ;)

I'm shocked! :eek:
 
Sharkfood said:
The guy that wrote the 3DAnalyze set of Direct3D wrappers has already made a kludged wrapper to pass the failing API checks... at which point the demo is now reported as working.

Yo..! :)

Well, looks like the game had nothing to do with Cg after all, since the only thing that prevented the game from running on was a failed check for an nVidia card.

-dksuiko
 
Derek Smart [3000AD said:
]It is useless (except for nVidia hardware)

Why, Derek? nVidia's most recent release has full support for ARB-extension headers, as well as generic DX9 profiles.

It will die - a horrible and glorified death

Again, why? There are developers out there who want to produce cross-platform engines. Cg is perfect for this. Additionally, Cg allows the possibility of producing IHV-specific compilers, which is a bonus. If ATI doesn't want to capitalize this, it's their own fault. Cg will still be better than Microsoft's HLSL, as long as the generic compilers are optimized well enough.
 
There are developers out there who want to produce cross-platform engines. Cg is perfect for this. Additionally, Cg allows the possibility of producing IHV-specific compilers, which is a bonus. If ATI doesn't want to capitalize this, it's their own fault. Cg will still be better than Microsoft's HLSL, as long as the generic compilers are optimized well enough.



The two NV30 profiles in the Cg compiler compile to two new, NVIDIA-proprietary OpenGL extensions, NV_vertex_program2 and NV_fragment_program. You cannot use them on a 9700.


http://www.cgshaders.org/forums/viewtopic.php?t=419&highlight=9700

Yep compiles to two new NEW PROPRIETARY EXTENSIONS :LOL:
 
Again, why? There are developers out there who want to produce cross-platform engines.

I've seen that argument plenty of times already, and it makes little sense to me.

How many developers / programs want to produce and support dual DX and GL platforms? If a developer really wants to support multiple OS platforms (like Windows, Linux, Mac), the obvious API to use is GL. Why code for DirectX on Windows, and port to GL on the other platforms? Just code for GL.

And from what I've read, if you are porting DX to GL for cross-platform support, shader compatibility is the least of your troubles.

On the other hand EVERYONE wants to write code where it supports multiple 3D Hardware platforms. And this is where Cg fails. Who has any confidence that Cg code runs efficiently on anything but nVidia hardware?

Additionally, Cg allows the possibility of producing IHV-specific compilers, which is a bonus.

Actually, that can be a curse.

For stability reasons, we're probably better off to have one compiler, but that has been created with input from the IHVs. I'd bet software vendors would tire rather quickly of IHV specific compilers generating "different" results for differet hardware platforms. (When I use nVidia's compiler, I get result X. When I use ATI's compiler, I get result Y....)

If ATI doesn't want to capitalize this, it's their own fault.

If nVidia doesn't want to concentrate their efforts on ensuring Microsoft's DX9 compiler is as robust as possible for their own hardware...rather than spending money creating their own languages and compilers....that's their own fault.

Cg will still be better than Microsoft's HLSL, as long as the generic compilers are optimized well enough.

I see it as HLSL will still be better than nVidia's Cg, as long as the HLSL compilers are optimized such that EVERY platofrm gets to within a few percent of performance of hand-tuned assembly for that hardware.
 
Derek Smart ,

Awesome. I agree with you on the un-necessity of Cg, and its intent.

Now Hurry up and get that totally Awesome BC Generations game out.. I am REALLY looing forward to it.
 
Joe DeFuria said:
How many developers / programs want to produce and support dual DX and GL platforms? If a developer really wants to support multiple OS platforms (like Windows, Linux, Mac), the obvious API to use is GL.

Ummm, add Xbox to the list and you're forced into DirectX plus OpenGL.

So, just as an interesting aside: if Cg wasn't made by NVIDIA how much argument would there be?

Isn't a cross platform high level shader language that can target OpenGL, and multiple flavors of DX based on a runtime back end a good thing? It boggles my mind to see the violent protest over such a useful idea.
 
Russ, in theory: yes. In this particular instance: no.
 
The two NV30 profiles in the Cg compiler compile to two new, NVIDIA-proprietary OpenGL extensions, NV_vertex_program2 and NV_fragment_program. You cannot use them on a 9700.


http://www.cgshaders.org/forums/viewtopic.php?t=419&highlight=9700

Yep compiles to two new NEW PROPRIETARY EXTENSIONS icon_lol.gif

Whooptie f$cking doo... Anybody who's payed even a smidgen amount of attention to Cg and the NV30 has known about the fp30 and vp30 profiles... :rolleyes: Guess what? fp20 and vp20 don't work on ATi hardware either, wanna guess why? Of course there's arbvp1 and arbfp1 if you want cross-platform support. The only real gripe I have about this is that older programmable hardware (NV20, NV25, R200, RV250, Parhelia, dunno about P10/9 hardware) doesn't support ARB_fragment_program (just R300 and NV30, thus for those devices you're stuck with vendor specific extensions (NV_texture_shader, ATI_fragment_shader, MTX_fragment_shader) of which of course Nvidia has provided Cg profiles for themselves...

BTW, You don't need to YELL! :devilish:
 
Chalnoth said:
Derek Smart [3000AD said:
]It is useless (except for nVidia hardware)

Why, Derek? nVidia's most recent release has full support for ARB-extension headers, as well as generic DX9 profiles.

I'm not Derek but I would've thought the answer would be obvious - DX9 and its own HLSL is so simply the better alternative.

Cg is not a "necessity" nor is it a better time-saver than DX9HLSL, the latter of which really is what HLSL is most appealing for (to me, at least!).

It will die - a horrible and glorified death

Again, why? There are developers out there who want to produce cross-platform engines. Cg is perfect for this. Additionally, Cg allows the possibility of producing IHV-specific compilers, which is a bonus. If ATI doesn't want to capitalize this, it's their own fault. Cg will still be better than Microsoft's HLSL, as long as the generic compilers are optimized well enough.

A smart (no pun intended) developer will want to take advantage of the widest platform used.

On a side note - whether Cg will die a "glorified" death will ultimately rest on how many developers chose to use it in a smart un-swayable way and how they may ultimately fail.

DX9HLSL, by virtue of of being almost exactly identical to Cg, will be used simply because it is the smarter thing to do.

PS. Where OGL1_2 is concerned, I personally have found no real benefits in using Cg (it's all really much simpler than D3DX). Hopefully, OGL2_0 would be further proof of this.
 
Back
Top