What was that about Cg *Not* favoring Nvidia Hardware?

archie4oz said:
Yes it is... Vertex Shaders and Pixel shaders are largely optional rendering paths within DX. Just like MMX, SSE, SSE2, 3Dnow!, 3Dnow! Enhanced and 3Dnow Professinal (read: SSE) are optional extensions to x86.

Wrong. Vertex shaders and pixel shaders are the *only* rendering paths now. Nobody is writing new games using the FFP any more. Your analogy is crazy.

If it is, it's just as nuts as endorsing 2 3D APIs. Supporting Cg doesn't entail "putting all their eggs into one basket", at worst it's just distributing them around to hedge your bets.
Likewise they could build their own Cg compiler with support for themselves

Why on earth do you think ATI would want to encourage Cg? Cg getting more popular is bad for ATI and very good for NVIDIA. If ATI wrote a profile for Cg, it suddenly becomes a serious option, rather than NVIDIA's proprietary-but-open little attempt to grab developers.
If you were CEO at ATI, and you naively said "hey, let's publicly announce that we are behind NVIDIA's Cg initiative and plan to publish profiles and provide full support, seems like a cool idea, the more HLSLs we support the better, right?", you'd get voted off by the board in an emergency meeting.

If you seriously think it's in ATI's best interests to support Cg, why haven't they already? As a matter of fact, why is there not a single IHV behind Cg except for NVIDIA?
 
Sharkfood said:
I think the only information the consumer has to date is:
1) Cg doesn't have any form of PS1.4 profile available
2) The only Cg game demo of code we have doesnt run on all IHV's hardware.

Although #2 has little to do with Cg itself, one might question what perception something touting NVidia features with an NVidia shader language that only runs natively on NVidia hardware could be suggesting.

As you've already pointed out, the reason for it not running on nVidia hardware has little do with Cg itself. According to the link you posted, it runs fine after it passes that hardware check (well, rather complicated hardware check). In which case, the only Cg game demo that we have does run on non-nVidia hardware. Also, hasn't nVidia always released demos that only run on their hardware? It didn't really surprise me that they included the hardware check on this demo, whether it uses Cg or not. So there really shouldn't be a mystery as to what perception nVidia is trying to send by making this particular demo run on nVidia hardware only... its just how they always make their demos.

-dksuiko
 
790 said:
Why on earth do you think ATI would want to encourage Cg? Cg getting more popular is bad for ATI and very good for NVIDIA. If ATI wrote a profile for Cg, it suddenly becomes a serious option, rather than NVIDIA's proprietary-but-open little attempt to grab developers.

Why is it bad for ATI? All Cg getting popular does is gives nVidia a feather in their cap. Nothing more.

Consider this scenario:
1. ATI releases their own backends for Cg that have slightly higher-performance on their hardware.

2. Cg gains widespread usage, due to higher performance for both ATI and nVidia hardware.

3. nVidia changes the Cg spec so that it works better on their hardware.

(This is what you're imagining, right?)

4. ATI releases their own new version of Cg that works better on ATI hardware, but is not 100% compatible with nVidia's version of Cg.

Said another way, the way I see it, the only problem with ATI supporting Cg is that one day a lack of cooperation may lead to the language diverging. If this happens, then another HLSL will be used instead. It's not like the use of Cg will lock anybody into using it. As many have said, it's just a language. The only thing that it will do to other languages if it comes into widespread use is force them to compete.
 
First of all, it's not just a language, it's an ever growing toolset, with plugins and doodads up the wazzoo.

Secondly, ATI has DX HLSL and soon OGL2 HLSL, as does the rest of the industry. They have no reason to care about Cg at all, it's not a threat, it stands no chance of superceding those HLSLs without adoption from them.

And you didn't answer my question; If this is so great for ATI, why are they refusing to write a Cg profile, as are all major IHVs. I guess they just haven't had you to enlighten them yet.
 
790 said:
Secondly, ATI has DX HLSL and soon OGL2 HLSL, as does the rest of the industry. They have no reason to care about Cg at all, it's not a threat, it stands no chance of superceding those HLSLs without adoption from them.

And you didn't answer my question; If this is so great for ATI, why are they refusing to write a Cg profile, as are all major IHVs. I guess they just haven't had you to enlighten them.

I don't think I ever said it was wonderful for ATI. I'm just not seeing why it's bad. The way I see it, it's just an investment that ATI may or may not need to spend on.

Anyway, it's only really good for ATI to support it themselves if it becomes popular. It can only become popular if Cg performance on ATI hardware is as good or better than competing HLSL's, which may actually preclude ATI from bothering to support it.
 
If ATI wrote their own HLSL, would Nvidia use it..would Chalnoth recommend Nvidia to adopt ATIs HLSL...

790 has stated the obvious, no IHV in their right mind would support a language controlled by a competing IHV.
Thats why their is techno nuts and business men, if techno nuts ran the show the company would fold in one month :LOL:
 
Its irrelevant if ATI decides to use it or not at this time. If CG is used the way it was designed to, it will output to OGL or DX9 HLSL. If the developer chooses, he can very well make CG work on ATI cards.

If the language is attractive enough from a developers perspective (in terms of simplifying dev cycles and making things easier), and it gains widespread adoption in the developer circles (hence the gaming market, hwo ultimately buy videocards), it would be in ATI's interest to write a backend for it or risk having unoptimized benchmarks in future titles that may sway the gaming populace to buy one card over the other.

The whole issue on whether CG becomes important or not is in the developers hands now, (not with fanboys on this forum, or with IHVs with vested interests) and frankly thats the best place for it to be IMO (in terms of the ultimate benefit to the consumer).
 
790 said:
Secondly, ATI has DX HLSL and soon OGL2 HLSL, as does the rest of the industry. They have no reason to care about Cg at all, it's not a threat, it stands no chance of superceding those HLSLs without adoption from them.

Without trying to be purposely obtuse, if it's not a threat, then what's the big deal?

Does CG have any benefit at all to the consumer?
 
Hi there,

I think it's quite simple, really--Cg is a tool. Like any tool, people need to use it, or it becomes obsolete. If devs use Cg--for whatever reason--it's a good thing. If they don't, it's a good thing, too, as they will have thought of other ways to implement what they wanted the application to feature. I don't care whether e.g. Softimage viewports show shaders thanks to Cg or customised D3D / OpenGL code. Whatever works is fine.

The market will either accept Cg solutions or it won't, devs will either use Cg or they won't. No reason to argue about the reasons behind Cg, really.

Ante P said:
what's next "Coca Cola: the stuff you're supposed to drink while you play like it's meant to be played without FSAA and Aniso because the card is too slow" ;)
LOL ! *copy/paste*

:D

ta,
-Sascha.rb
 
The whole issue on whether CG becomes important or not is in the developers hands now, (not with fanboys on this forum, or with IHVs with vested interests) and frankly thats the best place for it to be IMO (in terms of the ultimate benefit to the consumer).

I am Totally SICK of the Term Fanboi getting Thrown around Like you guys actually understand the meaning.. As if it actually has meaning anymore... Its just another meaningless term now tossed into an argument every time someone wants to make themselves feel superior to people with a Differing Opinion.. :rolleyes:

Also, It is Completely Nieve To think that Nvidia has no Stake, Or wether their Vested intrest in Cg is of no consiquence. Nvidia Has Direct influence on 90% of all developer studios in the world. Nvidia Sets the Rules for Cg, They influence Developers to use it, They get their *Nvidia they way its ment to be played* pasted accross half the games on the market now.. And They Will have *Made with Nvidia Cg* plastered everywhere soon to..

And You are actually going to try to push the idea that Nvidia is an Idle participant who will just sit by and see How it all plays out?? :rolleyes:
 
Hellbinder[CE said:
]And You are actually going to try to push the idea that Nvidia is an Idle participant who will just sit by and see How it all plays out?? :rolleyes:

Where did Fred say that Nvidia is going to sit on the sidelines regarding CG? Of course they aren't. That doesn't mean that CG is going to help Nvidia conquer the world either though.

Fred is, imo, 100% correct. Glide had WAY more mindshare than CG does now and was about the only choice for 3D acceleration (unlike OGL & DX now) but yet it died.

Fact is that developers aren't zombies and DO have a will of their own, thank you very much.

One demo that DOES run on ATI hardware. Wow, now what was this about again?
 
Ailuros said:
Dunno Rudolf the dear is behaving a bit strange at the moment :D

Hmmmm....Rudolf the deer is a dear?? Either you've been hittin the eggnog or else...... I don't wanta know! :eek:

MERRY CHRISTMAS everyone, even the fanbois!
 
790

Wrong. Vertex shaders and pixel shaders are the *only* rendering paths now. Nobody is writing new games using the FFP any more. Your analogy is crazy.

what about Geforce 4MX ? i guess million of users dont agree much with you.
 
Mummy said:
790

Wrong. Vertex shaders and pixel shaders are the *only* rendering paths now. Nobody is writing new games using the FFP any more. Your analogy is crazy.

what about Geforce 4MX ? i guess million of users dont agree much with you.

They didn't really have a choice... an example of an excellent marketing move and the ignorant consumer.
 
Doomtrooper said:
790 has stated the obvious, no IHV in their right mind would support a language controlled by a competing IHV.

I don't think you get what I'm trying to say. If ATI supports Cg, then it's no longer absolutely controlled by nVidia.
 
Mummy said:
what about Geforce 4MX ? i guess million of users dont agree much with you.

The 4MX users will be screwed if they don't upgrade soon enough. Users don't make the decisions, developers do, and we are all using the programmable pipeline now, the only exception to that being projects which started before 2002ish

To wrap up the original topic. I'm a developer. I use NVIDIA tools daily, as well as ATI tools, and tools from all over the web. The bottom line to me is, I need speed, and support. Cg isn't faster than HLSL, and without official IHV support, I can't afford to invest the future of my projects into it, only to find that NVIDIA's doing non-optimal or non-supported for a certain card past, present, or future.

I always need somewhere to turn when there are problems, someone to blame, and with Cg, NVIDIA don't care if it doesn't run properly on my Radeon 9700 or Parhelia, and nor do ATI or Matrox.
 
Back
Top