What was that about Cg *Not* favoring Nvidia Hardware?

There might be games with effects that you only can enable if you have a GfFX.
Even if other cards support the feature the effect use.
I dont think this is unreasonable if the game company has a close relationship with Nvidia and gets feedback from Nvidia during the development of the game.
I think a company like that might use CG if Nvidia tells them that it´s the best option.

I know that I would be pretty pissed if a game that I would like to play supported some fancy stuff on a lesser board than mine just because nvidia helped them develop the game.
 
Novdid said:
There might be games with effects that you only can enable if you have a GfFX.
Even if other cards support the feature the effect use.
I dont think this is unreasonable if the game company has a close relationship with Nvidia and gets feedback from Nvidia during the development of the game.
I think a company like that might use CG if Nvidia tells them that it´s the best option.

I know that I would be pretty pissed if a game that I would like to play supported some fancy stuff on a lesser board than mine just because nvidia helped them develop the game.

I would not like that myself.
But some companies have Nvidia logos and adds on their homepage and obviously get help from Nvidia.
Some of them mostley release Xbox games and sometimes release them for PC too.
The game can of course be fun to play anyway.

It´s not good at all if some companies welcomes help from Nvidia because they need it. Perhaps both money and advice.
That´s of course something that can hurt consumers that own other cards.

Regards!
 
ps1.4 was a fluke, a blip on the radar screen. Forget that it even existed, ps2/3 won't suppor it. CG will compile to vs/ps2 which both gffx and ati support. I think CG supports a general ver.2 profile if I'm not mistaken. Ati should write their own back end to it. Look guys, it isn't easy for devs. to use ms hlsl right now. God knows how long it will be before ms ships proper hlsl docs with plenty of examples. The dx9 docs are in shambles as anyone who is learning shaders can attest to, myself included. I can't believe ms released the docs in this poor state. Examples have numerous errors, the docs are contradicting at places, etc. So for some devs more complete CG is a god send. Yes nvidia controls the lang. perhaps they should standardize it or something. However the shaders move too fast so am not sure it's something good to do now. Maybe nvidia collaborates with Ati on lang. I don't know but it wouldn't hurt. CG might eventually be replaced with ms hlsl but not now and now is where it's at. Every day counts. You don't expect devs to put shaders on backburner while ms gets in gear do you? Remember, it's the person who's writing the shaders not some machine optimized for Nvidia. You don't have to use loops, etc. in your shaders and can very well write them towards standard ps2 sematic. The important thing is that devs now have a good tool to work with.
 
Hope you all had a good xmas. JD, a few points, before I let this topic die:
1. 1.4 is used by all 9000/8500 cards, and is significantly more advanced than 1.1/1.3, it is not a blip (ut2003 makes use of 1.4, among many other games). On the other hand, Cg support ps1.3 which adds hardly any functionality worth using, and is only supported by GeForce4 cards.
2. ATI will not collaborate or write backends for Cg.
3. We've been using the DX HLSL for 5 months now, we have piles of shaders, and there's nothing that isn't covered in the docs. It's reference style though, not tutorials. If you're looking for that, see the ATI and MS shaders, and hold your breath for more. In fact, if you're having any specific problems with HLSL, you can post it to the DirectX groups, or message me, I'd be happy to help.
 
An interesting observation: It really is beginning to look like nVidia has taken the worse of 3DFX to heart. Way back when it was a 3DFX vs. nVidia thing, the nVidites ;) always bemoaned the fact that Glide was taking away from the support of DX & OGL. I cannot count the times I heard diatribes on the evilness of Glide…. and still do! Back when the original Voodoo was available, it was Glide that gave us an inkling of just what 3D graphics were about. But, as both DX & OGL matured, well, even 3DFX realized it was time to put Glide out to pasture. And, there was much rejoicing among the nVidites. Now, nVidia has resurrected the specter (pun intended) of Glide – CG! A hardware specific API….. Not? Well, check this out:

http://www.gamingnext.com/stalkerinterview.asp

“To secure the best of visual impression, we cooperate closely with Nvidia, implementing their latest CG technologies, so the owners of GeForce FX boards will be especially privileged here.â€

The whole purpose of CG IS just this: Make developers use our features AT THE expense of our competition, period. They may sprinkle sugar on top of it, but it still tastes like shite. It is totally counterproductive to the idea of free enterprise and open competition. No matter what manufacturer you support, this cannot be good for the whole industry, and, eventually, to you. It’s the same, but even more subversive, attitudes that nVidia used to strong-arm many internet sites a few years ago…… now, instead of those site, they are trying to strong-arm the developers – bottom line for nVidia – Business is War, take no prisoners, the end justifies the means, all is fair, ethics are for losers!
 
I think you're overreacting. "Make developers use"? "Strong arming" the developers by offering them another tool? Adding a tool to the mix is "counterproductive to the idea of free enterprise and open competition?" Your statements are so ironic its humorous.

You're waxing very poetic by equating GLIDE and Cg. Glide was completely closed, and companies and individuals were sued to keep it so. Cg is an option open to any individual to write a profile or backend.

Glide was widely accepted because that's all there was. Once DirectX and OpenGL became viable, it died its own death and there was nothing 3dfx could do to prevent it. All they could do was prevent other companies from offering Glide ports for their chips, but in the end it offered no benefits to the developer wanting to reach as wide a market as possible and hence was designed out. There's nothing 3dfx could to do force developers to use Glide, and similarly there's nothing NVIDIA can do to force developers to use Cg.

Of course Cg can expose the limits of NVIDIA cards--because they've gone and written back ends for their chips! This doesn't mean its a hardware specific api; that exact same option is available for every other IHV.
 
Let me paint a picture for you:
Lets say ATI decides to support Cg, and decides to write a backend.
With in a year, a boat load of games come out using Cg because the "big two" are pushing it. Since ATI supports it, the games would run fine on ATI cards. So far so good.

Just one problem for ATI though. Every game that ships carries an Nvidia logo, and ofcourse that "the way its meant to be played" intro as well.

How can that be good for ATI?

I don't pretend to know the technical difs between Cg and MS HLSL, and the technical advantages and disadvantages of the two either. Purely from a business stand point, ATI have nothing to gain by supporting Cg. Nvidia on the other hand, have everything to gain. Jmho.
 
Cg is of course not like Glide. But it´s intended to help Nvidia. To make sure that games look as good as possible with Nvidia hardware and that more features(supported by Nvidia) are used.

Some companies that have a relationship with Nvidia and decides to use Cg as development tool will release games that looks better and have more(or better looking) effects when you use Nvidia hardware.
At least I very much suspect that there eventually will be a few examples like this.

If I buy a game like that I would in fact blame Nvidia if I can´t enable all effects when using a ATI(or other) card that supports the feature needed for the effect.
Because I can understand why other companies refuse to support Cg and I would never blame them in this case.

I think there must be a common standard for both API:s and tools that prevents the release of games like that.
I think that this is the reason why it´s best to use DX and related tools supported by MS.

Regards!
 
Fuz said:
Just one problem for ATI though. Every game that ships carries an Nvidia logo, and ofcourse that "the way its meant to be played" intro as well.
And this is in some licensing agreement? If you use Cg, you must have this NVIDIA logo and "the way its meant to be played" in your game? That would be heinous, but I don't believe its true (feel free to show me wrong).

If anything, that logo and marketting slogan is there because NVIDIA payed for it, not because its using Cg.
 
One point on which we all can agree is that Cg would pose no threat to ATI if they could compete in the Cg space on the same terms as NVidia (the same way they can compete in the Direct3d HLSL space).

The difference is that some think that because the Cg spec originated with and is controlled by NVidia, ATI will never be able to compete on the same terms. Others seem to think that somehow NVidia's control is irrelevant.
 
RussSchultz said:
And this is in some licensing agreement? If you use Cg, you must have this NVIDIA logo and "the way its meant to be played" in your game? That would be heinous, but I don't believe its true (feel free to show me wrong)

If you spent thousands (maybe millions) and worked years to design a tool that helped me make something and sell it, wouldn't you expect some recognition?
 
Just as a note, any PS / VS 2.0 / 3.0 part will support PS 1.4. They are supposed to be backward compatible according to the DirectX specification. So, it is still a viable fall-back option from 2.0 (and beyond) Shaders since ATI has, and will continue to sell, boatloads of DX8.1 parts, especially when their integrated parts get them :)

As for CG, since developers can support both HLSL and CG, there is no problem them doing so. But, a smart developer who wants to use a High Level Shading Language that supports ALL card, HLSL is the ONLY way to go. Once developers get educated to this fact, it will become a moot issue unless Nvidia has a licensing agreement with that developer / publisher enforcing it. And I really doubt any developer / publisher worth anything will allow themselves to be FORCED into using CG EXCLUSIVELY so I don't think there is anything to worry about.

Especially since MS will be constantly updating HLSL as well so anything that developers like about CG can have their opinions go into the revisions of HLSL. Sounds like a winning option to me.
 
nooneyouknow said:
Just as a note, any PS / VS 2.0 / 3.0 part will support PS 1.4. They are supposed to be backward compatible according to the DirectX specification. So, it is still a viable fall-back option from 2.0 (and beyond) Shaders since ATI has, and will continue to sell, boatloads of DX8.1 parts, especially when their integrated parts get them :)

And because of this CG will support it? I don't think so.........
 
martrox said:
nooneyouknow said:
Just as a note, any PS / VS 2.0 / 3.0 part will support PS 1.4. They are supposed to be backward compatible according to the DirectX specification. So, it is still a viable fall-back option from 2.0 (and beyond) Shaders since ATI has, and will continue to sell, boatloads of DX8.1 parts, especially when their integrated parts get them :)

And because of this CG will support it? I don't think so.........

Martox, I wasn't saying that. What I was saying is that if a developer makes PS 1.4 shaders, any PS 2.0 parts can understand it and it will work. CG doesn't make PS 1.4 Shaders...
 
Sorry....not meant as a putdown,nooneyouknow, just trying to make a point...... for the resident nVidia worshipers ;) ......
 
Fuz said:
If you spent thousands (maybe millions) and worked years to design a tool that helped me make something and sell it, wouldn't you expect some recognition?

Show me an agreement where the badging is made compulsory by the use of Cg in a product. Idle speculation doesn't count.
 
Russ,
More than 80% of games that come out that use Cg will, guaranteed, have the Nv logo on the box. I don't have any proof of this, nor do I need to show you any. If you can not agree on this point, then discussing this further with me will be a waste of your time.
 
I think that I should be able to buy any card that supports DX9 and any DX9 game.
When I install the card and the game everything should work all effects should be possible to use regardless what company made the game and regardless what DX9 compliant card I use.

If it´s not working there is a problem. It could be drivers or it could be that some effects in the game are implemented in a way that is not fully DX9 compliant.

Any tool or anything that can create problems like this is not in the consumers best interest to support.

I think that the best solution to this problem would be if MS tested games for DX compliance.
Like they test WHQL drivers.
The game company would have to let MS test the game to be allowed to market it as a DX game.
If it failed the test it would not be allowed to be called a DX game.
This would be both to protect the consumer and the integrety of DX and MS.

Regards!
 
Fuz said:
More than 80% of games that come out that use Cg will, guaranteed, have the Nv logo on the box. I don't have any proof of this, nor do I need to show you any.
Your skills of pursuasion are phenomenal.

If you can not agree on this point, then discussing this further with me will be a waste of your time.
What a suprise. I've come to that exact same conclusion.
 
Fuz said:
RussSchultz said:
And this is in some licensing agreement? If you use Cg, you must have this NVIDIA logo and "the way its meant to be played" in your game? That would be heinous, but I don't believe its true (feel free to show me wrong)

If you spent thousands (maybe millions) and worked years to design a tool that helped me make something and sell it, wouldn't you expect some recognition?

The new UT has the NV slogan you refer to and it's not using CG afaik. One does not imply the other unless you have evidence of this in the licensing agreement?

Don't games sometimes have multiple labels on their boxes including both ATI & Nvidia?
 
Back
Top