What was that about Cg *Not* favoring Nvidia Hardware?

Well Nvidia has released on their Site a *Game demo*.. called Gun metal.. that featues Cg, and DX9

http://www.nvidia.com/view.asp?IO=game_gunmetal

It says...

Gun Metal from Yeti Studios is a futuristic action packed experience where you take control of the fully transformable, prototype combat vehicle known as the Havoc Suit. Use the Havoc Suit to battle ground based enemies, or transform in the blink of an eye into an agile jet and take on aircraft in high-speed dogfights. Either way, get ready for mind-blowing action and out of this world graphics. Gun Metal is one of the first available games to utilize Cg during its development. By doing so, Yeti have been able to create a wealth of movie-quality eye-catching special effects that bring the gaming experience to another level of excitement.
Scorch the earth, crush rocks under foot, fell trees, torch crops, tear chunks out of towering buildings and raze entire settlements to the ground as you take the fight to the hordes of enemies through 14 diverse and engaging missions and marvel at the unparalleled level of destruction.

Cg Specific Graphical Features:

CG support for all materials
CG: 'Motion Blur' effect for the plane, proportional to speed
CG: Realistic 'Water Refraction' effect
CG: Further effects under consideration
DX9: 'Occlusion Query' for optimised rendering and realistic flare effects
DX9: Use of 128bit floating point buffers, enabling use of high contrast colour and overflows (such as retina bleach)

Now..

It runs on all GF3 and GF4 cards.. But it does not run *At all* on Radeon cards.. Not even the 9500/9700... with DX9 and Cat 3.0... Which is funny because they are the only TRUE DX9 card available.. yet it runs on all the Dx8 Nvidia cards... :rolleyes:

When you try to run it you get...

Time: 21/12/2002 07:49:37
No supported 3D card found

Whats more than that... i remember getting in HUGE arguments with people on this very site, who called me a FANBOI and far, far Worse for saying this is EXACTLY What Nvidia Would do with Cg.

Well, What do you have to say about this now? The Day after DX9 gets released, And right after Ati releases DX9 Drivers... Nvidia releases a Game demo that the CALL a DX9 game that wont run on the only True Dx9 hardware...

Why don't all you really *ballanced* and *knowledgable* people Explain this one.. When was the Last time you saw Ati do something like this? or powerVr... or ANY other company than Nvidia.... :devilish:
 
Hellbinder[CE said:
]Well, What do you have to say about this now?

Well, that same thing that I have been saying all along! :p

One of the key things in Cg are that ability to use hardware specific shader profiles IMO.

Apparently they decided to put some NV_ string into those profiles. It's their choice - and a bad one at that I might add.
 
It probably requires the "Cg emulation" or whatever it's called that's built into NVidia's recent drivers. In other words, it does not use Cg as a Microsoft HLSL is used, but as the shader code that's sent to the driver (which does the compiling).
 
Hi there,

Hellbinder[CE said:
]
Well, What do you have to say about this now?

The same thing as I said before--Cg on its own doesn't favour NV hardware, but implementations using Cg might. And apparently do. That's a design decision by the dev studio (for what reasons ever), and not really Cg's fault.

Or do you really think that Rage will release a game that runs only on NV hardware, in its full version?

ta,
-Sascha.rb
 
Have you tried installing the DX8 Cg runtime?

Not that the game would work if it is testing for nVidia cards, but I think you would at least need some sort of Cg runtime for any Cg app.
 
I guess this depend on whether the game ships with precomiled Cg generated code or runtime Cg code - if its runtime then it will need a profile in the drivers. Of course, the developer could have just comiled it themselves.
 
Yeah, for a while there I thought DX and OpenGL favored ATI cards, because Humus' demo's wouldn't work on my NVIDIA card.

Turns out I was just being stupid. He hadn't tested it with any NVIDIA cards and had to do a few tweaks to make it work. Phew.
 
I remember some serious debates about this prior to this DX9 release, how CG is good for the industry, how it exports generic shader data that any card can do....how CG is great for the industry...I remember being flamed for be skeptical...

Where are these people now...


astrosmiley.gif
 
DT please, do a favour to yourself, stop talking about things obviously you don't understand. You and your smiles are getting pretty boring.
[EDIT] sorry, I didn't see Russ's post...
 
This one demo, that is a port of an XBox game, which is only available from NVIDIA's website, created by a development company that has NVIDIA propoganda plastered all over their website, is an accurate representation of what Cg can and will be used for by the industry as a whole. We should lynch NVIDIA for creating such a blasphemous piece of software, and we should boycott Yeti Studios for supporting them in their attempt at world domination. Fuck NVIDIA and Yeti Studios; fuck them in their stupid asses.
 
Ahhh the usual suspects, Russ the legend in his own mind know it all, Nao who is a Nvidia follower from way back and Crusher..Nvidia defender claiming Nvidia is the nicest IHV in the business..plays clean...CG is good for the PC graphics industry...

Your defense of this is just another example what people do to defend their favorite IHV...and they claim their not biased ??

:LOL:

Nao, thats just too bad isn't it.
 
Cg can be used to support multiple vendors' hardware, and can conceivably evolve to support the evolution of graphics hardware going forward.

Speaking for myself (though it seems appealing to lump me in with people whose comments don't necessarily match mine), the concern I've expressed is nVidia utilizing it as a tool to leverage support for their hardware over other hardware (whether to the degree of glide where it prohibits feature support for other hardware, or whether it simply makes implementation for other hardware more difficult or slower). I proposed, with supporting commentary that never got addressed that I recall, that the proof that was actually required that was this would not occur for there to be any good reason for the industry to support Cg, as other HLSL's (those not controlled or maintained by an IHV) would be more likely to adapt specification changes if modifications are required going forward. I also asked if there was any advantage offered by Cg to offset this likely disadvantage.

So far, the indication seems pretty clear that nVidia intend to utilize it as I proposed, and that what it can do is not being fulfilled for various reason (probably having something to do with nVidia's intent, I'd guess). What remains true is that Cg is capable of being used otherwise, but it is also true that glide could have been implemented on other than 3dfx hardware and could have evolved as a specification. But it didn't, because it didn't serve the interest of the IHV that maintained it. To be fair, I don't see how Cg could be as deficient as glide, but was glide deficient when it was first released? Also, there seem equivalent alternatives to use in any case... And yes, I do think nVidia is trying to leverage Cg as they would an API even though it is a HLSL.

BTW, question marks at the end of sentences mean "questions". And contrary to what seems common practice (it has taken me a while to realize this), I really do welcome answers, I just want them to be pertinent. If you have them, enlighten me (us?).

What I remain curious about is why use Cg instead of DX 9 HLSL, except to favor nVidia over other IHVs? Are the Cg nVidia optimizations impossible for the DX 9 HLSL? Has someone who has looked at the DX 9 HLSL formed an opinion?
 
Yes, Doomtrooper, I love NVIDIA so much, I'm going to support them no matter what. That's why I recommend Radeons to people.

Ever stop to think that people might not neccessarily be defending NVIDIA out of devotion to the company, but rather simply rebutting your idiotic beliefs that Cg is going to ruin the world?
 
Crusher said:
Ever stop to think that people might not neccessarily be defending NVIDIA out of devotion to the company, but rather simply rebutting your idiotic beliefs that Cg is going to ruin the world?
One HLSL to rule them all, one HLSL to find them,
One HLSL to bring them all and in the darkness bind them

See, CG is the Master HLSL. It will be the RUIN of this world, unless we cast it back into the fiery chasm from whence it came.
 
The one thing that I still think is a bit pathetic is that they didn't add support for Pixel Shader 1.4 now that they have 2.0 support.
That's plain childish man.
 
The HLSL usually matches hand-written assembly, and for me it has beat it on occassion, and I'm no newbie to asm shader coding. Cg compiles out to assembly for the DX target, so there's nothing magical they can do. In fact, all my tests have shown it to be less efficient. Come on, Microsoft have the best compiler writers in the world, minus a couple at Intel.

The Cg language is open for NVIDIA to change, and they are free to update the compiler as only they see fit -- Just as they decided that we developers didn't need 1.4 (and many of us did), they'll decide in the future that we don't need the next technological change which an IHV supports but NVIDIA don't, such as if ATI release 3.0 capable hardware first, don't expect NVIDIA to rush to get out a 3.0 profile. They'll claim nobody needs it yet.

In effect, you are putting support for COMPETITORS in NVIDIA's hands (remember, all IHVs have said they will *NOT* write backends for NVIDIA's Cg), and that's just a ridiculous notion.
 
I agree that it's all in the implementation. There's not really any optimizations that Cg could have built into it to favour NVidia cards, except when it comes to the NV30, which will support extended versions of 2.0 shaders.

As I said, this demo appears to not use compiled Cg code, but raw code that is compiled by NVidia's drivers. I find it quite perplexing why any developer would wish to do this if they were marketing a game to the computer gaming world. This is, of course, merely NVidia PR. I think that any developer utilizing Cg will use a method more open to other video cards -- such as pre-compiling the code, or having the game itself compile the code.

Just to bash Cg more:

One of the aspects of Cg and their products that NVidia is touting is the ability to run it in software on video cards that don't support a higher enough version of shader or don't support shaders at all. However, looking further into the DX9 SDK, I see the following: Software Shaders. DirectX 9 allows software emulation of shaders up to version 3.0 -- on any video card.
 
Ahhh, Crusher, since when has www.fileplanet.com been nVidia's homepage? Never (if you didn't know).
If you think that nVidia made Cg just for being "good" you are not very clever. They made it, so they could get more market... and in the end monopol. And if they get monopol they will abuse it... just look at the prices of Geforce cards when there weren't any real competioner.
 
Back
Top