What was that about Cg *Not* favoring Nvidia Hardware?

RussSchultz said:
Doomtrooper said:
Where are these people now...

Right here, still thinking you're being a reactionary who doesn't understand what you're talking about. ;)

Doomtrooper said:
I remember some serious debates about this prior to this DX9 release, how CG is good for the industry, how it exports generic shader data that any card can do....how CG is great for the industry...I remember being flamed for be skeptical...

Where are these people now...


astrosmiley.gif

I disagree with you, Russ. Doom's point has been proven, and I for one think it is a disgrace that one company is trying to dominate by manipulating developers. You are entirely entitled to disagree, and so am I.
 
Eh, I don't consider FilePlanet to be a website at all... Gamespy Industries is worse than any hardware company could ever be. Regardless, I wasn't aware they had the demo, it didn't turn up there when I google'd a while ago.

Anyway, please feel free to send Yeti some e-mails letting them know you disapprove of their demo, and won't be buying the game because of it. That will probably do a lot more towards preventing NVIDIA-specific titles than complaining about it here will. I don't like the idea of that happening, and I never claimed I did. In fact, I don't think anyone here has every said they thought that was a good idea. But this does nothing to prove that what we've said about Cg is false. Nobody claimed Cg wouldn't have the capability to make NVIDIA-specific games, only that it would also allow the code to run with standard DX9 HLSL code as well. Trying to use this demo to prove otherwise is just plain ignorant. The only thing this demo proves is that NVIDIA is willing to convince developers to help them promote their hardware, which is nothing new. They are also going to be using their proprietary back end for Cg to encourage optimized shaders for their hardware as well, which is to be expected. That doesn't mean developers have to cater their games towards NVIDIA hardware if they use Cg, and probably the only ones who will do that are the ones who have arrangements with NVIDIA (and it appears Yeti Studios is one of the few doing that).
 
NVIDIA will support PS1.4 in a future version.. rest assured.

Heh thats all I got that was interesting.. oh and that ATI would like developers to always use DX9 and OGL 2.0 HLSL.

Correct, no ps_1_4 yet: not too many people are using this profile, but
rest assured we will support it in a future version.
From an NVIDIA employee by the way... Matthias Wloka <mwloka@NVIDIA.COM>
 
I should also point out that I really don't care if nobody uses Cg. If Tim Sweeney wants to use DX9 HLSL for the next Unreal engine, that's fine with me. I'm not saying people should pick Cg instead of HLSL, just that I don't see a need to go on a campaign to denounce Cg as something evil that will ruin the industry, as others here do. And if a developer writes their shaders in the HLSL tools and compiles them, then copies the code into Cg and compiles it there so it runs a little faster on NVIDIA cards than the HLSL version does, I don't see anything wrong with that either. If a developer chooses to use Cg and only support NVIDIA cards, then they'll have to live with the fact that they aren't going to sell as many copies of their game. And rich as NVIDIA may be, they can't afford to buy off the entire industry, so I think the occurances of that sort of thing will be few and far between (and hopefully limited to demos).
 
Crusher, read this:
_Without_ ps.1.4 shader support, and with _only_ NVIDIA specific
optimisations it's about as valuable as Glide was. Both are proprietary
standards owned by companies which want to own the future of computer
graphics to the exclusion of all other parties.
 
LeStoffer said:
Allrighty then, I shouldn't be adding fuel to the fire, but what the heck... Take a look at what ex-nVidia Richard Huddy is saying about Cg in the Re: DX9 Is out! ...NVidia? thread:

http://discuss.microsoft.com/SCRIPTS/WA-MSD.EXE?A1=ind0212c&L=directxdev#15

I think that Hellbinder and Doomtrooper might just fall in love over this! ;)

It's not only Richard Huddy's comments that are worth reading.

Mike Burrows <mikebur@MICROSOFT.COM> said:
As DX:
Cg is nVidia's compiler for nVidia hardware. No other hardware IHV (ttbomk) is working on optimising backends for Cg. nVidia obviously will take maximum advantage of their hardware nuances and optimisation pathways.
HLSL has been designed by us to run optimally across all programmable video hardware. We have worked with all the vendors of programmable hardware parts, past and future, to ensure the generated output is not only optimised from an instruction redundancy standard, but will run best for each targetted assembly model based on the hardware optimisation tips given to us by the hardware manufacturers.
If you're targetting Windows or Xbox then you would think a software company's core competency and owning the platform the hardware vendors write to may actually help ;)

As a general comment:
Resources being assigned to a software language, which will supposedly remain identical to HLSL, rather than concentrating on increased driver quality and helping us optimise the back end compiler seems odd. I hope that working better together we can help drive the industry forward towards an exciting future...
 
Also think this is worth taking into account:
Cg was one good reason for my not wanting to stay at NVIDIA. I saw it as
every bit as divisive as GLide. And I saw it (maybe wrongly, but I don't
think so) as NVIDIA's attempt to control the future of computer graphics.
 
As I already said in an other thread, Cg's point was (for nVidia) is to bring support for their nv30 card before the competition and before main APIs are there.

Now we have R300 (and Cg doesn't support it of course), and we have DX9 now. And nv30 still nowhere near.

Theres no point in using Cg now. It was killed by they unability to deliver.

But don't except them to admit it...
 
Years ago when I ran mostly 3DFX products, Voodoo 1's and 2's everything was peachy...went to the store..bought the title...came home installed it and it worked !! Glide was rolling along....

Then 3DFX died..so did glide and so did support from developers.

I then tried other cards, Geforce 2...Radeon and I started to notice what it was like for other users during the 3DFX reign...specific code paths and optimizations for one IHV...
I didn't like being on the other side of the fence, and developers during the 3DFX days preferred Glide due to its ease to program...well those excuses can't be used now. All the current cards on the market can
produce decent graphics if coded for a API like DX9, and their appears to be lots of tools to help with that...

So here we are almost at 2003 and we have developers still stupid enough to put all their eggs in one basket by pimping one IHV..
Classic example..Neverwinter Nights forums from the tech support...

This is for a water pixel shader effect..something Humus has been doing forever...

http://nwn.bioware.com/forums/viewtopic.html?topic=142653&forum=49

http://nwn.bioware.com/forums/viewtopic.html?topic=142644&forum=49&highlight=shiny+water
http://nwn.bioware.com/forums/viewtopic.html?topic=134670&forum=49&highlight=shiny+water
http://nwn.bioware.com/forums/viewtopic.html?topic=143597&forum=49&highlight=shiny+water

Or how about GTA 3...

Hellbinder may remember this...
Subj: RE: Talonsoft Technical Support
Date: 05/06/2002 14:10:05 GMT Standard Time
From: ssharpe@Talonsoft.com
To: Jut1233456@aol.com
Sent from the Internet (Details)


To be honest, in a strict hardware and driver comparison only, the lowest end GeForce card can run rings around almost any Radeon card. As far as assumptions go, you can assume whatever you want. As far as the truth goes, I'm just passing along information that I got in a .pdf format document because they were so secretive in the production that they wouldn't let us know anything else about it. To be honest, in a strict hardware and driver comparison, the lowest end GeForce card can run rings around almost any Radeon card. Now, to answer your questions:

1. To run the game at an acceptable level, I'd say a desktop computer that meets the recommended requirements, with a video card that has a faster processor than most Radeons (i.e. GeForce 2, which now costs about fifty bucks and will last a couple of years)

2. To receive email notifications on new game information or possible patches or updates, you can go to our website and register here: http://www.take2games.com/register because Technical Support cannot supply information of a patch until an official release has/has not been made.


Stacey Sharpe
Level Two Technician
410-933-9191
ssharpe@take2baltimore.com
Take2 Baltimore

www.TALONSOFT.COM


Secret .PDF document... hmmm what .pdf is that :D
 
While you can label me a NVIDIOT all you want, it doesn't change the fact of the matter that a demo that used Cg to compile to a vendor specific profile doesn't run on ATI hardware does not make proof positive that Cg is bad for the industry. All it means is somebody made a demo that only runs on one platform. I presume they're aiming for the XBOX, so why shouldn't they?

Furthermore, is it a suprise the Richard Huddy and the guy from microsoft aren't chatting up Cg? Ever consider they've got their own interests (ATI because it's mindshare for NVIDIA, and Microsoft because Cg enables OpenGL)? Or are they pure of heart simply because they're not from NVIDIA? Notice, I'm not saying that they're wrong, but they've got their own motives at heart, not the good of the industry.

I'm sorry if being called a reactionary that doesn't know what he's talking about makes you angry. Its true, though. You don't seem to demonstrate a grasp of the technical issues you rail for or against. You predictably react for ATI and against NVIDIA, which is irritating and is just as irritating the other way around.

You can call me a know it all all you want, I take it as a compliment. I try my best to understand the issues I'm talking about before opening my mouth. I do this because I know I'd look like an poser idiot to people on this forum who did know what they're talking about--and there's plenty of them.
 
I realize that it was not really clear in my origional post..

Butt the main point that i have always made.. Is that Nvidia would use Cg, as a tool to promote Nvidia Hardware over everything else. And Where possible, Find ways to make routines run better on Nvidia hardware.

In this case, while Cg, Is not directly responsible for the incompatability. It is none the less True, and Evident that they went out of their way to Tie Cg (And Dx9) to this Game demo. Which *ONLY* runs on Nvidia hardware.

I have already seen comments written at Bluesnews, Rage3d, and other sites where people are slamming Ati drivers, Hardware.. Or saying *why doesnt Ati support Cg, its obvious that its going to be widely supported by developers*..

From the beginging i have *Stressed* that this is the Key issue With Cg, Its the Public preception of Nvidia/Cg And how Nvidia would use it in Spite of all their Claims of inocent *wanting to help the game community*....
 
As I stated on the other thread, the perceived outcome of something that splashes "Cg!" all over it then only runs on a single platform is damaging to the perception of what Cg is all about.

Yeti/NVidia cannot fathom what kinds of overtones this raises to the public arena, even if the usage of Cg has absolutely no tie to the platform dependence. It's kinda like unveiling a new colorsafe washing machine, then out comes pink and blue striped underwear from dye bleeding. Even though it was because some loon put bleach in the load, the washing machine unveiled will accordingly be faulted due to it's "color safe" introduction.

Perceived value is substantially more important to the consumer than it is to reality. Marketing departments have thrived on this notion for the past several years.

As far as arguments for Cg being non-platform specific, the same can be said for Glide. After all, all one one need is a "backend" library to use Glide instructions.. it's even been done at run-time through the use of wrappers. Glide was also made public domain, so other companies could freely make their own implementations of the library. This isn't an apples to apples comparison from the underlying architecture, but from a conceptual and argumentative approach, it is identical.

If there is consumer hesistation to buy Cg powered games due to this fiasco, it can only hurt Cg's propagation, and therefore wasn't a very bright move on Yeti/NVidia's part. The best case scenario for combating perceptual bias is developers using Cg and keeping this underwraps/non-public for fears of leading to buyer hesitation. This does nothing to advance the usage of the tool but instead makes it less desirable to use... even if it is not at fault.

If this isn't the desired effect for this game demo, then Yeti and NVidia need to fall into total damage control mode to dispell such urban lore from spreading. Either by detracting the demo and replacing with one that runs well on multiple platforms, or putting a huge disclaimer that announces the desired intent was single platform compatibility and the full game will have no such limitations when completed (or something similar). The latter approach would still lead to much hesitation until the final, multiple platform version can be visually inspected and tested to ensure this "Cg marvel" truly is multiple platform friendly and runs well on non-NVidia hardware.
 
All it means is somebody made a demo that only runs on one platform. I presume they're aiming for the XBOX, so why shouldn't they?

You sure do read before speak... or maybe not. If you had, you would've known that Gun Metal is also aimed at PC.
Hehe, if the fact that a demo only runs on one type of card isn't bad, what is bad then?
 
Furthermore, is it a suprise the Richard Huddy and the guy from microsoft aren't chatting up Cg? Ever consider they've got their own interests (ATI because it's mindshare for NVIDIA, and Microsoft because Cg enables OpenGL)?

There’s probably another reason why MS aren’t all that taken by Cg.

MS hated Glide not just because it was a ‘competitor’ to DirectX, but because it was something out of their control – invariably if something went wrong with a game they would be the first port of call and end up fielding all kinds of support issues that were totally out of their control because it was using 3dfx’s API.

Exactly the same thing can happen with Cg – if something goes wrong with Cg MS will end up taking a lot of the support, but its entirely out of their control. “It says DX9, and I have DX9 installed and a Radeon 9x00, why doesn’t it work? I thought DX9 would make everything workâ€, says the user, “Sorry, you’ll need to speak to the software vendor or NVIDIA†Say Microsoft , “Eh, NVIDIA? Why, I don’t have any NVIDIA hardware – stoopid windows doesn’t workâ€. I doubt they’ll have had many like that for this demo, given the nature and location of the download, but they will get that if games begin to be released in a similar fashion – it happened before and it will happen again.
 
Rexer said:
All it means is somebody made a demo that only runs on one platform. I presume they're aiming for the XBOX, so why shouldn't they?

You sure do read before speak... or maybe not. If you had, you would've known that Gun Metal is also aimed at PC.
Hehe, if the fact that a demo only runs on one type of card isn't bad, what is bad then?


A demo that only runs on one platform obviously shows, without doubt, that Cg is meant to and will destroy the industry if allowed to exist. NVIDIA will rule the world unless you convince all the wrong thinking people in the world to cast out Cg into the light of day, where it will be revealed as the spawn of Satan that it is.

Hopefully I don't have to put the sarcastic tag there for you to catch my drift.

You're right, I should have looked before opening my mouth. Regardless that doesn't change the facts. A demo does not proof make.
 
Well no matter if the public perception is "its obvious that its going to be widely supported by developers".

That doesn't make it true.

And I just don't see why would a developer use Cg.

Want to speed up shader development?
There's DX9 HLSL.

Want to create an application that runs on both D3D and OpenGL?
Then shader compatibility is the last thing to worry about.

Want to do OpenGL only development?
Then it's no better than any other vendor specific extension - supports one vendor only. The need to support other extensions really nullifies the advantages Cg provide.

There's no point in using Cg - there never was.

And if they want to sell hardware by a proprietary API - that's a dead end.

nVidia should know that.
That's how they beat 3dfx.
Supporting standards instead of proprietary APIs and getting on well with Microsoft.
That's exactly what ATI is doing now.
 
The market will determine whether Cg lives or dies.

If most developers find it useless, it'll die. These little flame wars will not make one bit of difference.

Edit: gorsh. That was a complete non-sequitor on my part. I have no idea what I was agreeing with.
 
Want to do OpenGL only development?
Then it's no better than any other vendor specific extension - supports one vendor only. The need to support other extensions really nullifies the advantages Cg provide.

Yes it is... It also works with more than one vendor as well under OpenGL (I've built demos with it that run on ATI hardware, or at least 9500/9700 boards)...

However that also leads me to another option...

There's no point in using Cg - there never was.

How about writing or extending an OpenGL application on another platform besides a Microsoft one? Like Linux or MacOS? DX9 doesn't do you a whole lot of good there, and OpenGL 2.0 is still too far over the horizon to sit around and wait for...
 
Back
Top