What was that about Cg *Not* favoring Nvidia Hardware?

To whomever mentioned it, yes: most people here would be aware that Richard Huddy now works at ATI and thus are not impartial. But I made the link because of this quote from Huddy, that I found surprisingly candid:

As for the inside scoop? Well, obviously I'm bound by the NDA in my previous contract of employment - but yes, I had the inside scoop.

Cg was one good reason for my not wanting to stay at NVIDIA. I saw it as every bit as divisive as GLide. And I saw it (maybe wrongly, but I don't think so) as NVIDIA's attempt to control the future of computer graphics.

I still believe Cg was made to be able to support the extras of NV30 and future generation NV40 in case Microsoft started to f*ck things up.

When Cg went into development there was only going to be VS 2.0 and PS 2.0 AFAIK and with DX10 slipping way into the future with Longhorn there's no doubt why nVidia felt that they had to react in some way.

Things settled down and today we got VS 2.0+/3.0 and PS 2.0+/3.0 with DX9 so they probably decided to changed the marketing focus of Cg.

I still think that Cg has a key benefit in being able to optimize shaders for specific hardware (via specific compilers + profiles) but if no other IHV will make optimized compiler backends and profiles for their product it will never be dominent. On the other hand, if Cg takes off - for whatever reason - I'm pretty damn sure that ATI will make the backend compiler/profiles.

In fact, they might even have had at shoot at it at a secret underground R&D center, just to be in the known about it for the future. But - pssst! - they would have to kill you for even asking! ;)

A last quote from Richard, old chap:

I'm talking about the objective facts - like whether Cg and HLSL are certain to be and to remain the same (there is not now, and can be, no guarantee), whether Cg optimises for non-NVIDIA hardware (it does not), and whether Cg will be OpenGL's next high level language (it won't).
 
"
Now what do you guys think that will acomplish? People getting pissed at Nvidia? Or Will they think that Nvidia is *cooler* and *better*. If your preception was that most game developers were going to support soemthing that will *apparently* Not run on Your card.... Its pretty damn obvious what you will do.. Buy The card that *will* do it.

Some of those in this forum have a really bad habbit of assuming that everyone in game land has the same inside, or Technical understanding to seperate The Wheat from the Chaff.

Sure this is just one small game demo.. But what are you guys going to use for your arguments in 6 moths, and 12 months.. This is only the begining.
"

That is what have been saying for months now.
So what?
That is just fine with me.
ATI can feel free to jump on that CG train.
If not well it could happen that they loose ground because of the reasons above.

If i buy a Nvidia card and a certain game runs pretty good on my hardware and my friend has for example an ATI card and the game runs not that great. It does not meant that it does not run at all.
Just imagine the game runs on the ATI card but not as good as on my Nvidia card.
What will my friend do?
He will try to sell his ATI card and get a Nvidia card.
You can bet that he will not blame the game developer because you know when the game says it is an DX8 or 9 game or OpenGL game it should run on his hardware too.
Nobody from the average user will blame the game developer. They will blame the hardware company and next time buy the card from the competitor.

But well who cars? That is how business works. Nvidia has the market share to set standards. If they are successfull with that ATI or any other competitor better support it.

You know thats the same like the whole driver story.
ATIs drivers are not that great. They are far away from Nvidia's.
But not every game has some problems on ATI cards because of their crappy drivers. Some of those games are developed on nvidia hardware and often these games include already some workarounds because of some minor problems in Nvidia's drivers and than ... you have your problem on other cards.
And then look into the forums. The user will not blame the game developer for this.
Standard answer is "crappy ATI drivers......"
Period.
Is it fair?
Well what is fair?
Business is business. I have no problem with that. ATI can support CG too.
If they don't well than they take a certain risk.
 
Doomtrooper said:
Reverend said:
Nobody is going to go out and buy a NVIDIA video card because they discovered the game they just bought won't run on anything else.

Disagree on a grand scale, I linked Neverwinter Nights forums about shiny water...not only was the game developer tech support stating If you want Shiny Water you need a Nvidia card...half of the members were stating the same thing.
So you don't think in those threads it would basically force that person to get rid of the card that isn't supporting all the 'cool' features, because its very true..especially if the Game Developers is telling them basically to buy a NV card.
Er... but did anyone who read what the developer said actually went out and bought an NV card? Okay, maybe a few did, but is this the majority of the case of NWN buyers who don't own an NV card?
 
Sharkfood said:
Nobody is going to go out and buy a NVIDIA video card because they discovered the game they just bought won't run on anything else.

Just had to point out that this is indeed a bit of fiction.

Quake is what originally put 3dfx on the roadmap.

And what other real alternative to 3dfx was there back then?

Quake3 sold a ton of NVIDIA graphics cards, as did Tribes2 force hundreds and even thousands of non-NVidia owners to flock towards getting that new GTS or MX in order to play the game correctly.

Perhaps this is true... are these in surveys or in some forums I could read up on? I, for one, never felt the urge to upgrade from a V5 to a GeForce to play Q3. Perhaps I'm different.

And remember - I was talking about the inability to play a game on a non-NV card, not about extra performance or features of a NV card that runs a game better. Yeti's game is very different in this aspect.
 
BTW, I hope this doesn't back me further into the NVIDIA camp's corner :rolleyes:

I do not think Cg is bad. I do not think any developer using Cg is bad.

I do object to developers using Cg (or not) to make a game that will only run on NV cards (shit, I could care less if such a developer will have considered that they may make less money this way.... I only care that I have a ATI card and don't have the money to buy a NV card to play this particular game that may well be a great game to own/play).

Lastly, I truly believe Yeti is but the minority - I refuse to believe that the majority (or even close to the majority) of developers will make games the way Yeti made this game of theirs (and certainly NV can't have that deep a game-funding pocket nor that stupid a CEO). Certainly not Epic, certainly not id, certainly not EA, certainly not a thousand other developers/publishers. This demo of Yeti's has raised eyebrows simply because, IMO, it is something a few individuals like to bring up. I am fairly certain this is but a storm in a teacup. I don't like what Yeti has done but MO is that the decision lies with the developer and such developers really aren't that many.
 
Since I didn't follow the thread all the way from the beginning: is the game in question one of those "sponsored" games that are nothing else but fancy techdemos in game format with abysmally bad gameplay? If yes then I only see a massive waste of energy here. If anyone wants to I can give him DroneZ and Gunlok for free; see above why.
 
Ailuros said:
Since I didn't follow the thread all the way from the beginning: is the game in question one of those "sponsored" games that are nothing else but fancy techdemos in game format with abysmally bad gameplay? If yes then I only see a massive waste of energy here. If anyone wants to I can give him DroneZ and Gunlok for free; see above why.

Ding! Ding! The door prize goes to Ail. :p
 
Reverend said:
And remember - I was talking about the inability to play a game on a non-NV card, not about extra performance or features of a NV card that runs a game better. Yeti's game is very different in this aspect.

So, the inability to play a game on a non-NV card is better than a game running poorly on a non-NV card? It indicates nothing about the released game, or any issues it may have?

Ailuros said:
Since I didn't follow the thread all the way from the beginning: is the game in question one of those "sponsored" games that are nothing else but fancy techdemos in game format with abysmally bad gameplay? If yes then I only see a massive waste of energy here. If anyone wants to I can give him DroneZ and Gunlok for free; see above why.

Vulpine GLMark is only a benchmark, surely people are too smart to use that to compare products.

Surely DroneZ and Gunlok indicate nothing about nVidia's intent in the past, and so this current example in turn indicates nothing about Cg's future. Surely the ridiculous and obvious nature of their attempt precluded such factors impacting on any subsequent commercial games...

Surely, if 5 or more distinct people said they were ditching Radeons and buying GeForce cards based on the issues a (hypothetical) commercial game had, or comments of developers, or even moderators parroting "common knowledge", they are the only 5 or more people who did such a thing. Surely their perception does not indicate future buying patterns nor the success of any of the tactics mentioned above. And, most definitely, no one would be foolish enough to "upgrade" to a GF 4 MX based on such a perception of enhanced performance due to "brand name recognition" or some other silly concept, consumers don't do things like that.

We live in an ideal world where consumers always make informed decisions, and the perceptions of the uninformed have little to do with what products are successful. Strategies based on deception are doomed to failure, and should not concern us because things Always Work Out.

---

This is my view of the viewpoints that say that this does not warrant discussion or that "the market will decide (so don't talk about it)", or that "things will work out". I have no intent on convincing people who believe this, but I do tend to respond to them trying to convince me. I do think they ignore what has happened in the past to propose this.

This is not my view of the viewpoints that say just "the market will decide" (this is a reasonable expectation IMO), or that it is "likely things will work out". In this case, I end up mainly arguing against the idea that no harm will be done in the meantime if the person proposes that in conjunction with this viewpoint.

I hope I've repeated it enough times for those who don't read the entire thread or are prone to confusing my statements with those of others.
 
All gamers I know would buy a new gfx-card if they couldn´t play one of their favorite games with the one they have.
Gamers sometimes read about a game for years before it is released and then buy it as soon as possible.
And when it´s released they will upgrade all hardware thats not good enough to play it with good picture quality and framerates.

People like that buy hardware to play games. They dont buy games to use the hardware.
It makes no difference to such people if there is a problem with the hardware itself or if the game is optimized for different but not better hardware.
If there is a problem they buy something that works.
That´s of course not true for any game but for games they really want to play. The best games that are very popular and people talk about for years before they are released.

Regards!
 
Here's my guess on why Yeti Studios put out this demo for only one vendor. NVIDIA probably knew that when Microsoft released DirectX 9, that ATI were going to come out with some really great stuff, which they did. Brand new Catalyst drivers with support for DX9, DX9 demos and screensavers, and a ton of fanfare. Like a parent not to be outdone by the other parent in giving presents for the kids, NVIDIA went to a developer, Yeti Studios in this case, and asked them to come up with a demo that exploits Cg and DX9, and NVIDIA will give them massive attention on their website. NVIDIA got a lot of developers under their wings, because they give them a ton of support, so it's no surprise that NVIDIA were able to drum up a developer to do this for them. It's just something to give NVIDIA attention, and not let ATI have it all. NVIDIA didn't think they'd have new drivers released or anything, and figured a game demo was in order.

This is just a guess on my part, I haven't talked to neither NVIDIA or Yeti about this. I did email Yeti Studios to see if they can give me some answers, which I'll post up if I do. As for Cg, I think a few people, namely Democoder, already answered most of the confusion that took place in this thread. Only time will tell if Cg will live on, or not.
 
Ailuros said:
Since I didn't follow the thread all the way from the beginning: is the game in question one of those "sponsored" games that are nothing else but fancy techdemos in game format with abysmally bad gameplay? If yes then I only see a massive waste of energy here. If anyone wants to I can give him DroneZ and Gunlok for free; see above why.

What? No Evolva for free? ;) Pretty game at the time, shoddy gameplay that was fun for the first 10 minutes. That was another heavily promoted game demo by NVIDIA back in the GF256 days. Let's not forget Dagoth Moor Zoological Gardens, those people never even came out with a game. :rolleyes: It sure was pretty though ...
 
Reverend said:
Sharkfood said:
Quake is what originally put 3dfx on the roadmap.

And what other real alternative to 3dfx was there back then?

You dont count the Rendition Verite line of cards as real alternatives? I remember massive vQuake vs glQuake usenet discussions.



Perhaps this is true... are these in surveys or in some forums I could read up on? I, for one, never felt the urge to upgrade from a V5 to a GeForce to play Q3. Perhaps I'm different.

I never felt the urge to upgrade from my V5 to anything else for Q3 or even Tribes2. I do remember the urge to upgrade from my V2-sli to V3 at the time of Q3-test era. A little bit after Q3A was out, I did upgrade to V5, but more for the FSAA and smoother play on Unreal.

My friends are gamers, but not hardcore hardware people. The one just upgraded from a GF2 to a Ti-4200, and the other still runs his Matrox G550. My second and third system [AMD Tbird 1.6Ghz + ATI-8500, AMD Tbird 1.4Ghz + V5] is more gaming capable than 50% of my friends' primary systems. They spend more time gaming than I do. As for the scenario where I have a game that runs great on my video card, and a friend has a different card which doesn't play the game too well ... In all times, the friends will merely find a different game to play. I believe this to be the normative case.
 
Re: Gun Metal Demo runs on the Radeon 9500/9700

I've released a new version of 3DA, if someone want to try the demo on his 9500 / 9700. Its a dx8 game, so how can they use dx9 features, maybe in the full version....

It basicly checks some supported texture formats/frame buffer combinations, which the geforce and the ref. rast. supports, but it never uses this combination, becuase it runs without errors on the radeon 9500/9700, I just had to return different results in the "check if" phase.

Regards,
Thomas
 
One of the reasons for CG in my mind is the ability to offer nVidia's programmable features in titles faster so the gamer can actually enjoy the programmable features in the life of the card.

We have DirectX 9 features but yet no DirectX 9 titles to really take advantage of them. Gamers always have been hearing features this and features that, but where are they in a life of the card?

So, what I think nVidia was doing and showing -- look ATI is offering DirectX 9 to the world now and yet -- no gaming titles.

Look at us nVidia, we already have a gaming title utilizing CG now.

This is the PR theme I am getting from this.
 
Wow, one insignificant demo and Nvidia have conquered the world! I'd better rip out my R9700 Pro NOW! :rolleyes:
 
If you are doing things in the offline world then it all comes down to which compiler you use (since Cg and DX9 HLSL are 99% compatible). The problem is if you want to compile your shaders at runtime, since some ps_1_x shaders that Cg produces plain out crashes Radeon (though if Cg can make it crash, you can crash it "manually" but if you are hand coding things you can always write a ps_1_4 shader or bypass issue).
Currently Cg does not support ps_1_4 which makes it a bit limited and DX9 HLSL also does not support extended 2.0 shaders (ps_2_x and vs_2_x) so it can't take advantage of new features that GeForceFX brings (and some features are very hard to multipass (dynamic branching in VS, gradient instructions,...)).
Both tools will improve as time goes by and it should really be a developers call which one to use.
 
Matt said:
Ailuros said:
Since I didn't follow the thread all the way from the beginning: is the game in question one of those "sponsored" games that are nothing else but fancy techdemos in game format with abysmally bad gameplay? If yes then I only see a massive waste of energy here. If anyone wants to I can give him DroneZ and Gunlok for free; see above why.

What? No Evolva for free? ;) Pretty game at the time, shoddy gameplay that was fun for the first 10 minutes. That was another heavily promoted game demo by NVIDIA back in the GF256 days. Let's not forget Dagoth Moor Zoological Gardens, those people never even came out with a game. :rolleyes: It sure was pretty though ...

No Leadtek bundled the cards with those two POS impersonations of games *ahem*.

Anyway I think my point is clear; the debate seems to be about an apparently underwhelming demo or game and not some major release by one of the big developing houses, which should be smart enough to not step inside these traps nowadays, especially after ATI is constantly and consistantly gaining more ground in the 3D graphics market.

Eventually every time one of those ventures pop up, there will always be pro/contra CG arguments from now on.
 
Doomtrooper said:
Now what the water looked like for Nvidia card owners for a year..

Geforce3/Geforce4

uh make that 6 months, it was only released in June this year :)

Well I cant blame nVidia. personally I think ATI missed a trick by not having a full blown game demo like Breed launched to coinicide with DX9 release, with DX9 effects, that could currently only run on the 9500/9700. That would have been smart.
 
First post, been lurking for ages it seems. Some may recognise me from MURC.

Agree Randell, but what ATI needs most right now is 3dmark2003!:p
 
Back
Top