NVidia Cg...now we know where the "Glide" rumors c

I'll just throw my 0.02 back in the ring, then I'll retreat until Thursday when we'll learn more.

NVidia is doing this because they feel it is best for their business, not "for the good of the industry". nVidia (or any other company) doesn't care about "the industry" any more than the extent to which they need the industry to sell their products and services. That's what businesses do. Now, that being said, just because this is a direction that NVidia feels is the best for their own business, doesn't mean that it ALSO can't be good for the industry as a whole. The two ideas are NOT mutually exclusive as some would like believe.

Thus, this can be a good thing, or a bad thing, depending on the specifics and quality of the implementation. (Who has control and influence on the language and compilers, "openness" of the architecture, relationship to other high-level interfaces like DX9 HLSL, any licensing fees, etc.)
 
On a side note...the March OpenGL ARB meeting suggested that the next ARB meeting would "probably be June 11-12. Exact dates to be settled soon." Does anyone know if that's the actual meeting date? (Yesterday and today?)

If that's the case, is it just an odd coincidence that NVidia would announce Cg immediately AFTER the ARB meeting adjourns for another few months...or is it possible that the timing is purposely to avoid the possibility of NVidia getting publically ripped in the meeting notes? ;)

I do hope that whatever Cg is, nVidia has the backing of the GL ARB and Microsoft on this. Such endorsements would basically validate nVidia's approach.
 
Oh, the timing of this is even more interesting for 'other reasons', but more on that next week.

I've got through to nVIDIA's PR agency so hopefully we'll bring you more about this when he time is right.
 
Mephisto Wrote:
Windows 3.x sucked, it didn't fail. Just an example what you can do if you've enough market share (>60% in case of Nvidia), good marketing and money.

Sucked compared to what? What Other OS's were available at that time for the PC...like DOS. Compared the what we have now of course it sucked but back then it was impressive (as were others NeXT, MacOS, OS2) but ultimately it was successful becuase it made thing easier.

We know very little right now yet, maybe this is a developer tool that will help CONSUMERS get the most out of their graphics card purchases sooner rather than later, that's the promise and who delivers it first will get my vote.

I read countless posts about people disalusioned by the slow adoption of advanced features (TNL,Pixel & Vetex Shaders) and then when a company attemps to resolve it (Albeit potentialy for their own personal benefit) the naysayers then blame the company that innovates.

If you don't like don't use it, if it delivers on solving the above problem it will be sucessful, it's that simple. Don't be mad because Nvidia thought of it first...Innovate or die...
 
Mephisto said:
Windows 3.x sucked, it didn't fail. Just an example what you can do if you've enough market share (>60% in case of Nvidia), good marketing and money.
I don't think you can really compare those two situations, Nvidia has 60% of the 3D hardware market, but this is software we're talking about, just because 60% of the people have Nvidia based cards does not mean that any developer has to start using Cg in theire development cycle, there are many alternatives, Nvidia doesn't have the leverage of a monopoly. If Cg doesn't make sense for developers, then they simply won't use it. Cg will *have* to offer something compelling over the alternatives so that developers will start using it, so why should this spell the doom of the industry, be dangerous, bad or threatening? Don't make Nvidia more influential than they are.
In the Win 3.x scenario just about 99% of the PC market were using MS DOS or a clone of it and if you wanted a graphical user interface, no other alternative made sense unless you wanted to change your main OS. Nvidia's influence is big, but not nearly as big as MS' on the OS/Software market ...

Of course Nvidia is in this for their own benefit, but like Joe DeFuria said, that doesn't neccessarily mean it can't be good for the industry or (what I care more about) end-users in general.
 
... this is a developer tool that will help CONSUMERS get the most out of their graphics card purchases sooner rather than later, that's the promise and who delivers it first will get my vote.

As I said several times, if it is just a tool with a standard language which is not under Nvidia's control in any way, I will be a very happy guy.

But seeing Nvidia's behaviour within the ARB as well as with DX8.0, I somehow doubt Nvidia is spending $$$$ on a software that is just a tool and does not give any advantage for their hardware over the competition.

BTW: The consumer argument is lame anyway IMO. Windows XP does also help some consumers to get most out of their PC. But at what cost? (price, privacy,...).
 
I also see this as being in license agreement territory, if you want to use this you pay us X amount of $...If I'm wrong I'll eat my own words :)

Nothing is free, there is always fine print..
 
Well, so much for me shutting up until tomorrow....

The "question/concerns" I have are relatively complex, and you probably really shouldn't bother with it until after the details are given out tomorrow, because they might be answered. It all basically stems around the Cg compiler. The "goodness" that Cg is for the industry as a whole, stems almost entirely on how nVidia sets up the "rights" for the compiler use and further development. This is more important than even the language itself, IMO.

Presumably, nVidia will distribute a compiler with the SDK. To illustrate my concern: Suppose someone writes Cg that can traditionally be coded using DX 8.0 or DX 8.1 pixel shaders. (I'm using DirectX in this illustration, though it can be applied to OpenGL as well.)

1) Will nVidia's compiler be able to produce PS 1.1 code and PS 1.4 code? Although no nVidia hardware supports PS 1.4, it is a DirectX standard, and the compiler should be built with DirectX as the end target, not nVidia hardware if this is "for the good of the industry."

2) If the NVidia's compiler only supports PS 1.1 (and generally, only those features that are supported by nVidia hardware), can other vendors implement compiler support? If support for Cg output is limited soley to what nVidia decides to support in the compiler, that "would suck." I would have to assume there is some mechanism for other IHVs to add support for their own GPUs capabilities in "the compiler." (Otherwise, this would be a blatant attempt to hijack the APIs, which no one would tolerate.)

3) Assuming other IHVs can get support in the compiler for their GPUs, how is this achieved? (Both physically and financially?) Ideally, I would like to see a SINGLE compiler. And advancement of the compiler's capabilities would be acheived preferably by (from best to worst).

* open source,
* by an independent source (non hardware affiliated)
* by a group of industry IHVs (like the GL ARB).
* by Nvidia only...other IHVs lobby nVidia to include compiler support for specific GPU capabilities.

4) A different approach to this would be to allow other vendors to build their own compilers. This raises its own set of questions: From a developer's perspective, this seems to be a "bad idea." Would this mean that to get robust code for multiple hardware targets, you'd have to run CG through several compilers from different vendors? Would NVidia charge a licensing fee to 3rd parties IHVs to be able to release Cg compilers? (If nVidia "owns" the Cg language, presumably they could charge fees if they wished?) Will NVidia allow IHVs access to the code to their own compiler to use as a base, or will nVidia force other IHVs to start from scratch?

5) Yet another approach would be to allow 3rd parties to "extend" NVidia compiler functionality with 3rd party compiler "plug-ins." This type of approach has its own pros and cons for potential compatibility and performance. I'm not sure how comfortable 3rd party IHVs will be with using an NVidia compiler as the "basis" for code generation. This would probably be less hassle for develoeprs, but would also probably result in less optimal code for non nVidia hardware. Again, we also have the question of what fees if any would be involved for hardware vendors writing GPU specific plug-ins?

If we're lucky...some of these questions will be answered tomorrow...
 
Windows 3.x sucked, it didn't fail.

Win3.x did not suck. It wasn't a game development platform, of course, but for business and general use purposes it was a godsend. The primary focus for Win3.x was standardized GDI and (most importantly) standardized printing interface and printer drivers. The business world cares about paper output, you know... and Win3.x provided a standard when all dos programs needed custom drivers for each individual program to output to laser printers.

Printing is what made Win3.x successful. The standardized GUI and control framework was just icing on the cake.

And, lastly, the Amiga was superior to every other PC at the time, yet it sunk with Commodore because it wasn't marketed correctly. They marketed it as a high-end gamers machine, when they could have dominated the business market if they'd just had the apps and the marketing direction to do so.

Marketing isn't evil, you know... it just is.
 
Mephisto said:
1) Cg comes with it's own language definition. Even if it only differs slightly from the DX9 HL Shading language and even if competitors are able do build in their plugins, this is VERY VERY VERY bad for the industry, as then three APIs are competing for support, from Cg is controlled by the market leader. I see, this is hard to understand for fanboys ("You guys are so pessimistic when it comes to NVIDIA").

I guess having the inside information and every detail straight from NVIDIA that the PC.IGN people weren't privy to, and knowing full well what is going on automatically gets me branded as a fanboy (guess this word isn't banned anymore, heh) by someone who thinks he clearly knows more about the situation then I do. :cry: Bummer.

I still think you guys should wait til tomorrow before making judgements or flinging insults at people. That's just MHO anyways.

Edit: Doh, guess it is banned for me. :p
 
Doomtrooper said:
If I'm wrong I'll eat my own words :)

Better than declaring to chop off your poor chap :p and eating it with salsa sauce..

Anyone know the update on that, or has the offer been widthdrawn? ;)
 
Withdrawn nothing! How can one withdrawl a bet with any sense of honor? "PowerVR" still has a few months to pull through for Teasy. ;)
 
Just a few thoughts:

Assumption 1: cg is a full open source standard with specs available to anyone and no lingering IP issues. Everyone can implement a compiler that either generates DX9 or OGL code or code that directly interacts with the driver in any way it sees fit.

Assumption 2: Writing a compiler that works well on hardware with limited programmability is difficult because of the high level nature of cg. (Many possible ways to solve a problem that are difficult to map efficiently to the underlying hardware).

Assumption 3: nVidias next chip focusses more on programmability than any other competing products hence (regarding assumption2) it's easier to write an efficient compiler for the next gen nVidia chip than for any other chip for the time being. Its output still works reasonably well on other products but best on nVidia hardware.

Assumption 4: cg significantly eases development (therefore shortening development cycles) for the developers thanks to its high level nature.

Conclusion:
- cg is an open standard which is generally considered a good thing. It's superior to DX on that account.
- cg is cutting down development time for all hardware platforms therefore its a good thing for the industry. That it doesn't work as well on non-nVidia platforms is unfortunate, but all of us powerusers know that the other companies are just inferior to nvda regarding chip design abilities and driver quality. Tough luck (No flamewars over this remark, please)

So, despite of technically nothing being wrong with the cg offering, nVidia will still be the primary beneficiary.
 
Heh...lots of assumptions there. ;)

As I said before, just because something is done in the best interests of "The Company", doesn't mean it can't be good for the industry as a whole. If those assumptions above turn out to be "facts", then no one would have an issue with Cg. However, I do believe that no company would do this unless they believed that they WERE the primary benficiary. Directly (though licensing costs and IP control) and/or indirectly, that Cg would simply accentuate their "superior" (*Ahem*) hardware as you dictated above.

That being said...I think Assumption number one is highly unlikely to turn out to be true.

Now, I don't think NVidia should be criticized outright if number 1 isn't true. It's not reasonable or "fair" IMO, to expect any company to spearhead an effort like this "for free" with no strings attached. It's reasonable to expect some "strings". The debate is going to be, IMO, on how "fair" the strings are....
 
Joe DeFuria said:
Withdrawn nothing! How can one withdrawl a bet with any sense of honor? "PowerVR" still has a few months to pull through for Teasy. ;)

I wont think less of Teasy if he withdraws his promise.
If PowerVR fail to deliver by the time the deadline is up (August 2002 from what I remember), and Teasy does go ahead with chopping off his own Penis (surely it's OK to say Penis, it is a medical term, no?) then I will definitely think less of him... hehehe

NDA been lifted... ? Read a few articles and all I can say is SMOKIN'

Even ILM and the co-founder of SGI has 'bigged up' Cg - I think NVIDIA are going to benifit themselves enormously from Cg as developers begin to use it more and more games will require the newer hardware to run their games. Hopefully this will drive gfx card sales too. Come on, TNT2 (or was it the TNT2 M64) was NVIDIA's highest selling 'product' last year!

That 60% market share means diddly squat to gamers if most people are using ATI Rage, NVIDIA TNT2 M64 or Sis 315 in their machines!

Thats why I applauded NVIDIA's NFORCE platform... raising the minimum standards to DX7 now lets try and raise them again to DX8 plus.

I am sure programmers and crucially artists are counting their lucky stars too as most have no interest in Assembly or trying and learn it. :)

EDIT: typos
 
Back
Top