3DLabs Cg Rebuttal

Doomtrooper said:
Standard PR for EVERY company is hype up what you have, downplay what you don't

You said it best, and Nvidia doesn't have good relations with the ARB or MS :)

What has that got to do with what I mentioned before? And where did you get the idea that NVidia has a poor relationship with either ARB or MS? Proof please.

Doomtrooper said:
My point is not with developers, my point is 'how easy' Democoder and Gking are trying to make CG sound for ATI and Matrox and 3Dlabs to implement yet again for the fith time there has been no press release stating so, in fact there has been a Rebuttal :rolleyes:

Well you should focus on developers because they are the ones CG is aimed at. Try to understand that generally dev teams do NOT wish to lock themselves down onto a single platform. The simple reason is for this is that it having a game run on as many platforms as possible is a good thing for them. With this in mind, NVidia knows that if CG locks or prevents developers from easily optimizing their code then developers are NOT going to use it.

Exactly what kind of press release are you looking for? ATI stating that CG is a great innovation that they feel they could contribute easily to? A press release stating that CG is nothing? The first is nonsensical since NO press release has ever praised a competitor. The second will come IF ATI feels they need to address it, much like Codeplay did.

I stand by my opinion that when Opengl 2.0 and Dx9 HLSL is released CG will be just a memory.

Then why are you so worried?

dksuiko said:
Unless the single body is ATI, right Doomtrooper? :)

You think so? ;)


Edit for grammar
 
I stand by my opinion that when Opengl 2.0 and Dx9 HLSL is released CG will be just a memory.

And, my guess would be that this will go a lot faster now since Cg is released.

Cg has put pressure on the other vendors which usually means that they will try even harder to come up with:

their HLSL compilers
Open GL 2.0 HLSL
and so forth

It's like f.e a soccer game. Usually, one team needs to score a goal before things start to happen.
 
Unless the single body is ATI, right Doomtrooper?

hehe...Come on Doomtrooper, you know...and I know...that it would have been a totally different deal had ATI come forward with a similar compiler.

Let's just see how this thing pans out...If, at the end of the day, this tool stimulates developers to the point where we _finally_ have games employing DX8 functionality...then who would complain?

I just keep coming back to the fact that after more than 1 year, we have essentially squat to show for any DX8 features in actual games...What ever it takes, at this point, is what I'm looking at.
 
Well I am no game developer guru but...... it is entirely possible nvidia held this conference on Cg for developers (with all the bells and whistles free drinks ;) and food etc...) and managed to get a these fellows to give their first impressions on Cg.

It is also possible as Doomtropper has pointed out that they received a bit of cash for their pockets in the process. BTW I didn't see John Carmack on that list of developers that support nvidias Cg. If he were I guess I would have to say that Doom was incorrect as JC as far as I know is not up for hire.( taking bribes.) So it is IMHO very possible that nvidia lined the pockets of a few developers to have a desired outcome. (God knows that they have the cash.)

Another good point that Doomtropper brings up is the fact that there are no other graphic chip companies having praise for nvidias Cg. In fact we have an official rebuttal from 3Dlabs. I would very much like this Cg to be good for the entire industry but have serious reservations about this.

Why would nvidia speed up development of software so that it can run better on other companies hardware just as well? That would truly be an act of altruism on nvidias behalf.(From what I know there really is no such thing as altruism....but that is another argument.)

I will wait I guess to see what if any performance disparities there are on other companies hardware before I outright stake a claim that Cg is somehow proprietary in some way. But until then I will be wary of the possibility that nvidia has done something altruistic for developers.

Further I will also wait and see if ATi has any sort of rebuttal for Cg. I don't see why, if the software will be standard, that nvidia did not work with Open GL to make this "developer tool". Again it is a wait and see thing.

Sabastian
 
DemoCoder said:
Nope, nothing. Except that there the interface doesn't existin D3DX's compiler API right now for vendors to plug into the backend, atleast from what I've read.

Well, if MS stipulate that it can't be a runtime compiler then there will be no possability for anyone optimising - just generic DX code, however if MS do use a runtime compiler then that opens the door for possabilities like that. I would guess it depends on whether MS feel confident that the hardware vendors won't cock-up their compilers.

DemoCoder said:
Again, here's the current dilemma:

1) You are building a game engine and need to target near term hardware that's in the marketplace *TODAY*

2) You don't want to write assembly code.

I wasn't talking about now or NVIDIA being 'evil' I was giving a potential solution to MDolenc's question!

Sabastian said:
BTW I didn't see John Carmack on that list of developers that support nvidias Cg. If he were I guess I would have to say that Doom was incorrect as JC as far as I know is not up for hire.

Two things there. First I should imagine JC is currently too busy to really concentrate on much else other than DoomIII right now - AFAIK he generally goes for periods of 'study' and research inbetween game engines. Second, Cg doesn't answer the basic underlying problem that I would imagine JC has with OpenGL and shaders which is that there is no single path yet; once ARB shader routines are out and Cg supports them it will probably be more interesting to him.

We've yet to hear any comments from JC about the OpenGL 2 movement yet either, which I'm very interested to hear what he has to say about it. Of course, it could also be the case that Carmack just would prefer the assembly!
 
Yeah you are likely right with regards to JC having more pressing matters. But until a name like his is attached to Cg and other hardware manufacturers endorse Cg there is good reason to be sceptical.

Sabastian
 
Let's try it again from the beginning...
Creating and using vertex & pixel shaders in D3D is like that:
Shader assembly -> D3DXAssembleShader* -> Compiled object code
D3DXAssembleShaders is helper function and it DOES NOT know anything about hardware. It compiles shader from readable assembly language into binary form.
It is EXACTLY THE SAME with ANY high level shading language.
DX9 High Level Shading Language -> D3DX -> Compiled object code
Cg -> Cg compiler -> Compiled object code
Then you need to pass this compiled object code to Create*Shader which is processed by Direct3D runtime and only then passed to driver. Now and ONLY NOW is driver allowed to OPTIMISE shader.
You can only make general optimisations in assembly. And any high level shading language will (is) only be able to do general optimisations. It is totally up to driver what it will do with compiled object code.
You also CAN NOT pass anything non standard trough Create*Shader and if you do Create*Shader will fail!

How many of you here have actually tried to write one simple pixel or vertex shader?? None? And now you are talking about HOW developers, Microsoft, NVidia and others should and should not optimise shaders...
 
And do you maybe know how does NVidia vertex programming extension look like? It looks exactly like DX 8 vertex shaders. If NVidia would succeed with their proposal DX8 vertex shaders an OpenGL vertex programs would be natively compatible (you could write vertex shader assembly code in DX8 and it would work in OpenGL also). It is also interesting how long did ARB need to accept standard ARB extension for render to texture operation. I think it was around 2 years!?
Now who is holding up developers?
 
:eek:

I wouldn't mind if ATi did something like this purely because ATi doesn't have as close relations with Microsoft as Nvidia does and wasn't involved in the mysterious downfall of a company with a product only 1 or 2 months from release. Then they suddenly buy all their IP and snub remaining customers (yes I'm still sore over 3dfx). But, of course, that's a whole other thread. :-?

But that's what I think, and I'm willing to wait how Cg plays out. Frankly, I have to ask: why wouldn't this small spark of controversy start unless there were things to argue over? Surely if it was all fine and dandy we wouldn't have this thread. :eek:
 
Don't be naive. Praise is cheap. As someone said, standard pr. Gather a few developer quotes, thats easy.

Matt Burris said:
Point me to ANY of the major 3D players in the industry saying CG is good for them then maybe you will have a arguement !!

Well, you got Kurt Akeley praising it, and even though he is a part-time employee for NVIDIA, he is also the co-founder of SGI and OpeGL, which should say a lot right there. Then, you have these guys, the guys who are responsible for making Cg either fail or be successful. There's a lot of major 3D players in there, if you ask me.
 
MOf1l9V said:
Don't be naive. Praise is cheap. As someone said, standard pr. Gather a few developer quotes, thats easy.

That's funny, I counted much more than just a few. Time to go back to Kindgergaten! ;)

Praise is cheap? Are opinions cheap too? Is excitement cheap as well? :rolleyes:
 
Doomtrooper said:
Then why are you so worried?

Worried ..LOL dude I've got ALOT more important things in my life then a video card compiler, I'm stating my OPINION just like you, if you don't like it don't read it.
My opinion obviously reflects 3Dlabs and OGL ARB, so am I crazy..sure only to the Nvidia Camp :LOL:

So now if people don't agree with your opinion it's not ok to discuss it with you? NO WHERE in my post did I intimate that you shouldn't post. There's no need to be defensive but it seems to me that your 'search' for the truth on CG was really just to justify that NVidia = the devil, hence CG = the devil.

And I see you haven't answered ANY of my points btw.

Read what Reverend wrote, if you want CG you can try it out now, if you want OGL 2.0, then what as a developer are you going to do?

Just remember, ATI is a VERY big player, they too could afford to push out their own compiler as well.
 
Cg is obviously a competitive weapon. If ths had been an industry wide desireable development, Cg would have been designed in cooperation with the other manufacturers, or at the very least, the other manufacturers would have been kept up to date as to Cgs development so they could prepare their own backends.

This is not what has happened.

That is not to say that Cg is useless. If it had been useless, nVidia couldn't hope to garner any support from developers. But it is equally obvious that their competitors have been kept in the dark. Small wonder that these are not thrilled. It is also interesting that nVidia feels the need to make misleading claims about OGL2..... Now why would they want/need to do that...?

Entropy
 
Entropy said:
It is also interesting that nVidia feels the need to make misleading claims about OGL2..... Now why would they want/need to do that...?

Good catch, Entropy.

Regardless of the claims that ATI was driving the DX9 development, I'm fairly confident that nVidia is betting their horse on Microsoft (and DX9/10 + HLSL) over OpenGL to define the future for high quality real-time 3D.

Please read the fuss on www.extremetech.com again and take note of this:

We just discovered important OpenGL news in talks with 3Dlabs today-- at last week's OpenGL ARB meeting, Microsoft disclosed IP that may be setting OpenGL 1.4 shader extension development into a holding pattern. The Microsoft IP is assumed for DX8, and relates to the mechanism used to implement the ARB_vertex_program assembly language extension (even though the extensions themselves were derived from Nvidia OpenGL shader extensions). At this juncture, the future of generalized programmable shader extensions to OpenGL 1.4, 1.5, etc. is up in the air.

This is related to OpenGl 1.x, but the action shows that Microsoft [again] is taking OpenGL head on. How does nVidias misleading claims about OpenGL 2.0 come into this then? Well, if they are following/have teamed up with Microsoft, they want OpenGL 2.0 to be much more like DX 9/10 HLSL then 3Dlabs.
 
Back
Top