Siggraph Rendermonkey Details and more....

Status
Not open for further replies.
There is no conditional branches in pixel shaders on NV30. R300 has Sin/Cos. Only thing NV30 has extra is the derivatives.


Secondly, RenderMonkey is just a IDE for editing a high level scripting language in XML Format that represents the composed script actions. It is no different than Photoshop's ability to create an enormousl number of effects by composing simple action scripts. Apply convolution, do histogram, run warp, crop, blend layer, etc.

So RenderMonkey *IS* a programming language, it's just a really high level one with an IDE to generate the script code. RenderMonkey then takes the XML language representation of this script and generates DX9 HLSL or OpenGL HLSL. It is no different than Cg in this regard, since Cg will also generate either DX9 or OpenGL compliant execution code.


Where RenderMonkey falls down flat is that it is only a prototyping tool. There will be things you simply can't do in their high level XML batch language that you can do by writing shaders directly, since it is a declarative language instead of a procedural one.

RenderMonkey is really just an artist tool, and may be useful for artists to prototype effects, but when it comes time for Tim Sweeney or John Carmack to include the effect in their game, they will probably hand code it in HLSL.


Still, it's hypocritical for the ATI fanboys to defend it on one hand, and attack NVidia on the other.

Finally, I still assert that DX9 is not a isomorphism to the R300's feature set.
 
Doomtrooper said:
Its getting out of hand here, there is no rules anymore

Dr. Peter Venkman: This city is about to face a disaster of biblical proportions.
Mayor: What do you mean, "biblical?"
Dr. Raymond Stantz: We mean real wrath-of-God type stuff. Plagues, darkness--
Winston Zeddemore: The dead rising from the grave!
Dr. Egon Spengler: Forty years of darkness! Earthquakes, volcanoes--
Dr. Peter Venkman: Riots in the streets, dogs and cats living together, mass hysteria!

DemoCoder said:
All Cg2.0 will do for NV30 is allow it to generate longer programs without going to multipass, the exact same thing a programmer would have to do if coding by hand!
I do not think this is a fight that will ever end. Intel makes their own compilers, boo hoo AMD. What if Microsoft made a Visual DX++, all hell would break loose regarding the end of OpenGL! Frankly masterbating with sand paper is a bit more exciting than reading some of these posts on Cg/RMonkey.
 
That doesn't make sense to me. Rendermonkey will produce HLSL code?!

It is no different than Cg in this regard, since Cg will also generate either DX9 or OpenGL compliant execution code.

That would be quite different than Cg, because Cg doesn't create HLSL code. It creates "assembly" code for DX9/GL compiled from the Cg HLSL.

I still don't believe that RenderMonkey produces the HLSL code rather than assembly (or the code that the calling app needs to generate the assembly)...I'll try and look over the docs myself, but could someone else verify that claim?

Still, it's hypocritical for the ATI fanboys to defend it on one hand, and attack NVidia on the other.

And I think it's hypocritical for nVidia fan boys to say that Cg and RenderMonkey are fundamentally different things on one hand, and then claim that one can't criticize one approach over another.
 
RenderMonkey is a code generator. RenderMan, for example, is listed as one of the target in ATI's presentation. All it does is take a declarative high level XML batch file format and turn it into procedural HLSL code. RenderMonkey doesn't have a language itself. I have written numerous such XML compilers myself for generating web sites in multiple languages (HTML, JavaScript, VBScript, WML, Flash, VoiceXML, etc)


The point is, RenderMonkey has its own programming language, the XML format it uses to store the special effect. Cg also has its own programming language. The only difference between the two is that RenderMonkey is higher level, but it's still proprietary in the same way that Cg is, unless they are donating their XML language to ARB for DTD standardization.

So yes, it is both the same, and different. It is the same in that they are trying to use a high level proprietary language and transform that to multiple programming languages. Cg is doing the same. The only difference is that Cg generates assembly and RenderMonkey doesn't neccessarily. But it's a distinction without a difference architecturally.


Point is Joe, that you, and the other anti-Cg people are attacking NVidia for shipping a multi-api/multi-language tool, but praising ATI for doing the same. Whether or not RenderMonkey generates assembly or HLSL is even irrelevent. The idea is that developers used ATI's IDE, which generates an XML file, which is converted to another language.

Therefore, the shaders designed with ATI's tool are *Locked into ATI's proprietary file format/language* just as Cg shaders are locked into NVidia's syntax. ATI controls the RenderMonkey tool, and the XML DTD that it uses.


I find it almost breathtakenly useless to even debate half the people on this board for their refusal to post any factual information about Cg or DX HLSL (instead of vacuous speculation) or their failure to understand even basic development practices.
 
Point is Joe, that you, and the other anti-Cg people are attacking NVidia for shipping a multi-api/multi-language tool, but praising ATI for doing the same.

1) I am attacking the Cg "language" (not any other aspect of Cg), because I don't see any point to it considering DX9 HSL and OpenGL 2 are on the horizon, if not imminent.

2) Where is ATI's "C-like, procedural" language?

What I can offer is ATI's "Motivation" for RM:

Create an environment that is language agnostic allowing ANY high level shading language to be supported via plug-ins

Also,

The Importer plug-ins allow support for any INPUT shding language: RenderMan, DriectX, Gl, Maya...

So it seems to me that RenderMonkey can take HLSL as the INPUT, not the output as you describe.

According to the Workflow, HLSL can be used as input, RenderMonkey Compiles, the streams are fed back to the "application" which feeds the results to the driver. Within the RenderMonkey compiler is the option to target the resultant code for DX or GL.

That seems pretty different to me than what you described.
 
Finally, I still assert that DX9 is not a isomorphism to the R300's feature set.

We'll have to agree to disagree about instruction count, then, until more information is available.

There are probably some subtle differences; however, there is nothing in any of the ATI SIGGRAPH shading presentations that indicates that the 160 instructions is a no-strings-attached number.

The first slide about pixel shaders in Mitchell's presentation is either put together incorrectly, or asserts that Radeon 9700's pixel shaders support 64 ALU instructions, and DirectX 9 isn't mentioned anywhere in the course notes for ATI hardware shading.

SIN/COS aren't mentioned as hardware-implemented ALU instructions, but the wording about instruction sets on both presentations doesn't necessarily rule out the possibility. However, given the nature of the presentations, I would believe that a company would want to publish as many positive aspects about its products as possible.
 
JoeHoe said:
1) I am attacking the Cg "language" (not any other aspect of Cg), because I don't see any point to it considering DX9 HSL and OpenGL 2 are on the horizon, if not imminent.
That's absurd. Why "attack" Cg at all when ultimately it is a choice? Developers can choose to use it or ignore it. NVIDIA offers it, they don't force feed it down developers throats.
 
Revnerd,

"Attack" wasn't the right word to use. (DemoCoder accused me of "attacking" nVidia, so I used the same word but the point is that my "problem" is with the Cg language.) I just don't see the POINT or VALUE of it (beyond other HLSLs), other than being a means to expose nVidia features (like GL proprietary extensions are). And as I said before several times I don't have any problem with that. Who could fault nVidia or any other vendor for providing a means to access their hardware's features?

I would have a problem with the language if it evolves into a standard, and control of it remains isolated to a single IHV. Especially given the emergence of non "single IHV" controlled languages like OpenGL and DX9.
 
In order for anything to involve into a standard, it would need a wide-spread adaptation, would it not? I say, let the industry regulate itself.

Back when Glide was a superior API for game developers, it was widely adopted. When the competing APIs improved and 3dfx began to stagnate, Glide was largely abandoned in favor of other APIs.

If CG offers tangible benefits to the developers (who are truing to reach the widest possible audience), it will be adopted. If it does not, no one will bother. If it is adopted but over time loses its usefulness to the developers, it will be abandoned in favor of something else.
 
Well, on a tangent one undeniably good that CG will do, is provide a convenient tool for Xbox designers such that they can view there shaders in Maya and other relevant software.

I don't think many people can argue that what's good for the Xbox, is good for our short term gaming prospects, since by and large, few targetted DX8 games are readily available.

Long term, I tend to agree with SA that these sorts of languages are going to converge to a research consensus.
 
Joe, regarding your last paragraph (and I mean the exact words in your last para) - it will never happen. You can stop with your needless "worrying", if you are in a position to worry about this.
 
Reverend said:
Joe, regarding your last paragraph (and I mean the exact words in your last para) - it will never happen. You can stop with your needless "worrying", if you are in a position to worry about this.
How do you know?
Not to be a dick, but how can you be sure?
Why should i trust that YOU alone can stop this from happening?
Monopolies are the inevitable byproduct of capitalism.
 
DemoCoder said:
There is no evidence, NOT ONE IOTA, that Cg will only work on NV30.


Secondly, are you people complete idiots? The only functional difference posted so far between NV30 and DX9 pixel shaders is the increased instruction count. The R300 is also functionally different than DX9. DX9 specifies that pixel shaders are a maximum of 96 instructions long, but the R300 extends this to 160!

All Cg2.0 will do for NV30 is allow it to generate longer programs without going to multipass, the exact same thing a programmer would have to do if coding by hand!


Doomtrooper, you don't sound like a developer, so why are you even partipicating in this discussion?

Its my thread Coder, and it was supposed to be about Rendermonkey but since its degraded to the usual trash..you can post what ever biased drivel you wish.
I care as a gamer and a consumer and you don't need to be developer to take part in discussions here, in fact looking at what I've seen posted here by 'developers' I see why monopolies exist.
Its your thread now..have fun with it.
 
Joe DeFuria said:
I would have a problem with the language if it evolves into a standard, and control of it remains isolated to a single IHV. Especially given the emergence of non "single IHV" controlled languages like OpenGL and DX9.

Okay, maybe it's time to come out with an unpopular view: The worst thing that could happen is that Cg turn into a standard for some but not all. Then other hardware vendors wouldn't be pressed into accepting it and writing their own profiles/compilers. In that case I would much prefer that Cg turns into a true standard in which case I think that nVidia would be much more pressed to change Cg to follow other companies requests for features.

And please remember this about Cg: You know the langauge/code and what it does. You also know the end result in form af the assemble langauge, so only the "conversion" to assemble code in kept somewhat in the dark. And why should every be so upset about that (which you can BTW license and thereby get insight to).
 
Doomtrooper said:
Its my thread Coder, and it was supposed to be about Rendermonkey but since its degraded to the usual trash..you can post what ever biased drivel you wish.
I care as a gamer and a consumer and you don't need to be developer to take part in discussions here, in fact looking at what I've seen posted here by 'developers' I see why monopolies exist.
Its your thread now..have fun with it.
In case you missed it I was poking fun at you up above for being over dramatic. You are really giving me too many opportunities to get myself banned.
 
Sorry, but this is just enough! I don't find it amusing at all Doomtrooper, that you now try to make it look as if Democoder were somehow responsible for the way this thread has evolved. Yes, this thread was about rendermonkey, and I am highly annoyed there has been actually almost no rendermonkey discussion going on, but its degrading into the "usual trash" has just as much or more to do with you than Democoder or anybody else here. Last time I read through those last 4 pages (and unfortunatelly a few other threads even longer where you participated), you were one of the main contributors of inflamatory political arguing that has become so annoying the past weeks!

At least Democoder constantly tries to bring up technical points when arguing, and judging from the last years of his postings here, he obviously knows a *lot* more about development and programming than most of us combined, supposed Nvidia bias or not. Your constant bickering however is pretty much limited to quoting what others said (you must have a huge library of bookmarks alone for that purpose) and making dramatically exxagerated political assumptions and accusations. All of this "end of the 3D world", "there are no rules anymore", etc. of yours is getting so incredibly annoying, ridiculous and out of hand its not even funny anymore! And if I hear that incredibly ignorant "why can't you people see the obvious truth" statement once more I'm probably gonna go nuts... o_O
 
Gollum said:
Sorry, but this is just enough! I don't find it amusing at all Doomtrooper, that you now try to make it look as if Democoder were somehow responsible for the way this thread has evolved. Yes, this thread was about rendermonkey, and I am highly annoyed there has been actually almost no rendermonkey discussion going on, but its degrading into the "usual trash" has just as much or more to do with you than Democoder or anybody else here. Last time I read through those last 4 pages (and unfortunatelly a few other threads even longer where you participated), you were one of the main contributors of inflamatory political arguing that has become so annoying the past weeks!

At least Democoder constantly tries to bring up technical points when arguing, and judging from the last years of his postings here, he obviously knows a *lot* more about development and programming than most of us combined, supposed Nvidia bias or not. Your constant bickering however is pretty much limited to quoting what others said (you must have a huge library of bookmarks alone for that purpose) and making dramatically exxagerated political assumptions and accusations. All of this "end of the 3D world", "there are no rules anymore", etc. of yours is getting so incredibly annoying, ridiculous and out of hand its not even funny anymore! And if I hear that incredibly ignorant "why can't you people see the obvious truth" statement once more I'm probably gonna go nuts... o_O

Don't like it go play checkers...take a hike
 
You can continue proclaiming your melodramatic opinions all you want, I'll just do my best to ignore it the next time they get out of hand. Just don't try to make yourself look innocent by blaming somebody else afterwards, its rather pathetic...
 
Whats pathetic is not one 'technical' discussion about Rendermonkey ...instead it turned into CG Roxzors..half the posts the poster didn't even read the .pdf.

Just don't try to make yourself look innocent by blaming somebody else afterwards, its rather pathetic...

:LOL: ...this is better than a soap opera.
 
Status
Not open for further replies.
Back
Top