Well if Rendermonkey supports PS 1.1-2.0, what tool is more complete ??
This is just too funny. Come on dude, do I really have to take the time and dig out your quotes/responses to CG?
Man, if it ain't the pot calling the kettle black. Before, it was just a _bad_ idea...Not good for the industry as a whole, because you wouldn't want _any_ 3D chip maker to spec. any sort of standard. OK, I cann see the point of view on that...and you continued on stating all the _bad_ things about CG...that went totally beyond, for example, the technical aspects of CG.
Now, all of a sudden, it appears that ATI is going to bring a similar solution to the table, which _totally_ goes against EVERYTHING you just stated a mere 2-3 weeks ago! Rather than continue in the same manner as 2-3 weeks ago, now it's suddenly...
Well if Rendermonkey supports PS 1.1-2.0, what tool is more complete ??
Didn't detect one ounce of dissatisfaction in the direction that ATI is apparantly pursuing whatsoever. I mean, come on...A spade is a spade.
If you're going to have a strong opinion on a given issue...and make it very clear, beyond a shadow of a doubt...You just cannot possibly, all of a sudden, go "Yeah...This is the most feature complete tool now!"
OK, just for the heck of it...
and it's still very silly to introduce another HLSL format with Dx 9 and Opengl 2.0 with 3Dlabs leading the way coming soon with their own...makes absolutley no sense whatsover and IMO shows Nvidia has lost its influence with MS and the ARB.
I so no reason for CG being here Type if Nvidia and Microsoft are seeing eye to eye, why not just assist in developing DX9 HLSL ??
Nvidia has designed this language to be optimized for their chips code path as pointed out by Codeplay..so a game developed on this will run optimal on Geforce hardware..hmmm
Another example of a optimization that another vendor would not get
Who isn't for better graphics in games etc ???... I want it now..but I also don't want a graphics card company dictating the rules or getting any special treatments through developers..be it specific code enhancements etc.I would rather wait for the standard that has been in place for a very long time, Opengl ARB and DirectX, at least this way there is a table with everyones interest being heard.
The day Nvidia is calling the shots on the overall graphics industry is the day PC gaming dies... we don't need three compilers and it will fail..mark my words.
When a graphics card company tries to market a compiler that will give developers ease of programming with their OWN cards and does not inlcude proper docs on how to implement profiles, when codeplay states its optimized specifically for Nvidia only ,when renderman software already does it 3 lines of code vs CG 's 2, when DX and OpenGl 2.0 are realeasing their own HLSL....its a attempt to get developers to buy into this CG HLSL so games will always perform better on Nvidia based cards while the rest of the video cards will be underperforming due to optimizations that their cards are not getting..since you own a Nvidia card its not a big deal but for myself and thousands of other non- Nvidia users this means SQUAT to us.
There is nothing wrong with the current standard AT ALL
Here's the absolute kicker...
I would never want ATI to own any standard on HLSL, I would expect them like any other company to go through the proper channels like they HAVE be doing... they were appointed to full membership to the ARB because ATI played fair Having a graphic card manufacturer develop software for game developers when a standard is coming in 6 months optimized for themselves is also a MONOPOLISTIC Move IMO.
Edit: Let me just state that all of these quotes came from one single thread @ Rage3D concerning CG...so it wasn't as if I had to dig around the net finding these quotes. *If* this turns out to be ATI's answer to CG (and we really don't know at this point), I'm not entirely sure I necessarily disagree with it...
Remember what Carmack said in his last .plan: the last thing we need, at this stage of the game, are a bunch of different tools that are all trying to achieve the same thing. I think the downside to this is that it makes it a little more difficult for the developers...The flip side, however, is that it could be the best-case-scenario for the likes of ATI/nVidia since it would presumably allow developers to really optimize routines for the 2 biggest 3D players. It would ultimately come down to the quality of the compilers and/or tools...If they're both _very_ easy to use, then it shouldn't be a problem...But it would be really nice if, somehow, a Universal/Unified Shading language could be developed/adopted by all (ATI/nVidia/Matrox/etc) that could also be utilized in both OGL and D3D.