2 good Extremetech Articles (parhelia and Cg)

Well, they didn't really rebut anything. We still don't know when OpenGL2.0 will be ready, and I've read the OGL2.0 proposals myself, and I don't see anything specifying how they are going to make OGL2.0 shaders compile backwards compatible with older hardware.

I like 3DLabs proposal, but let's not forget that they have an agenda as well.
 
Two things

Matrox Reveals Parhelia Particulars

TABLE OF CONTENTS


• Introduction
• Parhelia Game Support
• Beyond 3D: DVD Playback
• What about Linux Drivers?
• Texture Engines, Filter Modes, and Memory
• Surround Gaming vs. Stretching
• Who Needs a Parhelia?

The Parhelia will lead the industry in quad texturing, anisotropic filtering and anti-aliased performance
 
DemoCoder said:
Well, they didn't really rebut anything. We still don't know when OpenGL2.0 will be ready, and I've read the OGL2.0 proposals myself, and I don't see anything specifying how they are going to make OGL2.0 shaders compile backwards compatible with older hardware.

I like 3DLabs proposal, but let's not forget that they have an agenda as well.

Yes and that Agenda is Governed by a table of participents and not a single body :)
 
You should check out the screens I linked to in Nappe's Par..Par..Parhelia thread. :) The MURCers have been given the green light to post some pics ahead of the NDA.
 
Doomtrooper said:
DemoCoder said:
Well, they didn't really rebut anything. We still don't know when OpenGL2.0 will be ready, and I've read the OGL2.0 proposals myself, and I don't see anything specifying how they are going to make OGL2.0 shaders compile backwards compatible with older hardware.

I like 3DLabs proposal, but let's not forget that they have an agenda as well.

Yes and that Agenda is Governed by a table of participents and not a single body :)

NVidia is also a member of OpenGL. 3DLabs Agenda != OpenGL Agenda. 3DLabs Agenda == Making Money for 3DLabs investors == Selling 3DLabs Hardware == Use Standards Commitees where appropriate.

We all know that Microsoft drove SOAP through the standards committees because they wanted Web Services govened by a single body, right?


Did 3D Labs describe how code I will write for OpenGL2.0 shading language is going to work on a GF3/Radeon today? Does 3DLabs have a stop gap measure for today's developers, not 2004's developers? Did 3DLabs really address the criticism that OpenGL2.0 will be a LONG time in coming to market? Do you still not understand that Cg is going to be 100% compatible with DX9 HLSL, and that all Nvidia is really shipping is a tool to generate code for older hardware when you develop using a subset of the language?
 
3Dlabs fully acknowledge that they are pursuing OpenGL2.0 for their own goals, which is to create a method for allowing programmers to gain access to the programmability of the P10 chip. However when they were deciding how they were going to do it they did have the discussions of whether they should do something on their own or take the route they have – ultimately they felt it would be better for them if they went with the OpenGL route because it could be a standard.

Nothing that 3Dlabs propose will go through the ARB without (at least) the majority of the ARB agreeing on it. Thus far they have had strong support from the majority of the full time vendors on the ARB in the work they have carried out. So, if 3Dlabs haven’t thought of a solution that will cater for GF3’s/Radeon then its up to ATI and NVIDIA to ensure they do, not necessarily 3Dlabs – however its more likely to be ratified quickly if they have. Have 3Dlabs addressed the criticism of being a long time to the market? Obviously they have since they are the most proactive members of the ARB in this case – that can’t be a criticism of 3Dlabs but a criticism of the rest of the ARB for not doing anything earlier.

[edit] Incidentally Apple are currently defining the specification for the ARB Shader extensions for inclusion in OpenGL1.5 (probably), so if its included in that then OpenGL 2.0 will cover shader support for GF3/4/Radeon8500 by default.
 
Let's say you are a smaller player in the market. You develop a new hardware that would require a totally new API to take full advantage of. You're options are to create a proprietary API or go through the standards committees.

If you go the proprietary route, you have the problem of convincing developers to USE your API. But since you are a minor player with small marketshare, there will be no overwhelming reason to target your API. You have a huge problem convincing developers and consumers to buy your hardware platform.


Therefore, your only route is to change the standards through the standards committees to suite your product's features. Then, the developer adoption issue is taken care of, because developers by and large have already adopted the standard.


Now let's say you are first to market or a market dominating player. Because you have so much market share, you can pretty much drag developers along, therefore, the pressure to push everything through a standards committee is less. (BTW, Carmack is in the same position. He can code for the average lowest common denominator TNT2/V3 "standard", or he can drag people kicking and screaming to the level of performance he wants, forcing them to upgrade. When you are in a dominant position, you can force people to do what you want)


The downside to standards committees for vendors is that you don't always get what you want. You usually get a "lowest common denominator" or watered down spec. Thus, a huge risk that your super-duper feature ridden hardware will go unused. That's why hundreds of OpenGL extensions exist. Evolution is also alot slower. Sometimes, a dictator is a useful thing (e.g. MS, Intel) in driving the market to standardize quicker, instead of political bickering. Remember sound cards before DirectSound?


But make no mistake about it, the standards committees have nothing to do with the "greater good" or altruism. They are about politics and marketing. It's sad, but most companies don't care about interoperability. The DVD standard, for example, almost disintegrated at one point as each vendor tried to add their own proprietary stuff. Even though it is in their financial best interest to have one standard format, to save on the added marketing expense of promoting your own format and distribution chain!

I sit on 2 standards groups. Most of the people there are technogeeks. However, the only reason I am on the group is to make sure the standard reflects our company's product features and to use the fact that it is a standard to market our product against a Microsoft one that is proprietary, using proprietary as a epithet.


The fact that people keep missing, is, regardless of the inevitability of OGL2.0 and DX9 shading languages, developers need tools *TODAY*. NVidia simply produced a compiler for DX9 HLSL that can compile shaders for OGL and DX for todays hardware. It does this by telling you to limit the size of your programs and to use a subset of the full language.

There is nothing NVidia specific about Cg. Cg should compile and run under DX9 when it comes out. All NVidia produced was a tool. ATI should produce one also.

Ideally, each and every vendor should have their own compiler for HLSL which can produce OPTIMAL code for their platform, instead of using the generic DX9 built-in one.
 
Let's say you are a smaller player in the market. You develop a new hardware that would require a totally new API to take full advantage of. You're options are to create a proprietary API or go through the standards committees.

If you go the proprietary route, you have the problem of convincing developers to USE your API. But since you are a minor player with small marketshare, there will be no overwhelming reason to target your API. You have a huge problem convincing developers and consumers to buy your hardware platform.

Appart from pointing out the obvious that doesn't necessarily hold extactly true for 3Dlabs since they were primarily interesting in one market when they began OpenGL2 development, the workstation market, and they have a much larger slice of the pie there so developers are already working with them.
 
Err, no. Jon Peddie , IDC, JPR, Gartner, et al, say NVidia is tops in total board shipments. If you confine it to only "high-end" boards, Peddie says 3DLabs. However, the "high end" workstation market is a vastly smaller chunk compared to the total graphics workstation market.

Again, 3DLabs may dominate the high-end workstations, but they still don't have the market power to get Alias, Autodesk, et al, onboard to support a proprietary API. Therefore, their only recourse is to push through changes in OpenGL specifically tailored to their new architecture, in order to get the software vendors onboard.


I'm still waiting for someone to suggest what a developer should do right now, not 2 years from now, to avoid assembly language programming for today's hardware. He has two options: build his own high level abstraction tool, or adopt a third party one.
 
Doomtrooper said:
I don't think Dx9 HLSL is 2 years away Democoder....

It's not only about 'when' though. It's about time. Saving it means getting your product out the door that much quicker. Saving time means adhering to your budget and being able to keep your dev team employed. Though this may be greatly exagerated try to imagine yourself as a dev team building a house without any tools. CG comes along in the form of a hammer. Then again it seems to me that you'd prefer to continue pounding the nails in with your forehead. Gotta use it for something, eh? ;)
 
Ty said:
It's not only about 'when' though. It's about time.

That's fine and dandy, but DX9 beta is already available. It was most likely available before Nvidia's Cg too. I'd think its in better shape than Cg too, considering it will be released soon(ish) [~4 months] and most likely has a larger development-team working on it.

Now given the fact that Dx9 HLSL is available before Nvidias Cg, and has more support, which would you choose?

I don't think Cg would save a true game developer any time at all!

--|BRiT|
 
DemoCoder said:
Err, no. Jon Peddie , IDC, JPR, Gartner, et al, say NVidia is tops in total board shipments. If you confine it to only "high-end" boards, Peddie says 3DLabs.

On what planet does the phrase 'larger slice of the pie' equate to 'tops in total board shipments'? I was talking in relation to any other market.
 
Ty said:
Doomtrooper said:
I don't think Dx9 HLSL is 2 years away Democoder....

It's not only about 'when' though. It's about time. Saving it means getting your product out the door that much quicker. Saving time means adhering to your budget and being able to keep your dev team employed. Though this may be greatly exagerated try to imagine yourself as a dev team building a house without any tools. CG comes along in the form of a hammer. Then again it seems to me that you'd prefer to continue pounding the nails in with your forehead. Gotta use it for something, eh? ;)

As said many times I'm not against tools for developers, I'm against tools developed by a video card company to optimize for their product only, Ps 1.1 etc...
I also see the only people on this forum that are backing it are Nvidia employees and the usual suspects, fan site webmasters and long time Nvidia fans ;)
Now Nvidia fans think Nvidia is ABOVE OGL and DX...pathetic. IF my point of view is so wrong why was the rebuttal even published..think about it. :-?
 
I don't think they've 'rebutted' anything specific about the concept of Cg, other than pointing out its proprietry nature, the rebuttal was more about clearing up some facts concerning OpenGL 2.0 development.
 
Dave I refer to these comments:

Contrary to Nvidia's claim, OpenGL 2.0 WILL be 100% backward compatible with existing OpenGL levels. This has been stated in every presentation on OpenGL 2.0 since the beginning.

Contrary to their implied positioning, Nvidia's is not planning to offer Cg to the OpenGL Architecture Review Board for consideration as a standard of any type. Rather, they have stated that they fully intend to control the specification and implementation. Other graphics hardware vendors would be offered the ability to implement this Nvidia-specified language, under Nvidia licensing terms, for their own hardware.

Contrary to Nvidia's claim, the specification work for OpenGL 2.0 is well along. This week, 3Dlabs has provided the OpenGL Architecture Review Board with specifications for the OpenGL Shading Language and three extension specifications that implement support for vertex shaders and fragment shaders that use this high level shading language. The original OpenGL 2.0 white papers were published nine gmonths ago, and 3Dlabs has been refining those white papers, taking input from public reviewers - including other ARB members, and are now turning them into specification documents.

All of these is what Gking and Democoder have been using as reason why CG needs to be here, looking at this none of their reasons hold any water.
 
BRiT said:
Ty said:
It's not only about 'when' though. It's about time.

That's fine and dandy, but DX9 beta is already available. It was most likely available before Nvidia's Cg too. I'd think its in better shape than Cg too, considering it will be released soon(ish) [~4 months] and most likely has a larger development-team working on it.

Now given the fact that Dx9 HLSL is available before Nvidias Cg, and has more support, which would you choose?

I don't think Cg would save a true game developer any time at all!

--|BRiT|

I think they actually both became available within relatively the same time period.

We haven't had the time to really look at CG vs. Dx9 HLSL yet so I can't comment about which could help us more.
 
Back
Top