Which API is better?

Which API is Better?

  • DirectX9 is more elegant, easier to program

    Votes: 0 0.0%
  • Both about the same

    Votes: 0 0.0%
  • I use DirectX mainly because of market size and MS is behind it

    Votes: 0 0.0%

  • Total voters
    329
Chalnoth said:
One case in point: specifications. OpenGL specifications are very specific. DirectX's are not. DirectX seems to simply give a vague idea of what something is supposed to do, then puts out a reference renderer and then Microsoft basically says, "do what the reference does."
OpenGL and Direct3D (contary to public opnion they exist in written form) specifications aren't worth the paper there written on if there is noway of checking the IHV keep to them.
Example : ARB_fragment_program
NVIDIA and ATI both have differing output with regard shadow maps. (NVIDIA claims ATI pushed through a change of spec that disadvantages there hardware)

At least when NVIDIA tried to change the Direct3D PS_2_0 spec, I had an outside organisation to shout at. Where would I have gone with OpenGL? Who do I go to when an IHV knowingly breaks the spec and refuses to fix it?

D3D vs OpenGL arguments are totally pointless, Its was ~8 years ago since I was involved in my first one, and as I really don't care which API I use (I prefer RedLine personally ;-) ) I'll duck out of this thread.
 
Chalnoth said:
Joe DeFuria said:
No one is arguing that the GL Model doesn't have advantages. Both GL and DX models have some balance between central control / stability and flexibility. For the consumer environment, I just believe that the DX model's balance (toward stability) is better suited. Not that GL is utter crap or something.
I think it's the exact opposite.

OpenGL's "balance" has always been more towards stability than DirectX's has.

Implicit in my statement is stability, and consistency. And I disagree with you in any case. OpenGL's balance has always been toward flexibility. Just look at the extension mechanism.
 
DX3-DX6. 'Nuff Said about "stability"

Anyway, I find it hard to believe people are actually arguing against open standards now, just because you can't "complain" to an open standards organization and have them beat down vendors.

We're talking about approaches: Open vs Proprietary. Future looking and extensible vs closed. Dynamic vs Static. This thread is a technical and philsophical discussion about what's possible in the future. The naysayers say "well, GLSlang isn't here yet, here and now, all we have is DX9". If GLSlang were actually here, this thread wouldn't exist, because we would actual have HW and drivers to discuss.

From a technical point of view, I find this thread ludicrous. The "anti-compiler-in-driver" people don't seem to get that today's DX9 drivers already include a compiler. All the arguments about increased driver complexity and potential bugs and breaking apps because of compiler updates apply to the here-and-now of DirectX9.

The only difference between the OGL2.0 approach and the DX9 approach, is that the front-end parsing and tree/ir building is done by the driver, but as I have explained till I'm blue in the face, parsing is the most trivial part of a compiler, and it is entirely automated. Hardly anyone writes parsers for programming languages anymore. They go to a language spec, get a copy of the grammar, write an LL or LR grammar, and then use a tool to generate the code needed to do the parsing. The only except is very small grammars where writing an LL(1) recursive decent parser by hand would be simple.


Instead of "parsing" source, DX9 drivers have to "parse" a binary format stream containing the compiled shaders. They then have to reconstruct the same datastructures the command line compiler use: use/def or ssa, register interference graph, dominators/control flow graph/dependence graph (depending on if loops and branches are used)

Both ATI and NVidia are still shipping updated drivers with better optimizations, proving that DX9 drivers aren't the trivial no-bugs-no-hassle "stable" panaceas they are being made out to be, in fact, they are quite complex and take a loooong time to mature. Why? Because all of the supposed "benefits" of the DX9 approach (Microsoft doing the initial parsing and IR generation for you) don't solve the biggest problems.
 
DeanoC said:
It makes no difference if a bug is fixed in a driver, once its there you HAVE to work around it. You simple cannot ask the user to install new drivers.

Lol, I suppose then we should just use DX7 too. After all, we can't expect that users upgrade to the latest version of DirectX, can we?

Of course you can demand the user to upgrade his drivers. You can check the driver version at installation time and see if it's a known buggy one, and in that case ask the user to install a new one, like when your DX version is too old. Just place a working driver set on the game CD for all IHVs.
 
Joe DeFuria said:
Implicit in my statement is stability, and consistency. And I disagree with you in any case. OpenGL's balance has always been toward flexibility. Just look at the extension mechanism.

No, before OpenGL became popular for games, it was used in the CAD industry, where stability and pixel-perfect accurracy was needed. OpenGL implementations went through far more rigorous examination and certification. The OpenGL spec places specific requirements on the pipeline functionality and they ensure that any APPROVED specifications are consistent with existing functionality.

For christsakes, OpenGL is used in the cockpit of commercial and military planes, and implementations must be certified by multiple government agencies before it can go into a cockpit.

The DX API is the ultimate in non-consistency. The damn thing has transmorphified many many times since DX1, all the while, OpenGL's core and style has remained relatively the same. The ARB extensions that got approved usually are entirely consistent with the spirit of the rest of GL.
 
Joe DeFuria said:
MS makes more work for compiler authors, when GL provides no type of compiler at all?

"More work for compiler authors", indeed. Compiler authors need to do more work. They need to reconstruct the information the MS compiler stripped away from the shader.

Joe DeFuria said:
Let's see how long it takes someone like XGI to support GLSLang...

I would be more concerned about how long it will take them to come up with competitive hardware.
 
DemoCoder said:
No, before OpenGL became popular for games, it was used in the CAD industry, where stability and pixel-perfect accurracy was needed.

Right...but only for a handful of applications, each of which actually tends to have it's own "certification".

That's the difference between "professional" and consumer. Orders of magnitudes of a difference in terms of the number of applications, as well as the pace at which advances occur. (and therefor need to be controlled.)

OpenGL implementations went through far more rigorous examination and certification.

As above. Far fewer apps to "examine / certify" with, and far fewer end users to deal with.

For christsakes, OpenGL is used in the cockpit of commercial and military planes, and implementations must be certified by multiple government agencies before it can go into a cockpit.

Wow....specially designed closed systems.

The DX API is the ultimate in non-consistency. The damn thing has transmorphified many many times since DX1, all the while,...

EXACTLY.

It has "transforphied", while at the same time maintained a level of backwards compatibility and support on a HUGE number of software and hardware implementations.

It took a centralized level of control to make that happen.

OpenGL's core and style has remained relatively the same.

Right...slow moving.

Humus...you're just another developer who prefers the GL interface. Most do. That's fine, I take your word for it. This doesn't mean the GL structure as a whole is best for the consumer space.
 
So open standards isn't a good thing anymore? Now it's strong central control we want? Oh, how the times change.
 
Humus said:
So open standards isn't a good thing anymore? Now it's strong central control we want? Oh, how the times change.

Sigh....

How many times must I repeat myself. Let me say it again, but putting it another way.

I never said open standards aren't a good thing.

But depending on the target market, one single controlling authority (who takes input from the players) can be a better model than 'standards by committee.'.
 
I see standards by committee as being too combersome, too late and takiing a please everyone approach which doesn't always work e.g. JEDEC.

The MS DX approach is a good thing as their is plenty of input from developers and IHV's to ensure their product (DX) is used successfully and easily.
 
If there was good input from developers and IHVs, it wouldn't have taken 7 versions of the API to get it to the stage where it was usable.

OGL2.0 wasn't "designed by committee." ARB put out an RFP, 3dLabs submitted a proposal, and that proposal was essentially adopted with minor modifications.

I don't know many standards that are "designed by committe". I've worked in standards groups, and the vast majority of the time, a company with a succesful implementation will simply put it down on paper and submit it to be standardized.

I would call most standards processes "edited and approved by committee"

Obviously, some standards groups are better than others.

Anyway, I'd call waiting several API revisions to get to OGL quality (DX1 to DX7) "slow moving". Microsoft software appears to be "quick moving", because they release like one new version every year, unfortunately, the first 3 versions are usually totally unusable, versions 4-5 are usually mediocre, it is only after 7-8 when they start to get good.

Is this really demonstrating the superiority of MS"s "design by monopoly"? 3-4 sucky versions, software mired in a buggy quagmire for the first 3-4 years?
 
Ah but Democoder I think with DX8 and then DX9 MS has made a push in the right direction and still has the momentum to carry the standard forward. In fact I appreciate the slowdown that is predicted from DX9 to DX10.

Can you tell me was OGL2.0 'late.' I hope it is successful but almost every developer on the PC platform seems to have switched over to DX (for better or worse).
 
DemoCoder said:
If there was good input from developers and IHVs, it wouldn't have taken 7 versions of the API to get it to the stage where it was usable.

The GL's design by committe was so great, it wouldn't be behind DX in many repects, given the relative newness of DX relative to GL.

OGL2.0 wasn't "designed by committee." ARB put out an RFP, 3dLabs submitted a proposal, and that proposal was essentially adopted with minor modifications.

The ARB is design / approval by committee.

Is this really demonstrating the superiority of MS"s "design by monopoly"? 3-4 sucky versions, software mired in a buggy quagmire for the first 3-4 years?

Again, spoken purely from a developer perspective. The pace at which MS improved and innovated DX over a short period of time, while maintaining compatibility is testament to MS's / DX design approach.
 
Joe DeFuria said:
[If?] The GL's design by committe was so great, it wouldn't be behind DX in many repects, given the relative newness of DX relative to GL.
How, where, what? Direct3D has been behind the curve from since its inception until (excluding) DX8.
So you had this one release that was 'better' in one area and that then marks the abandoning of OpenGL in the games market? I beg to differ. GL has been under the pressure of mudslinging PR people rephrasing the old "Well, GL is not designed for games, but Direct3D was, you know" up to the point when they started believing it themselves. That's one of the two reasons Direct3D even exists (as in "survived this long"), the other reason being OpenGL itself. Without OpenGL, there would have been no foundation to 'borrow' the fundamental know how from (eg the already mentioned "DX7" hw t&l, but also more basic things such as texture sampling positions and filters, watertight rasterization, the whole combiner paradigm, stenciling, etc).

On the "DX9" techlevel, OpenGL offers the same shader functionality as Direct3D runtime and is less restrictive. High level compilation in DXG, pardon me, just went horribly wrong. There's no point in shipping DXG applications with high level code. Offline tools, that's all you get, albeit in a more complicated way.

That leaves us with the single apparent* benefit of using DX Graphics:
Pixel shaders at the NV2x/R200 tech level are sort of unified. Ie NV2x hardware capabilities were brutalized to put them just below R200, so that a vendor neutral PS1.1~1.3 could be defined. PS1.1 runs on R200 while in OpenGL there is no such common ground. R200 fragment shaders and NV_register_combiners/NV_texture_shader must be explicitly supported in seperate codepaths. This is just a reflection of reality, the hardware is vastly different after all.

From a 'business' pov this certainly makes DXG look more attractive. OTOH it tends to underexpose hardware, which equals performance loss

Add on top of that all the political issues that have spawned from this backwards compatible versioning scheme (3DMark2k1 "advanced pixel shaders"; PS1.4 in 3DMark2k3; PS2.0a anyone?). DX Graphics just isn't fair, particularly not so in the context of neutral benchmarks. Admittedly this is a side issue, but it didn't tilt my sympathy towards DXG.


*adding an ARB_fragment_program backend to my pet project took me eight hours, including debugging, validation and going final. The backend runtime-generates shaders from a rendering state vector and caches them in API objects via hash. Approx 10000 possible permutations. It's not that much work to support multiple paths.
 
Joe DeFuria said:
Humus said:
So open standards isn't a good thing anymore? Now it's strong central control we want? Oh, how the times change.

Sigh....

How many times must I repeat myself. Let me say it again, but putting it another way.

I never said open standards aren't a good thing.

But depending on the target market, one single controlling authority (who takes input from the players) can be a better model than 'standards by committee.'.

That's what I call a closed standard. Centrally enforced, and in this case from a third party.
 
Joe DeFuria said:
The GL's design by committe was so great, it wouldn't be behind DX in many repects, given the relative newness of DX relative to GL.

This argument has been heard in all times and have been false pretty much all the time. Certain features have appeared in DX first, but they are fewer than the other way around.

Joe DeFuria said:
The ARB is design / approval by committee.

Wrong. It's design by workgroup, approval by committee. Vendor A does initial research, takes it to the ARB, ARB decides to go forward, creates a workgroup with people from vendor A,B and C. These work together to design the extension, usually beginning from the original draft show to the ARB. This work is done in the workgroup and outside the ARB. Once the workgroup is done with their work, it's taken back to the ARB for approval.

Joe DeFuria said:
Again, spoken purely from a developer perspective. The pace at which MS improved and innovated DX over a short period of time, while maintaining compatibility is testament to MS's / DX design approach.

DX has not "improved and innovated" particularly much over "a short period of time". Even if would have, I still would have preferred the API that was for most parts done right from the beginning. It took them until DX7 before it was anywhere close to GL. And I'm still waiting for MS to fix some stupid design choices such as texture upload by locking and to fix the default state to something that makes sense.
 
Joe DeFuria said:
DemoCoder said:
If there was good input from developers and IHVs, it wouldn't have taken 7 versions of the API to get it to the stage where it was usable.

The GL's design by committe was so great, it wouldn't be behind DX in many repects, given the relative newness of DX relative to GL.

Behind how? DirectX was behind OGL until DX8. DX8 and OGL1.4 have parity. It is only DX9 that is "ahead" by going forward with non-vetted design that seems overly tilted towards one HW implementation.


OGL2.0 wasn't "designed by committee." ARB put out an RFP, 3dLabs submitted a proposal, and that proposal was essentially adopted with minor modifications.

The ARB is design / approval by committee.
[/quote]

Wasn't designed by ARB, only approved. And just how do you think DX9 was designed? Think MS invented all of it and handed it down directly to ATI and NVidia? No, the majority of it came from one or two specific vendors, and worked out in secret working group, with MS's final blessing. The only difference between ARB's approach and MS's approach, is that ARB meetings are public, and anyone can read the meeting minutes, and the final approval comes down to a vote.

The way MS's system works is that MS is like a permanent member of the security council. They can simply veto or rubberstamp anything.




Again, spoken purely from a developer perspective. The pace at which MS improved and innovated DX over a short period of time, while maintaining compatibility is testament to MS's / DX design approach.

Short time? Try 6 years, after ripping off OGL concepts as a base. Why could I play OpenGL games on Windows NT, but I couldn't play any DX games on NT until 2000? How's that for an end user's perspective?

My perspective is that Quake1 runs fine on my system, but games written for DX3 choke under XP. My perspective is that for 5+ years, I've been able to get OGL games that run fine, but up until DX7, I had to worry about whether or not games would run correctly, and had to force upgrade my API 5 times.

Arguably, the games with the largest audience and longevity started their life on OpenGL. I still play Counter-Strike about 2 hours a day, a game engine that started life a half decade ago on OGL.

You honestly think the end user experience of DX over the last five or more years has been better than GL? You don't remember the era of bug ridden DX3/5 games? Games that choke if you don't have the right CAPS bits? Games that choke because of the screen resolution picked, or whether or not you have 16-bit Z or 32-bit Z, etc?

I suppose next you'll tell us that the Windows 95 OS is better than Unix from a consistency and stability point of view?
 
Has anybody brought up this yet?

sidenoteopengl.gif


From one of NVidia's presentations at GDC 2003 -- "Batch, Batch, Batch"
 
Yes, nVidia recommends sending a couple thousand triangle per batch.

And the poor DirectX performance with few triangles per batch was the reason why some of the first DirectX T&L games (*cough* Test Drive *cough*) performed poorly.
 
Humus said:
Just place a working driver set on the game CD for all IHVs.

Most publishers won't allow us for legal reasons...

Its not for trying I can tell you...

The basic problem is the change that some user installs a driver of a game CD, Office runs badly and then blames the publishers. So you either left saying "Install this driver to make this game work, but if it breaks your machine don't blame us", which the legal departments don't like. Its also increase support calls, each support call wipes out the entire profit from the sale of a PC game, publisher will do almost anything to reduce the chance of the user phoning up and saying 'this game doesn't work'.
 
Back
Top