Which API is better?

Which API is Better?

  • DirectX9 is more elegant, easier to program

    Votes: 0 0.0%
  • Both about the same

    Votes: 0 0.0%
  • I use DirectX mainly because of market size and MS is behind it

    Votes: 0 0.0%

  • Total voters
    329
DemoCoder said:
Short time? Try 6 years, after ripping off OGL concepts as a base.

Ahhh...yes, now MS is just ripping oof GL. :rolleyes:

Why could I play OpenGL games on Windows NT, but I couldn't play any DX games on NT until 2000? How's that for an end user's perspective?

Hello? It's called a target market? WTF have I been talking about?

NT kernel was not a consumer OS until XP. For the 20th time. I am talking about the mass conusmer space here, not a hand full of profesionals or developers.

You honestly think the end user experience of DX over the last five or more years has been better than GL?

Yup.

How many games support DX? How many support GL?

You don't remember the era of bug ridden DX3/5 games?

Who's talking abot bugs? GL games don't have bugs?

I suppose next you'll tell us that the Windows 95 OS is better than Unix from a consistency and stability point of view?

I suppose you'll keep on missing the entire point.

I suppose you'll tell us that the Windows OSs have not always been the better Consumer choice than Linux / Unix over all these years? Puh-lease.
 
Humus said:
Wrong. It's design by workgroup, approval by committee. Vendor A does initial research, takes it to the ARB, ARB decides to go forward, creates a workgroup with people from vendor A,B and C. These work together to design the extension, usually beginning from the original draft show to the ARB. This work is done in the workgroup and outside the ARB. Once the workgroup is done with their work, it's taken back to the ARB for approval.

A commitee with not one games developers on it. Put John Carmack on the ARB and the ARB approval rating among professional games programmers will improve. The ARB is currently a IHV only club... I'd be much happier trusting the future of graphics API to them if there was at least one games developer on the ARB.
 
Humus said:
This argument has been heard in all times and have been false pretty much all the time. Certain features have appeared in DX first, but they are fewer than the other way around.

It's called a "trend" Humus.

GL obviously had a head start. It had lots of things "first."

As time went on, DX first had pixel shading, now it's first with HLSL support, etc.

Wrong. It's design by workgroup, approval by committee....

Blah...blah..."Approval by committe". Whatever. Doesn't change my point.

DX has not "improved and innovated" particularly much over "a short period of time".

How can you say this?

Even if would have, I still would have preferred the API that was for most parts done right from the beginning.

Again. You're a developer. Of course you think this way.
 
Damn I forget I wasn't going to comment on this thread again! Doh! bad me :)

The thing about API wars is its utterly pointless, because (as many women will tell you) its how you use it, not how big it is :)

Its amazing that people still find time to argue about this after so many years. Could we at least argue about something new say OGL ES vs D3D Pocket? (I'm in the OGL ES camp myself, let the flames begin :) )
 
Joe, name some shipping DX9 + HLSL games today that really utilizies these features. Your supposed DX9 lead is vaporware. You can already achieve PS2.0 by using ARB_fragment_program. If you want to use HLSL, you can use NVidia's Cg compiler to compile MS HLSL into ARB_fragment_program, Viola, OGL + HLSL.

So today, one can already utilize tools with OpenGL to achieve what you can under DX9: a command line HLSL compiler, with "assembly" as intermediate language. When ARB approves the HLSL extension, OGL will do what DX9 does, plus, allow integrated compilers.

As for the number of DX games, I remember that before DX7, most popular games were either OpenGL based or GLIDE, not DX.

Enduser experience is subpar. First, the end user experience of pre-XP OSes sucked: crashes all the time. And even to this day under XP, the experience with MS software by and large is: viruses, worms, crashes, viruses, worms, crashes. I'm amazed when I can plug a freshly bought DELL laptop into a network, and within a few minutes, it's been infected.

Likewise, with DX, half the time you installed a game, the game came with a new DX runtime and driver. And boy did I love those numerous bugs related to ALT-tabbing with early directx games. Microsoft software patches your OS half the time you install it. Go install any new version of Office, and watch it overwrite system DLLs. Then watch as third party software breaks, ala the infamous Office "file dialog" problem.


Let's see

1. DX more advanced than OGL? Nope. OGL has VS/PS equivalents shipping today
2. DX model produce less buggy drivers? Nope, drivers now require more code in them to deal with optimizing poor FXC code
3. DX has better enduser experience? Frequent reboots and forced runtime updates. Let's see how Doom3 enduser installation fairs against HL2 when it ships (opps, it is April 2004 now? September 30th my ass, valve you liars) Most popular game engines ever: Quake/Quake2/Quake3/and soon to be Doom3 use it.

4. DX has better developer experience? I think not. Many developers asked when DX came out, "why oh why didn't they adopt OGL's style?" This spawned the infamous Carmack "takes 3 pages of code to draw a triangle" letter.



5. Did DX advance fast? No, it took 7 API revisions to get to non-mediocre stage. 2 more to DX9 to get where it is today. And it has OGL to build on, but spent the first five years trying to do something "not OGL", and failed, having ended up with GL-style functions anyway.


Summing up: No one proved their point that DX's HLSL model would be "less buggy", so now argument has switched to "worse end user experience" and "rate of change". Semantic dodge.
 
DemoCoder said:
Summing up: No one proved their point that DX's HLSL model would be "less buggy",

No one claimed that DX's HLSL model results in less buggy programs.

so now argument has switched to "worse end user experience" and "rate of change". Semantic dodge.

End user experience is my whole entire point. Maybe someone else is trying to sematically dodge that?
 
DemoCoder said:
Joe, name some shipping DX9 + HLSL games today that really utilizies these features. Your supposed DX9 lead is vaporware. You can already achieve PS2.0 by using ARB_fragment_program. If you want to use HLSL, you can use NVidia's Cg compiler to compile MS HLSL into ARB_fragment_program, Viola, OGL + HLSL.

Not to mention ISL (OpenGL Shader) has been around for years now (if you want to take it outside the realm of games).

Deano said:
The ARB is currently a IHV only club...

Are you including entities who are also software vendors (e.g. Apple & IBM)?

Deano said:
A commitee with not one games developers on it. Put John Carmack on the ARB and the ARB approval rating among professional games programmers will improve

Perhaps... Then again, while he may not be a voting member (after all, voting members just reps of companies that are members), he can still attend the meetings and contribute... (but he doesn't, even though he's stated he should).

As it stands, I think game developers are fairly well represented by the various IHVs and system vendors by proxy. Their input may be more diluted vs. a DX board more likely because OpenGL simply has more widespread use outside of games...

Deano said:
OGL ES vs D3D Pocket?

OGL ES! :p[/code]
 
Joe DeFuria said:
DemoCoder said:
Summing up: No one proved their point that DX's HLSL model would be "less buggy",
No one claimed that DX's HLSL model results in less buggy programs.
That's DeanoC's main point.

Joe DeFuria said:
DemoCoder said:
so now argument has switched to "worse end user experience" and "rate of change". Semantic dodge.
End user experience is my whole entire point. Maybe someone else is trying to sematically dodge that?
Um, okay...
The arguments against the GL model have primarily focused around getting developers to use it. But it's obviously great for the end-user. After all, the easier shader optimization should provide noticeable performance improvements for some hardware.
 
Hmm. Great discussion!

A parser is just something that splits the source code up and expands it into a basic Turing machine. It reduces it to the grandfather of all computer models: a state machine. As it should.

During that process, it builds a parse tree: a (binary) tree, that specifies what goes where. What variable is what value at which time and where it resides.

That logic only has to be described once. There is no real need to try and improve upon it. You just copy it and feed it to your lexical analyzer of choice.

When you want your code to run on specified hardware, you also need a code generator: a thingy that consists of a lookup table, that specifies what instructions to execute to shift (or reduce, but that is in-between) the state of your state-machine to another one.

You only have to fill in the lookup table and feed that with the output of your lexical analyzer to your code generator (compiler-compiler) of choice.

If you make errors in the above, it will produce code that doesn't work as expected. Fix them, and you're done.

And that's all! That's how you generate a DX-driver, a GLSlang compiler, a C++ compiler or whatever. It works, or it doesn't. And you generate one all in the same way.

But wait! Is there a DX lexicon readily available? Hm. I think not. Because M$ generates 'assembly', that your hardware should be able to run without modifications. You are allowed to map the opcodes to the actual ones used by your hardware, but that's it. You're not assumed to optimize it, otherwise we get those horrible debacles about 'allowed' optimizations (while the hardware manufacurers have limited choice but replacing low-level functions -like trilineair- with others -like bilineair-).

OGL takes a quite different approach. It hands you an input that you can actually use to generate a sequence of instructions that makes sense for your specific hardware.

While it would be better (from a developers point of view) to have just one rigid API, that would produce the same result on any hardware, M$ is falling short of that mark. To deliver that, the hardware should run their assembly output unmodified. And no hardware is able to do that.

I am sure, that M$ would like nothing better than specifying exactly how that hardware should preform. But they're (fortunately) not the only ones with a stake in that. If they should succeed, it would make a PC into a console: fixed hardware.

Of course, that's what the hardware manufacturers want as well: to become the standard. Duh.

For a developer, it's a no-no: OGL will do it all and much better at that. For a businessman it's a no-no as well: follow Microsoft's lead.

That's all, folks!
 
Joe DeFuria said:
Ahhh...yes, now MS is just ripping oof GL. :rolleyes:

Depends on how you see it. First they tried to use other models than the GL model, and it sucked and developers were disgruntled. So they have copy'n'pasted the successfuly GL concepts more and more and finally in DX7 we had a reasonably usable model that basically handles most things like GL, except through a more painful interface.

Joe DeFuria said:
You honestly think the end user experience of DX over the last five or more years has been better than GL?

Yup.

How many games support DX? How many support GL?

In what way in the number of games using each API relevant to the user experience?


Joe DeFuria said:
You don't remember the era of bug ridden DX3/5 games?

Who's talking abot bugs? GL games don't have bugs?

Games can have bugs, and GL games have bugs too. But a stupidly designed API will boost the amount of bugs, which is what DX < 7 did, and what later revisions of the API does to some extent too.
 
DeanoC said:
A commitee with not one games developers on it. Put John Carmack on the ARB and the ARB approval rating among professional games programmers will improve. The ARB is currently a IHV only club... I'd be much happier trusting the future of graphics API to them if there was at least one games developer on the ARB.

Of course, having some major game developers on the ARB would be an improvement. That doesn't change however that the ARB model is superior to the MS central control model.
 
Joe DeFuria said:
It's called a "trend" Humus.

GL obviously had a head start. It had lots of things "first."

As time went on, DX first had pixel shading, now it's first with HLSL support, etc.

It's called a trend and you don't seem to see the trend. You conveniently forgot T&L, occlusion queries, volumetric texturing, multisampling, polygon offset, texture lod .... etc. For every new version of DirectX so far I've read the "What's new?" section and found plenty of old GL functional going in there and thought "about time" for myself.

Joe DeFuria said:
Wrong. It's design by workgroup, approval by committee....

Blah...blah..."Approval by committe". Whatever. Doesn't change my point.

What was your point then?

Joe DeFuria said:
DX has not "improved and innovated" particularly much over "a short period of time".

How can you say this?

Because most of the "innovation" has been copying already proven concepts from OpenGL, and the process have not been particularly fast.
 
Joe DeFuria said:
DemoCoder said:
Summing up: No one proved their point that DX's HLSL model would be "less buggy",

No one claimed that DX's HLSL model results in less buggy programs.

Really? It's the main argument against the GL shader model so far and this thread and others are packed with such claims.

Joe DeFuria said:
so now argument has switched to "worse end user experience" and "rate of change". Semantic dodge.

End user experience is my whole entire point. Maybe someone else is trying to sematically dodge that?

End user experience talks against DX. OpenGL games and drivers have in general been less buggy.
 
John Carmack is listed as a contributor to GL1.5.
Page 296.

Reading through the acknowledgment lists for GL1.5 and previous versions (preserved in the 1.5 spec), you'll find more mention of software people. Mostly middleware vendors and people with an interest in workstations.

I also fail to see how the ARB should in any way be inferior to MS as a standards setting body. The ARB has all the major graphics chip designers on board for crying out loud. What's Microsoft's level of expertise in graphics? What is it based on?

And no, features aren't late in GL. Final ARB extensions may be late, that's an important distinction. Starting on EXT or even vendor specific extensions is fine for development purposes. 'Porting' to ARB versions as soon as ratified is generally painless. In the majority of cases, all you need to do is change string constants from EXT_* to ARB_* and you're done.
 
DeanoC said:
It makes no difference if a bug is fixed in a driver, once its there you HAVE to work around it. You simple cannot ask the user to install new drivers.
Having ATI or NVIDIA fix it, doesn't matter if it ships in a WHQL driver its an issue for the entire life of the card.

I.e. Most people with ATI R3x0 will never even have a OGL 1.5 + driver let along a fix!

Publisher want the easiest/cheapest way of getting the product on the customers machine. Anything the increases the test matrix is bad in there eyes, thats why MS won with D3D, D3D is cheaper than OpenGL for the publisher who wants to target the mainstream PC user. This is a business and if D3DX HLSL save 0.1$ per copy sold, then this argument is moot.
There's no if ... then, and there's no moot, this argument just doesn't hold water. What you're claiming here is that the D3D layered driver model somehow reduces your need to test different driver versions right? I can see you coming from the precompiled shader code pov, but that's completely off the mark in the scope of your own comments.

Drivers have bugs. There are many drivers for a given product. DXG drivers are updated at the same rate as OpenGL drivers (same package, usually). While doing what you're asking for (work around every known bug in every WHQL driver for every chipset), there's no difference in effort between DX Graphics and GL. But frankly, that's a waste of time anyway. Might be justified for stuff targeted at 'casual' gamers, like, say, Pong 2k3 :D

Anyone with a real interest in games (which includes having the hardware, getting info on a regular basis) has experience with driver updates. Driver updates even come on the CDs tacked onto games magazines. I know publishers can be hard to educate in terms of target hardware. The driver update thing is a question of simplicity, which equals money. That they do grasp when explained right. It shrinks the "test matrix" :)

UT2k3 readme said:
(rough translation from German readme, for your convenience)
NVIDIA 40.41 Treiber are known to have visual and performance related issues (stuttering) with Unreal Tournament 2003. As these issues aren't observed with drivers 30.82 and later drivers fixed the problem, you have two choices when currently using the 40.41 version drivers:
1. downgrade to the older 30.82 version driver, or
2. install a newer driver as soon as it's available

The most recent NVIDIA drivers can be found at the following URL:

http://www.nvidia.com/content/drivers/drivers.asp
 
Chalnoth said:
JohnH said:
Which cheat "angle" ?

JohnH said:
Why is the OGL2.0 approach to HLSL parsing not correct at this time ?
...
2) IHV's can individually tweak the syntax ("illegally") for there own devious reasons

Nothing to do with cheating, everything to do with market control, this was stated in another post.

John.
 
Humus said:
JohnH said:
Tell me how you guarantee that something written for GLSlang and test on one driver/HW is guaranteed to work on any piece of HW in the field ?

If a shader is correct, then the shader specification is your guarantor. Sure, driver bugs can exist, and driver bugs has always existed and will exist in the future too. Nothing new under the sun. MS compiler doesn't change that. The driver can still have a bug. And that's the whole reason why the developer's QA department needs to run their newly written game on graphic cards on the market. There's no way around that problem. You need to do that if you're using HLSL too.

But there continues to be _NO_ HW in existence that meets the GLSlang requirements of things like unlimited code size, unlimited temporaries, unlimited flow control depth, unlimited call depth etc etc. Given this the spec gaurentees nothing. Until such HW is available this is a real problem, which the Dx profiles should fix.

John.

John.
 
Chalnoth said:
DeanoC said:
DemoCoder said:
What's the NV30 driver to do? Try to "recognize" a sequence of 8 instructions and assume it's a SINCOS?
And thats a bug in the HLSL compiler, in the same way that a GLSLANG compiler (and sometimes certainly will) could produce code that 'forgets' to us a hardware instruction. All code will have bugs...
No. It's a direct result of the forced low-level standard intermediate format. It's not a bug, it comes from a design decision.
No, this is a bug, these macros exist in the assemply language for reason you know.

John.
 
DemoCoder said:
Joe DeFuria said:
and since the D3DX compiler is statically linked, the MS fixes won't help end users at all.

Nor will the MS fixes break the end-user's stuff.

But MS's "fixes" do introduce bugs, only D3DX changes won't. On the other hand, to work around D3DX limitations, IHVs have to write far more intelligent drivers to perform optimizations, so all the MS model does is make more work for the compiler authors, and increase the change that IHVs will ship a buggy DX9 driver optimizer.
The intermediate Dx formats do not add much, if any, compexity over GLSlang's model, if anything its simplified by the constraints imposed by the profiles.

John.
 
Back
Top