Which API is better?

Which API is Better?

  • DirectX9 is more elegant, easier to program

    Votes: 0 0.0%
  • Both about the same

    Votes: 0 0.0%
  • I use DirectX mainly because of market size and MS is behind it

    Votes: 0 0.0%

  • Total voters
    329
Seriously, someone should be a scientific study of the underlying processes and probabilities of threads turning highly offtopic. :)
 
GL2 for sure.

I had the 'pleasure' of porting our engine over to DX9 a while ago. It was a REAL pain in the ass, the interfaces have changed a lot from DX8. I'm sick and tired of DX interfaces constantly changing (anyone who's been around since DX3 will know what I'm talking about). GL never changes interfaces, it just introduces new extensions. Our (very old and out of date) OpenGL renderer still worked the same on today's hardware. I incrementally added features to our GL renderer, and soon I had shaders, float textures, etc all working in much less time than DX9. GLSL is already available to developers through an alpha ATI driver (don't know about NVidia) on the 9600+.

We dumped DX in favour of GL for future PC games. The Xbox is hardly an argument for keeping with DX. The Xbox is stuck with DX8, and I'm sure by the time Xbox2 comes out it'll have some hacked up version of DX10. So you win nothing by keeping with DX. Our next PC games are due out in 2004, so we have some time for GLSL support to stabilize. If GLSL support is still not there, then I have a fallback to the 1.4 ARB_*_program extensions.

If an IHV other than MS were to embrace a graphics API, which one would they choose? Definitely not DX. There's that to consider as well.
 
fresh said:
I had the 'pleasure' of porting our engine over to DX9 a while ago. It was a REAL pain in the ass, the interfaces have changed a lot from DX8.

It took me 5 hours to port our engine to DX9.
I was happy with it since there wasn't much change in the interfaces but those there benefited our engine code. (It allowed to clean up hacks.)
The only exception is SetSamplerStage which is pointless, but it required only 1-2 minutes of replacing.
 
fresh said:
If an IHV other than MS were to embrace a graphics API, which one would they choose? Definitely not DX.

I'd guess you have that completely wrong.

Seems to me that GL plcaes a larger burden on the IHV for driver (and soon compiler) development. I'd wager that IHVs would rather embrace DX, while ISVs generally prefer GL.
 
Humus said:
Seriously, someone should be a scientific study of the underlying processes and probabilities of threads turning highly offtopic. :)

Trust me, if I had a few million bucks to waste, that'd be the first project I'd finance :LOL:


Uttar
 
Joe DeFuria said:
fresh said:
If an IHV other than MS were to embrace a graphics API, which one would they choose? Definitely not DX.

I'd guess you have that completely wrong.

Seems to me that GL plcaes a larger burden on the IHV for driver (and soon compiler) development. I'd wager that IHVs would rather embrace DX, while ISVs generally prefer GL.

Sure, GL requires higher amount of implementation effort, but the extensibility of the GL API is pure gold for an IHV. It's not cool for IHVs to wait for MS to come out with a new API to be able to demo the new tech.
 
Humus said:
Sure, GL requires higher amount of implementation effort, but the extensibility of the GL API is pure gold for an IHV.

nVidia does this.

It's not cool for IHVs to wait for MS to come out with a new API to be able to demo the new tech.

ATI does this.

Seems like ATI doesn't view this the way you do...
They even waited for the ARB extenstions instead of coming up with their own.
 
Hyp-X said:
They even waited for the ARB extenstions instead of coming up with their own.

I disagree, my take is that ATI promotes EXT/ARB over vendor specific extensions, which is better for the programmer, and IMO, the way to go.
They didn't "wait for" an ARB extension, but worked hard on it to have a single common extension to ease the devs life.
 
Indeed. The way I see it, ATi tend to take their new extension to the ARB. If the ARB likes it, then great, a new industry standard. If the ARB doesn't, then ATi won't just ditch the idea, instead they release it as a vendor specific extension. NVidia tend to go the other way. First they make an extension that suits them perfectly and release it. Then as others show interest they have to either provide their own version or get the ARB to work together for an ARB extension.
 
fresh said:
GL2 for sure.

I had the 'pleasure' of porting our engine over to DX9 a while ago. It was a REAL pain in the ass, the interfaces have changed a lot from DX8. I'm sick and tired of DX interfaces constantly changing (anyone who's been around since DX3 will know what I'm talking about). GL never changes interfaces, it just introduces new extensions. Our (very old and out of date) OpenGL renderer still worked the same on today's hardware. I incrementally added features to our GL renderer, and soon I had shaders, float textures, etc all working in much less time than DX9. GLSL is already available to developers through an alpha ATI driver (don't know about NVidia) on the 9600+.

We dumped DX in favour of GL for future PC games. The Xbox is hardly an argument for keeping with DX. The Xbox is stuck with DX8, and I'm sure by the time Xbox2 comes out it'll have some hacked up version of DX10. So you win nothing by keeping with DX. Our next PC games are due out in 2004, so we have some time for GLSL support to stabilize. If GLSL support is still not there, then I have a fallback to the 1.4 ARB_*_program extensions.

If an IHV other than MS were to embrace a graphics API, which one would they choose? Definitely not DX. There's that to consider as well.

Err, yes Dx9 interfaces are new, if you want to use an old interface you still can. Admittedly with OGL extensions you can occasionally move forward without having to tweak the majority of your API calls, then again if you'd been going from Dx8 to Dx9 instead of OGL to Dx9 you would probably have found this much easier.

John.
 
JohnH said:
Err, yes Dx9 interfaces are new, if you want to use an old interface you still can. Admittedly with OGL extensions you can occasionally move forward without having to tweak the majority of your API calls, then again if you'd been going from Dx8 to Dx9 instead of OGL to Dx9 you would probably have found this much easier.

John.

I have 3 renderers, 1 OpenGL, 1 D3D8 and 1 D3D9.
I wrote the OpenGL one first, moving to D3D8 was a pain but definetly less than going from OpenGL to D3D6 which lacked plenty of features.
Moving from D3D8 to D3D9 wasn't easy, a few things changed here and there, just enough to piss you off...
On the other hand, I had no problem adding new features to the GL renderer (although I did not had GL2 extensions as of yet).
 
Ingenu said:
JohnH said:
Err, yes Dx9 interfaces are new, if you want to use an old interface you still can. Admittedly with OGL extensions you can occasionally move forward without having to tweak the majority of your API calls, then again if you'd been going from Dx8 to Dx9 instead of OGL to Dx9 you would probably have found this much easier.

John.

I have 3 renderers, 1 OpenGL, 1 D3D8 and 1 D3D9.
I wrote the OpenGL one first, moving to D3D8 was a pain but definetly less than going from OpenGL to D3D6 which lacked plenty of features.
Moving from D3D8 to D3D9 wasn't easy, a few things changed here and there, just enough to piss you off...
On the other hand, I had no problem adding new features to the GL renderer (although I did not had GL2 extensions as of yet).

Do you really think that going to GL2 ext's would have been that much less painful than Dx8->9, for example if you wanted to add use of MRT's ?

John.
 
JohnH said:
Do you really think that going to GL2 ext's would have been that much less painful than Dx8->9, for example if you wanted to add use of MRT's ?

John.

mmhh... I'm not sure, what MRT is. It's Multi Render Targets ok, but doesn't that mean rendering to more than one target at once, of simply having more than one render target available at a given time ?

If it's the later then I believe the WGL_ARB_pbuffer extension manage that, and it's not OpenGL 2.0...
 
MRT implies the capability to render to multiple buffers at the same time, in the same pass, from the same pixel shader program, with different data going to each buffer. This is not the same as older pbuffers, where you need one pass for each pbuffer you wish to render to - although you could easily simulate MRT operation with a collection of same-sized pbuffers (albeit at reduced performance).
 
Ingenu said:
JohnH said:
Do you really think that going to GL2 ext's would have been that much less painful than Dx8->9, for example if you wanted to add use of MRT's ?

John.

mmhh... I'm not sure, what MRT is. It's Multi Render Targets ok, but doesn't that mean rendering to more than one target at once, of simply having more than one render target available at a given time ?

If it's the later then I believe the WGL_ARB_pbuffer extension manage that, and it's not OpenGL 2.0...

Didn't think pbuffer was as capable as MRT's, could be wrong I haven't looked at it recently...
 
arjan de lumens said:
MRT implies the capability to render to multiple buffers at the same time, in the same pass, from the same pixel shader program, with different data going to each buffer. This is not the same as older pbuffers, where you need one pass for each pbuffer you wish to render to - although you could easily simulate MRT operation with a collection of same-sized pbuffers (albeit at reduced performance).

Trying to think if the perf differential is really that much... Hmm, I guess it depends if you end up needing to re-submit your geometry multiple times in order generate all the bits you need in your various pbuffers. Not really simulating MRT's, just doing multi-pass really?

John.
 
Let's not anthropomorphise companies, yes different companies have different cultures ... but a large part of NVIDIA's strategy was born of it being market leader, and a large part of ATI's was born of it not being.

Nice to see the extensions to that future proof GL2 already rolling in BTW ;)
 
But not a core part of the API, yet another extension, not sure how this doesn't lead to a fragmented API.

I'll shut up now, I'm the only one defending Dx, and I'm starting to feel lonely :(

John.
 
JohnH said:
I'll shut up now, I'm the only one defending Dx, and I'm starting to feel lonely :(
I also would like to defent it, but this thread/poll is pretty much pointless... Everyone uses the library he/she likes, and if it's ok for him/her - why should we care?
 
Back
Top