Which API is better?

Which API is Better?

  • DirectX9 is more elegant, easier to program

    Votes: 0 0.0%
  • Both about the same

    Votes: 0 0.0%
  • I use DirectX mainly because of market size and MS is behind it

    Votes: 0 0.0%

  • Total voters
    329
DiGuru said:
Even if we would all agree that one of them is better than the other one, would that mean that more applications will be written that support it? Who makes the decision to use DX9 or OGL? The developer (that wants the best tool for the job) or the management (that chooses the one with the largest market share)?
Somewhat off topic, but no publisher should be under the utterly false assumption that DirectX somehow has "more marketshare" than OpenGL. All consumer hardware support both API's quite well.

Regardless, what I hope is that DirectX adopts a model more similar to GLSlang, and totally drops the intermediate assembly (or, at least, drops the shader version paradigm).
 
Chalnoth said:
DiGuru said:
Even if we would all agree that one of them is better than the other one, would that mean that more applications will be written that support it? Who makes the decision to use DX9 or OGL? The developer (that wants the best tool for the job) or the management (that chooses the one with the largest market share)?
Somewhat off topic, but no publisher should be under the utterly false assumption that DirectX somehow has "more marketshare" than OpenGL. All consumer hardware support both API's quite well.

Regardless, what I hope is that DirectX adopts a model more similar to GLSlang, and totally drops the intermediate assembly (or, at least, drops the shader version paradigm).

Yes, any PC will run OGL just as well as DXx. But I am not sure everyone (read: management) knows that.

If DX would drop the fixed intermediate format, that would be very nice indeed. I'm not against Microsoft, I'm against things that halt innovation.

I want to live in interesting times. :D
 
Chalnot, DemoCoder, Humus and Xmas:

;-D

One paragraph would do for us. Great thing we agree as we do.
 
JohnH said:
Humus said:
No, the problem is the intermediate language. You can't optimize for both architectures with a common profile. Yes, the GFFX hardware is slow too, but that's just another problem. Both vendor's hardware could be faster had they had the opportunity to compile the code themselves, though the difference would probably be larger on the GFFX side.
I would be prepared to bet a large sum of money that you are completly wrong here (well within a few percentage points).

John.


Well, be prepared to pay then. 8)
One example is that removing common subexpressions is a good optimization for the R300 since register usage is for free. On the NV30 on the other hand it's not an optimization at all, rather the opposite since register usage is costly. There's no way a common intermediate version can be optimal for both, so the compiler needs to either unfairly favor one, or come up with a good compromise that's decent but not optimal for either card.
 
JohnH said:
Humus said:
The R9700 also has major problems with certain GL_ARB_fragment_program code.
Such as ? Last I heard DoomIII didn't have an issues with ARB frag on ATi HW. Seriously, out of interrest what sort of problems ?

For instance non-native swizzles. It can expand to many instructions.
 
Humus said:
Well, be prepared to pay then. 8)
One example is that removing common subexpressions is a good optimization for the R300 since register usage is for free. On the NV30 on the other hand it's not an optimization at all, rather the opposite since register usage is costly. There's no way a common intermediate version can be optimal for both, so the compiler needs to either unfairly favor one, or come up with a good compromise that's decent but not optimal for either card.

Also, LRP and CMP are expensive on the NV3x, but SINCOS is very cheap. FXC has an affinity for choosing these over other constructs. IF_PRED would be more optimal for the NV30, and SINCOS is way cheaper than a power series expansion. The SINCOS expansion is devastatingly inefficient on the NV3x because it eats up multiple extra registers. Another example is BIAS, and SHIFT operations, which on DX8 HW and some DX9 HW are hardware supported. But DX9 can't represent them, so code for "(x - 0.5)*2" generates code like

def c0, 0.5,2.0, 0, 0
add r0, v0, c0.xxxx
mul r0, r0, c0.yyyy

Which requires the driver to do some real heavy lifting to figure out what the hell is going on, since it will have to inspect the content of the constant registers themselves to figure out if it could generate a HW BIAS/SCALE modifier or not. And if the code is 2*X - 1, a GLSLANG compiler could still figure out how to use HW bias/scale via strength reduction techniques, but FXC will merily generate raw code for this.

Oh, did I mention that FXC doesn't do constant folding correctly and that I noticed that sometimes it would actually waste a register to add two constants together that could have been rewritten with a fold?




FXC is more efficient for an register-combiner-like phased pipeline (e.g. R300), and unfortunately, doesn't take kindly to NV3x's choices of going with specialized SINCOS and predication HW.

I don't believe the NV3x's pipeline will ever beat an R300, if both are optimized to max. The issue is not whether the R300 isn't a killer card that destroys the NV3x. The issue is whether or not DX9 will restrict pipelines in the future that have more flexibility. It's hurt the NV3x already, and I'm just worried that when they try to introduce real HW branching into the R300 successor, we are going to run into significant problems.
 
Joe DeFuria said:
Really? Then how come if you try and use a GL driver from say, Matrox, for anything other than Quake engine games, you're likely to not get satisfactory results? Yet I bet the DX driver is a lot more robust? (Plays a wider variety of games).

I'd love to see you take any Matrox card, a 3dfx 5500, a Kyro II, S3's shipping chips, Intel's solutions.... and run it on a variety of GL apps and games vs. the same chips on a variety of DX games.
Sorry, I know this was said a long time ago, but: The specific (Matrox) example you've used hasn't been accurate since ~Mid-2000 or so.

If you'd like I can run some games on my Parhelia or dust off the old G400Max.
 
keegdsb said:
Sorry, I know this was said a long time ago, but: The specific (Matrox) example you've used hasn't been accurate since ~Mid-2000 or so.

If you'd like I can run some games on my Parhelia or dust off the old G400Max.

I was talking about developers. Are developers going to be happy using Martrox GL as a development platform?
 
Joe DeFuria said:
I was talking about developers. Are developers going to be happy using Martrox GL as a development platform?
No, because Matrox' cards have almost no marketshare and are rather lacking in the feature department. OpenGL doesn't come into it.
 
One only needs to look at the recent game titles being released to see what developers prefer, if it is based off being M$ driven, well that is just the nature of the beast (beast being M$).
 
There's a difference between what developers prefer, and what developers are forced to use. The difference between joy and pain, work and pleasure.

iD is pretty much the only developer with the cache' to tell MS to piss off.
 
Mostly, it's not developers who decide what to use, but the management. Who think DX is their best option. Because of Microsoft, of course. Most don't even know that OGL would do just as well and would make no difference in market share. Sad, but true.
 
DemoCoder said:
iD is pretty much the only developer with the cache' to tell MS to piss off.

Correct. iD (and ironically, 3dfx) is the only reason we have GL for consumer cards today.

DiGuru said:
Mostly, it's not developers who decide what to use, but the management. Who think DX is their best option.

For what reason is "management" insisting on DX?
 
Joe DeFuria said:
DiGuru said:
Mostly, it's not developers who decide what to use, but the management. Who think DX is their best option.

For what reason is "management" insisting on DX?

Because Bill Gates is the richest man on the planet. He knows best. And they might even be able to shake hands and talk with him if they choose DX and make a big hit! That's Important. It's the stuff you tell your grandchildren.
 
DiGuru said:
Because Bill Gates is the richest man on the planet. He knows best. And they might even be able to shake hands and talk with him if they choose DX and make a big hit! That's Important. It's the stuff you tell your grandchildren.

Can you give me a real answer?
 
Joe DeFuria said:
DiGuru said:
Because Bill Gates is the richest man on the planet. He knows best. And they might even be able to shake hands and talk with him if they choose DX and make a big hit! That's Important. It's the stuff you tell your grandchildren.

Can you give me a real answer?

Honest. That's it. Ask any manager. That and the certainty that they cannot go wrong and make a mistake by choosing Microsoft. If you have to make a decision, choose the market leader. Just like anyone else.
 
DiGuru said:
Honest. That's it. Ask any manager. That and the certainty that they cannot go wrong and make a mistake by choosing Microsoft. If you have to make a decision, choose the market leader. Just like anyone else.

Sorry, I just don't buy it. Of course, management is going to target Win32 platform, but choosing the API has little to do with that.
 
Joe DeFuria said:
DiGuru said:
Honest. That's it. Ask any manager. That and the certainty that they cannot go wrong and make a mistake by choosing Microsoft. If you have to make a decision, choose the market leader. Just like anyone else.

Sorry, I just don't buy it. Of course, management is going to target Win32 platform, but choosing the API has little to do with that.

Just ask any manager.

They're not interested in technical details anyway. They have other things to do. If choosing OGL would be very much cheaper, it might be different. But only if a lot of others would use and recommend it as well.

A technical person that would reccommend one over the other would be ok if he took all the blame if it would go bad. Otherwise, the risk is too great.
 
DiGuru said:
Just ask any manager. They're not interested in technical details anyway.

I don't know and managers for gaming development houses.
I'm a manager for other types of development. And I'm not necessarily interested in the tech details either....I'm mostly interested in the job getting done.

And that means I ask the people doing the job what they need. If it doesn't cost me anything

If choosing OGL would be very much cheaper, it might be different. But only if a lot of others would use and recommend it as well.

You've got it backwards. If OGL isn't more expensive, then there's no reason I wouldn't allow them to use it if that's what they're asking for.

No offense, but it sounds to me like you're talking out of your ass a bit with respect to "managers."
 
Back
Top