If Doom 3 had been written in D3D....

I suspect the primary reason John continues to develop in opengl (aside from comfort) is that id puts out versions of their games for both the linux platform and the macintosh platform. Direct3d would be problematic for both those platforms..
i always thought this as well, untill i looked at other multiplatform games/engines. the biggest players in the engine market are id software, epic games, and lithtech/monolith. out of the 3 companies, they all have games running on other platforms (hardware), but only id uses opengl for it's primary renderer in windows. if d3d->openGL conversions are so difficult, why do epic and lithtech insist on doing them?

carmack's bias tword opengl had technical merrits in the past, but those merrits have been slowly swept aside with every dx release. now, i'd say it's just personal preference. he's entitled to that, for sure, and i have no problem with his games running in openGL.
 
see colon said:
if d3d->openGL conversions are so difficult, why do epic and lithtech insist on doing them?

Maybe the questions should be - why should JC use DX on windows when he intends to port to opengl anyway and can get the job done just as well with either api.
 
see colon said:
I suspect the primary reason John continues to develop in opengl (aside from comfort) is that id puts out versions of their games for both the linux platform and the macintosh platform. Direct3d would be problematic for both those platforms..
i always thought this as well, untill i looked at other multiplatform games/engines. the biggest players in the engine market are id software, epic games, and lithtech/monolith. out of the 3 companies, they all have games running on other platforms (hardware), but only id uses opengl for it's primary renderer in windows. if d3d->openGL conversions are so difficult, why do epic and lithtech insist on doing them?

carmack's bias tword opengl had technical merrits in the past, but those merrits have been slowly swept aside with every dx release. now, i'd say it's just personal preference. he's entitled to that, for sure, and i have no problem with his games running in openGL.

obviously you CAN port an engine using D3D to other platforms, its just that if your primary API is opengl (and opengl is supported by your target platforms) obviously that port is going to be a WHOLE lot easier. Out of curiosity I wonder what the primary renderer is for the unreal engine on the macintosh platform? But I also agree comfort surely plays a part in it too. Comfort + portability benefits = little reason for him to switch.
 
Not to mention Carmack tends to code for graphics architectures, not just a general archetype. OGL allows IHVs to give him extensions for new hotness so he doesn't have to stick with middle of the road and possible old and busted.
 
Lol. I suspect the primary reason John continues to develop in opengl (aside from comfort) is that id puts out versions of their games for both the linux platform and the macintosh platform. Direct3d would be problematic for both those platforms...

Actually, DirectX is available for Mac: www.macdx.com. Quite a lot of DirectX-games have been ported to Mac this way (stuff like Halo and Max Payne 2 for example).
As for linux... well unless you have an NVIDIA card, OpenGL support is bad or non-existent anyway, I understood. So as far as I'm concerned, the portability issue is mostly theoretical.
And ofcourse the rendering API used is only a small portion of the entire code. It's not that big a deal to port Direct3D code to OpenGL or something else, or vice versa. Some games support multiple APIs, and many games are ported to consoles like PS2 or GameCube, which don't support OpenGL or Direct3D directly either.
And well... there is a Doom3 port for XBox, so... :)

Personally I don't get why Carmack uses OpenGL... and I heard that he himself doesn't know anymore either... I'm not sure where I read it, but I believe that Carmack said somewhere that his next engine would be in DirectX.
 
There is a whole lot more to bad OpenGL performance on Linux than the drivers supplied with your videocard. It starts off with Xfree86 being in the slump. The team that controls Xfree86 has become a hierarchical committee of managers, so to speak. Change is bad, but they decide wat goes. Which has been almost nothing for years.

Therefore, two developments have forked off. Mesa, which started out as the hardware emulation layer, combined with the DRI (Direct Rendering Interface) offshoot from SGI to form MesaDRI. Which has evolved over the last years into a combined graphical API that supports hardware-accelerated OpenGL under Xfree86 (which is forking as well).

It is quite a bit more complicated that that, of course, but the underlying GL API is improving very rapidly.

For example, as I have a M9200 in my laptop, no Linux drivers are supplied by ATI. They direct you to the producer. As this laptop is a clone, I could forget hardware accelerated 3D, until I installed the MesaDRI package. Since then, it is working like a charm.

So, not only is the underlying API that you need to do 3D under Xfree86 improved dramatically, very good drivers for just about any videocard are supplied as well. You just have to go and get the recent package from the new MesaDRI code tree.

Things are looking good for 3D under Linux.
 
DiGuru, I don't quite have a grasp of all the issues, but shouldn't Xorg help in that space quite dramatically? From what I'm seeing, their rate of progression is excellent and thanks to XFree86's license issues, their adoption rate is insanely high.
 
Saem said:
DiGuru, I don't quite have a grasp of all the issues, but shouldn't Xorg help in that space quite dramatically? From what I'm seeing, their rate of progression is excellent and thanks to XFree86's license issues, their adoption rate is insanely high.

I haven't got any experience with it yet, but it sure looks promising. Only a year ago, people were still hoping that the forks would push the old Xfree86 members back in action, but at the moment it seems unlikely that that will happen. Xorg with MesaDRI seems to become a very nice (and much better!) alternative. I will definitely try it out in the near future.
 
It would look worse and be slower. D3d lacks the crossbar which helps lower number of passes and it doesn't have register combiners which allow nifty maths for nv hw. There is also an issue with heavy state changes under d3d. The worse looks and speed is something I experienced by writing doom3 renderer under both d3d and gl. I quickly dropped d3d after that.
 
D3d lacks the crossbar which helps lower number of passes and it doesn't have register combiners which allow nifty maths for nv hw.

What is a crossbar, and who cares about register combiners when you have programmable shaders?
 
For example, as I have a M9200 in my laptop, no Linux drivers are supplied by ATI. They direct you to the producer. As this laptop is a clone, I could forget hardware accelerated 3D, until I installed the MesaDRI package. Since then, it is working like a charm.

I would be interested in benchmarks of these drivers against the official Windows drivers. In short, I can't imagine that a bunch of amateurs manage to write better drivers in their spare time than fulltime professionals, who have inside information on the chip's architecture.
 
RejZoR said:
I'd call Carmack "The Master of OpenGL". And as someone mentioned,OpenGL is a Cross-platform API,DX is not.
Unless OpenGL is a practical API for the PS2 (and I don't think that's the case), cross-platform capabilities are practically worthless from a gaming perspective. Mac games are barely worth making, and Linux games are an act of charity.
 
JD said:
There is also an issue with heavy state changes under d3d.
Actually I see this as the most major problem for OpenGL going forward.

DX9 is now pretty lightweight, because everyone in the Direct3D software chain has been encouraged by very many game developers to greatly improve small batch performance. GL, in contrast, is hamstrung somewhat by the complexity of GL and the sheer mass of overlapping and interlocking extensions because there's no propspect of introducing backwards incompatibility to thin validation out. This seems to me only likely to get worse as GL takes on more functionality.

There's already one well-known pathological case (SetVertexShaderConstant vs. glProgramEnvParameter) where D3D can be an order of magnitude faster.
 
Yeah, OGL being cross-platform really makes a difference.. especially on home consoles.

OpenGL can easily be ported to almost any console due to its open-source nature.. but D3D is Windows only. Plain and simple.
 
Blade said:
Yeah, OGL being cross-platform really makes a difference.. especially on home consoles.

OpenGL can easily be ported to almost any console due to its open-source nature.. but D3D is Windows only. Plain and simple.

Your being sarcastic right?

How is it any easier to write
Code:
void glTriangle( stuff )
{
   DoConsoleThing( stuff );
}
vs
Code:
class FakeD3DDevice
{
  void DrawPrimitive( stuff )
  {
      DoConsoleThing(stuff);
  }
};
Not that you would do either as its the wrong level of abstraction to drive a renderer at.

I know of 'near' compatible versions of Dx8 (or was it 7...) and OpenGL for the PS2, but performance sucks for obvious reasons.

Its not usual for ports from one platform to write compat layers, to get things up and running quickly. You then usually migrate performance functions to 'native' functions.

SH2PC had a pretty good emulation of XBOX DirectX at one point...
 
Dio said:
There's already one well-known pathological case (SetVertexShaderConstant vs. glProgramEnvParameter) where D3D can be an order of magnitude faster.
Can you elaborate a bit, please?
Are local params more efficient than env params? Or is it the whole distinction (and possible mix) between the two that causes headaches?
I always thought that env params were easier to implement than local params. Local params imply a copy per program objects, so that's undesirable if I want a param to affect a multitude of programs.
 
Approved by ARB on February 16, 2001.

Didn't we have pixelshaders by then anyway? And does this extension not require cards with pixelshaders to actually support it? I mean, I don't expect just any card to support this, since it requires specific hardware to pull it off.
In which case, OGL would be too little, too late.
 
Scali said:
Approved by ARB on February 16, 2001.

Didn't we have pixelshaders by then anyway? And does this extension not require cards with pixelshaders to actually support it? I mean, I don't expect just any card to support this, since it requires specific hardware to pull it off.
In which case, OGL would be too little, too late.
Remember all the stuff about the original Radeon supposedly having pre-SM1.0 pixel shader support? Something about ATI having been screwed by Microsoft in DirectX 8.0 through a lack of support for its "shaders"? Well, crossbar as well as some other fancy multitexturing features (including some ATI extensions, in OpenGL) are what basically made up that shading technology. It was still heavily related to fixed function shading, but had some added functionality to make it more versatile.
 
Back
Top