Is Carmack the only one still using OpenGL?

epicstruggle said:
Hmm, didnt realize that NWN was OGL.

Most ATI card owners realized that...
They had to beg Bioware quite some time to implement the water for ATI cards as well.
If it was in DX8 it would have worked right from the start...
 
Hyp-X said:
epicstruggle said:
Hmm, didnt realize that NWN was OGL.

Most ATI card owners realized that...
They had to beg Bioware quite some time to implement the water for ATI cards as well.
If it was in DX8 it would have worked right from the start...
You can't assume that it would work straight away - Morrowind is a good example of where a D3D pixel shaded material did funny things with an ATI card (to begin with at least).
 
Neeyik said:
Hyp-X said:
epicstruggle said:
Hmm, didnt realize that NWN was OGL.

Most ATI card owners realized that...
They had to beg Bioware quite some time to implement the water for ATI cards as well.
If it was in DX8 it would have worked right from the start...
You can't assume that it would work straight away - Morrowind is a good example of where a D3D pixel shaded material did funny things with an ATI card (to begin with at least).
I never saw any weird pixel shader problems in Morrowind on my 9700.
 
Hyp-X said:
epicstruggle said:
Hmm, didnt realize that NWN was OGL.

Most ATI card owners realized that...
They had to beg Bioware quite some time to implement the water for ATI cards as well.
If it was in DX8 it would have worked right from the start...

If it was in DX8 you wouldn't be able to have advanced effects on GF<3 cards. Same with doom3, if it was written in DX8 you wouldn't be able to run it on GF<3 cards (with GF3 quality, that is).
 
I'd think most professional game companies would choose DirectX over OGL for the same reasons John Carmack explains often when in the home-stretch of his latest game engine- all the cleanup and pathing work he needs to add to make his games look and perform well on different platforms. The scale of doing this requires someone that has a fairly deep understanding of the hardware and is more of a technology buff than a coder, which is why id software games are generally really good value for anyone despite what hardware they own.

Contrast this with Bioware- which has been a "back to the drawing board" ever since NWN was released. It still has severe performance and IQ problems on non-nvidia hardware.

OpenGL is still the only choice if you want to make something with cross-platform capability. Taking a mostly-C code base written using OGL, and with only small amounts of elbow grease, you can pop out a Linux port, or a Mac port. The only thing left is to deal with Win32 sound/music routines, but most coders are smart enough to wrapper these high-level to low-level so only a low-level library is needed to be recreated. The end result is a more platform friendly game.
 
That's what I keep saying, people still thought there is a problem. I never could see one on my 9700 Pro.

OpenGL guy said:
Neeyik said:
Hyp-X said:
epicstruggle said:
Hmm, didnt realize that NWN was OGL.

Most ATI card owners realized that...
They had to beg Bioware quite some time to implement the water for ATI cards as well.
If it was in DX8 it would have worked right from the start...
You can't assume that it would work straight away - Morrowind is a good example of where a D3D pixel shaded material did funny things with an ATI card (to begin with at least).
I never saw any weird pixel shader problems in Morrowind on my 9700.
 
Tonyo said:
If it was in DX8 you wouldn't be able to have advanced effects on GF<3 cards. Same with doom3, if it was written in DX8 you wouldn't be able to run it on GF<3 cards (with GF3 quality, that is).

NWN doesn't have those "advanced effects".
(I assume you mean per-pixel specular lighting...)
 
K.I.L.E.R said:
That's what I keep saying, people still thought there is a problem. I never could see one on my 9700 Pro.
OpenGL guy said:
You can't assume that it would work straight away - Morrowind is a good example of where a D3D pixel shaded material did funny things with an ATI card (to begin with at least).
I never saw any weird pixel shader problems in Morrowind on my 9700.
Just remember that Morrowind was released when the 9700 was not available. OpenGL guy stated "to begin with." That means, to me, that we're not talking about the timeframe of the 9700, but rather when Morrowind was released, meaning in the middle of the 8500's lifetime.

On topic:
I don't think OpenGL has declined any in the past few years. Back in the days of Quake2, JC seemed to be the only one writing engines for OpenGL. Now we have quite a few that work under OpenGL (Unreal, Serious, NWN, etc.). One might say that Direct3D has grown faster, because the games market as a whole has grown immensely since the days of QUake2, but I don't think it is valid to say that OpenGL has declined.
 
Hyp-X said:
Tonyo said:
If it was in DX8 you wouldn't be able to have advanced effects on GF<3 cards. Same with doom3, if it was written in DX8 you wouldn't be able to run it on GF<3 cards (with GF3 quality, that is).

NWN doesn't have those "advanced effects".
(I assume you mean per-pixel specular lighting...)

NWN will use NV_register_combiners (GF1+) and/or NV_texture_shaders (GF3+) extensions if they are available to perform "advanced effects". That's what ATI users were complaining about, because ATI has an extension as capable as those two (for R200), but they weren't able to see those "advanced effects".
 
Back
Top