Chalnoth said:In the past smaller IHV's and ATI have managed the issues by optimizing specifically for certain games.
Scali said:After all these years of driver writing, one would expect at least ATi to have decent OpenGL drivers, if it were no harder than Direct3D drivers, which they did manage quite well.
We'll see... Doom3 apparently has given ATi's OpenGL team a new impulse. But I doubt they will improve all that much, really.
DaveBaumann said:The DX driver is a clean slate fore R300 onwards. OpenGL wasn't - not only do they have the historical code in there, but also a bunch that was added after they bought FireGL.
Naturally, though, there is only so much that a driver can do and over half the passes taken in each frame of D3 will favour nvidia architecture more than ATI's.
You can't be serious about that. Ignored.Scali said:Wrong. The OS apparently has no way of knowing that a window has exclusive fullscreen, or that it has its own resolution, like in Direct3D.
Why should it? A window is just a container for stuff. And what do you mean anyway? Just because OpenGL relies on the "canvas" provided by a window doesn't mean that the window would somehow magically be able to care about OpenGL.Scali said:So apparently the OpenGL context is attached to the window, but not the other way around. OpenGL knows about the window, but the window does not know about OpenGL.
OpenGL doesn't give a fuck about and has nothing to do with ALT+TAB. What happens on ALT+TAB is determined by the message pump.Scali said:Stupid point? I think not. Alt-tab in Doom3 and you see what I mean.
Global? Another straw?Scali said:Obviously ChangeDisplaySettings is a GLOBAL function, not related to the window itself, let alone to any OpenGL contexts attached to it. Which is where the problems stem from.
No, this is all completely new and intriguing to me k'thx.Scali said:But you knew this, I hope?
Maybe because OpenGL driver writers are graphics card driver writers? And maybe because surfaces are just surfaces and do not necessarily need to be gone forever upon a modeswitch? Maybe because this is what happens to all your "regular" windows, and keeping the OpenGL state alive separately is no effort anyway?Scali said:Yes, how exactly is that possible?Btw, an OpenGL context is never "lost" *snickers*.
Obviously when you switch away from a fullscreen application, the videomemory is trashed. Apparently OpenGL leaves this to be handled by the driver in some way. Perhaps this again has something to do with that architecture, which is harder to implement than the Direct3D one?
See above. And one more thing: if Direct3D rolls its own in these things, it's not more integrated with the OS, it's more agnostic of the OS. It ignores, disobeys, overrides and replaces the OS as long as it's visible in fullscreen. I wouldn't call that "integrated". I'd call that a glideism.Scali said:Again, calling a GLOBAL Windows API function is the opposite of Windows integration. If you want to look at integration: in Direct3D you set the gamma on the D3DDevice itself. Which is also why a crashing D3D app has no problems resetting the global settings, its settings were only local.
No, this kind of discussion is in fact quite detrimental to my respect for human life in general. But I'm looking forward to sentences written by you that you actually spent some thought on.Scali said:I hope this was educational for you.
Then why did you suggest looking at changelogs in the first place? What is a good measurement then? Your subjective experience?Scali said:No, I think that if problems aren't fixed, they are not mentioned in the changelog. Especially with ATi, we all know they have performance problems and bugs in a lot of OpenGL applications, but a lot of them don't get fixed, so they are not in the changelog. So the changelog is not a good measurement for bugs.
From those benchmarks you could equally claim NVidia has poor D3D performance, and OpenGL performance is as it should be. Just using a different reference point than you did.I never claimed that the OpenGL API was slower. I claimed that all manufacturers (possibly NVIDIA excluded) have relatively poor performance in OpenGL compared to Direct3D, and the benchmarks show that. This is common knowledge anyway. Don't pretend you don't know that ATi/Matrox/S3/XGI perform poorly in OpenGL.
That's about as much effort necessary in OpenGL. Just a window message handler that restores the desktop settings in one case, and restores the game state in another. You need those functions in your engine anyway (just like in D3D you need the code to create the unmanaged resources regardless of task switching), so the additional work is close to zero.Yep, window message handling is indeed an utter mess, and a major hack. It should be done away with.
Indeed, like with Direct3D, where alt-tab simply works for fullscreen applications. That's because the resolution and gamma settings etc are local settings to the window. When the window loses focus, the desktop settings are restored. The programmer only needs to handle the unmanaged resources that are lost, and all is well. This is very simple and clean to do. Eg a simple callback function that releases all unmanaged resources, and another one that recreates them.
OpenGL doesn't give a fuck about and has nothing to do with ALT+TAB. What happens on ALT+TAB is determined by the message pump.
Global? Another straw?
And you just didn't get it. Let me help you: switch your desktop resolution. Right now. Are your windows still there after the switch? Can you create new windows after the switch? Oh wait, don't bother, both answers are "yes".
So which problems?
No, this is all completely new and intriguing to me k'thx.
Why is it obvious to you that videomemory is lost upon a modeswitch? Of course I know where you got that impression from, but the video driver handles the freaking modeswitch itself, so it knows what's going to happen and can do whatever is necessary before "trashing" anything.
See above. And one more thing: if Direct3D rolls its own in these things, it's not more integrated with the OS, it's more agnostic of the OS. It ignores, disobeys, overrides and replaces the OS as long as it's visible in fullscreen. I wouldn't call that "integrated". I'd call that a glideism.
Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.
As you undoubtedly recall, I ignored the first part. Try to find out why.
Then why did you suggest looking at changelogs in the first place? What is a good measurement then? Your subjective experience?
From those benchmarks you could equally claim NVidia has poor D3D performance, and OpenGL performance is as it should be. Just using a different reference point than you did.
That's about as much effort necessary in OpenGL. Just a window message handler that restores the desktop settings in one case, and restores the game state in another. You need those functions in your engine anyway (just like in D3D you need the code to create the unmanaged resources regardless of task switching), so the additional work is close to zero.
For both D3D and OpenGL, you can find some games where you can't task switch, and you can find some where you can. This is definitely not an issue of the API
Scali said:Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.
Correct, this would never have happened with a D3D application. Sadly you lack the comprehension.
This too.Scali said:I don't recall suggesting anything like that. Bugreports perhaps, but not changelogs.
I'd like to know the criteria you use in this comparison.Also, if we compare the relative workloads and framerates of the different applications, we will see that NVIDIA has the closest match between OpenGL and D3D, while others have a significant gap between OpenGL and D3D.
Either you can't read my answers, or you don't want to. Either way, I see little use in continuing this discussion.The difference being that in D3D, it is not in the application itself. If the application is closed down ungracefully, the desktop settings are restored. OpenGL just hacks the global desktop settings itself, and they are not restored. And we all know how lovely it is to get stuck in a 640x480 desktop with gamma turned up.
Either you still don't understand the problem, or you don't want to see it.
What problem? OpenGL just isn't aware nor concerned about ALT+TAB, window resizes, whatever. It's the application's responsibility. The application can tell OpenGL that the viewport size has changed in reaction to a window resize, it can throw away the GL context in reaction to the display window being minimized. And it indeed can do so because the OS informs the application about these events, even before they happen, via a mechanism commonly referred to as "message queue". Got it?Scali said:OpenGL doesn't give a fuck about and has nothing to do with ALT+TAB. What happens on ALT+TAB is determined by the message pump.
Thank you for proving that you don't understand anything about the problem.
True for Win9x, wrong for Win2k. Win2k will restore the desktop resolution if an application bombs out of a fullscreen scenario.Scali said:Okay, let me explain this again, since you don't understand (and have the arrogance to claim I didn't get it?! People should be shot for that, really).
Global means that it applies to the system as a whole, not to just a particular window. The problem here is that if the window is somehow lost (application crash for example), the OS doesn't bother restoring the resolution, since it a) doesn't know that there is a resolution that needs to be restored, and b) it doesn't know which resolution would have to be restored to.
Res has been handled, and as for gamma let me just say that it's the same thing as for any "normal" Windows application. OpenGL applications on Windows are just Windows applications. They don't come with their own execution environment.Scali said:Only the application can handle that, and if the application is closed ungracefully, it doesn't handle it. Obvious problems, since your desktop will be screwed, it will still have the resolution and gamma settings that the OpenGL window used.
It's not my fault if you just invent your facts out of thin air. Of course I am surprised by your creativity, especially in its current form.Scali said:That is painfully obvious, yes.No, this is all completely new and intriguing to me k'thx.
No, applications don't have a better idea than the driver about when a modeswitch or any other external event might cause reallocation of video memory resources. It's the driver that performs these reallocations in the first place. Got it?Scali said:Why would the driver have to handle that? The application has a much better idea of what needs to be handled. It would also make the work for the driver easier, which was my point.
How does OpenGL not work?Scali said:Whatever you call it, it works, and OpenGL doesn't.
What problems? Gimme!Scali said:But ofcourse you'd rather discuss nomenclature than the actual problems.
Thief 3 is purely a DirectX Graphics game. It's based on some version of the Unreal engine, but it ships without the multi-API support stuff. Comprehend that.Scali said:Correct, this would never have happened with a D3D application. Sadly you lack the comprehension.Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.
I ignored the first part because it just was so strikingly stupid that it could only have been something you quickly made up to have, well, something. I figure if you had just reread that one sentence maybe a couple of times and had taken roughly twenty seconds to think, you would have seen yourself that it's just not worthy of writing, let alone commenting on.Scali said:Because you are either unable to understand the problem with the way OpenGL works, or because you are an OpenGL zealot and want to defend OpenGL any way possible, for emotional reasons only.As you undoubtedly recall, I ignored the first part. Try to find out why.
My bet is the second, since you are using all kinds of emotional arguments now.
Xmas said:It's silly actions that cause silly reactions.
I'm glad non-existant issues won't change...
That, or based on all the silly dribble you've spewed in this thread, the only sore spot is the hole you've dug for yourself.Scali said:Judging from all the silly reactions, I have hit a sore spot regarding OpenGL
Calling you names isn't going to change the DirectX issues, nor is it going to create any OpenGL issues that don't exist. You're grasping at straws, making hugely generalized logical fallacies to substantiate whatever vendetta it is you have against OpenGL. In short, you really suck at defending your point of view, to the point of making yourself look completely undeducated and borderline absurd.Go ahead and call me names, and pretend you don't know what I mean (or are you not pretending, and are you really that thick?).
Calling me names isn't going to change the OpenGL issues.
No, I think you have nothing more to add because you have no retort for any of the logical data that was presented to you, NOR did you ever provide any sensible response. If I respond to you this way: "well the deflamulator izbot has gorked your nyewt-fran", I should have no logical expectation that you will give me any sort of answer. That's the gibberish you're spewing, and the reason that you're not getting any feedback.I have nothing more to add, because I have already given all the answers before, they just haven't been read, or understood.