* Next Generation Cards - will they need 4.5+ GHz CPUs?

There are parts of the OpenGL driver that are more challenging to write, but that doesn't mean that they shouldn't be written properly.

In the past smaller IHV's and ATI have managed the issues by optimizing specifically for certain games.
 
Chalnoth said:
In the past smaller IHV's and ATI have managed the issues by optimizing specifically for certain games.

Presumably that was just a slip, and you meant to say all the IHV's (and not just in the past).
 
Scali said:
After all these years of driver writing, one would expect at least ATi to have decent OpenGL drivers, if it were no harder than Direct3D drivers, which they did manage quite well.
We'll see... Doom3 apparently has given ATi's OpenGL team a new impulse. But I doubt they will improve all that much, really.

It's not as simple as that - you should take a look at this :

DaveBaumann said:
The DX driver is a clean slate fore R300 onwards. OpenGL wasn't - not only do they have the historical code in there, but also a bunch that was added after they bought FireGL.

Naturally, though, there is only so much that a driver can do and over half the passes taken in each frame of D3 will favour nvidia architecture more than ATI's.

http://www.beyond3d.com/forum/viewtopic.php?p=329700#329700
 
Doom3 is actually the least of my concerns. It uses so many extensions and so little geometry that the OpenGL driver itself doesn't get much of a chance to be the bottleneck.
I was thinking about other applications that have so far been buggy or performed below average with OpenGL, that would benefit from the Doom3 impulse. Think about 3dsmax for example.

Doom3 doesn't really run badly on ATi cards, it just runs so well on NVIDIA cards.
Actually, I sorta expect ATi to do some cheating in Doom3, to catch up. Or at least... if they do catch up, I suspect cheats.
 
I haven't paid attention to DX and OpenGL performance differences in public benchmarks of 3ds max, but over the past few years I've used OpenGL in 3ds max with my Matrox and Ati cards because it has performed better with my scenes which are fairly low poly. I haven't used 3ds max much in the past year and I was using version 4 at the time of my benchmarks. For a quick benchmark I would usually load up the included scene of the dragon breathing fire and turn on FPS reporting for viewport playback.
 
Scali said:
Wrong. The OS apparently has no way of knowing that a window has exclusive fullscreen, or that it has its own resolution, like in Direct3D.
You can't be serious about that. Ignored.
Scali said:
So apparently the OpenGL context is attached to the window, but not the other way around. OpenGL knows about the window, but the window does not know about OpenGL.
Why should it? A window is just a container for stuff. And what do you mean anyway? Just because OpenGL relies on the "canvas" provided by a window doesn't mean that the window would somehow magically be able to care about OpenGL.

It's just a surface in the card's memory.
Scali said:
Stupid point? I think not. Alt-tab in Doom3 and you see what I mean.
OpenGL doesn't give a fuck about and has nothing to do with ALT+TAB. What happens on ALT+TAB is determined by the message pump.

Scali said:
Obviously ChangeDisplaySettings is a GLOBAL function, not related to the window itself, let alone to any OpenGL contexts attached to it. Which is where the problems stem from.
Global? Another straw?
And you just didn't get it. Let me help you: switch your desktop resolution. Right now. Are your windows still there after the switch? Can you create new windows after the switch? Oh wait, don't bother, both answers are "yes".
So which problems?
Scali said:
But you knew this, I hope?
No, this is all completely new and intriguing to me k'thx.
Scali said:
Btw, an OpenGL context is never "lost" *snickers*.
Yes, how exactly is that possible?

Obviously when you switch away from a fullscreen application, the videomemory is trashed. Apparently OpenGL leaves this to be handled by the driver in some way. Perhaps this again has something to do with that architecture, which is harder to implement than the Direct3D one?
Maybe because OpenGL driver writers are graphics card driver writers? And maybe because surfaces are just surfaces and do not necessarily need to be gone forever upon a modeswitch? Maybe because this is what happens to all your "regular" windows, and keeping the OpenGL state alive separately is no effort anyway?

Why is it obvious to you that videomemory is lost upon a modeswitch? Of course I know where you got that impression from, but the video driver handles the freaking modeswitch itself, so it knows what's going to happen and can do whatever is necessary before "trashing" anything.
Scali said:
Again, calling a GLOBAL Windows API function is the opposite of Windows integration. If you want to look at integration: in Direct3D you set the gamma on the D3DDevice itself. Which is also why a crashing D3D app has no problems resetting the global settings, its settings were only local.
See above. And one more thing: if Direct3D rolls its own in these things, it's not more integrated with the OS, it's more agnostic of the OS. It ignores, disobeys, overrides and replaces the OS as long as it's visible in fullscreen. I wouldn't call that "integrated". I'd call that a glideism.

Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.
Scali said:
I hope this was educational for you.
No, this kind of discussion is in fact quite detrimental to my respect for human life in general. But I'm looking forward to sentences written by you that you actually spent some thought on.

As you undoubtedly recall, I ignored the first part. Try to find out why.
 
Scali said:
No, I think that if problems aren't fixed, they are not mentioned in the changelog. Especially with ATi, we all know they have performance problems and bugs in a lot of OpenGL applications, but a lot of them don't get fixed, so they are not in the changelog. So the changelog is not a good measurement for bugs.
Then why did you suggest looking at changelogs in the first place? What is a good measurement then? Your subjective experience?

I never claimed that the OpenGL API was slower. I claimed that all manufacturers (possibly NVIDIA excluded) have relatively poor performance in OpenGL compared to Direct3D, and the benchmarks show that. This is common knowledge anyway. Don't pretend you don't know that ATi/Matrox/S3/XGI perform poorly in OpenGL.
From those benchmarks you could equally claim NVidia has poor D3D performance, and OpenGL performance is as it should be. Just using a different reference point than you did.

Yep, window message handling is indeed an utter mess, and a major hack. It should be done away with.

Indeed, like with Direct3D, where alt-tab simply works for fullscreen applications. That's because the resolution and gamma settings etc are local settings to the window. When the window loses focus, the desktop settings are restored. The programmer only needs to handle the unmanaged resources that are lost, and all is well. This is very simple and clean to do. Eg a simple callback function that releases all unmanaged resources, and another one that recreates them.
That's about as much effort necessary in OpenGL. Just a window message handler that restores the desktop settings in one case, and restores the game state in another. You need those functions in your engine anyway (just like in D3D you need the code to create the unmanaged resources regardless of task switching), so the additional work is close to zero.

For both D3D and OpenGL, you can find some games where you can't task switch, and you can find some where you can. This is definitely not an issue of the API
 
OpenGL doesn't give a fuck about and has nothing to do with ALT+TAB. What happens on ALT+TAB is determined by the message pump.

Thank you for proving that you don't understand anything about the problem.

Global? Another straw?
And you just didn't get it. Let me help you: switch your desktop resolution. Right now. Are your windows still there after the switch? Can you create new windows after the switch? Oh wait, don't bother, both answers are "yes".
So which problems?

Okay, let me explain this again, since you don't understand (and have the arrogance to claim I didn't get it?! People should be shot for that, really).
Global means that it applies to the system as a whole, not to just a particular window. The problem here is that if the window is somehow lost (application crash for example), the OS doesn't bother restoring the resolution, since it a) doesn't know that there is a resolution that needs to be restored, and b) it doesn't know which resolution would have to be restored to.
Only the application can handle that, and if the application is closed ungracefully, it doesn't handle it. Obvious problems, since your desktop will be screwed, it will still have the resolution and gamma settings that the OpenGL window used.

No, this is all completely new and intriguing to me k'thx.

That is painfully obvious, yes.

Why is it obvious to you that videomemory is lost upon a modeswitch? Of course I know where you got that impression from, but the video driver handles the freaking modeswitch itself, so it knows what's going to happen and can do whatever is necessary before "trashing" anything.

Why would the driver have to handle that? The application has a much better idea of what needs to be handled. It would also make the work for the driver easier, which was my point.

See above. And one more thing: if Direct3D rolls its own in these things, it's not more integrated with the OS, it's more agnostic of the OS. It ignores, disobeys, overrides and replaces the OS as long as it's visible in fullscreen. I wouldn't call that "integrated". I'd call that a glideism.

Whatever you call it, it works, and OpenGL doesn't. But ofcourse you'd rather discuss nomenclature than the actual problems.

Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.

Correct, this would never have happened with a D3D application. Sadly you lack the comprehension.

As you undoubtedly recall, I ignored the first part. Try to find out why.

Because you are either unable to understand the problem with the way OpenGL works, or because you are an OpenGL zealot and want to defend OpenGL any way possible, for emotional reasons only.
My bet is the second, since you are using all kinds of emotional arguments now.
 
Then why did you suggest looking at changelogs in the first place? What is a good measurement then? Your subjective experience?

I don't recall suggesting anything like that. Bugreports perhaps, but not changelogs.

From those benchmarks you could equally claim NVidia has poor D3D performance, and OpenGL performance is as it should be. Just using a different reference point than you did.

This would be an unlikely claim, since NVIDIA is at the top of the performance scale in both D3D and OpenGL. So if we were to assume that NVIDIA's D3D performance is poor, we would have to conclude that NVIDIA's hardware is significantly faster than that of the competition. This is highly unlikely.
Also, if we compare the relative workloads and framerates of the different applications, we will see that NVIDIA has the closest match between OpenGL and D3D, while others have a significant gap between OpenGL and D3D.
We can draw a conclusion either way:
- In general, Direct3D is significantly faster than OpenGL, and NVIDIA is the exception.
- Theoretically, OpenGL and Direct3D are capable of the same performance levels, but only NVIDIA manages this in practice. The rest is somehow unable to develop drivers with the same performance level in OpenGL.

Either way, OpenGL loses on non-NVIDIA hardware. I think the first conclusion is nonsense, I'd pick the second one.

That's about as much effort necessary in OpenGL. Just a window message handler that restores the desktop settings in one case, and restores the game state in another. You need those functions in your engine anyway (just like in D3D you need the code to create the unmanaged resources regardless of task switching), so the additional work is close to zero.

The difference being that in D3D, it is not in the application itself. If the application is closed down ungracefully, the desktop settings are restored. OpenGL just hacks the global desktop settings itself, and they are not restored. And we all know how lovely it is to get stuck in a 640x480 desktop with gamma turned up.
Either you still don't understand the problem, or you don't want to see it.

For both D3D and OpenGL, you can find some games where you can't task switch, and you can find some where you can. This is definitely not an issue of the API

That's like saying "we can find bugs in both mailclients and webservers, so it is not a mailclient issue".
Perhaps you don't understand the problem under the surface (as neither zeckensack does, are OpenGL advocates somehow less knowledgeable?), even after I explained it in detail multiple times, but they are entirely different problems in D3D and OpenGL, and as I explained before, they have entirely different solutions.
They just appear the same ("alt-tab doesn't work properly").
 
Scali said:
Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.

Correct, this would never have happened with a D3D application. Sadly you lack the comprehension.
:oops: :oops:
This is sad...

Scali said:
I don't recall suggesting anything like that. Bugreports perhaps, but not changelogs.
This too.

Also, if we compare the relative workloads and framerates of the different applications, we will see that NVIDIA has the closest match between OpenGL and D3D, while others have a significant gap between OpenGL and D3D.
I'd like to know the criteria you use in this comparison.

The difference being that in D3D, it is not in the application itself. If the application is closed down ungracefully, the desktop settings are restored. OpenGL just hacks the global desktop settings itself, and they are not restored. And we all know how lovely it is to get stuck in a 640x480 desktop with gamma turned up.
Either you still don't understand the problem, or you don't want to see it.
Either you can't read my answers, or you don't want to. Either way, I see little use in continuing this discussion.
 
three things to scali:

first: with windows integration, he ment integration with the OS, not a particular window. global functions count as well, as they are part of the windows sdk, a.k.a. functions to inter-operate with your windows system.

second: you're dump

third: you're plain wrong with a lot of your statements, mixing up tons of stuff. bether get your facts, or get quiet. for your own best.



ps.: even while i wasn't advanced in gamedev at all, in the first weeks i worked with opengl, i NEVER had ANY problem with opengl to alt-tab. unimportant witch hw and driver combination. never.
 
I list a few things below which people might like to consider:

- How large is each API?
- How clean is each API (in terms of side-effects, interactions, etc.)? How many paths are there through the driver?
- How is the API specified and enhanced?
- How well integrated is the operating system and the driver?
- How much code is reused between drivers from different vendors?
- How many applications use each? (Consider the interactions of this with the number of paths through the driver).
- How many different platforms must the driver run on?
 
Scali said:
OpenGL doesn't give a fuck about and has nothing to do with ALT+TAB. What happens on ALT+TAB is determined by the message pump.

Thank you for proving that you don't understand anything about the problem.
What problem? OpenGL just isn't aware nor concerned about ALT+TAB, window resizes, whatever. It's the application's responsibility. The application can tell OpenGL that the viewport size has changed in reaction to a window resize, it can throw away the GL context in reaction to the display window being minimized. And it indeed can do so because the OS informs the application about these events, even before they happen, via a mechanism commonly referred to as "message queue". Got it?

Scali said:
Okay, let me explain this again, since you don't understand (and have the arrogance to claim I didn't get it?! People should be shot for that, really).
Global means that it applies to the system as a whole, not to just a particular window. The problem here is that if the window is somehow lost (application crash for example), the OS doesn't bother restoring the resolution, since it a) doesn't know that there is a resolution that needs to be restored, and b) it doesn't know which resolution would have to be restored to.
True for Win9x, wrong for Win2k. Win2k will restore the desktop resolution if an application bombs out of a fullscreen scenario.
a)the OS is aware that ChangeDisplaySettings has been called
b)the current desktop resolution is stored in the registry
This behaviour merely requires using the proper flag in ChangeDisplaySettings (CDS_FULLSCREEN). I'd expect WinXP to do the same.

And "global" or not, well, whatever. If you had designed the Win32 API you'd probably have provided a mechanism to switch resolutions for multiple windows on the same screen separately and simultaneously. It would sure be fun to see that.

And please look up SetDeviceGammaRamp. It takes a hdc, so it's technically per-window, not "global". The problem appears to be that common hardware only has a single global gamma LUT per RAMDAC. Per-window gamma could be supported by the Win32 API, but it won't be seen on common hardware.
Scali said:
Only the application can handle that, and if the application is closed ungracefully, it doesn't handle it. Obvious problems, since your desktop will be screwed, it will still have the resolution and gamma settings that the OpenGL window used.
Res has been handled, and as for gamma let me just say that it's the same thing as for any "normal" Windows application. OpenGL applications on Windows are just Windows applications. They don't come with their own execution environment.
Scali said:
No, this is all completely new and intriguing to me k'thx.
That is painfully obvious, yes.
It's not my fault if you just invent your facts out of thin air. Of course I am surprised by your creativity, especially in its current form.
Scali said:
Why would the driver have to handle that? The application has a much better idea of what needs to be handled. It would also make the work for the driver easier, which was my point.
No, applications don't have a better idea than the driver about when a modeswitch or any other external event might cause reallocation of video memory resources. It's the driver that performs these reallocations in the first place. Got it?

The explicit possibility that a device might be lost at any time is just a Direct3D design choice, carried over from old ages I dare not speak of.

Scali said:
Whatever you call it, it works, and OpenGL doesn't.
How does OpenGL not work?
Scali said:
But ofcourse you'd rather discuss nomenclature than the actual problems.
What problems? Gimme!
And your nomenclature comment ... you were the one who said that OpenGL does not integrate with Windows and that Direct3D does. If you wouldn't spew around such nonsense, you wouldn't have to endure my "nomenclature". Maybe you could come forward to explain what the hell that was supposed to mean, instead of cranking up the diversion. "Glideism" was supposed to mean that Windows GDI is turned off completely during fullscreen rendering, which is, in my book, the rough opposite of integration. Sounds better?
Scali said:
Btw, the Thief 3 demo crashed. After the incident, my desktop refresh rate was somehow at 60 Hz ... you really should blame OpenGL for that, too, while you're at it.
Correct, this would never have happened with a D3D application. Sadly you lack the comprehension.
Thief 3 is purely a DirectX Graphics game. It's based on some version of the Unreal engine, but it ships without the multi-API support stuff. Comprehend that.
Scali said:
As you undoubtedly recall, I ignored the first part. Try to find out why.
Because you are either unable to understand the problem with the way OpenGL works, or because you are an OpenGL zealot and want to defend OpenGL any way possible, for emotional reasons only.
My bet is the second, since you are using all kinds of emotional arguments now.
I ignored the first part because it just was so strikingly stupid that it could only have been something you quickly made up to have, well, something. I figure if you had just reread that one sentence maybe a couple of times and had taken roughly twenty seconds to think, you would have seen yourself that it's just not worthy of writing, let alone commenting on.

Quick rehash:
The OS and/or gfx driver (whatever) does not know when a modeswitch occurs. The OS and/or gfx driver (whatever) does not know when a window covers the whole screen.
If that's not obviously wrong to you, there is no hope left. I will not explain what's wrong though. You will never learn if you don't try for yourself.

Your psychological analysis of me is both flattering and unsurprising. If I may try my own humble analytic skills on you I'd say that you already decided which conclusion you wanted to reach ("OpenGL sucks in all respects" or similar) before you went looking for supporting arguments, to reach that conclusion. And I also sense an imbalance of energy spent between these two tasks: most of it went into the first one.
 
Judging from all the silly reactions, I have hit a sore spot regarding OpenGL :p

Go ahead and call me names, and pretend you don't know what I mean (or are you not pretending, and are you really that thick?).
Calling me names isn't going to change the OpenGL issues.

I have nothing more to add, because I have already given all the answers before, they just haven't been read, or understood.

As for Davepermen, if you want to call someone dumb, at least spell it right :)
 
It's silly actions that cause silly reactions.

I'm glad non-existant issues won't change... ;)
 
Xmas said:
It's silly actions that cause silly reactions.

I'm glad non-existant issues won't change... ;)

Just because I'm outnumbered by underqualified OpenGL-zealots doesn't mean I wasn't right.
 
Here's a name for you: Troll.

There are plenty of OpenGL games that handle alt-tab correctly, and a handful that don't.
There are plenty of DirectX games that handle alt-tab correctly, and a handful that don't.
It doesn't require some s00per-SMRT leet coder to do it properly under either API.

ATI themselves have admitted that their OpenGL performance isn't up to scratch, and that they're working on it.

If the fact that if a game crashes it doesn't restore the gamma is the biggest problem with OpenGL you can find, then I say just lock this thread right now and cast it into the deepest pit of internet hell the mods can find, because there can be no value in allowing it to live any longer.
 
Since I'm new to the thread, I thought I'd interject my opinions :)
Scali said:
Judging from all the silly reactions, I have hit a sore spot regarding OpenGL :p
That, or based on all the silly dribble you've spewed in this thread, the only sore spot is the hole you've dug for yourself.
Go ahead and call me names, and pretend you don't know what I mean (or are you not pretending, and are you really that thick?).
Calling me names isn't going to change the OpenGL issues.
Calling you names isn't going to change the DirectX issues, nor is it going to create any OpenGL issues that don't exist. You're grasping at straws, making hugely generalized logical fallacies to substantiate whatever vendetta it is you have against OpenGL. In short, you really suck at defending your point of view, to the point of making yourself look completely undeducated and borderline absurd.
I have nothing more to add, because I have already given all the answers before, they just haven't been read, or understood.
No, I think you have nothing more to add because you have no retort for any of the logical data that was presented to you, NOR did you ever provide any sensible response. If I respond to you this way: "well the deflamulator izbot has gorked your nyewt-fran", I should have no logical expectation that you will give me any sort of answer. That's the gibberish you're spewing, and the reason that you're not getting any feedback.

I'm not a super duper programmer, but I know enough to Google search developer forums on how things work. I spent the last four hours just perusing through a few articles and code snippets and whatnot long enough to realize that you haven't the first fucking clue what's going on.

I already knew the ALT-TAB bit, and so should anyone who has two braincells worth of common sense. You're lamblasting an entire API based on what a few companies choose to optimize for. The arguments you use against OGL I could very easily apply straight to DirectX. Problems with Gamma, refresh rate, ALT-TABing, botched resolution changes, bad driver calls, poor performance, you name it.
 
Well now that we've vented that for the last 3 pages, and are hopefully feeling more adjusted today - could we possibly escape back to the topic?

Or should I just re-name this thread to the discussion you spawn and re-start my topic under a new thread?

I have a peice of advice - next time you want to hijack one of my threads, don't.
 
My apologies for not offering an on-topic response to your thread :oops: :(

It would be my general opinion that most "top end" cards are purchased by the enthusiast market; these are the same folks that will have similar "top end" processors and other supporting equipment. So it follows reason that, when the next generation of hardware becomes available, the vast majority of people who purchase it will also have the next "generation" of processor.

I say generation in quotation marks because, frankly, I'm not sure we're headed for any sort of generational leap from processors anytime soon. AMD's move to 64 bit on the desktop was a good decision, but 64 bit processing has been around far too long to call it a generational change. Just like RAID has been around for years, but is only now surfacing in IDE flavors on desktop boards. Good move, but not a generational change...

IMO of course ;)
 
Back
Top