Derek Smart on ATI Driver issues

Hellbinder[CE said:
]He is not talking about game workarounds in the driver code. He is talking about game developers that have worked around issues, meaning that if the real driver issue was addressed... then the game code would have to be patched to reflect the driver changes, else it could become corrupt.
I'm glad this topic was brought up. Developers should NOT be working around driver issues for different IHVs. Now that I've got that off my chest... do you know how hard it is to debug an application that behaves differently for different video cards? Or, what if IHV X has incorrectly implemented feature Y, so application Z takes note of this and uses this incorrect implementation. Now IHV W is SOL because the application is using the wrong implementation for a feature.

This is why IHVs have devrel people: Report bugs to them so they can be fixed! :devilish:

It's bad enough that drivers need to be hacked for applications that do silly things (like crash if you report too many texture formats... this is a famous one), but having to put bugs into the driver so the behavior matches another IHV's driver is ridiculous.
 
Jeff Royle from ATI Dev Relations posted that on Rage3D a year or so ago..

There are a high number of games that were developed on non-ATI boards which means any driver bugs they have may be worked around in code. If the game is not tested on ATI boards before release and these bugs found, the game goes gold and ships that way. When the bug is eventually found and determined to be a game bug, we contact the developers of the game and let them know. We can then request a patch if they are willing and even offer advice on how to fix it. ATI will not knowingly break a driver to make a game work.

In rare cases, developers will not create a patch and then we can only take note of the title and try to remember the bug for future reference. The state of the development community seems to be shifting for the better these days and many bugs are hammered out well in advance of shipping, some later on. We do our best to get all titles tested and bugs found.

The biggest problem we encounter is that end users don't always realize it's not a driver issue that causes the problems. When the game is written as above, on different graphics hardware and bugs are just accepted and worked around in code then it's hard for us to say "But the problem is in the game" because end users see it works for other people on different graphics hardware. ATI already has a bad rap for drivers and yet we won't intentionally leave a bug in a driver. Competitors occasionally will leave a known bug in the driver maybe because they are afraid of what everyone will think when they actually fix it.

Recently I've seen a trend where developers let us know of bugs in the competitor drivers which acts as a heads up for us. Then when other developers come across a problem we can offer the advice that it may not be a bug on our side.
 
OpenGL guy said:
I'm glad this topic was brought up. Developers should NOT be working around driver issues for different IHVs.

The way I see it, as Derek Smart so abrasively put it, sometimes it's necessary to get your game to work on video card X. Of course, at the very least, the program should most certainly check that it's a video card from the manufacturer who has the bug...that is, these workarounds should only affect the hardware that has the bug.

This is why IHVs have devrel people: Report bugs to them so they can be fixed! :devilish:

Well, at least from Derek Smart's perspective, he has been reporting bugs, then working around them when no response was emminent.
 
OpenGL guy said:
RussSchultz said:
I think it has something to do with failing WHQL if you have overclocking options on your property tabs, though I might be completely off base there.
To me, having an overclocking utility built into the driver is a big Pandora's box... First, some people will break their cards from overclocking. Second, overclocking can reduce the lifespan of the product. Third, some people might be tempted to overclock the card in the driver in order to increase benchmark results. Fourth, there are many other issues that seem more important.

The average user is not an overclocker, so is it really a worthwhile investment to add this feature to the driver? Don't forget that there are third party overclocking utilities that work just fine.

If there were no third party applications to fill this niche, then I would be more supportive of it.

P.S. This is all my own opinion. I don't speak for ATi (and never do).

This is the general gist of feeling I get with all my dealings with ATI regarding my work on Rage3D Tweak. They won't put OCing in the drivers for the reasons you listed and others, but they are happy that we provide that capability to their users... they'll even answer some questions now and then if we bug them enough. :)
 
honestly though...

If they conceal the overclocking altogether...but pop it up when some obscure registry key is detected, such as "MrMcGoo," then I don't see the problem.

Heck...I know it's somewhat unfashionable to do a 1-to-1 comparison between nVidia and ATI...However, nVidia even now provides an overclocking panel with their reference drivers...

This may seem like a small thing...But, when you read between the lines, I think it's somewhat obvious (at least to me) that the reason it's in there is to dispell the notion that nVidia's standard reference drivers are/were inferior to ATI's, and that ATI provided a heck of a lot more control for their products...

In essence, it's my opinion that nVidia users can directly thank ATI for providing the incentive to have such a control...

Likewise, I wouldn't be all too surprised to see ATI do the same.

Is it a coincidence that this last nVidia driver finally provided a control for D3D Anisotropic Filtering? I mean, how long has that feature been available now? Does anybody honestly think they hadn't read some of the comparisons where reviewers stated, "We refuse to use a 3rd party hacker or registry hack to enable this feature...therefore, we're going to show you how good/bad nVidia's products look compared to the Radeon's when their A.F. selection is enabled."
 
good point type, but i think, in likelihood ATi wont provide a overclocking "tab." Why?

well clearly, there are samples of the r9700 out that overclock really well, but its a high possibility that there a good deal of boards that do not. we know for a fact the chip is running pretty hot, and 325MHz is a generous clock speed... if a user were to read that anand got his r9700 to 400MHz then try that on his own and completely melt his whole board/agp slot.... it could be really bad for ATi LOL.[/u]
 
It's bad enough that drivers need to be hacked for applications that do silly things (like crash if you report too many texture formats... this is a famous one), but having to put bugs into the driver so the behavior matches another IHV's driver is ridiculous

Ouch... i wonder who that could be talking about?? *cough*nvidia*cough*... Like that irritating issue when tribes 2 first came out.. Where the only cards that correctly supported the way fog covered buldings in the distance (way laymen I know, I know) were Nvidias. Everyone elses cards had *bugs* that made the buldings blink and act all quirky as they got covered by distance fogging. It turns out it was becuase of some quirky way that only Nvidias openGL code handled the situation.. and they coded *specifically* for that only..... :rolleyes:

i see what you mean.
 
Chalnoth said:
The way I see it, as Derek Smart so abrasively put it, sometimes it's necessary to get your game to work on video card X. Of course, at the very least, the program should most certainly check that it's a video card from the manufacturer who has the bug...that is, these workarounds should only affect the hardware that has the bug.
A big problem is lack of testing on the part of the ISVs. It used to be that everyone tested their stuff on 3dfx and that was that. That is how many of the "if texture format X isn't in the right slot, we'll get corruption" and "if too many texture formats are reported, we'll crash" bugs popped up. Now things are more subtle. Like, "let's use the last 32-bit format, whatever it is" bugs. DX8 changes this because you have to request a format and DX8 tells you if it is available, so at least we can avoid these simple problems :-?

Now, 3dfx is gone, but many ISVs are limiting their testing to another IHV's products, so we get more of the same. If a driver bug is encountered, ISVs assume the fault is their's or just work around it to make things work. History has this way of repeating itself.
 
multigl2 said:
well clearly, there are samples of the r9700 out that overclock really well, but its a high possibility that there a good deal of boards that do not. we know for a fact the chip is running pretty hot, and 325MHz is a generous clock speed... if a user were to read that anand got his r9700 to 400MHz then try that on his own and completely melt his whole board/agp slot.... it could be really bad for ATi LOL.[/u]

Alternatively they want to hold something back for when they do the DDR-II versions of the board...
 
OpenGL guy said:
Now things are more subtle. Like, "let's use the last 32-bit format, whatever it is" bugs. DX8 changes this because you have to request a format and DX8 tells you if it is available, so at least we can avoid these simple problems :-?

I remember that one as well :)

K-
 
hmm in his rant DS is very definite that the ATI drivers regularly break basic behaviour like fog. Or is it a case he appears to think ATI drives break it, because he is used to coding in a certain way?

DS is a D3D programmer right and JC seems happy right now with ATI, but that would be OGL and all the bugs listed for the 9700 seem to be D3D games (maybe because there are more I suppose). What can be read into that?
 
Typedef Enum said:
OpenGL guy...

I agree with you...and this is why ATI should implement a "CoolBits" like feature...Don't enable it by default, but allow the "power" and "enthusiast" crowd an opportunity to enable this. In doing so, you won't necessarily have to worry about providing a feature to the avg. Joe, as they're not very likely to explore such an option...

The "power" and "enthusiast" crowd can already download Rage3D Tweak or Powerstrip, the first of which is free. We don't need ATi to add an overclocking tab to their drivers, and it's probably better if they don't (although you have to be pretty stupid to be able to kill your card via OC'ing...I mean you really have to push the limit and ignore the fact the card keeps locking and/or severe graphical glitches).

So, anyway, if I were them I just wouldn't bother. Leave it to the enthusiasts to take care of themselves.
 
multigl2 said:
good point type, but i think, in likelihood ATi wont provide a overclocking "tab." Why?

well clearly, there are samples of the r9700 out that overclock really well, but its a high possibility that there a good deal of boards that do not. we know for a fact the chip is running pretty hot, and 325MHz is a generous clock speed... if a user were to read that anand got his r9700 to 400MHz then try that on his own and completely melt his whole board/agp slot.... it could be really bad for ATi LOL.[/u]

I disagree completely...if some cards can overclock to 400 MHz (and it seems to be more than one) then it's not true at all that 325 MHz is a generous speed. In fact I guarantee 325 MHz is conservative so that ATi can maximize yields while still getting stellar performance.

A 400 MHz R9700 would kill a Ti4600 all the more, but at what cost in yields? There's just no real point for them to push the envelope and sacrifice chips when Nvidia has no real competition for them (how times change, LOL).
 
Randell said:
hmm in his rant DS is very definite that the ATI drivers regularly break basic behaviour like fog. Or is it a case he appears to think ATI drives break it, because he is used to coding in a certain way?

can't speak of the r200 / r300 drivers, but the r100 (presumably) drivers used to have problems with fog on a regular basis. i have recollections of having to come up with a special r100 path where vertex fog was used instead of exponential, which was used in the common path, as per-pixel fog behaved differently with each new r100 driver release (but overall incorrectly). eventually the title ended up looking a bit different on the r100 in the fog aspect, i.e. on r100 it was linear and on the rest of the cards - exponential. admittedly, on some of the other cards vertex fog was broken, which, aside from the artistic reasons, was the main (technical) reason to prefer exponential fog in the common path.

at that time, i, too, asked myself how could such basic features be so broken (and not only with r100's pixel fog but with other vendor's vertex too)
 
I disagree completely...if some cards can overclock to 400 MHz (and it seems to be more than one) then it's not true at all that 325 MHz is a generous speed. In fact I guarantee 325 MHz is conservative so that ATi can maximize yields while still getting stellar performance.


Aside from yields and competition, I imagine there are heat and power issues.

325Mhz to 400Mhz would represent around a 23% increase in clock speed. Wouldn't that also mean a 23% increase in heat output (correct me if i'm wrong on that one)?

If that is the case R300's current heat output of 50 watts would jump to about 61.5 watts, and if you increase the clock speed you really should increase the memory speed too which would further increase power demands right?

Regardless of all that stuff I'm sure ATi will figure out a way to make things faster. :)
 
darkblu said:
at that time, i, too, asked myself how could such basic features be so broken (and not only with r100's pixel fog but with other vendor's vertex too)

bugs in the silicon perhaps. I take it its acceptable to suggest there are going to be minor bugs in the silicon of products :)
 
Randell said:
bugs in the silicon perhaps. I take it its acceptable to suggest there are going to be minor bugs in the silicon of products :)

sure. but i'm also positive that even if the title publisher had put up with that, the title's customer wouldn't have been pleased if i had just waved and said 'aah, what the heck - fog's out of order, the customer expects for such things in the sillicon & drivers anyways.' :)
 
oops...

What happened is that I uninstalled the newer Det. drivers, but didn't do a full reboot to clear out the registry...So, when I reinstalled the drivers, it preserved the registry settings.

In doing so, it gave me the impression that the overclocking panel was there by default...But you're right, you still have to provide the CoolBits registry key/value in order to see it.
 
darkblu said:
the title's customer wouldn't have been pleased if i had just waved and said 'aah, what the heck - fog's out of order, the customer expects for such things in the sillicon & drivers anyways.' :)

oh no I meant acceptable on this forum, you know how touchy people can get ;)

of course the customer wouldn't find it 'acceptable' :)
 
Back
Top