Driver stability - Carmack still gives edge to NVIDIA

Really, dig? With you it's so hard to tell... :p ;)

I'll steal a post I made elsewhere to sum up my feelings on their drivers at this point:

You'll be fine with either at this point. Main thing to do is know how to cure problems that DO happen to come up, as you won't be able to predict them and it's an impossibility to survey enough people and track enough data for an "overall view" of drivers with the broad swath of gaming you're likely to do. I've had issues go away with reinstalling the game, reinstalling Windows, updating my sound card drivers, updating my VIA 4-in-1's, adjusting a BIOS option, checking Option X, unchecking Option Y... And usually, any serious issues will happen an extreme minority of the time. Not really worth the worry. If you spend 90% of your time on a few games and want to REALLY know which cards handle them best you can do a decent investigation, but predicting for all your gaming habits for the next few years just ain't gonna happen.
 
PaulS said:
I have absolutely no experience with ATi drivers, so this is really just an impression I've gotten from the various forums around, but it seems to me that the ATi drivers are finicky and very susceptible to hardware conflicts. But if you've got a system which avoids the particular troublesome hardware combinations, they're great - and probably better than their nVidia counterparts.

Again, this is all an impression. I may well be completely wrong.

That's the old ATI reputation. A lot of pro-Nvidia people still cling onto it. Since the Catalyst program, ATI's drivers have come on in leaps and bounds, and are every bit as good as Nvidia's. In fact at the moment, going by what developers are saying, ATI's drivers are much better. ATI's driver team is spending it's time making better drivers, not concentrating on cheats in order to compensate for poor hardware like the Nvidia driver team are doing.

I've been using the Cats on a 9700 Pro for the last year, and I've had no problems or compatability issues. They've been really, really good, much better than I was expecting, and they've only got better over time.
 
Bouncing Zabaglione Bros. said:
That's the old ATI reputation. A lot of pro-Nvidia people still cling onto it. Since the Catalyst program, ATI's drivers have come on in leaps and bounds, and are every bit as good as Nvidia's. In fact at the moment, going by what developers are saying, ATI's drivers are much better.

Maybe on the Direct3D side, but what about OpenGL? Carmack certainly isn't saying that ATI's drivers are not only as good as NVIDIA's, but they are actually better...
 
StealthHawk said:
Maybe on the Direct3D side, but what about OpenGL? Carmack certainly isn't saying that ATI's drivers are not only as good as NVIDIA's, but they are actually better...
read the first 15 posts in this thread.
 
StealthHawk said:
Maybe on the Direct3D side, but what about OpenGL? Carmack certainly isn't saying that ATI's drivers are not only as good as NVIDIA's, but they are actually better...

From *Carmack's* point of view as a developer (which is always biased towards whatever features he happens to want from his current engine), he still give Nvidia's OpenGL drivers "the edge".

This is a *marked* difference to how things were a year or two ago. And ATI have just released a new OpenGL in their hotfix 3.10 drivers.

Even if you disregard the massive differences in hardware performance, and only consider the drivers (a bit of a silly thing to do really), once you look at OpenGL & DirectX drivers from both companies, you can't say that ATI's are worse - they are at least as good, and many people think they are better.

This "ATI's drivers are crap" is just an old rep that's been following ATI around. The Nvidia PR department cling desperately to this at same time as they are bastardising their own drivers to make up for their poor hardware.
 
Bouncing Zabaglione Bros. said:
Even if you disregard the massive differences in hardware performance, and only consider the drivers (a bit of a silly thing to do really), once you look at OpenGL & DirectX drivers from both companies, you can't say that ATI's are worse - they are at least as good, and many people think they are better

I am not saying whether I think ATI's drivers are better, worse, or at the same level of NVIDIA's. I am not in a position where I can(from a developer's view) or need(from a enduser's view) to make such assessments. Both ATI and NVIDIA's drivers are perfectly fine for me on my systems. I have had an equally good experience with both IHV's drivers. But that hardly means that I can extend my experience to be relevant to other people. The whole problem of induction.

ATI's drivers are around the same ballpark as NVIDIA's, and I think I've said as much in this thread. Some people will say they're better. Some people say they're worse. Again I reiterate that my personal belief on exactly where the drivers stand in relation to each other is pretty irrelevant since I can only speak from my own experience, which doesn't necessarily represent anyone else's experience.

And I still don't see any evidence to support your claim that ATI has surpassed NVIDIA in OGL. So you say Carmack is biased. Fine, fair enough. How about offering some quotes from other OGL programmers who aren't biased then? Obviously NVIDIA's drivers aren't/weren't as good on the D3D side as ATI's. I am not disputing that. Although I really don't know how much the 52.xx drivers have improved with regard to DX9. I don't think I've heard any complaints about driver stability since they came out which doesn't say that such problems don't exist- just that I haven't heard about them like I did with old drivers. Again, clearly ATI's DX9 drivers support and expose more features than NVIDIA's DX9 drivers. But I was talking about OpenGL and not Direct3D.
 
Obviously NVIDIA's drivers aren't/weren't as good on the D3D side as ATI's. I am not disputing that.

I am.

Speaking from a graphically & performing obsessed persons view:

I've seen so many games that run and look much better under D3D than under OpenGL on my former NV20.

SS:SE is one game that runs better under D3D than under GL. I believe I made a post at nVnews about this. Difference on average was around 10fps. I forgot which drivers I used but they were the recent drivers during that time.

Maybe SH could dig the thread up? Maybe it exists somewhere on the database.
 
How about offering some quotes from other OGL programmers who aren't biased then?
you mean, someone else uses GL besides Carmack? ;)

and I'll respond to WaltC's giant post sometime. but, to sum that up before I even write it, those were just my experiences. is it the video card's fault? surely, because A. it worked with a 5200 and a Ti4600 and B. it now works with a 5700 Ultra. kinda narrows it down. but, I'm absolutely certain that it has a lot to do with system configuration; for example, I'm using the WinXP wireless stuff, and that will probably cause a bit of funkiness now and then.
 
StealthHawk said:
ATI's drivers are around the same ballpark as NVIDIA's, and I think I've said as much in this thread. Some people will say they're better. Some people say they're worse. Again I reiterate that my personal belief on exactly where the drivers stand in relation to each other is pretty irrelevant since I can only speak from my own experience, which doesn't necessarily represent anyone else's experience.

The point I am trying to make here is that we have a thread with a broad sweeping title like "Driver stability - Carmack still gives edge to NVIDIA", without an explanation or context. Without the context or understanding of where Carmack is coming from, such a title is meaningless and sensationalist.

You have to know that Carmack is talking only about OpenGL, that he is talking from a developers point of view, that he is always biased towards whatever he needs in his current engine (as he has been historically).

The majority of people are interested in drivers from a user point of view, will also consider the DirectX performance/stability *essential*, and will not divorce hardware performance/stability from that of the drivers. From a user point of view, both are necessary components in the graphics system.

While what Carmack says may be correct from his own point of view, it is a very refined and rarified distinction he is making here, and yet it's being picked up and waved around without an understanding of it's context, as if it is some kind of massive win for Nvidia, when it is nothing of the sort.

Carmack likes Nvidia's OGL drivers slightly better for Doom 3 - so what? . Does that make up for Nvidia have lower quality IQ, driver cheats, poor hardware, poor DX drivers, etc? Does it change the fact that ATI have made up a phenominal amount of ground in the last year when it comes to improving the quality of their drivers? Does it change all the other compromises you must make as a consumer when choosing to use Nvidia cards over ATI cards because Carmack likes to program Doom 3 on Nvidia drivers at the moment?

Without an understanding of all this, "Driver stability - Carmack still gives edge to NVIDIA" as a title is misleading at best, especially when others come in to repeat second hand the same old out-of-date information about ATI having poor drivers.
 
Something else to bear in mind is that we are still only just coming out of the era where games were primarily developed on the GF4 for many development houses. In terms of end user compatibility then very latest games are the ones that are going to be telling in terms of how far ATI have come because a fariyl substancial amount of development of the game may well have occured on ATI hardware now.
 
It is always interesting to read these types of threads. I will gladly give ATI credit for their cat program. A "HUGE" improvement.

However, having both vendors cards (ATI and Nvidia), I still give nvidia the edge. I have 3 ATI cards, 9800 Pro, 9500 Pro, and 9100 Pro and two Nvidia cards - 5700 Ultra and Ti4200. When I install the same game on all comps I've had less problems with Nvidia whereas I've had more issues with ATI. All are new games using DX 8 and 9.

And if truth be told, this is what irrates me the most. Because it takes time to install, then trouble shoot, read the readme files, boards - then find out how it is a driver issue. The whole time sending in emails via the vendors page only to get a response it is your system... /boggle.

Anyway - ATI has made some huge improvements. They deserve credit. But in my house and environement I can still see a difference between the two as far as issues between new released games and bugs.

Not a huge deal - nor one to prevent me from buying their products. But never the less, something I've noticed and why I'd say Nvidia is better. "In my opinion".
 
DaveBaumann said:
Something else to bear in mind is that we are still only just coming out of the era where games were primarily developed on the GF4 for many development houses. In terms of end user compatibility then very latest games are the ones that are going to be telling in terms of how far ATI have come because a fariyl substancial amount of development of the game may well have occured on ATI hardware now.
The argument that games run better on GF* cards often because the games were developed using GF*s always comes up, but it seems moot. If I were ATI, I'd see two options here:

1. get as many cards out to as many developers as quickly as possible, and have them release ATI patches
2. application detection for bugfixes

Yeah, so the playing field's not even, but an end user doesn't care about that. He or she just wants the damn thing to run well.
 
K.I.L.E.R said:
Obviously NVIDIA's drivers aren't/weren't as good on the D3D side as ATI's. I am not disputing that.

I am.

Speaking from a graphically & performing obsessed persons view:

I've seen so many games that run and look much better under D3D than under OpenGL on my former NV20.

SS:SE is one game that runs better under D3D than under GL. I believe I made a post at nVnews about this. Difference on average was around 10fps. I forgot which drivers I used but they were the recent drivers during that time.

Maybe SH could dig the thread up? Maybe it exists somewhere on the database.

Doesn't SS:SE in OpenGL has a lot more effects and stuff to toggle than direct3d, and thus push the hardware harder?(hey, at least it's not as bad as 3dfx anything versus glide, though later 3dfx crippled glide so much that d3d caught up)
 
The Baron said:
The argument that games run better on GF* cards often because the games were developed using GF*s always comes up, but it seems moot.

Baron, if you've been reading here closely for some time you'll know there have been comments from people from PowerVR and ATI bemaoning the fact that there have been previous WHQL non-compliances on boards developers used most frequently and that has caused both ATI and PowerVR issues due to the fact that the developers expected it to be done "the other way" even when ths wasn't compliant. Thats one issue and there are numerous cases were games prever things to be done "the GeForce way".

Yeah, so the playing field's not even, but an end user doesn't care about that. He or she just wants the damn thing to run well.

And these things take time to shake out. For instance, what board(s) do you expect are going to raise the least issues with HL2?
 
The Baron said:
Yeah, so the playing field's not even, but an end user doesn't care about that. He or she just wants the damn thing to run well.

But who's job is it to fix the code? The driver team at ATI, or the developers that wrote faulty code around the idiosynchrasies in the Nvidia drivers (think Bioware)?

I used to work with some *NIX developers, and when IBM was their primary development platform, it had a particularly forgiving compiler, and the OS would pick up things and solve errors by automatically zeroing uindefined variables and stuff like this. When moving to other platforms, the same code would come back and bite you in the bum because other platforms were more strict. They forced you to write code that complied to the standards, rather than being slack and trying to catch your errors.
Soon the developers shifted their main development platform to the one with the strictest compiler (Compaq Alpha, IIRC) to ensure that code worked the way it was supposed to.

Developers who code to the Nvidia standards (ie "do it our way, not the API way) are the ones who run into errors with their code in exactly the same way as above. It is these developers who should fix the errors. I think it is just as likely that the consumer reacts with "this game is a POS" rather than "my expensive hardware is a POS".

I think the idea of Nvidia being the primary development platform is now changing as ATI consolidate their position at the top of the DX9 performance level (and have been doing so for more than a year). With the problems that developers are reporting with Nvidia drivers, it seems likely that more and more games will be developed with ATI as the primary platform.

The only problem I can see is the disturbing trend of Nvidia buying off developers to code only for Nvidia hardware, or writing code for the developers and ensuring that it will not run correctly on competing hardware, which again is something to lay at the developer's door, not the ATI driver team.
 
DaveBaumann said:
The Baron said:
The argument that games run better on GF* cards often because the games were developed using GF*s always comes up, but it seems moot.

Baron, if you've been reading here closely for some time you'll know there have been comments from people from PowerVR and ATI bemaoning the fact that there have been previous WHQL non-compliances on boards developers used most frequently and that has caused both ATI and PowerVR issues due to the fact that the developers expected it to be done "the other way" even when ths wasn't compliant. Thats one issue and there are numerous cases were games prever things to be done "the GeForce way".

Heh, haven't been reading very closely for a while. Only started really reading it over the past month or so, so cut me a bit of slack :)

Yeah, so the playing field's not even, but an end user doesn't care about that. He or she just wants the damn thing to run well.

And these things take time to shake out. For instance, what board(s) do you expect are going to raise the least issues with HL2?

I hope you didn't expect me to say NVIDIA just because I work at NVN. If someone is buying a card primarily for HL2, they should buy an ATI card. Just the way it is.

I know it will take some time to work out; all I'm saying is that it hasn't fully been worked out yet.

As for the "whose responsibility is it to fix it" question--there's no way in hell I'm taking a real position on that one. I'll just stick with "both, to a degree" and call it a day.
 
I hope you didn't expect me to say NVIDIA just because I work at NVN. If someone is buying a card primarily for HL2, they should buy an ATI card. Just the way it is.

No, but I'm just making the point that you'd certainly expect there to be fewer issues on ATI boards than NVIDIA with HL2 because we know the lionshare of development has occured with ATI's boards. That, we know, is the case with HL2 and probably a number of other titles coming along - the same could not be said previously.
 
Bouncing Zabaglione Bros. said:
Without an understanding of all this, "Driver stability - Carmack still gives edge to NVIDIA" as a title is misleading at best, especially when others come in to repeat second hand the same old out-of-date information about ATI having poor drivers.

Clearly ATI has had some very nasty issues like the 16bit bug which existed for quite a long time...and didn't happen that long ago. In a purist fashion, such issues may be "out of date," but the reality is that this bug was running rampant just a few months ago.

This seems like a pretty major bug to me from an end user's POV. How such a bug got into the wild and stayed there for awhile is puzzling. I have never heard about anything like this happening with NVIDIA cards.
 
I don't know if there are more effects under GL than in D3D in SS:SE. Both D3D and GL look alike.

Fox5 said:
K.I.L.E.R said:
Obviously NVIDIA's drivers aren't/weren't as good on the D3D side as ATI's. I am not disputing that.

I am.

Speaking from a graphically & performing obsessed persons view:

I've seen so many games that run and look much better under D3D than under OpenGL on my former NV20.

SS:SE is one game that runs better under D3D than under GL. I believe I made a post at nVnews about this. Difference on average was around 10fps. I forgot which drivers I used but they were the recent drivers during that time.

Maybe SH could dig the thread up? Maybe it exists somewhere on the database.

Doesn't SS:SE in OpenGL has a lot more effects and stuff to toggle than direct3d, and thus push the hardware harder?(hey, at least it's not as bad as 3dfx anything versus glide, though later 3dfx crippled glide so much that d3d caught up)
 
I would be surprised if NVidia drivers wouldn't be Carmack's favorite for Doom 3 given the huge amount of work and PR weight they have put to one game (the benchmarks around NV35 launch just as one example). I would take his view purely as the subjective matter of running Doom 3, not even OpenGL in general, and even less D3D.
 
Back
Top