Driver stability - Carmack still gives edge to NVIDIA

Reverend

Banned
John's musings at CGW

Carmack said:
.... but I've found the NVIDIA drivers to be more stable. ATI has gotten a lot better, though.

He's talking about stability only of course. Not sure what the "gap" is between NV and ATI since he didn't mention specifics but in my experience, both sets are surprisingly fuss-free but then again I probably don't stress the hardwares as much as he does!

Carmack said:
First generation hardware really isn't for consumers, it's for the developers. I'm already thinking of the next engine and when I finally put DOOM 3 to bed, I'm going to start working on an engine that will take advantage of the hardware coming out now. By the time it's out, though, nobody will still have the GeForce FX or Radeon 9000 series cards.
Stating the obvious. Some will question his first sentence (what about improved AA and filtering? what about plain ole' improved performance with AA./filtering applied?) but I think we all know what he meant. As for his next engine, I think he meant that he'll be working on DX9-level (DX and GL) features (in a private email to me, while he didn't actually say this, he seemed to allude that stressing DX9 is probably what should be expected), stressing such, but that the current generation from both NV and ATI will run his next engine-games much like the previous generation of hardware will run DOOM3.

Any guesses what his next engine will stress on?
 
pixel shaders. most certainly.

and yes, I agree with him about driver stability. I've had one crash so far with this 5700 Ultra versus 20 or 30 with a 9600 Pro. ATI has drastically improved performance and speed, but stability is not something that comes overnight (or even within a year). in another year or so, though, they should improve significantly and will probably equal or at least come a lot closer to NVIDIA stability.

and anyway, the whole VPU Recover thing seems ludicrous; it's like a band-aid to help a severed limb. "We know it'll crash, so when it does, it should automatically restart and hopefully not bring the entire system down with it?" or am I simply brain dead after too much turkey?
 
The Baron said:
or am I simply brain dead after too much turkey?
Too much turkey. Go to the feedback lounge of nVnews and sleep it off. ;)

(You don't really think it's pixel shaders in his next engine, do you? :rolleyes: :LOL: )
 
digitalwanderer said:
The Baron said:
or am I simply brain dead after too much turkey?
Too much turkey. Go to the feedback lounge of nVnews and sleep it off. ;)

(You don't really think it's pixel shaders in his next engine, do you? :rolleyes: :LOL: )
No, I bet it's moving away from using triangles to using quads. So, the uber-Doom 4 card will be NV1.
 
The Baron said:
and anyway, the whole VPU Recover thing seems ludicrous; it's like a band-aid to help a severed limb. "We know it'll crash, so when it does, it should automatically restart and hopefully not bring the entire system down with it?" or am I simply brain dead after too much turkey?
actually, the VPU recovery thing will give ATI a huge leg up on nVidia when longhorn rolls around and the OS is hardware accelerated.
 
I was about to say... I don't think anyone would argue against the fact that nVidia are still streets ahead when it comes to the quality of their OpenGL driver.
 
Hanners said:
I was about to say... I don't think anyone would argue against the fact that nVidia are still streets ahead when it comes to the quality of their OpenGL driver.
I'd be one of those to argue. "streets ahead" would be giving NVIDIA too much credit... like John says, ATI is improving, and in my case NV and ATI is almost evenly matched (if you could forgive the odd extension here and there, which really isn't about quality, though you can argue about that over and over).
 
Reverend said:
I'd be one of those to argue. "streets ahead" would be giving NVIDIA too much credit... like John says, ATI is improving, and in my case NV and ATI is almost evenly matched (if you could forgive the odd extension here and there, which really isn't about quality, though you can argue about that over and over).

OK, maybe streets ahead would be pushing it... I think I just have the Call of Duty crashing problem still burned into my brain. ;)

Still, I can't talk from anything but an end-user experience, but nVidia's OpenGL stability and performance always seems to have had the edge by a fair margin. That isn't to take anything away from ATi though, they have improved a lot in that area, in line with their overall driver quality.
 
When I look at how awesome Doom3 looks, and I think that it is based on features of the geforce1, I simply cannot imagine how great his next enging will look if it is based on r300 minimum specs...
 
It's really sad VS3.0. didn't make it in the NV30, because if it did, maybe Carmack would consider it ( and thus texturing in the VS ) as a standard for his next engine... Oh well, I'm sure some other developers will ;)

Still, I suspect JC would include some ( primitive? ) VS texturing & PPP support in his next generation engine, since by that time the next gen hardware will have pretty mature technologies and excellent speed for these features. I do agree with Baron it'll most likely be massive Pixel Shading goodness - on cards such as the R500 and NV50, stressing only a part of the pipeline is not as dramatic as on older generations of cards AFAIK.

I'd also bet that JC is getting tired of being criticized for doing indoors-only games with engines rather uncompetitive ( although certainly not bad at all even in these circumstances, looking at certain derived games! ), so he'll focus a bit more on outdoors this time around; not forgetting his love for dark indoor walls, though! :LOL:


Speaking of Doom 3 though, I really, really hope the screenshots aren't too hand-choosen. Having played the DX2 demo, shadow volumes are, well, stupid in it. They eat performance, don't look too good, and they make the rest look awful. What can I say? Oh, yes... Meh!
I'm sure Our Saviour JC will have done an awful more better than those "Deus" wannabees, though :p


Uttar
 
Tagrineth said:
The Baron said:
No, I bet it's moving away from using triangles to using quads. So, the uber-Doom 4 card will be NV1.

I wonder if a Saturn port is on the way, too? :D
Of course. If we're lucky, we might even get to play it on the Jaguar. Now THAT would be rockin.
 
The Baron said:
and yes, I agree with him about driver stability. I've had one crash so far with this 5700 Ultra versus 20 or 30 with a 9600 Pro. ATI has drastically improved performance and speed, but stability is not something that comes overnight (or even within a year). in another year or so, though, they should improve significantly and will probably equal or at least come a lot closer to NVIDIA stability.

I'm pretty sure Carmack was addressing stability from a developer's perspective and not from an end users'. The two are different.
 
I never had any crashes on my NV20 nor my R300 unless I forced my system to crash for testing reasons. Either I was bored or I tried to diagnose a problem for someone else.
 
StealthHawk said:
The Baron said:
and yes, I agree with him about driver stability. I've had one crash so far with this 5700 Ultra versus 20 or 30 with a 9600 Pro. ATI has drastically improved performance and speed, but stability is not something that comes overnight (or even within a year). in another year or so, though, they should improve significantly and will probably equal or at least come a lot closer to NVIDIA stability.

I'm pretty sure Carmack was addressing stability from a developer's perspective and not from an end users'. The two are different.
why must you always disagree with me? I mean, we agree like maybe one time out of seventeen-thousand. :? Why can't we just be snugglebuddies?
 
cthellis42 said:
The Baron said:
Why can't we just be snugglebuddies?

I think there are other forums for that. ;)
Hehe. Seriously, sometimes I think Stealth is me and I am Stealth, and we are simply polar opposites in our perspective. It's funny.

But I would still say that NV's drivers are more stable, even for an end-user. A year isn't exactly a long time for something to be around, and the monthly updates, with their constant feature additions and performance improvements, are bound to cause problems. I'd rather they take their driver updates, add 2-4 weeks to each one for beta testing, and release 6 or 8 drivers a year. To me, it seems like a lot of it (the whole Catalyst program, with CM appearing and the monthly driver updates) is PR rather than good driver decisions. Constant new drivers doesn't equal a better driver. People don't really understand that.
 
K.I.L.E.R said:
I never had any crashes on my NV20 nor my R300 unless I forced my system to crash for testing reasons. Either I was bored or I tried to diagnose a problem for someone else.

Yeah, for the longest time my only crashes have been due to sloppy programming. That is. . . er. . . my sloppy programming. . . :oops:
 
Back
Top