Coming from you, this is a very, very good post. So, I will respond in kind.
Bigus Dickus said:
Derek's main beef seems to be that ATI made a decision that broke his game, and regardless of whether we think it's his game code's fault or ATI's fault, the fact remains that his game is broken and it's a pain in the ass for him.
Correct
BUT, its not that big a deal because my code does fallback to the use of a Z buffer.
The point is that why should I have to put up with blatant rendering artfiacts in my game for which a gamer out there paid almost $400 to run on a 9xxx series of card. When they could have bought an nVidia, Matrox or any other board and not have a problem.
Gamers rely on visual quality and speed. Thats why graphics cards lead. There are games that ignore rendering artifacts such as texture swimming, tearing, shearing, Z fighter, 3D clipping, broken fog etc etc but more often than not, the games do sell but you WILL see reviewers and/or gamers mention this.
When you fork out $400 for a board (I got my 9xxx series free from ATI btw. It cost me nothing), you expect it to (a) work out of the box (b) have drivers that, while not perfect, don't BREAK games you are currently playing - and which ATI *do* have access to and *know* are broken (c) see little or NO rendering artifacts.
Look, we're not talking about a $100 card here. OK?
PLUS, you have these fan
ATIcs bitching, insulting, harrassing me etc just because I happen to hold firm on my opinions and beliefs. Thats the kind of thing you find in an nVidia vs ATI flame thread or a console war thread.
This is NOT about what card is better. In fact, I try to steer away from that because thats NOT the intent. As I have said before, ALL graphics drivers have problems. But so far, MY GAMES and several others, run FLAWLESSLY on ALL nVidia and Matrox boards. But when it comes to ATI's hardware, they either (a) run (b) not run at all (c) run but you have all these
blatant driver bugs.
Bigus Dickus said:
Granted. He's sour, and I would be as well. The obvious question is what is a slander campaign against ATI supposed to accomplish?
Since when was this slander? Or a campaign for that matter? Surely you jest. So, all the gamers, devs, reviewers etc who are bitching about the
SORRY STATE OF ATI DRIVERS are on a slander campaign too, then?
Bigus Dickus said:
Further, a sticky point is that ATI didn't tell him they were dropping W-buffer support. I can't help but wonder just what difference this would make.
I'd know about it in much the same way they chose to frigging pimp how many damn transistors the card has? Whats the difference? Oh, I get it. They'd rather pimp the HW to sell it - and leave devs to pick up the pieces of broken drivers and/or missing support? Yeah, thats the ticket. In fact, thats
EXACTLY what they've done isn't it?
This is the
SAME ATI that cooked up drivers in order to exceed Quake3 benchamarks - when they could have put all that time and effort into FIXING buggy drivers.
The
SAME ATI that has removed W buffer support from the HW because, according to them, it would be de-stabilize the driver and produce a speed impediment.
Let me ask you this, if nobody is using the W buffer - WHY - should they care if its going to cause a speed degredation or not? Its
BOLLOCKS. Its not like the W buffer ups and just works if the app isn't telling it to. And if its driver de-stabilization they're worried about (as they pointed out to me), thats just laughable - consdering the state of ATI drivers in general.
Bigus Dickus said:
Does ATI, NVIDIA, or anyone else tell developers in general hardware details about their future products (well, ignoring JC for the moment) long before the initial previews? One would be inclined to think not, since some features are probably dependent on driver development, and in general such complete disclosure would constitute a tremendous source of information leakage.
Wrong.
They DO tell. In fact, we're not talking about disclosures about card features. So please stop comparing Apples to Oranges. We're talking about a feature (W buffer) that is NOT a secret and should NOT have been removed to begin with. So they removed it. Thats fine. Their decision. The fact is, we [devs] should know about it so that we can plan ahead. THATS THE NORMAL THING TO DO.
Why else do you think I challenged them to come up with a shader solution? Because it is their responsibility to do so. If they hadn't gone and tinkered around with this, we wouldn't even be having this discussion and I won't be staring at rendering artifacts in a game I released in 2001, a game I'm releasing at the end of 2002 - or a game I'm releasing in 2003 and beyond.
Bigus Dickus said:
So, perhaps Derek thinks the W-buffer support issue is a special case... one worthy of a special "heads up" to developers? Apparently not, since it's an obscure feature used by a tiny fraction of games, and one for which there are other alternative ways of reaching the same end result.
No, thats NOT what I think
And no, a W buffer is NOT an obscure feature. If it was, a LOT of other obscure features that are part of HW, would be removed.
Bigus Dickus said:
But, assuming for a moment that ATI should have given a "heads up," just when should that have occured? During the initial conceptual design of the R300 core? Of course not. During initial driver development? During the verification process? Or, more likely, when the product specs and driver capabilities were fully known... in other words, very close to the launch date. So how much time would a "heads up" have saved in coding efforts? A week, two, three, a month? My oh my, what a crime indeed.
See above. This feature is not a trade secret, not subject to NDA and is nothing that should be kept hush-hush. As such, there was NO reason to NOT tell us about its removal. NONE.
The fact is, as usual,
ATI DROPPED THE BALL on this. And, as always, expect us [devs] to fend for ourselves - as they expect us and gamers to, when it comes to piss-poor drivers.