Its a little past 5.30am and am at work. I check my email to get a notification of several PM in my inbox. I check the PMs. Check the threads.
....and run into the same shit. I don't even know WHERE to start.
Nevertheless, I was going to start with this post - taking it apart at the seams, as only I know how - until I got to this bit. Then I went
...screw this. No way I'm bothering with this git So, I'm just going to post the bit and move on.
rubank said:
Derek uses the word "pharmacodynamic" without ever having taken the time to look it up in the dictionary.
This is yet another example of assassination and fan
ATIcs deliberately taking my posts out of context, twisting my words etc etc. NOWHERE in this forum - or anywhere else, have I EVER -
EVER used this word. What a damn
lie.
Look, I don't mind taking on the world. I have a box of toothpicks. But if you're going to engage me,
BE FAIR. I'm only
ONE man you know. As such, do you folks have to sink to the depths seen in this thread and others, in order to get ahead (which, in itself, isn't bloody likely) of me?
noko said:
Well Fresh from what I understand (not much) Derek seemed to have used the right method for his game with the W-Buffer, where he has a rather large 3d space with objects throughout the space vice a bunch of close up objects. So the accuracy of far objects and not so much accuracy of a few near objects seems to make sense due to the nature of his game.
I didn't know that the W-Buffer wasn't supported in DX6 capable hardware and above. Meaning I thought that was a hardware standard that was pretty much used by all recent graphic chips.
As for the bugs in his games I wouldn't know since I can't seem to find any of his games at the store here.
Do yourself a favor and just ignore Fresh. If you've read the
other thread, you'd know why. He doesn't have a CLUE what he's talking about. And I clearly described what I'm doing, how I'm doing it and why.
An example? He says that most devs use 1.0 ft as the nearZ and why should I set it to 0.01m? That right there, is the sign of someone who has no frigging clue what he's talking about - but likes to think so.
Even though I painstakingly gave a description of why its seeded at that value - he is the only one (surprise!) who choses to ignore this and continue to post the usual drivel. He must think I'm developing a shooter, in which all the pickups in the world as the same size as a 6' human. e.g. the TDU (Target Designator Unit) used to illuminate targets for precision munitions guidance, one of the smaller items in the game, is as tiny as 0.18m H x 0.12m W x 0.03m L
As for the W buffer issue, just TWO days ago, I was in an email trail with Mike over at dev rel related to this W buffer. To the extent that I even sent emails to some of my industry contacts (including JC) to see if they knew wtf was going on and whether they were in fact advised by ATI that they had (a) removed W buffer support from the 9xxx series (b) that W buffer was in fact, NOT in the HW at all
Below is my email to the head of dev rel, in its entirety. His reply basically said that the W buffer is not (!) in the HW, they had no intentions of supporting it - and that one of the devs will come up with a sample app to do it via pixel/vertex shaders (I already tried it - and as I suspected, it does
not bloody work!). Thats when I told him that if they say it works, they can bloody well write the damn sample app like nVidia would if I asked them to prove something to me. He agreed. And I'm in a holding pattern waiting for it.
Of course, note that this shader solution won't work for legacy apps (such as my previous games, which rely on a W buffer)
I mean, they took out W buffer
WITHOUT TELLING ANYONE. To the extent that the card is DX backward compatible and a W buffer is accounted for in DX6 and higher. The jury is still out on DX9, but as far as I know, is still available in Beta 2.
Mike,
I've been doing some investigations. And then some thinking.
I know that the 9xxx series cards still have W buffer support in the hardware.
However, there is no way to turn it on because it is disabled in the display properties and the
registry settings are being bybassed. The driver is doing this, obviously.
So, I mean, whats it going to hurt to put this back in the compatibility section (since thats
where such stuff should go) of the ATI control panel, leave it off and let the user/application
decide if they want to use it or not? I mean, according to your explanation below, I really
don't see why ATI should be concerned about speed, if its left to the developer to determine
its use.
This all comes down to this nVidia vs ATI speed thing again - and its UNFAIR for you [ATI]
folks (not you personally, I mean) to do this - while breaking backward compatibility with games
(mostly sims) that make use of and/or require a W buffer. This is the kind of thing that really
irks most of us (though other developers aren't as vocal) and yet, the ATI bigwigs wonder why such
a high number of games don't work on the 9xxx series (driver problems aside).
I mean, here is an excerpt of an message that an ATI engineer suggested as a workaround. He forgot
one simple (!) thing - most legacy cards do NOT have a shader!!! In fact, some cards don't even
had a pixel shader; they only have a vertex shader.
You can use the pixel shaders to write a Z value out. Given this, a vertex shader could pass
W as a parameter to the pixel shader, which could then write this out as a W value.
The basic principal is to setup the W computations in the vertex shader, then pass that
to the pixel shader by using a texture address component, which are 32b floats. In the pixel
shader, the shader will receive a 32b interpolated float, which will be the per pixel W value.
Writing that out, it will get converted to a 24b float, which can be Z buffered, if you make
sure that all W's written out are positive. Or, you could convert to integer, and write out
that value.
Hopefully you'll be able to talk some sense into the ATI folks who make these decisions. In my opinion,
and that of several devs I know, the decision to disable the W buffer in these cards, is just plain
ludicrous and is of NO benefit to anyone.
And before they go telling you otherwise, to the best of my knowledge (me being on the DX beta team)
there are NO plans to remove W buffer support from DX9 because it has backward compatiblity going all
the way back to DX7
cheers
LittlePenny said:
The intended point was to see how people felt according to which drivers should be used in a review. The ones with the card? latest official? or latest beta?
I see. I think it depends on the reviewer. I for one am of the opinion that the drivers that come on the CD are those that should be used. In the same way that I described off-the-shelf products. Bubba was quite right in using the 775 drivers.
LittlePenny said:
]Just a question, I wasn't prodding.
heh, OK. I see you skirting the realms of prodding and joining in the fan club.
LittlePenny said:
Reviewing with bias on the genre of sims is what I found to be different, that's all.
Ah, OK. I agree with you then. Besides, Bubba would be hearing from me if he had gone an' run Sacrifice, Morrowind or such - being a hardcore simmer and all, that would be like sacriledge.
Althornin said:
Nagorak said:
Can't we just keep the personal attacks to a minimum?
Yeah, It would be nice to see that from BOTH sides here...
If you're going to jump on the
Hey look at me!! look at me!! I have something interesting to say!! No really!!! bandwagon, at least gets you facts right. Who else is making personal attacks apart from the Usual Suspects?
Reverend said:
Derek, post screenshots of all those games exhibiting flaws in ATI's drivers.
I think that there are enough to go around on the web. Go to the Rage3D forum and take your pick. I'm not in the mood.
Besides, I'm working on a research article about this fiasco for two hardware sites. By the time I'm done and that propagates across the Net and in print, we'll see who has a louder voice - me or the bunch of goons making personal attacks. I'm going to bury them so deep, they'd need a spatula to get out of it.
Its obvious that on forums like this on the Net, the signal to noise ratio is too high for anything meaningful to be derived from my postings. Thats the plan though - they feel that by distorting the thread and taking it downhill, the message is lost. Fair enough, I'll just pick a different battlefield. The more they try to stiffle me, the louder I'm going to keep yelling.
You'll read about it soon. Be patient.
geo said:
Y'know, as an 8500 owner (and a RadLE and Rage Fury owner before that), I really don't mind at all having a Derek Smart pounding on ATI for better drivers. I *would* mind if the *only* thing he was doing was pissing and moaning, but it isn't. There have been quite a few examples in his various posts where he has quoted his various communications to ATI and it is clear (to me at least) that those communications are helpful, polite, and even good humored. Helpful in not just pointing out a problem, but in also (at least the ones I saw) suggesting where the problem may be.
Derek must be an honorary Israeli tho --he seems to believe in massive retaliation. The threads I've seen go to hell start with Derek making a strong, but not unreasoned statement, then somebody makes it personal, and then there are smoking craters all over the place. . .
Well said. If you look at my postings. This thread in particular, what you describe is exactly what goes on. As soon as they want to stiffle me, they start the personal attacks, character assassination, post distortion etc etc To the extent that you'd think I was running for office or something and this was an election year.
They
WASTING THEIR TIME because I'm going to keep voicing my opinions as I see fit. The day ATI cleans up their act, they'll get a clean bill of health from me, from devs and from gamers. But I'm not holding my breath. There are
STILL bugs in the 7xxx and 8xxx driver series, yet, we have a 9xxx series of boards on the shelves - with their own set of problems. ONLY in this industry like this will crap like that occur.
I for one, won't be surprised to see the first ATI class action lawsuit. From what I know, its coming. You'll see.
IT said:
Note: OpenGL has no W buffer, and plenty of apps get by just fine without it.
All good points. But show me
one OpenGL program which has large outdoor scenes.
There IS a reason I chose to stick with D3D instead of adopting OGL several years ago when I migrated from DOS.
crystalcube said:
I think it has been highlighted earlier too..but...
W-Buffer support was optional and seems it has been dropped in DX9 so ATI decided not to support it in newer hardware.
Every feature in DX is optional
W Buffer support has
not been dropped from DX9
crystalcube said:
As far as fallback & informing developer , the DX API provides the option to query the availability of W-buffer so if a developer has indeed programmed according to the API then there is no need for informing the developer.
Listen to me,
closely. No, not that close - back off the monitor a bit....
EVERY line of code that tries to do something such as this, checks the card's CAPS to see if it is supported. OK?
Good
Now, if the program checks for a W buffer, doesn't find it, then checks for a Z buffer, finds and uses it - thats normal. OK?
Good
If the program RELIES on W buffer (hence the check in the first place) in order to alleviate artifacts and use that as a preference, the first time the dev (such as me) is going to know that the W buffer no longer exists, is (a) when I notice artifacts (as I did in the case of the 9xxx series) or someone reports it or (b) if I knew this beforehand
NOBODY knew that W buffer was absent on the 9xxx series until the artifacts in their games started showing up. ATI never told anyone. It doesn't appear in ANY of the docs/bulletins on their dev site.
What? Do we now have to trace through every line of our code each time a new card comes out, in order to find out which CAPS report is failing?
You're kidding, right?
crystalcube said:
So I really fail to see how ATI broke any atandards or API in their new hardware by not supporting a feature which was optional.