SimHQ - Bubba does the ATI 9700 Pro.......

Status
Not open for further replies.
MDolenc said:
Derek,

Where did this "w-buffer will be removed in DX9" come from in the first place?

Several folks, including Fresh I think, were saying this in this thread I think. Another example of to what lengths they would go to prolong the ludicrousness of this whole farce.

If I have time later today, I will dig up all references to it - assuming the site's search function works right.

Xmas said:
Actually you must be kidding, right? If you need to trace through every line of code for this, you should think about using a better concept for your engine, this is ridiculous.

I didn't mean that literally. If you are a developer, you should know better than to assume the worst. Then again, maybe you don't. In which case, think what you want. I don't care.

Randell said:
I beleive that was one of those 'enter a name' and generate a generic psycho-babble attack things. I dont beleive it was meant seriously.

Whatever. The fact remains, I never said it. Period.
 
agreed you never said it, and again that posting really had no place here just trying to point it out to reduce the heat :)
 
DaveBaumann said:
I’ve seen lots of screenshots of issue with the like of Morrowwind and a number of other titles nobody is saying they don't exist. But I’ve yet to see any issues with all of the games Bubba has been looking up, from either Rage3D, or any forum, or from yourself – you are talking about issues with an absence of fact, we are only asking you to back them up.

And I'm sure that my response was crystal clear. But since you don't bother reading my posts nor sticking to the premise, I'm not surprised at your response.

DaveBaumann said:
All good points. But show me one OpenGL program which has large outdoor scenes.

Tribes?

You're kidding, right?

Do you have ANY idea what the maximum viewing distance for Tribes/2 is? No? I didn't think so.

Do you have ANY idea what the size of the largest and the smallest entity on the Tribes/2 world is? No? I didn't think so.

Get back to me when you have something tangible to debate, ok?

Randell said:
agreed you never said it, and again that posting really had no place here just trying to point it out to reduce the heat :)

Well then. While you're at it, why not point out ALL the other posts in this thread, which have no place here. That is if you are in fact interested in reducing the heat. :rolleyes:
 
Derek Smart [3000AD said:
]
NOBODY knew that W buffer was absent on the 9xxx series until the artifacts in their games started showing up. ATI never told anyone. It doesn't appear in ANY of the docs/bulletins on their dev site.

If I look from a code's perspective its just another card which does not support W Buffer. Unless the code has mind of its own and it expects the W buffer support for every card made by ATI.

[url=msdn.microsoft.com/library/default.asp?url=/library/en-us/dx8_c/directx_cpp/Graphics/ProgrammersGuide/UsingDirect3D/Rendering/FogAlphaDepth/DepthBuffers/QueryDepthBufferSupport.asp said:
From MS Site[/url]]Although depth buffers are supported for all software rasterizers, w-buffers are supported only by the reference rasterizer, which is not suited for use by real-world applications. Regardless of the type of device your application uses, verify support for w-buffers before you attempt to enable w-based depth buffering.

Derek Smart [3000AD said:
]
What? Do we now have to trace through every line of our code each time a new card comes out, in order to find out which CAPS report is failing?

You're kidding, right?

According to MS link above, you are supposed to check the caps before using it. So if the CAPS report is failing and you are still using it, then it would surely be considered a BUG.

Derek Smart [3000AD said:
]

crystalcube said:
So I really fail to see how ATI broke any atandards or API in their new hardware by not supporting a feature which was optional.

:eek:

That was in response to noko's post.

Well artificats problem is one thing but saying that artifacts problem is because ATI broke the API is completely diffrent.

btw Derek Smart, you dont need to come so close to the monitor everytime you read any post ;)
 
What do you mean by large screens?

I think what Dave was saying was that compared to other OpenGL games Tribes has a very large outdoor scenes. The problem with your question

“show me one OpenGL program which has large outdoor scenes.â€￾

is you did not define large. Large compared to what? I mean Tribes, Never Winter Nights and Serious Sam 2 all have large outdoor scenes compared to other OpenGL games. One could even call your games tiny as Elite 3 has 99% more systems and planets to visits then BattleCruiser 3000AD v2.0. Does that make you games small? No it does not.

In short please define what you consider a large outdoor scene and give some examples of D3D games?
 
Derek Smart [3000AD said:
]Well then. While you're at it, why not point out ALL the other posts in this thread, which have no place here. That is if you are in fact interested in reducing the heat. :rolleyes:

nah that'd put John and Dave out of job and probably me out of mine :)
 
crystalcube said:
If I look from a code's perspective its just another card which does not support W Buffer. Unless the code has mind of its own and it expects the W buffer support for every card made by ATI.

Good thing you're not a coder then - but what you posted is just ludicrous.

My games have been supporting and using a W buffer going back to 1999

If you're a coder, its just scary that you would write that

crystalcube said:
So if the CAPS report is failing and you are still using it, then it would surely be considered a BUG.

WHERE did you derive this particular piece of tibit from? It sure as hell ain't in that excerpt - or anywhere else for that matter.

1. Supposed to check the CAPS before using it? Yep.

2. If the CAPS reports back that it doesn't support it, how EXACTLY did you deduce from my postings that the code would still use it [W buffer] even after the CAPS had reported that its not there. Are you KIDDING me?

This is exactly why the Net is a problem. You folks can just grab stuff from anywhere, post it, distort it and try to pass it off as facts.

Again, as I mentioned above - if the CAPS reports no W buffer, it falls back to Z buffer (16 or 24 depending on the card). Hence the visual artifacts which are not evident with a W buffer. Got it? :rolleyes:

In fact, I just checked

1. That MSDN article is old and goes back to 96 I think

2. The DX6, 7, 8, 9 Beta2 SDK docs (I have all of them here), still have references, descriptions and usage of the W buffer

GET YOUR FACTS RIGHT!

crystalcube said:
Well artificats problem is one thing but saying that artifacts problem is because ATI broke the API is completely diffrent.

wot?

btw Derek Smart, you dont need to come so close to the monitor everytime you read any post ;)

hehe, stop mimicking me :D
 
And I'm sure that my response was crystal clear.

Honestly Derek, no its not crystal clear because I don’t know response you are talking about the one that says: “I can point out a plethora of driver BUGS in EVERY SINGLE GAME he used in that review (using the 775 drivers he used).” Or the one that says: “There ARE problems with the ATI drivers on those games and the only one that I'm certain of (because I play it more often) is F4.”.

Again, all I’m asking for is a little evidence to back up these statements – facts Derek, where are the facts. I’m not arguing that they may not be there I just want to see them.

Do you have ANY idea what the maximum viewing distance for Tribes/2 is? No? I didn't think so.

Do you have ANY idea what the size of the largest and the smallest entity on the Tribes/2 world is? No? I didn't think so.

Do you? If so, tell us – again, give us some facts. Your debate is no less intangible because its so far devoid of any facts. And As Pottsey points out apparently its devoid of any parameters.
 
DaveBaumann said:
1. That MSDN article is old and goes back to 96 I think

His quoted MSDN page relates to DX8.1, it says so at the top of the page...

Yes and MSDN gets updated depending on the current release of the API. They don't go back and re-write all that crap up each year or with each release you know.

In fact, in the DX9 SDK docs, some pages have in Bold Red lettering that the section is subject to change etc.

Thats the point I was making. A LOT of games and apps have used the W buffer and that excerpt in NO way says that you shouldn't. If that was the case, it wouldn't be supported in the API nor the hardware.

Man, this is getting harder and harder. But don't worry, my tenacity is as peaked as ever. Keep it coming folks - I'm a fast typist.
 
Yes and MSDN gets updated depending on the current release of the API. They don't go back and re-write all that crap up each year or with each release you know.

Seems to me that if they were in the documentation for DX8 then regardless of when it was written it would be no less applicable to that release of API.
 
It's also interesting that those "software rasterizers" mentioned on MSDN do not exist anymore. The only software rasterizer still included in DX8+ is reference rasterizer and even that is only availible with the SDK.
Will bug someone when I get home ;).
 
DaveBaumann said:
Yes and MSDN gets updated depending on the current release of the API. They don't go back and re-write all that crap up each year or with each release you know.

Seems to me that if they were in the documentation for DX8 then regardless of when it was written it would be no less applicable to that release of API.

Dave, since you prefer to post first before reading. Please go back and READ that MSDN item again. I just did, because I didn't have a clue what you two were yammering about.

The fact is, the excerpt that he posted misleadingly refers to the use of the reference rasterizer not be being suited for use by real-world apps. That statement has NOTHING to do with the W buffer. At all.

Here it is in its entirety

Once you know that the driver supports depth buffers, you can verify w-buffer support. Although depth buffers are supported for all software rasterizers, w-buffers are supported only by the reference rasterizer, which is not suited for use by real-world applications. Regardless of the type of device your application uses, verify support for w-buffers before you attempt to enable w-based depth buffering.

And since DX6, we've moved away from sole use reference rasterizers because of how cards and drivers are architectured.

As such, the W buffer is clearly supported and can be used by real world apps IF the device supports it.

That statement in NO WAY discourages the use of a W buffer as you and him would like to believe that it does.

*sheesh*
 
And since DX6, we've moved away from sole use reference rasterizers because of how cards and drivers are architectured. As such, the W buffer is clearly supported and can be used by real world apps.

No, its telling you to test before using it, because its not clearly supported.

Regardless of the type of device your application uses, verify support for w-buffers before you attempt to enable w-based depth buffering
 
DaveBaumann said:
And since DX6, we've moved away from sole use reference rasterizers because of how cards and drivers are architectured. As such, the W buffer is clearly supported and can be used by real world apps.

No, its telling you to test before using it, because its not clearly supported.

WHAT?!?! Its not telling you to test before using it because its not supported. ARE YOU MAD?!?!

Its telling you to test before TRYING to use it because thats the STANDARD procedure for such ops. You HAVE to test a card's CAPS before you make ANY op calls to it.

Ho Lee Cow, this is really getting out of hand
 
Derek Smart [3000AD said:
]WHAT?!?! Its not telling you to test before using it because its not supported. ARE YOU MAD?!?!

<sigh> I'm saying that it is not clear that this feature (w-buffering) is supported in all cases which is why it should be tested for - and this is not limited to software rasterisers.

From the first paragraph:

As with any feature, the driver that your application uses might not support all types of depth buffering. Always check the driver's capabilities. Although most drivers support z-based depth buffering, not all will support w-based depth buffering. Drivers do not fail if you attempt to enable an unsupported scheme. They fall back on another depth buffering method instead, or sometimes disable depth buffering altogether, which can result in scenes rendered with extreme depth-sorting artifacts.

Again, this seems to me to be saying that its not a given that w-buffering is supported.
 
Argh, the form mods ought to create a named dereks Forum where derek "smart" can conduct his ATi driver diatribes and use all the foul language he likes. I am sick of reading/seeing his nonsense posts and arguments. I am sure that I am not the only one. Consistently he takes others arguments out of context, exaggerates beyond all that is reality and insults most anyone who should disagree with his conclusions while ramming his opinions down everyone else’s throat. smart spends hours posting the longest posts of drivel that I have ever had the delight of reading.

The reviewer in question had no problems with ATi drivers. Even some MS staff have given the Catalyst drivers qualification of "at least" detonator driver quality. I have read dozens and dozens of other reviews where the reviewers have had no problems with Catalyst drivers. But smart seems to have endless problems with the drivers on and on about ATi not supporting the "w" buffer which seems that there are not many other developers out there that are nearly as concerned. A mass of people contradict your claims derek and have no to few problems with ATi drivers. Riddle me this mr derek, how is it that a whole industry have relatively little to no problems with ATi driver yet you do? How come the vast majority of Radeon 9700 owners have rarely an issue with the cards performance and yet you do?(Do you even have a Radeon 9700?) The reason you see so many people whom you claim are FanATIcs replying to your garbage arguments/posts is because so many disagree with your conclusions. Does their disagreement automatically qualify them as ATi fanb@ys? No. derek you attack anyone whom suggests that ATI has good/solid/stable drivers with outright bullshit arguments and profanity.

Something you should ask yourself derek is "Is it everyone else, or is it just me?" then after you answer that question ..... dig a little deeper.

Derek Smart [3000AD said:
]It is a well written review, though I take exception to the particular bit of rubbish.

That, right there, pure and utter bollocks - unless I misunderstood the real meaning of that paragraph.

For one thing, the 775 drivers that ship with the card are as buggy as hell. A common trait for ALL ATI drivers that ship with boards. So, I have no clue wtf he's talking about. :rolleyes:

“Utter†Bullshit.

Derek Smart [3000AD said:
]
Remember, you said it - I didn't. If the hat fits....

Speaking of which....

Oh, and they released the much-touted and equally as buggy, 777 drivers. Here, take a peak

Your evidence is a link to a thread on a message board with a few trolls giving a link to a Nvidia fan site claiming that ATi has driver problems. Big friggn deal since when is a thread claiming driver issues from someone anonymous on the internet proof positive of anything?


Derek Smart [3000AD said:
]While I agree with you to some extent, remember that the 775 drivers which shipped with the 9000/9700 had similar busted functionality e.g. TnL and string (want a list?) of other issues.

Yes and while you are at it I would like you to produce evidence of each case. BTW linking to a message board thread does not qualify as evidence IMHO.

Derek Smart [3000AD said:
]The 9xxx series of drivers - IMO - are just as buggy as the 8xxx series of drivers. There is no inconsistency in this regard. None whatsoever.

DaveBaumann disagrees and I will take his word over yours any day of the week. Not to mention the industry acclaim ATI has garnered over the past 6 months over its driver reform. You have no credibility …. “None whatsoeverâ€.

Derek Smart [3000AD said:
]While others - including myself - are touting ATI's improved driver support, I still maintain that they have a bloody long way to go.

If they still have “a bloody long way to go†then what the hell are you touting them for? Do you “tout†them so that you can turn around and bash the hell out of them simply to avoid the possibility that you would be labeled a total ATI basher. Do ATI a favor and don’t “tout†them at all because if that is what you are doing then you should reconsider the definition.

Derek Smart [3000AD said:
]I predict that, as with ALL previous ATI cards, these 9xxx series of drivers will remain in this sordid sorry state....right up to the release of the next generation of ATI cards. It is a neverending cycle that they simply can't seem to shake.

“Utter†Bullshit…. More “touting�

Derek Smart [3000AD said:
]Speed is one thing. Compatibility is quite another. Whats the fucking point of touting/selling a fast card, if the competion has an almost as fast card (Ti4600) but with approx 98% more stability?!?!

LOL, yeah what’s the point, 98% more stability my ass. “Utter†bullshit. The GFti4600 is beaten by nearly 150% in some cases by the Radeon 9700…..again you post mostly trash. All you do is post crap arguments here, all of them.

Derek Smart [3000AD] said:
Yes I did say that. And in comparision to PREVIOUS releases for cards. Hence my assertion that they [ATI driver dev] have improved.

Yes, I know - the general theme around here is to IGNORE the facts and just muddy posts with the usual nonsensical and distorted diatribe.

It took you a coupple weeks of DIGGING to find some issues that simply DO NOT EFFECT EVERYONE. [/b]

RUBBISH.

Well then I guess you should have made it clear then that you were intending on bashing the hell out of ATI no matter what they do at the time you “touted†them. “the usual nonsensical and distorted diatribe.†OMFG if there is anyone around here that does this on a regular biases it is you …… looking in the friggin mirror, moron. “RUBBISHâ€

Derek Smart [3000AD said:
]The documented problems in my games are still there. The documented problems in other games - are for the most part - still there. e.g. They ripped out W buffer support, fucked up the Z buffer royally. And even with these 777 drivers, Morrowind, my game and several others, STILL have Z buffer artifacts.

Post some screen shots of your “artifactsâ€. Funny I have not seen any reviews with screenshots of this problem. Apparently you are a part of a small minority of people with ATi driver issues. …… Almost qualify as an Nv_idiot troll…

Derek Smart [3000AD said:
]You must be reading a different review. How many people are still playing sims? How many people bought/playing Morrowind, BF1942, Mafia, The Thing, GTA3 (you want a list?) etc etc Did you see ANY indication in his review that he even played ANY of these games?

BFD … a few complaints about a few games. AFAIK ATi has already resolved a number of the issues that you are addressing here. BTW have you seen how crappy Morrowind looks on an Xbox? Slow frame rates and other anomalies on my xbox, besides the game sucking in a big way. IMHO it is a poorly developed game, just like your 3000AD is.

Derek Smart [3000AD said:
]The sales of mainstream games eclipse the sales of niche games such as sims (including mine), so of course a legacy game like F4, CFS2 etc won't exhibit problems that others will. But the fact remains, the casual (e.g. BF1942 et al and RPG market (e.g Morrowind) are vastly larger than the sim market.

There are no mass reports of driver issues with the vast majority of games on the market as far as the Radeon 9700 is concerned. Again ATI released a patch not long ago that addresses many of the problems that you are harping about. Yawn.

Derek Smart [3000AD said:
]
If 95% of the people can play 95% of all games without serious issues, most without ANY issues Then What the HELL IS THE PROBLEM.

The problem is quite simply, with your (a) math (b) lack of FACTUAL data - no, pulling numbers (95%) out of your thin air, doesn't count as being factual. What IS factual, is stuff like this.

LOL if this is not an example of the “pot calling the kettle black†I don’t know what is. Again your only evidence is a link to a troll thread on a message board where anonymous posters may post anything they like as if it is the truth. You claimed the Geforce ti4600 is 98% more stable then the Radeon 9700… hows that for “lack of FACTUAL data†and “pulling numbers out of your thin air� Again you post a total bullshit argument.

Derek Smart [3000AD said:
]I could care less. The issue is about ATI's piss-poor drivers. We're not discussing nVidia, Matrox or anyone else. Yes, they too have some driver issues - but you CANNOT compare the state of those boards drivers to that of ATI's. Thats just laughable.

Well on that matter I could give you dozens of links to threads to how piss poor nvidias drivers are. Your blinkered approach is so dogmatic it is pathetic.

Derek Smart [3000AD said:
]I am firmly neutral and don't pick sides. All I know is that nVidia and Matrox boards have tridionally worked out of the box.

Is that right so the Geforce 3 worked out of the box when it was launched then? More crap from you about claiming to be “neutral†I see here. LOL I read on to where you post emails as evidence of some sort. More garbage evidence posted in an effort to smother people whom disagree with your conclusion about ATi drivers. “If you can’t dazzle them with your brilliance baffle them with your bullshit†must be your moto. You go on to disagree with an owner of a Radeon 9000 whom has no problems and believes that the card has “excellent†drivers. More of your “touting†I see.

Derek Smart [3000AD said:
]
Heathen said:
Yes, I'm with Squidlor. I'm getting sick and tired of reading this constant ranting, raving and expletives which seem to arrive whenever Derek Shows up.

PLEASE SHUT THIS DAMN THREAD DOWN.

Yes, thats what we need to do, shut down threads that some people don't like or agree with.

Whats killing the thread are posts from YOU TWO calling for the thread to be shut down. If you can't JOIN in the topic, WHY THE HELL ARE YOU READING OR POSTING TO IT?!?!?!

I mean, what the hell is the matter with you people?

Or better yet rather then shut the thread down smother discontent people whom disagree with you with truck loads of garbage that equate to pages of rubbish until you have brow beat them into submission right Derek?( Fabianisim? ) We should allow for crap such as yours to be posted as if it is some sort of truth then I suppose and ruin the reputation of what used to be considered a place where people could go to get answers without being confused by crap such as you post.

Derek Smart [3000AD said:
]Its a little past 5.30am and am at work. I check my email to get a notification of several PM in my inbox. I check the PMs. Check the threads.

....and run into the same shit. I don't even know WHERE to start.

My sentiments exactly, you post mostly garbage arguments so far. I didn’t know where to start personally so I thought that from the beginning would be a good start but I am growing tired of this .

Derek Smart [3000AD said:
]Nevertheless, I was going to start with this post - taking it apart at the seams, as only I know how - until I got to this bit. Then I went ...screw this. No way I'm bothering with this git So, I'm just going to post the bit and move on.

Yeah just continue to ignore all the arguments that people have put in front of you and move on because you can’t possibly fool them all into thinking you are right. More garbage.

Derek Smart [3000AD said:
]
rubank said:
Derek uses the word "pharmacodynamic" without ever having taken the time to look it up in the dictionary.

This is yet another example of assassination and fanATIcs deliberately taking my posts out of context, twisting my words etc etc. NOWHERE in this forum - or anywhere else, have I EVER - EVER used this word. What a damn lie. :rolleyes:

You chose to reply to this because it is something that you can. Never mind that it has nothing to do with ATI drivers. Just like most of the ranting you post. More trash. :rolleyes:

Derek Smart [3000AD said:
]Look, I don't mind taking on the world. I have a box of toothpicks. But if you're going to engage me, BE FAIR. I'm only ONE man you know. As such, do you folks have to sink to the depths seen in this thread and others, in order to get ahead (which, in itself, isn't bloody likely) of me?

Hrm someone could make some sort of psycho analysis from dereks feelings of the world being against him. Go to the yellow pages and look up Psychiatrist derek you need help.

noko said:
As for the bugs in his games I wouldn't know since I can't seem to find any of his games at the store here.

Nor will you ever AFAIK the game sucks. Besides you wouldn’t want that buggy peace of trash in you PC. Further you have to pay for the patch. What’s worse is that possibly one of the absolute worst developers out there is actually able to get away with taking cheap shots at the best graphics hardware designing company in the world. Hrm something is amuck IMO. His ranting goes beyond reality when you consider he is one of the only developers having severe problems with ATi drivers. Never mind the industry acclaim for ATIs efforts and achievements just listen to derek smart. Yeah that’s it.

Derek Smart [3000AD said:
]Do yourself a favor and just ignore Fresh. If you've read the other thread, you'd know why. He doesn't have a CLUE what he's talking about. And I clearly described what I'm doing, how I'm doing it and why.

Hrm, I like freshs posts they are refreshing in comparison with your unsubstantiated arguments. Argh I am getting tired of this and so I will forget about replying to the rest of your posts. But as anyone can see most everything derek posts is trash add nausea. Truly derek you are a awarded my “Bullshitter/Basher†of the year … I mean decade award no wait “Lifetime achievement†award. [action]Hangs head in shame for not being able to resist replying to this massive heap of compost that derek smart posts.[/action]

Sabastian.

PS: sorry for the long diatribe but it seemed appropriate.... 8)
 
can someone please close this Thread or even better ban Mr. Smarties.

PLEASE!

I think I know why smarties is so upset with ATi ( and because of that he will never stop, judging his posts ).

As Smarties mentioned he has reported an bug to ATi and ATi didn't care. Then the same bug came up with GTA3 and ATi responded "over night" and made an patch available. So this was the first time he and his BIG personality were neglected.

Then as OpenGL-guy mentioned ATi altered the drivers so that an buggy demo run correct, because this demo were used by many review-sites ( I think this demo was the 921-build of UT2003). Due to this his game had problems again. That was the second time ATi neglected him.

And so all this hatred begun. As it seems he cannot stand it to be on the backburner and so he will never stop (even when this game will run ok on ATi-hardware).
 
Sabastian said:
Argh, the form mods ought to create a named dereks Forum where derek "smart" can conduct his ATi driver diatribes and use all the foul language he likes. I am sick of reading/seeing his nonsense posts and arguments. I am sure that I am not the only one. Consistently he takes others arguments out of context, exaggerates beyond all that is reality and insults most anyone who should disagree with his conclusions while ramming his opinions down everyone else’s throat. smart spends hours posting the longest posts of drivel that I have ever had the delight of reading.

Please shut up, shut up, shut up. Just SHUT UP!!

It is CLEAR that you have either (a) NOT read my posts or (b) you HAVE read them, but in keeping with the standard fanATIcal response, choose to discard and ignore them. And the post the usual fanATIcal NONSENSE. Thats a great club you ladies have going there. Too bad you are, well, NO MATCH for me. But keep trying though - you've got a LONG road ahead of you, if posting non-factual nonsense and supporting other fanATIcal posts, is the general theme.

WHERE did I use foul language?

I'm not even going to reply to your post. And this will be my last reply to you. Join the ranks of the others whose posts I ignore. If you feel so bad about, jump up and down on your chair.

mboeller said:
can someone please close this Thread or even better ban Mr. Smarties.

PLEASE!

I think I know why smarties is so upset with ATi ( and because of that he will never stop, judging his posts ).

As Smarties mentioned he has reported an bug to ATi and ATi didn't care. Then the same bug came up with GTA3 and ATi responded "over night" and made an patch available. So this was the first time he and his BIG personality were neglected.

Then as OpenGL-guy mentioned ATi altered the drivers so that an buggy demo run correct, because this demo were used by many review-sites ( I think this demo was the 921-build of UT2003). Due to this his game had problems again. That was the second time ATi neglected him.

And so all this hatred begun. As it seems he cannot stand it to be on the backburner and so he will never stop (even when this game will run ok on ATi-hardware).

More blatantly FALSE rhetoric. Now THATS a surprise. Point out FACTS as they pertain to this statement, THEN you'd have a leg to stand on.

In fact, NOWHERE did OpenGL_Guy EVER state what you wrote. Why? Because he'd be LYING. And I know for a fact that he wouldn't stoop that low. Now you folks sure as hell would.
 
Derek Smart [3000AD said:
]
crystalcube said:
If I look from a code's perspective its just another card which does not support W Buffer. Unless the code has mind of its own and it expects the W buffer support for every card made by ATI.

Good thing you're not a coder then - but what you posted is just ludicrous.

My point simply is that detection of W Buffer is independent of card and in case of ATI 9xxx series that detection fails, unless code does not checks for W buffer on ATI cards.

I hope thats clear now.

Derek Smart [3000AD said:
]
My games have been supporting and using a W buffer going back to 1999

Even on cards which do not have any W Buffer support ?
That would be a technological breakthrough

Derek Smart [3000AD said:
]
If you're a coder, its just scary that you would write that

And world is so much beautiful when you code ? :LOL:

Derek Smart [3000AD said:
]
crystalcube said:
So if the CAPS report is failing and you are still using it, then it would surely be considered a BUG.

WHERE did you derive this particular piece of tibit from? It sure as hell ain't in that excerpt - or anywhere else for that matter.

form here
Derek Smart [3000AD said:
]
NOBODY knew that W buffer was absent on the 9xxx series until the artifacts in their games started showing up. ATI never told anyone. It doesn't appear in ANY of the docs/bulletins on their dev site.

What? Do we now have to trace through every line of our code each time a new card comes out, in order to find out which CAPS report is failing?

You yourself are saying that you dont wish to look into your code for CAPS report failing when a new card is released.
Doesn't that implies that you have not handled the condition of that particular CAPS report failing in first place although API docs ask you to verify the presence of such feature before using it.

Derek Smart [3000AD said:
]
2. If the CAPS reports back that it doesn't support it, how EXACTLY did you deduce from my postings that the code would still use it [W buffer] even after the CAPS had reported that its not there. Are you KIDDING me?

I understand perfectly that once the W buffer detection fails you switch to Z buffer but then the need to scan the code for CAPS failing , in this case for W buffer, should not arise.

Derek Smart [3000AD said:
]
This is exactly why the Net is a problem. You folks can just grab stuff from anywhere, post it, distort it and try to pass it off as facts.

Really ?
That excerpt is from the current MS site ( source of DX API ) and its for Direct X 8.1 and considering DX9 is not yet officially released. It should be considered the current spec.

Derek Smart [3000AD said:
]
1. That MSDN article is old and goes back to 96 I think

2. The DX6, 7, 8, 9 Beta2 SDK docs (I have all of them here), still have references, descriptions and usage of the W buffer

Yes it does , but it clearly tells you to verify the presence of W Buffer before using it
 
Status
Not open for further replies.
Back
Top