SimHQ - Bubba does the ATI 9700 Pro.......

Status
Not open for further replies.
Derek, your time would be much better used if you spent more time altering the architecture of your engine to deal with 24-bit Z gracefully, and less time arguing about whether or not ATI is an evil entity for removing W-Buffering and not calling you up personally to notify you.

The fact is, you have to check device caps for w-buffering. If you don't gracefully fall back, your engine has a problem and is not future compatible. ATI doesn't actually have to notify anyone of their removal of a feature which is optional. The fact is, as a DirectX developer, you have to deal with a combinatorial explosion of different capabilities, both current and future. Microsoft created this problem, not ATI. Bitch about Microsoft's shoddy API decisions.


As for OpenGL and "large open worlds", you've go to be kidding? A vast array of flight simulators are built on OpenGL. In the GIS space, many analysis apps contain truly staggering datasets of a large terrain. And then there's the CAD/CAM market which demains high precision, where architects are designing buildings and boeing 777's using GL apps, where the range can go from the threads on a bolt to the shell of the airframe.
 
DaveBaumann said:
<sigh> I'm saying that it is not clear that this feature (w-buffering) is supported in all cases which is why it should be tested for - and this is not limited to software rasterisers.

Dave,
As I said few posts up there is no more software rasterizers shiping with DX8+ retail (You can check here) and reference rasterizer is only availible with SDK. Of course you don't know what you are running at (code) if you don't query for caps. And you *should* query for everything you use (If we talk about depth buffers it's also a VERY good idea to check for depthstencil-backbuffer match), but that's not the point. The point is that you simply CAN NOT by pass the missing w-buffer and Radeon 9700 is the first new card that droped support for it (you still have it in all GeForce chips, all Radeon chips but 9700, and very probably also on Matrox). Even if that trick with ps_2_0 shaders mentioned by earlier would work it would disable all early pixel rejections (since hardware does not know what the final depth will be until execution of shader is complete) and would cost a great deal of performance.
 
Derek Smart [3000AD said:
]
A LOT of games and apps have used the W buffer and that excerpt in NO way says that you shouldn't.

I asked this once before, but I'll give it a shot again here. Just what games require W-buffer support for proper rendering?

The answer so far is the BC series, and perhaps Mafia (I believe that was the other one mentioned).

As these are just two out of thousands, I'm curious just what some of the other hundreds are that constitute a lot, so I can check to see if I have any of them for reference.
 
DemoCoder said:
Derek, your time would be much better used if you spent more time altering the architecture of your engine to deal with 24-bit Z gracefully, and less time arguing about whether or not ATI is an evil entity for removing W-Buffering and not calling you up personally to notify you.

My engine already takes this into account. And I know for a FACT that you know this - especially since I've posted over and over and over that if the W buffer is not there, it falls back to Z buffer. My engine has been doing this since 1996.

DemoCoder said:
The fact is, you have to check device caps for w-buffering. If you don't gracefully fall back, your engine has a problem and is not future compatible.

Rubbish

Again, you KNOW that I DO check the CAPS

And I don't have a frigging clue why you'd even write the above - apart from the obvious notion of just typing for the hell of it.


DemoCoder said:
ATI doesn't actually have to notify anyone of their removal of a feature which is optional.

Rubbish. They are REQUIRED to notify us. They have time to pimp all the useless rubbish in the cards, but its too much trouble to notify us about an ARCHITECTURAL[b/] change? ARE YOU KIDDING ME?!?!?

And almost 90% of the features in DX are OPTIONAL. WHATS YOUR POINT?

DemoCoder said:
The fact is, as a DirectX developer, you have to deal with a combinatorial explosion of different capabilities, both current and future. Microsoft created this problem, not ATI. Bitch about Microsoft's shoddy API decisions.

No, ATI created this problem - NOT - Microsoft. Thats just an utterly foolish thing to say. Of course, its ALWAYS easier to pass the buck, right?

DemoCoder said:
As for OpenGL and "large open worlds", you've go to be kidding?

No, are you?

DemoCoder said:
A vast array of flight simulators are built on OpenGL. In the GIS space, many analysis apps contain truly staggering datasets of a large terrain. And then there's the CAD/CAM market which demains high precision, where architects are designing buildings and boeing 777's using GL apps, where the range can go from the threads on a bolt to the shell of the airframe.

Yes, and thats why they DO NOT USE CONSUMER LEVEL GRAPHICS CARDS. Pah!
 
:cry: This is so sad.....to have one person so abuse so many here..... I'm truly embarrased by Mr. Smart's antic's. :oops: At what point do you just say enough is enough? If anyone else that frequents this forum had behaved as Mr. Smart has....well, think they would still be allowed to present themselves in this abusive manor?

I can understand Beyond3D's desire to have as many developers and other knowledgable people frequent here. I myself truly appreciate reading & learning all that I have seen here....well, most of it! :rolleyes: However, at what point do we realize that Mr. Smart has NOTHING to add, except profane language, abusive tone, and self promotion. Has he even once givin us any of his profound knowledge? NO! All he has given us is his profane inability to be reasonable about anything he deems proper to post about.

This isn't your normal Fanboi stuff...... even the most ardent fanATIc or nVidiot doesn't carry himself like this......

Mr. Smart is like bad drugs..... at what point do we "Just say NO!"?
 
Derek Smart [3000AD said:
]
Please shut up, shut up, shut up. Just SHUT UP!!

LOL…. No you, moron.

Derek Smart [3000AD said:
]It is CLEAR that you have either (a) NOT read my posts or (b) you HAVE read them, but in keeping with the standard fanATIcal response, choose to discard and ignore them. And the post the usual fanATIcal NONSENSE. Thats a great club you ladies have going there. Too bad you are, well, NO MATCH for me. But keep trying though - you've got a LONG road ahead of you, if posting non-factual nonsense and supporting other fanATIcal posts, is the general theme.

Yes I did read your post err umm most of them and replied to them. Most of which were pure trash on your behalf which account prolly 90% of your post in this thread.

Derek Smart [3000AD said:
]WHERE did I use foul language?

LOL now derek is claiming he has never used foul language. Wow that’s what I call selective memory.

Derek Smart [3000AD said:
]I'm not even going to reply to your post. And this will be my last reply to you. Join the ranks of the others whose posts I ignore. If you feel so bad about, jump up and down on your chair.

Geash not more then 15 seconds ago I thought that I had a “long road ahead of meâ€￾. You’re a moron. :eek:
 
Derek's main beef seems to be that ATI made a decision that broke his game, and regardless of whether we think it's his game code's fault or ATI's fault, the fact remains that his game is broken and it's a pain in the ass for him.

Granted. He's sour, and I would be as well. The obvious question is what is a slander campaign against ATI supposed to accomplish?

Further, a sticky point is that ATI didn't tell him they were dropping W-buffer support. I can't help but wonder just what difference this would make. Does ATI, NVIDIA, or anyone else tell developers in general hardware details about their future products (well, ignoring JC for the moment) long before the initial previews? One would be inclined to think not, since some features are probably dependent on driver development, and in general such complete disclosure would constitute a tremendous source of information leakage.

So, perhaps Derek thinks the W-buffer support issue is a special case... one worthy of a special "heads up" to developers? Apparently not, since it's an obscure feature used by a tiny fraction of games, and one for which there are other alternative ways of reaching the same end result.

But, assuming for a moment that ATI should have given a "heads up," just when should that have occured? During the initial conceptual design of the R300 core? Of course not. During initial driver development? During the verification process? Or, more likely, when the product specs and driver capabilities were fully known... in other words, very close to the launch date. So how much time would a "heads up" have saved in coding efforts? A week, two, three, a month? My oh my, what a crime indeed.
 
martrox said:
at what point do we "Just say NO!"?

The point at which you shut the hell up and STOP reading and posting?

Whats this? Just because I have strong opinions that I am WILLING to debate, you fanATIcs want me banned? Yeah, thats gonna hurt like hell. :rolleyes:
 
Derek Smart [3000AD said:
]

More blatantly FALSE rhetoric. Now THATS a surprise. Point out FACTS as they pertain to this statement, THEN you'd have a leg to stand on.

In fact, NOWHERE did OpenGL_Guy EVER state what you wrote. Why? Because he'd be LYING. And I know for a fact that he wouldn't stoop that low. Now you folks sure as hell would.

Thanks for calling me an liar. I appreciate that, really! I would feel ugly if you would call me an friend.

But none the less here is the quote from OpenGL-guy from the old thread :

Because there was a demo of a certain game that had a bug. The demo was enabling Z bias too much (i.e. Z bias on the walls and the decals instead of just the decals). Since this demo was being used by certain review sites, we wanted to make sure things didn't look incorrect. Now the demo has fixed this bug, and our driver has a better Z bias to boot.

So you see ATi has more interest in an buggy demo than in your game.
 
As I said few posts up there is no more software rasterizers shiping with DX8+ retail (You can check here) and reference rasterizer is only availible with SDK.

I understand that, but that wasn’t my point; the article was not just talking about reference rasterisers.

The point is that you simply CAN NOT by pass the missing w-buffer and Radeon 9700 is the first new card that droped support for it (you still have it in all GeForce chips, all Radeon chips but 9700, and very probably also on Matrox).

You may not be able to bypass it, but the MS docs are saying that you should not be reliant on it being there as it may not.

Even if that trick with ps_2_0 shaders mentioned by earlier would work it would disable all early pixel rejections (since hardware does not know what the final depth will be until execution of shader is complete) and would cost a great deal of performance.

I know. It will also break FSAA on 9700.
 
Derek Smart [3000AD said:
]
martrox said:
at what point do we "Just say NO!"?

The point at which you shut the hell up and STOP reading and posting?

Whats this? Just because I have strong opinions that I am WILLING to debate, you fanATIcs want me banned? Yeah, thats gonna hurt like hell. :rolleyes:

Why? So you can continue on with your diatribe of ATI? On and on posting truckloads of dung as if the more you post the more you are correct even though the trash is mostly unsubstantiated. You have more then just "strong opinions". It is obvious from all the garbage that you post that you do have however "strong bias" and you have no credibility as a result. Your "strong opinion" is as credible/good as your crappy buggy game 3000AD. BTW how much are you soaking your clients for that patch to your buggy game? I bet the game is still riddled with bugs even after the patch.
 
Derek Smart [3000AD] wrote:
martrox wrote:
at what point do we "Just say NO!"?


The point at which you shut the hell up and STOP reading and posting?

Whats this? Just because I have strong opinions that I am WILLING to debate, you fanATIcs want me banned? Yeah, thats gonna hurt like hell.

Ummm......seems to me that I started this thread.......what gives you the right to tell me not to post in my own thread......If anyone should stop it's you, Mr. Smart.....
 
Coming from you, this is a very, very good post. So, I will respond in kind.

Bigus Dickus said:
Derek's main beef seems to be that ATI made a decision that broke his game, and regardless of whether we think it's his game code's fault or ATI's fault, the fact remains that his game is broken and it's a pain in the ass for him.

Correct

BUT, its not that big a deal because my code does fallback to the use of a Z buffer.

The point is that why should I have to put up with blatant rendering artfiacts in my game for which a gamer out there paid almost $400 to run on a 9xxx series of card. When they could have bought an nVidia, Matrox or any other board and not have a problem.

Gamers rely on visual quality and speed. Thats why graphics cards lead. There are games that ignore rendering artifacts such as texture swimming, tearing, shearing, Z fighter, 3D clipping, broken fog etc etc but more often than not, the games do sell but you WILL see reviewers and/or gamers mention this.

When you fork out $400 for a board (I got my 9xxx series free from ATI btw. It cost me nothing), you expect it to (a) work out of the box (b) have drivers that, while not perfect, don't BREAK games you are currently playing - and which ATI *do* have access to and *know* are broken (c) see little or NO rendering artifacts.

Look, we're not talking about a $100 card here. OK?

PLUS, you have these fanATIcs bitching, insulting, harrassing me etc just because I happen to hold firm on my opinions and beliefs. Thats the kind of thing you find in an nVidia vs ATI flame thread or a console war thread.

This is NOT about what card is better. In fact, I try to steer away from that because thats NOT the intent. As I have said before, ALL graphics drivers have problems. But so far, MY GAMES and several others, run FLAWLESSLY on ALL nVidia and Matrox boards. But when it comes to ATI's hardware, they either (a) run (b) not run at all (c) run but you have all these blatant driver bugs.

Bigus Dickus said:
Granted. He's sour, and I would be as well. The obvious question is what is a slander campaign against ATI supposed to accomplish?

Since when was this slander? Or a campaign for that matter? Surely you jest. So, all the gamers, devs, reviewers etc who are bitching about the SORRY STATE OF ATI DRIVERS are on a slander campaign too, then?

Bigus Dickus said:
Further, a sticky point is that ATI didn't tell him they were dropping W-buffer support. I can't help but wonder just what difference this would make.

I'd know about it in much the same way they chose to frigging pimp how many damn transistors the card has? Whats the difference? Oh, I get it. They'd rather pimp the HW to sell it - and leave devs to pick up the pieces of broken drivers and/or missing support? Yeah, thats the ticket. In fact, thats EXACTLY what they've done isn't it?

This is the SAME ATI that cooked up drivers in order to exceed Quake3 benchamarks - when they could have put all that time and effort into FIXING buggy drivers.

The SAME ATI that has removed W buffer support from the HW because, according to them, it would be de-stabilize the driver and produce a speed impediment.

Let me ask you this, if nobody is using the W buffer - WHY - should they care if its going to cause a speed degredation or not? Its BOLLOCKS. Its not like the W buffer ups and just works if the app isn't telling it to. And if its driver de-stabilization they're worried about (as they pointed out to me), thats just laughable - consdering the state of ATI drivers in general.

Bigus Dickus said:
Does ATI, NVIDIA, or anyone else tell developers in general hardware details about their future products (well, ignoring JC for the moment) long before the initial previews? One would be inclined to think not, since some features are probably dependent on driver development, and in general such complete disclosure would constitute a tremendous source of information leakage.

Wrong.

They DO tell. In fact, we're not talking about disclosures about card features. So please stop comparing Apples to Oranges. We're talking about a feature (W buffer) that is NOT a secret and should NOT have been removed to begin with. So they removed it. Thats fine. Their decision. The fact is, we [devs] should know about it so that we can plan ahead. THATS THE NORMAL THING TO DO.

Why else do you think I challenged them to come up with a shader solution? Because it is their responsibility to do so. If they hadn't gone and tinkered around with this, we wouldn't even be having this discussion and I won't be staring at rendering artifacts in a game I released in 2001, a game I'm releasing at the end of 2002 - or a game I'm releasing in 2003 and beyond.

Bigus Dickus said:
So, perhaps Derek thinks the W-buffer support issue is a special case... one worthy of a special "heads up" to developers? Apparently not, since it's an obscure feature used by a tiny fraction of games, and one for which there are other alternative ways of reaching the same end result.

No, thats NOT what I think

And no, a W buffer is NOT an obscure feature. If it was, a LOT of other obscure features that are part of HW, would be removed.

Bigus Dickus said:
But, assuming for a moment that ATI should have given a "heads up," just when should that have occured? During the initial conceptual design of the R300 core? Of course not. During initial driver development? During the verification process? Or, more likely, when the product specs and driver capabilities were fully known... in other words, very close to the launch date. So how much time would a "heads up" have saved in coding efforts? A week, two, three, a month? My oh my, what a crime indeed.

See above. This feature is not a trade secret, not subject to NDA and is nothing that should be kept hush-hush. As such, there was NO reason to NOT tell us about its removal. NONE.

The fact is, as usual, ATI DROPPED THE BALL on this. And, as always, expect us [devs] to fend for ourselves - as they expect us and gamers to, when it comes to piss-poor drivers.
 
And just where are all these other developers......yoo hoo.....any other developers around that want to align themselves with Mr. Smart?
 
DaveBaumann said:
The point is that you simply CAN NOT by pass the missing w-buffer and Radeon 9700 is the first new card that droped support for it (you still have it in all GeForce chips, all Radeon chips but 9700, and very probably also on Matrox).

You may not be able to bypass it, but the MS docs are saying that you should not be reliant on it being there as it may not.

Please STOP going around in circles. You've clearly been caught out on a limb, so, I'll break it down for you again.

MS docs are not saying ANYTHING different about checking for a W buffer before using it, than they don't say about EVERY other feature in the DX api. Get it?

If you want to use, check it first

If its there, use it

This is NORMAL for EVERY api function call in existence- and not just DX

DaveBaumann said:
Even if that trick with ps_2_0 shaders mentioned by earlier would work it would disable all early pixel rejections (since hardware does not know what the final depth will be until execution of shader is complete) and would cost a great deal of performance.

I know. It will also break FSAA on 9700.

And thats why I told them that it WILL NOT WORK. And if they insist, they can damn well prove it by writing the code snippet.
 
Derek Smart [3000AD said:
]

DemoCoder said:
A vast array of flight simulators are built on OpenGL. In the GIS space, many analysis apps contain truly staggering datasets of a large terrain. And then there's the CAD/CAM market which demains high precision, where architects are designing buildings and boeing 777's using GL apps, where the range can go from the threads on a bolt to the shell of the airframe.

Yes, and thats why they DO NOT USE CONSUMER LEVEL GRAPHICS CARDS. Pah!

And that is because they have higher Z precision than 24 bit ?

Actually I did a quick check on SGI and it seems their cards seem to support upto 24 bits only for Z buffer.

Maybe there are others which support higher but it seems even in workstation arena most of them use only 24 bit Z buffer.

And no matter how high the precion is someone is going to run out of it some day and they will have to deal with it.

[url=http://www.opengl.org/developers/faqs/technical/depthbuffer.htm said:
OpenGL.org faq[/url]]
12.080 There is no way that a standard-sized depth buffer will have enough precision for my astronomically large scene. What are my options?

The typical approach is to use a multipass technique. The application might divide the geometry database into regions that don't interfere with each other in Z. The geometry in each region is then rendered, starting at the furthest region, with a clear of the depth buffer before each region is rendered. This way the precision of the entire depth buffer is made available to each region.

In the end I will just like to quote Democoder that you will have to deal with 24 bit Z Buffer gracefully,.
 
“And no, a W buffer is NOT an obscure feature.â€￾

So far the only game I know about that needs and user’s W buffer is your game. When one game out of 1000’s makes use of a feature its can be consider obscure. You say lots of games use W buffer, can you list them. So far I have not been able to find any other games.


I am still waiting for you to define what you consider a large out door scene in a game. So I can try and think of some OpenGL game along the same size.
 
DaveBaumann said:
The point is that you simply CAN NOT by pass the missing w-buffer and Radeon 9700 is the first new card that droped support for it (you still have it in all GeForce chips, all Radeon chips but 9700, and very probably also on Matrox).

You may not be able to bypass it, but the MS docs are saying that you should not be reliant on it being there as it may not.

See, we are getting somewhere ;).
But what can you rely on in DX (or in OpenGL)? If something has had support for 2 generations of hardware (Radeon 7x, Radeon 8x) why should we expect that Radeon 9x would lack this feature?
 
Status
Not open for further replies.
Back
Top