DirectX9 Beta3 released

Status
Not open for further replies.
sireric said:
If you want the same exact geometry, you need to send exactly the same vertices, in exactly the same order (which also implies the same primitive type). Otherwise, you might need to offset the next set of geometry, if you want it guaranteed "above or equal", or use multi-texture, or some other method.
sireric, stop making sense! Don't you know that is expressly forbidden? :D
 
Actually, I was thinking about it some more and it's not as trivial as I posted. I apologize for the "silly" comment.

In actuality, I do remember that there's a WHQL test that checks for rasterizer correctness in reaction to circular permuation of vertices (unclipped). The 9700 does pass that test and there's quite a bit of code to sort vertices per poly on a screen based sorting algorithm (1/w first, then other criteria).

However, doing the same thing in clip space is much more difficult, and I'm not sure there's a correct general solution. For example, what sort criteria do you use to determine the "correct" vertex in clip space? You need multiple sets of criteria, in case the previous one fails (i.e. sort based on W, but what if one or more of the W's are equal?). Also, you must maintain the ordering of the vertices (to maintain the provoking vertex information), and so it would be harder to temporarily sort in clip space, clip, and then restore the original order, with the new vertices. Clip guardband certainly will help a lot here -- Though near/far clipping should still get you.

Also, for cases such as V0=(X,Y,Z,W) and V1=(2X,2Y,2Z,2W) which are the same vertex, yet in clip space they are different. How to sort between those?

And, if you have a mesh of vertices, there's mutliple triangle sets that will cover that mesh (with no overlap). All of those can lead to different solutions, and I see no way of stopping numerical errors from creeping in and causing slight differences between the sets.

In general, to obtain the same geometry through clipping, you really need to send exactly the same data and command again. That's the only 100% sure way. In non-clipped cases, I think most circular permutations of vertices will lead to the same vertex, but I'm not 100% of all cases.
 
Pretty certain that ordering in terms of X,Y is sufficient, guess there might be some oddness as triangles become degenerate, but I don't think thats a problem.

I think the problem here is really down to the API's being a bit vague about the effect of submission order, this being coupled with one IHV deciding to work whatever....

John.
 
OpenGL guy said:
Crusher said:
Changing the primitive type shouldn't affect how the scene looks at all. I don't think you can blame developers for that one.
How the scene looks to the eye and how the scene looks to the hardware are completely different things.

Let me see if I get this straight.....

You bothered with the above missive, but didn't find it necessary to actually explain what you folks did in these drivers that fixed the problem so that the scene LOOKS LIKE IT BLOODY WELL SHOULD AND DOES ON **ALL** OTHER VIDEO CARDS?

Its this pathetic attempt at treating us like we're idiots, that gets me the most.

As far as I'm aware, submitting different types of primitives for clipping of identical geometry should haveNO affect on how the final scene is rendered - unless there is a bug in the HW or drivers. Obviously, since myself and everyone who had this problem on 9xxx cards didn't go out and get new HW, the problem was in the drivers.

Now, one might argue that the problem is in the HW and that the drivers were just revised in order to fix the problem. In which case, is the HW flawed? Is it one of those things where something was removed (e.g. W buffer) in the interest of speed, thereby crippling some aspects of the HW?

And lets not forget that these non-WHQL drivers were released in order to fix a problem in a game. What problem was that? Going by the description
on the ATI page....

This issue affects RADEON 9700 PRO based products when running NHL 2003 by Electronic Arts.

During certain scenes in the game the system may lock up. Alternately, the game may exit back to the Windows desktop.

...this was supposed to fix a FATAL problem in yet another new game. In the case of my game, the problem did not cause lockups, freezes, exit to desktop etc etc. This tells me that something else was fixed in these drivers outside of the problem it is resolving in NHL2K3. WHAT?!?

Guardband my a$$. The scene looked just fine on every card so far, including 7500/8500 cards, until the 9xxx showed up. Here is what you folks said was the likely cause of the problem NOT showing up on other cards.

Why don't we see this problem on nvidia? One thing I see is that nvidia reports a very large guardband. It's quite possible that they are not clipping in this case so they don't have any problems here.

DaveBaumann said:
It strikes me that it could well be your code (as well as the code in other titles that have similar issues, thus needing this fix) however ATI have had to patch it in the driver to meet then demands of gamers - perhaps the most desired/optimal route as far as ATI are concerned would be to have the game code changed to better suite their hardware but they have to do it the less optimal way and stick a change in the driver to stop titles exhibiting the issue.

Again, for the other titles that have been fixed this just seems to be the old GF3/4 is developers primary development platform issue.

You're kidding me, right?

sireric said:
Actually, I was thinking about it some more and it's not as trivial as I posted. I apologize for the "silly" comment.

heh, you think? ;)
 
Derek Smart [3000AD said:
]You're kidding me, right?

Errr why? I assume you read the posts after mine? You’ve just had four different people from two different IHV’s explaining how these issues can occur, why would I be kidding?

John summed it up reasonably succinctly:

JohnH said:
I think the problem here is really down to the API's being a bit vague about the effect of submission order, this being coupled with one IHV deciding to work whatever....
 
DaveBaumann said:
Derek Smart [3000AD said:
]You're kidding me, right?

Errr why? I assume you read the posts after mine? You’ve just had four different people from two different IHV’s explaining how these issues can occur, why would I be kidding?

The posts are meaningless within the scope and context of the problem.

To wit:

  1. If its in our [devs] code, its up to us to fix it. The same way we fix our own game bugs. I find it highly unlikely that ATI (*gasp* of all video card manufacturers) would revise their drivers around our [devs] mistakes. Thats so ludicrous its not even funny. And that is pretty much what you are saying and what I excerpted from your post.
  2. I'm going to repeat this again, so that it sinks in, because you obviously missed it the first time I said it and would much rather lap up the rubbish that is otherwise being spewed in order to, once again, absolve ATI of all blame in yet another fiasco: As far as I'm aware, submitting different types of primitives for clipping of identical geometry should have NO affect on how the final scene is rendered - unless there is a bug in the HW or drivers

    Obviously the API calls support it, there is NO mention anywhere (at least that I can find) that suggests otherwise and I didn't start graphics and/or DX programming yesterday. Just because some faceless person on the board ups and speculates that is should be done one way, doesn't make it the right way.

    Apart from that, I am of the opinion that even if one went in and actually did make the change (suggested by ATI) and then re-installed the previous drivers, there is a very good chance that it wouldn't fix the problem. Which would lead me to believe that this was NOT the problem at all (as ATI driver devs assumed) but rather something else - now fixed - in the current drivers.

    Just for that, I'm going to see about that change sometime this week.

While you're posting your requisite fanboi rhetoric, why don't you ask your friends WHAT they fixed in these drivers that could have FIXED this problem. Your time would be better spent doing that, rather than running around in circles, similar to that whole thread we had about the definition of broken which is exactly what seems to be happening here.

It never fails. As soon as I find a problem, prove it had nothing to do with me, the fanbois are quick to rappel from the rafters with all manner of useless and pointless rhetoric. But you know what? I'm going to keep doing it because I like to watch myself laugh.
 
While you're posting your requisite fanboi rhetoric, why don't you ask your friends WHAT they fixed in these drivers that could have FIXED this problem. Your time would be better spent doing that, rather than running around in circles, similar to that whole thread we had about the definition of broken which is exactly what seems to be happening here.

Oh for god sake Derek, calm down and be reasonable for a second – where the is the ‘fanboi rhetoric’? I’m not attacking or defending anyone here I’m just pointing out what is clearly happening.

What’s so difficult to comprehend about what John has mentioned?

Lets take the case where hardware X can handle something no matter how its passed to it, the API makes no specific mention of rules for how this something should be handled and hardware X also happens to be the developers choice at the time. Now seeing as this something has worked during their development they haven’t really cared about how this is handled. Now, what happens when hardware Y (and Z!), which does care about how this something is handled attempts to render a scene that passes the data in a fashion that it doesn’t like? Its probably not going to display correctly. So, what happens, either they go back to the developer and ask for a patch or attempt to alter the drivers – seeing as they have no control over how long the developer will actually take to correct the issue and they have users screaming that they want their games to work, then its likely they will attempt to rectify the issue in the drivers (or do both – an interim fix in the drivers whilst chasing the developers to sort their game).

Now, where is the ‘fanboi rhetoric’? If anything its ‘Bravo’ to hardware IHV X for making hardware that’s ambivalent to such things.

If its in our [devs] code, its up to us to fix it. The same way we fix our own game bugs. I find it highly unlikely that ATI (*gasp* of all video card manufacturers) would revise their drivers around our [devs] mistakes. Thats so ludicrous its not even funny. And that is pretty much what you are saying and what I excerpted from your post.

Are you really serious? What on earth do you think driver revisions are actually for? Here’s an easy example (because they display it openly): If you have a KYRO board, stick it in the machine and do a search through the registry – you’ll see a ton of registry entires turning bits (driver paths / hardware elements) on or off just to make the games work.
 
I think Derek makes two points that are reasonable:

1. There is no reason to suppose (in advance) that submitting the same geometry in a different form would make it transform differently enough to cause Z-bias errors, especially when: a) previous T&L cards, including cards from the same vendor, did not have this issue b) the reference rasterizer does not have this issue, and c) there is a WHQL validation that checks that poly vertices sent in a different order are transformed the same way.

2. There is no way to tell if the drivers that fixed Derek's issue actually addressed the strip vs. fan issue at all. The fix the drivers are advertised to have was for a specific crash bug in NHL 2003. Presumably, there is no code in these drivers that specifically address Derek's game. Was the strip vs. fan issue actually fixed/worked around in the drivers? Was this the actual problem in Derek's game at all? Say the strip vs. fan issue was the original problem--if these drivers don't "officially" fix it, his game could break again with the next driver release. Or suppose that Derek's problem was something else entirely. Again, if the fix was incidental to the driver release, Derek has to face the possibility that the problem will show up again in a future release.

If Derek hacks at his game, he can rule in or rule out the strip vs. fan issue. If it turns out that this was indeed the problem, he can ask ATI if the fix for that that showed up in the latest drivers will continue to be supported. And if it turns out that wasn't the problem, he is totally in the dark.
 
Derek Smart [3000AD said:
][*]I'm going to repeat this again, so that it sinks in, because you obviously missed it the first time I said it and would much rather lap up the rubbish that is otherwise being spewed in order to, once again, absolve ATI of all blame in yet another fiasco: As far as I'm aware, submitting different types of primitives for clipping of identical geometry should have NO affect on how the final scene is rendered - unless there is a bug in the HW or drivers

After a night's sleep, I agree with the above (a little reluctantly). I know that the rasterizer organizes vertices so that circular permutations of vertices are irrelavent. I was under the assumption that you could not guarantee that for clipping, but I realized that you need to guarantee it for clipping, otherwise you're likely to f* things up pretty bad. If you don't guarantee circular independance, then you could generate T-vertices in a clean triangle mesh, which is unacceptable. Assuming that we have circular independace, I'm sure it's also primitive independance.

Sorry. I'll shut up now :oops:
 
DaveBaumann said:
Oh for god sake Derek, calm down and be reasonable for a second – where the is the ‘fanboi rhetoric’? I’m not attacking or defending anyone here I’m just pointing out what is clearly happening.

I'm calm and I'm fine. OK, so maybe the fanboi reference was a bit of a stretch, but I can't think of ANY other reason why you'd have this stance, given the FACTS as I have presented them. You're not notorious for taking such ludicrous stances - even though you ARE a fanboi. But this latest bit is just bordering on ridiculousness.

Look, I don't write benchmarks. I don't run benchmarks. I'm a developer with YEARS of experience. As such, I think I have more authority than most, in the subjects I engage in. I don't just go in blindly posting just because I feel the need to. This is why when I do post about stuff like this, I make sure that I know wtf I'm talking about - otherwise, I face the likelihood of being battered.

What’s so difficult to comprehend about what John has mentioned?

I don't have a clue what you're talking about

Althornin said:
DS, you started slipping again.

Wot? :eek:
 
No intention of being argumentative, but the WHQL test concerned only tests rasterisation, not clipping invariance. I beleive even OGL is missing this type of test.

The reality is that this is quite an obscure problem, we only spotted the problem by chance when messing with user clip planes (we don't need to do screen space clipping).

This said I can understand any ISV who gets fed up when they run into these sorts of problems, but don't forget that this works both ways. E.g. I've had to code a WHQL failure into a driver before as an ISV had coded for another card that actually did things incorrectly, but it just so happened that the developer insisted that was the only card that behaved correctly.

Personally I think the batter pudding line of conversation was more amusing and certainly a lot less troublesome...

John.
 
Well, while I was typing up my last post, I got email from ATI dev support. They confirm that they didn't fix anything in the 778 drivers which would have anything to do with my game.

In other words, even if they DID know why the 778 drivers fix this problem, they're probably not going to tell me.

While he didn't tell me exactly what these 778 drivers fixed, so that I can get an idea of what the problem was to begin with, he did say that they have [finally] capped the max ZBIAS value to 16 (nVidia, Matrox and everyone under the sun, already do this. No cigar there) because my E1 demo (released on 03/17/01) was passing a value of 50. Yes, quite irrelevant, since the demo works fine. There aren't any ZBIAS values higher than 4 in any code base (I just checked my source control again to be sure). That 50 (as I mentioned before) was a typo and was supposed to be 5. But it still worked.

I have repeated my question to them, in that I'd like to know what was actually revised in these 778 drivers apart from the ZBIAS value capping.

Wish me luck in getting an answer. :rolleyes:

It is SO frustrating when they do this. I can send ANY video card dev an email saying, "....hey, thats worked!! what did you ladies fix?" and they'd tell me with no befuddling. But when you pose this question to ATI, you get stonewalled, silence and the like. Its like such a big frigging supa sekret to not tell devs - who DO NEED TO KNOW what their driver bug fixes actually FIX. I mean, WHAT IS THE BIG DEAL? unless they were attempting to hide the obvious remarks that would be made at the expense of the driver devs. Well, I think someone needs to tell ATI that those remarks are still being made, regardless.
 
Derek, unfortunatley the answer to you last question probably comes under the heading "marketing maketing marketing", its the same reason why drivers that fix HW 'issues' are almost invariably marketed as fixing a driver problem. Basically there seems to be this inability to admit that there was a HW deficiency and just state that the incredibly bright driver guys, with the help of the equally bright HW guys, have come up with a cool work around that means it doesn't matter anyway :)

John.
 
Derek Smart [3000AD said:
]
DaveBaumann said:
Oh for god sake Derek, calm down and be reasonable for a second – where the is the ‘fanboi rhetoric’? I’m not attacking or defending anyone here I’m just pointing out what is clearly happening.

I'm calm and I'm fine. OK, so maybe the fanboi reference was a bit of a stretch, but I can't think of ANY other reason why you'd have this stance, given the FACTS as I have presented them. You're not notorious for taking such ludicrous stances - even though you ARE a fanboi. But this latest bit is just bordering on ridiculousness.

Look, I don't write benchmarks. I don't run benchmarks. I'm a developer with YEARS of experience. As such, I think I have more authority than most, in the subjects I engage in. I don't just go in blindly posting just because I feel the need to. This is why when I do post about stuff like this, I make sure that I know wtf I'm talking about - otherwise, I face the likelihood of being battered.

What’s so difficult to comprehend about what John has mentioned?

I don't have a clue what you're talking about

Derek - What does fanbio'ism have to do with anything here? I don't even care what the ATI drivers are doing in this case, they may be riddled with bugs for all I know or care. I'm trying to illustrate a point whereby something can works on one particular piece of hardware but not another without either drivers being at fault, or necessarily the developer. You've demonstrated in a prior thread an inability to believe that this could possible be a reason for an issue occuring, however here we have people from various IHV's talking about such issues. Of course, this issue caould also arise becuase of a hardware issue.

And cut with the fanboi remarks they are getting tiresome.
 
Derek Smart [3000AD said:
]If its in our [devs] code, its up to us to fix it. The same way we fix our own game bugs. I find it highly unlikely that ATI (*gasp* of all video card manufacturers) would revise their drivers around our [devs] mistakes. Thats so ludicrous its not even funny. And that is pretty much what you are saying and what I excerpted from your post.
Doesn't this contradict your statement:
he did say that they have [finally] capped the max ZBIAS value to 16 (nVidia, Matrox and everyone under the sun, already do this. No cigar there) because my E1 demo (released on 03/17/01) was passing a value of 50. Yes, quite irrelevant, since the demo works fine. There aren't any ZBIAS values higher than 4 in any code base (I just checked my source control again to be sure). That 50 (as I mentioned before) was a typo and was supposed to be 5.
Why was a cap of 16 put in? Because a game used values outside the legal range (0 through 16). So because a game used invalid values, now there is an extra check in the driver just for poorly written applications.
As far as I'm aware, submitting different types of primitives for clipping of identical geometry should have NO affect on how the final scene is rendered - unless there is a bug in the HW or drivers
I put the bold where it should be.
Apart from that, I am of the opinion that even if one went in and actually did make the change (suggested by ATI) and then re-installed the previous drivers, there is a very good chance that it wouldn't fix the problem. Which would lead me to believe that this was NOT the problem at all
Your opinion is more valid than someone else's for what reason? Oh I know what your answer will be: It's your years of experience that make you an authority on whatever you deign to comment on.

-FUDie
 
Two questions who noone has asked yet ...
1) Why do you even need to use both triangle lists and triangle strips for the same geometry? I can't come up with an example in my head where this would be beneficial.
2) Why not spend 10 minutes to change the app to use the same types in both cases? Then it's garantueed by the API to work and will fix the problem. Sounds like 10 minutes of work.
 
JohnH said:
Derek, unfortunatley the answer to you last question probably comes under the heading "marketing maketing marketing", its the same reason why drivers that fix HW 'issues' are almost invariably marketed as fixing a driver problem. Basically there seems to be this inability to admit that there was a HW deficiency and just state that the incredibly bright driver guys, with the help of the equally bright HW guys, have come up with a cool work around that means it doesn't matter anyway :)

John.

hehe, u said it, I didn't :D

Humus said:
Two questions who noone has asked yet ...
1) Why do you even need to use both triangle lists and triangle strips for the same geometry? I can't come up with an example in my head where this would be beneficial.
2) Why not spend 10 minutes to change the app to use the same types in both cases? Then it's garantueed by the API to work and will fix the problem. Sounds like 10 minutes of work.

1. Why ask why? Obviously, if you go back and read my post - particularly the one with the excerpt from ATI driver devs, you'd have your answer.

2. No, its not 10 mins of work. Gimme a break. Do you think that if it was as easy as 10 mins of work, I wouldn't have found 10 mins in the past two weeks to actually bother with it? Fact is, the code should work, it works and I have no intentions of changing it.
 
Derek Smart [3000AD said:
]2. No, its not 10 mins of work. Gimme a break. Do you think that if it was as easy as 10 mins of work, I wouldn't have found 10 mins in the past two weeks to actually bother with it? Fact is, the code should work, it works and I have no intentions of changing it.

I don't mean to beat this to the ground, but I haven't found the answer so here it is: Why should the code work? Is there some specifications of the APIs somewhere that dictates it must? Is there some basic theory of graphics that dictate it must work? Or is it just that it works with all other boards, thus it must be valid? I just want to know why it should work.

Thanks,
--|BRiT|
 
BRiT said:
Derek Smart [3000AD said:
]2. No, its not 10 mins of work. Gimme a break. Do you think that if it was as easy as 10 mins of work, I wouldn't have found 10 mins in the past two weeks to actually bother with it? Fact is, the code should work, it works and I have no intentions of changing it.

I don't mean to beat this to the ground, but I haven't found the answer so here it is: Why should the code work? Is there some specifications of the APIs somewhere that dictates it must? Is there some basic theory of graphics that dictate it must work? Or is it just that it works with all other boards, thus it must be valid? I just want to know why it should work.

Thanks,
--|BRiT|

My previous statement which I typed in bold is a clear indication why it should and does work. Please read it.
 
Status
Not open for further replies.
Back
Top