DirectX9 Beta3 released

Status
Not open for further replies.
The statement being:
As far as I'm aware, submitting different types of primitives for clipping of identical geometry should have NO affect on how the final scene is rendered - unless there is a bug in the HW or drivers.

I assume that's the statement you're refering to?

Anyways, it's a little disheartening to not have ATI be straight up with developers as to what the issue was or even produce a very distinct and accurate list of changes. :-? As a software engineer, I've owned up to my mistakes even when it was something as silly as being sloppy -- lack of proper testing or cut and paste error {"woops, should be < and not ="}.

--|BRiT|
 
Correct. I was going to go cut 'n paste it for you, but you beat me to it. You're not a lazy bastard after all. :p

As for the ATI bit, thats what irks me the most. Like I said, I seriously doubt that they'd tell me what and how this was fixed. But they figured that I would, like a complete idiot, go off and modify my code to fix some imaginary bug, when in fact (a) I don't/shouldn't have to (b) it should/does work.

Go figure.
 
Derek Smart [3000AD said:
]
1. Why ask why? Obviously, if you go back and read my post - particularly the one with the excerpt from ATI driver devs, you'd have your answer.

2. No, its not 10 mins of work. Gimme a break. Do you think that if it was as easy as 10 mins of work, I wouldn't have found 10 mins in the past two weeks to actually bother with it? Fact is, the code should work, it works and I have no intentions of changing it.

1) Why ask why ... well, because it sounds quite odd. If you use triangle list the first pass, but triangle strips the second pass I wonder why you don't use triangle strips the first pass. I haven't found what you're trying to do by reading your previous posts, "step 5" and "step 6" doesn't tell me a whole lot.

2) Because I figured that if your code was decently organized it would be only one or possibly a few places where changes are needed. The change would in most cases be trivial afaict.

I wouldn't just assume that it should work, unless you can back it up with some API spec text, which probably doesn't exist.
 
Derek Smart [3000AD said:
]
My previous statement which I typed in bold is a clear indication why it should and does work. Please read it.
You previous statement in blold leads of with "AS FAR AS I AM AWARE".
This begs the question, what if there is something of which you ARE NOT aware?
and hence, BriT's question.

As to my remark about your slipping, it was in reference to your rudeness and overreacting and spewing out "fanboi" labels.
 
Derek Smart [3000AD said:
]Oh, did I mention that I put in shadows and it works fine on every card but is flat out busted on the 9700PRO? No? Go see Mafia for an idea. Naturally, I can't do anything about it until the previous problem is solved, because they might be related. At this point, I don't have a clue.
It's very amusing to hear you say that shadows don't work because there appear to be problems in one application (Mafia). Last time I checked, shadows weren't a part of the Direct 3D or OpenGL specs, so implementations are likely to differ. Personally, I think you have no clue on how to implement shadows. Shadows are working just fine for me in applications such as Morrowind, Giants, Quake 3, UT2003, etc. etc. etc. so I can just as easily say that shadows work fine.

In other words, I call your bluff: Show us shadows in your engine.
 
DS

If its in our [devs] code, its up to us to fix it. The same way we fix our own game bugs. I find it highly unlikely that ATI (*gasp* of all video card manufacturers) would revise their drivers around our [devs] mistakes. Thats so ludicrous its not even funny. And that is pretty much what you are saying and what I excerpted from your post

This is just not true.. And one of the reasons that your one sided Posts really irritate me. Ever heard of the Nvidia GL_CLAMP bug??? No becuase you dont use OpenGL.. Well its well known and well documented. Just go to openGL.org and you can read REAMS about it. Here is the gist... By default, with NVIDIA drivers, GL_CLAMP behaves like GL_CLAMP_TO_EDGE which is totally incorect. Other video cards use correct GL_CLAMP which Often kicks out undesireable results. WHY??? becuase people like YOU sit around and code on ONE IHV Including their *bugs* which they tell you are not *bugs*. Thus Companies like ATI have to code their drivers around it.. or they get accussed by peoiple like YOU of having buggy drivers. When in reality It was NVIDIA'S PROBLEM THE ENTIRE TIME.

Here is a quote from openGL.org from the middle of one of these discussions...

On NVIDIA hardware correct clamping needs to be enabled in the registry. It's kind of strange but GL_CLAMP_TO_EDGE does not work correctly unless you set OGL_TexClampBehavior to 1 in the registry.
There was a topic on this some time ago and the NVIDIA guys said that enabling this by default would break a lot of applications.

That is, they said that correct GL_CLAMP behaviour could not be enabled by default but why GL_CLAMP_TO_EDGE isn't handled correctly is strange.

Thus Nvidia is doing Exactly what YOU are accusing ATI of doing. they cant change their code to work correctly.. becuase it would BREAK a bunch of games made by people LIKE YOU who code to Nvidia hardware and drivers as the *Norm*. No instead people who own other cards get SCREWED becuase they think that its ATI drivers, or Matrox drivers etcc.. When it is NOT.

I am only bringing this up becuase I am SICK of you acting like Nvidia are saints and ATI are the spawns of hell. No this is not about D3D, But it illustrated a VERY valid point. Nvidia has driver issues TO. To this DAY.. and They CANT fix them becuase it will break a bunch of apps. The very thing you you say *WTF is that kind of crap* of ATi.
 
Devs have had to put up with bug for bug compatibility forever. It's almost kind of irrelevent what the spec says, since it is the implementations that determine the real standard. Not just in 3D programming, but also when it comes to relational database APIs, HTML/CSS, or even Microsoft's own APIs.

You can never depend on the spec/documentation.
 
since it is the implementations that determine the real standard

Not When dev's are implimenting it Based on faulty information given by the one IHV they are coding on. Please Tell Me You are not actually trying to say that the Design specs dont matter, but if 80% of the people do it the way Nvidia does it *incorectly* then thats the correct way???? EVERYONE elses hardware works correctly except Nvidias.. but Nvidias way is correct anyway? :rolleyes:

*NO* Nvidia had a *bug* in their drivers for YEARS that Developers who codded on Nvidia hardware worked around. While everyone else whos drivers are NOT DESIGNED THAT WAY suffer. Tribes 2 exzibited this exact behavior for several months. and the Developers response was.. ATI and other Developers do not correctly Support GL_CLAMP.. Which is very far from the truth. Nvidia is doing it incorrectly. PERIOD everyone else is made to suffer and thinks they have buggy drivers.
 
I have to agree with Hellbinder here, and it did not just start with NVIDIA, before NVIDIA era your drivers had to behave like 3Dfx drivers... most likely it will now move to ATI "dialect" apps... ideally developers should use the reference rasterizer on any bug they encounter, if ref rast is fine then its a driver/hardware bug that should be reported (don't work around it unless you have a deadline and can't wait for the driver fix - but do make sure that you can disable the hack/workaround in an ini file when a new driver comes out !), if reference rasterizer fails as well then its a program or API bug. Do not assume that if something works on NVIDIA or ATI or whoevers hardware that its correct.

K-
 
I'm not saying that the it should be this way, I'm just saying that the reality of the situation is that the market leader's implementation is the defacto standard It's just that simple. Developers target their apps at existing hardware and drivers. Time to market matters. If they report a problem to an IHV and it doesn't get fixed (or can't be, if it is a hardware bug), they have to work around it. They simply cannot wait for the next driver patch or DirectX runtime update, or whatever. Once they do this, and games get shipped with the "bug" workaround, or a dependency on a certain API/driver version, that "bug" effectively gets locked in as a defacto standard.


Is it the ideal situation? No. I sit on two international standards committees and it is a constant source of worry that one of the large consortium vendors will race out of the gates and ship an incompatible implementation (the other big worry is some vendor coming out of the blue with IP claims) For example, if Microsoft is a member of working group X, and they ship a non-standard implementation of X and call it "X" in their PR, developers first realistic experience will be developing with "MS X" and if "MS X" has bugs or diverges from the standard, they will come to rely on these "features" Microsoft has consistently done this in almost every area they participate.


Another source of problems is incomplete specifications that allow vendors too much leeway in terms of implementation. A given spec may have feature Y, but they might not fully specify how Y is implemented, calculated, or how it interacts with other features. Therefore, two different implementations of this spec may use different algorithms, which give developers differing results depending on how much they depend the given implementation's way of doing things. I certainly didn't make up the phrase "bug for bug compatibility"


Real world interoperability is hard. There isn't really any sinister motive on the part of 3dfx, NVidia, or ATI with respect to these things. Sometimes it is an honest mistake, and sometimes it is a poorly worded or imcomplete spec. And sometimes, a vendor is just in a hurry to ship something and doesn't want to delay for months validating.


The only way to insure real world interoperability is to run very detailed unit tests on every part of the spec.

You think this problem is bad now? As the APIs get more and more complex, I expect the problem to get worse before it gets better.
 
DemoCoder said:
You think this problem is bad now? As the APIs get more and more complex, I expect the problem to get worse before it gets better.
Actually I don't think the problem is quite as bad as it used to be - particularly on the D3D side. This is down to
1. More focus on meeting OpenGL strict conformance
2. Microsoft providing better specs and particularly RefRast
3. the sheer size and depth of WHQL nowadays.
4. the convergence of the different hardware vendors on common methodologies
5. a large base of applications (at least half of which are actually within spec :D )

That's not to say things are perfect, but it's a far cry from the early days when drivers were absolutely rammed full of 'make this work like 3dfx does it' code.

Shaders improve things significantly in one respect - because they are atomic instructions with tight specifications as to the expected results they enforce convergence.

But there is one big issue coming up which EVERY game developer needs to be very aware of. There will be (are?) differences in internal precision and developers MUST pay close attention to it. Something will work fine on hardware X which always uses 'high precision' and not on hardware Y which uses 'less precision'. If the developers aren't testing on hardware Y, then boom, instant broken app... and that's not the driver's or the hardware's fault!
 
Althornin said:
Derek Smart [3000AD said:
]
My previous statement which I typed in bold is a clear indication why it should and does work. Please read it.
You previous statement in blold leads of with "AS FAR AS I AM AWARE".
This begs the question, what if there is something of which you ARE NOT aware?
and hence, BriT's question.

As to my remark about your slipping, it was in reference to your rudeness and overreacting and spewing out "fanboi" labels.

If you are reading comprehension impaired, find someone to help you with that problem.

The terminology in AS FAR AS I'M AWARE is commonly used within the context of there not being any regulated (or otherwise) and/or alternative explanation related to the subject matter thereing. ergo, As far as I'm aware, you're a frigging idiot. See how its used? Another example, As far as I can tell, you're just a frigging idiot who doesn't have a clue what he's talking about. See that usage?

Now, given the above, hopefully you've learned something. With that, go back and read my post and within the same context as it was intended and written.

Hellbinder[CE said:
]DS

I am only bringing this up becuase I am SICK of you acting like Nvidia are saints and ATI are the spawns of hell. No this is not about D3D, But it illustrated a VERY valid point. Nvidia has driver issues TO. To this DAY.. and They CANT fix them becuase it will break a bunch of apps. The very thing you you say *WTF is that kind of crap* of ATi.

No, you're bringing it up as you (and the rest of the ATI fanATIcs do) have done in the past when ATI is back on the ropes. In this case, it is a clear issue of what I've always said.

Look, I don't give a toss about OGL. I don't give a toss about most manufacturers drivers because theyWORK with MY games. I'm not the torch bearer for anyone. When ATI's drivers are botched and cause me aggravation and work, thats why I talk about it.

Yes, I know, you and your friends would rather it were all hush hush, swept under the carpet etc. And it totally chaffes your collective behinds when I post conclusive facts and evidence of the issues I bring up. You'd rather it were all fuzzy. And thats exactly why you folks try to distort, taint and maime the discussion in a pitiful attempt at steering it out of focus with a bunch of bollocks. To wit: This crap about GL_CLAMP.

Even with that, it boils down to the same thing doesn't it? If devs were loud and vigilant enough and totally took nVidia to the carpet, what makes you think it wouldn't have been fixed? Are you saying that even JC didn't take it up with nVidia - since he's the quintessential OGL for games torch bearer? Even then, if devs can bother to develop around HW and api deficiences, why can't they jam in a single registry key, with one instruction, in order to address the GL_CLAMP issue?

Do you have ANY idea how much stuff has been implemented and/or fixed in nVidia and ATI's drivers as a result of my incessant bitching, numerous follow-ups etc? This ZBIAS clamping is just one small example. Do you think other devs didn't know about it?

Look, unlike you and your party friends, I don't sit on a forum and make noise just for the hell of it. I do a LOT behind the scenes and which benefit a LOT of people everywhere. Your claim to fame is the same, ludicrous, unproductive rubbish, not likely to cause even a casual glance on anyone's radar.

Yet people wonder why I've just resigned myself to not posting to most people in these forums - even though they incessantly continue to post addressed to me, knowing fully well that I'm going to ignore it. Talk about low self esteem laced with a need for attention. pah!
 
Since I'm being accused of favoritism, let me give you a sampling of how my issues with nVidia are handled.

I recently discovered flickering in my games main menu when AA is turned on. After hours of debugging it, I came to the conclusion that it was a driver problem.

I sent nVidia devs an email describing the problem. By the end of the day, I got several replies back - some with alternate suggestions which I categorically rejected because as long as I'm developing within DX specs, I have no incentive to work around anyone's driver glitches.

This flickering does not occur on any other card, not even ATI's (which, in itself, is quite shocking really).

Anyway, here is the last response I got from them, after I told them that I had NO intentions of modifying my code to work around driver bugs and that they have to fix it.

Derek,

Have you considered doing these copies to a texture,and then mapping that onto a quad on-screen? Copying directly to the back-buffer is not particularly well defined when AA is enabled, and certainly less robust.

This doesn't change the fact that this should work in our drivers, and we must fix it.

So there you have it.
 
I see.

All that really matters then is that the hardware vendor "admits they are wrong and you are right." That should have been easy enough to predict. :rolleyes:

I'm curious, Derek. I believe that you posted (correct me if I'm wrong), that your personal code does not have any bugs by the time it makes it to the public. Any bugs that actually made it into public releases would be due to "your staff." (Though at least you still take responsibility for those bugs, whehter they are 'yours' or not.)

If that is the case...and this is an honest question, was there ever a time when you e-mailed a vendor convinced that something was a driver bug, and then found out it was your code?
 
All that really matters then is that the hardware vendor "admits they are wrong and you are right." That should have been easy enough to predict.

I have no first hand experience here so correct me if i am wrong but...

Well I could imagine behind the scenes big corporate chipmakers like ati and NVIDIA sort of pushing around the smaller game developers by "encouraging" them to work around their driver bugs so they don't have to devote the time and resources to fixing them themselves.

Why can they do this/do they do this? Because its much more important to the game developers survival that their game works properly then it is to the chipmaker...

So, IMO Its important that atleast some game developers stand up to chipmakers in situations like these. Derek may be a bit over the edge but if i were in his position I wouldn't appriciate companies pushing me around when they are the ones who are responsible for the issue.
 
Joe DeFuria said:
I see.

All that really matters then is that the hardware vendor "admits they are wrong and you are right." That should have been easy enough to predict. :rolleyes:

Rubbish. But coming from you, why am I not surprised?

I'm curious, Derek. I believe that you posted (correct me if I'm wrong), that your personal code does not have any bugs by the time it makes it to the public. Any bugs that actually made it into public releases would be due to "your staff." (Though at least you still take responsibility for those bugs, whehter they are 'yours' or not.)

You believed incorrectly because I never said that. Go back and read my post and stop misquoting me. If you're not sure of what I said, don't make references to it until you are sure or at the very least, have an accurate excerpt to that effect.

If that is the case...and this is an honest question, was there ever a time when you e-mailed a vendor convinced that something was a driver bug, and then found out it was your code?

No. Because knowing the down time it would cause me, I'm usually damn sure before I start writing emails. Unlike the shoot from the hip attitude that most here have, I don't have that particular character flaw - most especially when it comes to development.

GetStuff said:
All that really matters then is that the hardware vendor "admits they are wrong and you are right." That should have been easy enough to predict.

I have no first hand experience here so correct me if i am wrong but...

Well I could imagine behind the scenes big corporate chipmakers like ati and NVIDIA sort of pushing around the smaller game developers by "encouraging" them to work around their driver bugs so they don't have to devote the time and resources to fixing them themselves.

Why can they do this/do they do this? Because its much more important to the game developers survival that their game works properly then it is to the chipmaker...

So, IMO Its important that atleast some game developers stand up to chipmakers in situations like these. Derek may be a bit over the edge but if i were in his position I wouldn't appriciate companies pushing me around when they are the ones who are responsible for the issue.

That notion is basically what separates leaders from followers. I'm a leader, so, I lead. And with that leadership comes the responsibility of sticking to your guns, making the right decisions etc etc.

Followers don't usually have the stomach, balls (nor the opportunity to lead perhaps) that the responsibility of leadership, brings. And yet some don't have a choice. I have a choice. So I choose to lead.
 
Well I could imagine behind the scenes big corporate chipmakers like ati and NVIDIA sort of pushing around the smaller game developers by "encouraging" them to work around their driver bugs so they don't have to devote the time and resources to fixing them themselves.

I agree.

I could also imagine behind the scenes in small development houses that don't have top-notch coding skills, that they may blame the chip makers for problems that they perceive is in the drivers, when it reality it's not.

I'm not accusing Derek of this in this case.

I'm simply saying that this goes both ways.
 
Joe DeFuria said:
I could also imagine behind the scenes in small development houses that don't have top-notch coding skills, that they may blame the chip makers for problems that they perceive is in the drivers, when it reality it's not.

I'm not accusing Derek of this in this case.

I'm simply saying that this goes both ways.

Yes, it does go both ways and I'm sure that there have been instances of such, but I haven't encountered them. Most of it comes from inexperience really. And thats where I think most of the small dev houses without big name devs in the frontline, don't get heard much. Sad but true.

However, there was that one time back when I was doing DOS Glide development on 3DFX hardware when I happened upon a problem in a Hercules card which used a new 2D/3D unified 3DFx archicture (forgotten what that 3DFX chipset was called now). The game would run on some 3Dfx cards just fine, but would crash when a 2D op (e.g. menu) was done. I traced it directly into the Glide driver and presented it to them. In the end, I was just using the wrong Glide API version. :D If anyone here was in the Glide dev program, they'd know just how much of a mess that was. There were so many api releases, revisions etc that navigating the download site was a chore in itself.

EDIT:

Being the anal retentive git that I am, I went into my VC software and dug up the particular problem above. The chipset in question was the Voodoo Rush. Below are the entries used in the games' bat file which alleviated this problem, even when I was using the proper API. Man, talk about memories. :D

@echo off

REM /d1 = debug mode (if enabled in distribution)
REM /n = do not play intro
REM /v? = display list of 2D video cards supported in native mode
REM /vn = use specific 2D video card where n=card num displayed in /v?
REM use /v1 for VESA mode if card not supported directly
REM /g = run 3DFX Glide version (v1.07C or higher)

SET USE_MFS=1

REM Global
SET FX_GLIDE_NO_SPLASH=0
SET FX_GLIDE_SWAPINTERVAL=0
SET FX_GLIDE_TMU_MEMSIZE=2

REM Voodoo/Voodoo Rush
SET SST_TMUMEM_SIZE=2
SET SST96_TMUMEM_SIZE=2
SET SST_SWAP_EN_WAIT_ON_VSYNC=0

REM Voodoo2
SET SSTV2_TMU_MEMSIZE=2
SET SSTV2_SWAP_EN_WAIT_ON_VSYNC=0

REM Banshee
SET FX_GLIDE_TMU_MEMSIZE=2
SET SSTH3_TMU_MEMSIZE=2
SET FX_GLIDE_EMUL_RUSH=1
SET SSTH3_SWAPINTERVAL=1

bc3000ad /g /d1 /n
 
Yes I also agree that it's a 2 way street with these things...

It doesn't appear to me that Derek is unfairly biased against ATi. I would bet money that if NVIDIA did the same things that Derek has claimed ATi is guilty of, that he wouldn't hesitate to be just as vocal about his problems.


If you cut past all the attitude in Derek's post I see some real issues he is having as a developer with ATi's drivers. Does that mean EVERY game development house has the same problems with ATi? No it doesn't, and I don't think anybody here is claiming that...

I see lots of the opposition derek has met here is nothing more then fanboy retaliation, and as a professional in any field I would be pulling hair out if I had to deal with fanatics getting in the way of me discussing issues that effect my business.
 
Derek Smart [3000AD said:
]No. Because knowing the down time it would cause me, I'm usually damn sure before I start writing emails. Unlike the shoot from the hip attitude that most here have, I don't have that particular character flaw - most especially when it comes to development.
Rubbish. Then please explain how you contradict yourself constantly? Like your comment about IHVs not working around game bugs, after you stated that ATi had put in a Zbias clamp because of your program.

The D3D spec states that Zbias is a value from 0 (no bias) to 16. There shouldn't be a need for IHVs to put checks in their code to make sure applications are using values in that range. But then you twist things and act like ATi should have had that check all along. What a hypocrite.

-FUDie
 
Status
Not open for further replies.
Back
Top