More ATI Driver News from Derek Smart

Status
Not open for further replies.
Show ME where I posted that that driver had anything to do with D3D

Hrm? Since when was this about anything other than D3D? Now, maybe you meant to say "ati3d2ag.dll" but we wouldn't want to misquote you or take you out of context lest we suffer your wrath, now would we.

REGARDLESS OF D3D DRIVER IMPLEMENTATION, I NEVER, EVER, EVER, EVER HAD TO MAKE ANY ZBIAS CHANGES IN MY CODE FOR ANY BOARD TO GET IT WORK. FOR THE ATI BOARDS, I HAD TO

And how exactly does that contradict anything that openGL_guy said about ZBIAS being an ill-defined feature in D3D? Perhaps you can point to the ZBIAS specification that shows where ATI is doing it incorrectly as oppossed to just doing it it differently than other IHVs?

Maybe you'll understand his point if I put this another, related way:

REGARDLESS OF GAME IMPLEMENTATION, ATI NEVER, EVER, EVER, EVER HAD TO MAKE ANY MULTITEXTURING TEXTURING CHANGES IN THE 9700 DRIVERS TO GET IT TO WORK FOR ANY GAME. FOR YOUR GAME, THEY HAVE TO.

You're kidding me, right? Doesn't a knowledge of hardware architecture lend credence to board design and hence software development for saidh hardware?

Well gee, I guess we can extend that all the way to marketing folks....Unless you elaborate exactly "what" you did for PCB design, (what product and in what capacity) then we really have no clue. One would NOT expect that PCB design leads to any intrinsic knowledge of software drivers for a chip on that PCB.

If you look at my post to Mike, thats what my first thought was, re my inference that I'd figure something out. Doh!!

Yes, we're all sure that using a pixel shaders was your next step. And even if true, that's not what you said, and we wouldn't want to misquote you or take you out of context, or otherwise assume anything lest we suffer your wrath, now would we.

The point is what about legacy games e.g. my 2001 game and so many before it, that do NOT have pixel shader support? huh?

Maybe if some consumers actually complained about it, it would have gotten even more immediate attention. Where are all the complaining consumers? I don't recall anyone with a 9700 at Rage3d complaining about your "2001" game....

Perhaps because it has such a low popularity? Perhaps that's why it wasn't initially caught, because they can't test EVERYTHING, but god forbid they "don't test last year's Derek Smart" game?
 
andypski said:
Finally, why the hell would JC want W buffer support in drivers? The fact that you even use this example, just goes to show how clueless you are. He's NOT doing ANYTHING that even REMOTELY requires the benefits of a W buffer. *sheesh*. And do you think that even HE has the power to influence the architecture of a card? Even if he wanted to (which I know for a fact that he's not even likely to get involved in such)?

Hmmm... that's a bit of a puzzle...

You see, I know for a fact that JC has plenty of power to influence how cards are architected. And while he might not get directly involved, I think you will find that IHVs make some effort to get his opinions while they are architecting their cards.

Believe it.

So, uhm, aren't we both saying the same thing? Though you seem to have taken the bus, while I'm firmly seated on the train? What'd I miss? ;)
 
I've tried hard not to join into this 'debate', but after reading the recent posts I can't help myself..

"Thanks for showing up. I am going to, quite literally, rip you to shreds for all the aggravation you bastards in driver development have caused me."

Ok, setting DS's issues with drivers aside, this one quote just made me think one thing - what an as*hole. Often a the message is not as important as the way as it is presented and this guy just takes the friggin' cake. It doesn't matter if he thinks he is doing the world a favour by bringing these issues to light, presenting them this way just makes him look like a dick and someone who deserves to be ignored.

Why should anyone listen to anything he has to say? The only reason anyone knows who DS is, is through his own self promotion and delusions of 'God-dom'. DS's products certainly aren't worth talking about.
 
Derek Smart [3000AD said:
]Thanks for showing up. I am going to, quite literally, rip you to shreds for all the aggravation you bastards in driver development have caused me.
If it makes you feel better, go ahead.
THAT and due to the context and tone of your post. Trust me, methinks you've bitten off more than you can quite possibly render.
Oh please don't hurt me. :rolleyes:
Here, pucker up....

OpenGL guy said:
Maybe you aren't so clever after all. The D3D driver for the Radeon 9700 is not ati3d2ag.dll :rolleyes:

Yeah. You're so clever Mr ATI guy that not only did you foolishly (a) misquote me (b) the misquote you pasted bore NO relevance to the attack you were making.

Show ME where I posted that that driver had anything to do with D3D
Did you not mention Z bias? Are you unaware that this is a D3D feature?
Good job. You're doing just as good as your driver development.
Much better than your game, too.
Zbias is an ill defined feature of D3D. For one thing, the refrast doesn't support it so you have no idea how much to bias the Z values for each bias level. How can you claim that ATi's implementation is broken when there's no reference to compare it to? Oh wait, you were probably comparing it to another IHV's implementation. :rolleyes: The initial Radeon 9700 code used a similar calculation for Z bias as the Radeon 8500 because it worked well in most apps. I found one app where it didn't work well but I couldn't get a fix in that would work for this particular app and the other apps we were looking at before the release of the card. The 7.76 driver should have a better Z bias calculation.

Again. Your arguments have NO basis in reality and you're doing a BAD job of it. Even the trolls are doing a better job at posting nonsensical RUBBISH than you are.
My arguments are based on fact. I guess facts don't fit into your reality.
Here, let me break it down for you....
No, allow me.
REGARDLESS OF D3D DRIVER IMPLEMENTATION, I NEVER, EVER, EVER, EVER HAD TO MAKE ANY ZBIAS CHANGES IN MY CODE FOR ANY BOARD TO GET IT WORK. FOR THE ATI BOARDS, I HAD TO

There, get it now? Good.
And your point is? As I stated above, Z bias is NOT WELL DEFINED. THE SPEC DOESN'T TELL YOU HOW MUCH YOU SHOULD BIAS THE Z VALUES BY. Got it? Good.
Lets move along with this farce, shall we?
Sure, it reminds me of your game.
AND, if your implementation of this was ok, why'd you have to go back and piss around with it in the 776 driver?
Because there was a demo of a certain game that had a bug. The demo was enabling Z bias too much (i.e. Z bias on the walls and the decals instead of just the decals). Since this demo was being used by certain review sites, we wanted to make sure things didn't look incorrect. Now the demo has fixed this bug, and our driver has a better Z bias to boot.
What does designing PCBs have to do with D3D drivers?
You're kidding me, right? Doesn't a knowledge of hardware architecture lend credence to board design and hence software development for saidh hardware?
No, it does not. How does figuring out where traces go on a PCB (that's PRINTED CIRCUIT BOARD, in case you didn't know) have anything whatsoever to do with driver development? You didn't answer my question.
You know, if you really want W buffer, you can do it in the pixel shader. You're clever, you can figure it out.

If you look at my post to Mike, thats what my first thought was, re my inference that I'd figure something out. Doh!!
That's how things go. Some people are afraid to solve problems, others bitch about it, and others just solve them.
The point is what about legacy games e.g. my 2001 game and so many before it, that do NOT have pixel shader support? huh?
W buffering is an optional feature of D3D. Optional means you don't have to support it. We don't support it. QED.
 
cellarboy said:
I've tried hard not to join into this 'debate', but after reading the recent posts I can't help myself..

"Thanks for showing up. I am going to, quite literally, rip you to shreds for all the aggravation you bastards in driver development have caused me."

You must have quoted that while I was editing my post. Either that or you didn't refresh your browser. I have made two corrections to that post, one of which were some spelling errors and mis-matched quote blocks. Please go back and refresh your browser and stop posting out of content.

As you can see, even OpenGL_guy quoted the whole thing correctly. And with a bit of humor in it. Gotta luv that. ;)

Thanks for your pointless participation. Do come again next time
 
Derek Smart [3000AD said:
]So, uhm, aren't we both saying the same thing? Though you seem to have taken the bus, while I'm firmly seated on the train? What'd I miss? ;)

I was saying that JC is actively involved in influencing upcoming hardware architectures, while the way that I originally understood your post you seemed to be claiming that he either didn't or couldn't. Maybe I got the wrong end of the stick? ;)

Of course, I'm sure he tries to influence all architectures, and not just any particular vendor.

- Andy.
 
andypski said:
Derek Smart [3000AD said:
]So, uhm, aren't we both saying the same thing? Though you seem to have taken the bus, while I'm firmly seated on the train? What'd I miss? ;)

I was saying that JC is actively involved in influencing upcoming hardware architectures, while the way that I originally understood your post you seemed to be claiming that he either didn't or couldn't. Maybe I got the wrong end of the stick? ;)

Of course, I'm sure he tries to influence all architectures, and not just any particular vendor.

- Andy.

No, the fuzzification is probably in my badly worded rhetoric. Dunno. But we're saying the same thing. The point about W buffer that I was trying to get across was that he [JC] isn't going to get up and yell at ATI to implement some hardware/software component and expect them to just up and do it - as the original poster would imply.

Hopefully thats clearer now. Damn, this is hard work. :D
 
The point about W buffer that I was trying to get across was that he [JC] isn't going to get up and yell at ATI to implement some hardware/software component and expect them to just up and do it - as the original poster would imply.

As the original poster, I must say stop making assumptions about my implications. Your assumptions are wrong, as any idiot would know, so stop quoting me out of context. (That's the way your would reply, isn't it?)

No, IHVs would not "just jump up and do it." IHVs have transistor budgets, and their own timetables to manage.

However, IHVs would in fact jump up and seriously evaluate something Carmack has to say about future architecture needs. The same cannot be said most other other single developers, including, believe it or not, you.
 
Derek Smart [3000AD said:
]You must have quoted that while I was editing my post. Either that or you didn't refresh your browser. I have made two corrections to that post, one of which were some spelling errors and mis-matched quote blocks. Please go back and refresh your browser and stop posting out of content.

I don't give a crap if you edited it or not - you did post it.

[edited - DB]
 
Doomtrooper said:
Actually Serious Sam: SE uses both Direct X and OGL....and run fine in both modes...

Actually in point of fact SS runs like shit in DX, in my experience (and based on benchmarks).
 
Derek...

You seem to be a smart enough man.

Undertaking the coding of a game on your own is a nice little challenge. (understatement)

Unfortunately, how you come across is rather "assholic" TM

maybe its the loads of inept voters here in Florida that we have to deal with or something, or all the blue hairs driving at 20mph slower than the speed limit, whatever.


You found something that doesnt work with your implementation of multi texturing and Ati drivers. Are you following the spec for multi texturing, or are you following another video cards driver setup and trying to do a little reuse of code? Seems to me the reason behind DX and OpenGL is to allow developers to code for one thing and its up to the video card manufacturers to match that.

Mind showing us the snippet where MT is followed to DX specs on your coding side versus Ati's driver implementation?

Reminds me of ongoings with Intel and Sun.
Pentium 4 does not like JRE 1.1x
fingers point both ways, yet JRE 1.1x works on everything else out there.
sounds innocent enough other than the fact that tons of companies still use products like oracle which uses older versions of the JRE.

So Derek, stop ranting, raving, etc. Just run the poor little old lady over next time she gets in your way to the beach.

Show where the MT is broken according to whatever standards you are following. I did the same with JRE and P4 and point my finger towards Intel. Can you do the same with proof?
 
Why is it when Dereck Smart calls a spade a spade some of you spaz out? Even John Carmack has said he considers Nvidia's drivers (at least in open gl) the gold standard.
 
Joe DeFuria said:
Ati dropped w-buffer support. They don't support 32 bit z buffer. End of story. (For all the power and influence you purport to have on this industry, ATI dared to drop w-buffer support or 32 bit z? Gah!)

FYI (I've not followed all the end of this thread yet so I don't know if its been said) W-Buffer support has been dropped from DX9 - this is just further evidence that R300 is a 'truely' DX9 card. Presumably DX8 W-Buffer support must still exist in the runtime for legacy applications so ideally something should be there, but dependant on how 'DX9 specific' other future boards are they may well drop native (read: hardwired) support of it as well.

As OpenGL guy has pointed out the job of the w-Buffer can just be achieved through the pixel shader anyway - I think this is where the 'Software' support could come in; ATI could enable the DX8 cap by 'emulating' a W-buffer getting the drivers to do the job over the shaders. However, any DX9 developer will have to do this themselves as there will be no DX9 cap for W-Buffer to support.
 
Re: Just ask yourself

caywen said:
What would JC do?

Find a workaround. Just do it.

Indeed. And I HAVE done that, haven't it? There comes a point where you CANNOT find a workaround. And at those times, you have to rely on the dirvers devs to come up with a fix - while fielding all the abuse that comes with people who play your games, clamouring for a solution.

Look, I'm not everyone's favorite poster child for good behaviour and I don't give a damn. The fact is, no matter what, at the end of the day, it is a win-win situation for ALL involved if ATI driver dev stopped asking first grade students to write drivers :D

OK, that was a jab at OpenGL guy. Hopefully he'll find the humor in it. As both of us go, we've both made out points and I do not wish to continue the engaging rhetoric as it serves no purpose other than to feed trolls and people willing to go the extra mile to take things out of context.

Several people here have already sent me email telling me that this, really is a nice place (though they make no apologies for the likes of Joe :D). So, we'll see. I'll stick around and come in from time to time. But for now, I've disabled the ability to be notified of new posts in this thread, because its too time consuming. Nevermind the fact that I took the day off, in memory of our fallen friends, but the sheer number of posts in this thread, has been mind numbing for me.

And Dave, you bastard!! The check had better be in the mail!!! All those banners you've been running in this thread. :D :D :D

DaveBaumann said:
FYI (I've not followed all the end of this thread yet so I don't know if its been said) W-Buffer support has been dropped from DX9 - this is just further evidence that R300 is a 'truely' DX9 card. Presumably DX8 W-Buffer support must still exist in the runtime for legacy applications so ideally something should be there, but dependant on how 'DX9 specific' other future boards are they may well drop native (read: hardwired) support of it as well.

As OpenGL guy has pointed out the job of the w-Buffer can just be achieved through the pixel shader anyway - I think this is where the 'Software' support could come in; ATI could enable the DX8 cap by 'emulating' a W-buffer getting the drivers to do the job over the shaders. However, any DX9 developer will have to do this themselves as there will be no DX9 cap for W-Buffer to support.

100% correct

I am on the DX9 Beta team and I know that W buffer has been dropped. BUT there is still backward compliance for legacy apps that need it - IF - the driver supports it.

I'm not concerned about DX9 and beyond. I'm concerned about current apps and legacy ones which DO rely on W buffer but which do not have a pixel shader.

In fact, I can write and fully implement a pixel shader W buffer support in an afternoon if I put my mind to it. Assuming that it is in fact possible, as OpenGL_Guy says. Until I actually see it working, I will reserve my doubts.
 
It does seem kind of wierd that ATi just dropped W-buffer support without any sort of software emulation or anything.
 
Nagorak said:
Doomtrooper said:
Actually Serious Sam: SE uses both Direct X and OGL....and run fine in both modes...

Actually in point of fact SS runs like shit in DX, in my experience (and based on benchmarks).

No graphical glitches related to multitexturing...which is what the thread was kind of about at one time.
Since it was ported over for X-box I would assume it would not run as smooth :rolleyes:
 
Status
Not open for further replies.
Back
Top