SimHQ - Bubba does the ATI 9700 Pro.......

Status
Not open for further replies.
geo said:
Derek must be an honorary Israeli tho --he seems to believe in massive retaliation. The threads I've seen go to hell start with Derek making a strong, but not unreasoned statement, then somebody makes it personal, and then there are smoking craters all over the place. . .

So true... And then things just continue to spiral out of control from there. :(
 
Derek Smart [3000AD said:
]
Speed is one thing. Compatibility is quite another. Whats the fucking point of touting/selling a fast card, if the competion has an almost as fast card (Ti4600) but with approx 98% more stability?!?!

The problem is quite simply, with your (a) math (b) lack of FACTUAL data - no, pulling numbers (95%) out of your thin air, doesn't count as being factual. What IS factual, is stuff like this.

so its OK for you to pull numbers out of thin air then????


:rolleyes:

come on Derek, argue better than that.

:)
 
noko said:
I am no developer nor am I a programmer, just thought I get that out. Anyways I think Derek brought up some excellent points such as no W-Buffer and a 24bit Z buffer on the Radeon 9700. Now I wish someone explain, maybe somebody already did (signal to noise ratio was low) what games need these features and what happens when they are not available. How it affects the clip plane from near to far and what the hell is the W-buffer used for and if it was ever used? Allright maybe I have nothing to contribute but those questions are bugging the $hit out of me o_O


Here's some long winded points that might help you out and explain why some developers find it a pain in the a** to develop games on PCs:

1) W-buffer can benefit vast-large open scenes where the viewing distance is "huge" because more precision is maintainted at a greater distance from the viewer.

2) D3D apps using W-buffer must query for hardware support, and if there is non, use standard Z buffering... too bad so sad :devilish: .

3) An example of Z fighting occurs when a game renders a shadow to a texture and then places the shadow texture _right on the ground_ AND is doing a Z compare. This, unfortunately, was about the only way to hack shadows before stencil shadows and shadow map hardware were available (which they are now). The "place shadow texture on ground" method fails to look good at all times because it essentially has the same Z value as the ground on which it is cast.

4) To alleviate the problem in #3, D3D has a ZBIAS renderstate that can be set to help the video cards decide if the ground or the shadow texture has priority and will hopefully display the shadow on top of the ground at all times (if it is set to have a higher priority). Unfortunately, ZBIAS is illdefined in the D3D standard (i.e. there is no standard way of implementing it and its consistency cannot be guarenteed across different video cards) and should not be relied upon.

5) Some games do decal texturing in a stupid fashion by placing a texture mapped polygon right on top of (on the same plane as) other polygons, to, for example, put a skull picture on a wall. This is the same problem as the example in #3 and is always destined to fail... _even_ if the skull picture is placed slightly out from the wall. The correct way of doing this is to produce a "hole" in the wall and put the skull picture in the "hole" and it is up to the programmers to relay this information to level designers.

6) One should never rely on ZBIAS in D3D the same way one should never rely on colorkeying (colorkeying is also illdefined in D3D and different cards produce different results). Colorkeying should be replaced with texutre alpha operations instead. Btw, no modern game uses colorkeying anymore... if they do, they haven't done their research.

7) As a previous poster said, higher accuracy closest to the viewer is more important than 100 miles from the viewer _in most apps_ and that is why Z buffering is sufficient most of the time. Note: OpenGL has no W buffer, and plenty of apps get by just fine without it.

Hoped this help cause some more confusion :)
 
Don't feed the Derek Smart..errm i mean trolls :LOL:

Umm.... Battlecruiser 3000 AD hurray forever or someting. Hurray! LMFAO
 
noko said:
So what did ATI do:

  • Dropped the W-buffer support in hardware without an easy workaround available.
    Dropped down the Z-buffer to only 24bits precision
    Did not warn the developers or inform them of the changes made nor the reasons why until after hardware was launched

Well Derek games require hardware to work upon.
The games where developed around known hardware standards and API standards which for the most part are suppose to be standardized and not break latter on when the API is updated or hardware is updated (at least not in the forseeable future).

I think it has been highlighted earlier too..but...
W-Buffer support was optional and seems it has been dropped in DX9 so ATI decided not to support it in newer hardware.

As far as fallback & informing developer , the DX API provides the option to query the availability of W-buffer so if a developer has indeed programmed according to the API then there is no need for informing the developer.

So I really fail to see how ATI broke any atandards or API in their new hardware by not supporting a feature which was optional.
 
Bambers said:
Derek Smart [3000AD said:
]I am firmly neutral and don't pick sides. All I know is that nVidia and Matrox boards have tridionally worked out of the box.

The G400 is a prime example. :rolleyes:

well in D3D which is all DS cares about (I think) yes it was a prime example - it was the best card for purely D3D games at its launch time. Admittedly a slow OGL ICD must have been a bit of an oversight ;)
 
Derek Smart [3000AD said:
]It is a well written review, though I take exception to the particular bit of rubbish.

This is the first time I have ever known an ATI product to launch with such a complete coordination of driver and hardware support working and functioning together as a total package.

That, right there, pure and utter bollocks - unless I misunderstood the real meaning of that paragraph.
He didn't have any problems with the drivers, so what?
For one thing, the 775 drivers that ship with the card are as buggy as hell. A common trait for ALL ATI drivers that ship with boards. So, I have no clue wtf he's talking about. :rolleyes:
This is your opinion, stated as fact as usual.
Oh, and they released the much-touted and equally as buggy, 777 drivers.
That's funny as things are working better. Games like "The Thing" and "Battlefield 1942" are no longer having problems. There are always going to be some people who encounter problems no one else does. This is almost always due to user error.
While I agree with you to some extent, remember that the 775 drivers which shipped with the 9000/9700 had similar busted functionality e.g. TnL and string (want a list?) of other issues.
A driver bug does not constitute "busted functionality". As Dave and others have posted, TnL on the 7.75 drivers was working fine with the vast majority of applications.
I predict that, as with ALL previous ATI cards, these 9xxx series of drivers will remain in this sordid sorry state....right up to the release of the next generation of ATI cards.
Maybe you should replace the "Ph.D." after your name with "Seer" as it's just as valid.
They ripped out W buffer support
W buffer was never required for Direct 3D. It's not even well-defined. Direct 3D supports CAPS bits so that programs can see what features are available. If a certain feature, like W buffer, is not available, then the application has to deal with it. If the application can't deal with it, then the application is broken.
All I know is that nVidia and Matrox boards have tridionally worked out of the box.
All I know is that the vast majority of games I have bought from companies such as id, Valve, Raven, Red Storm, Bungie, Activision, Blizzard, Sierra, etc. didn't need major patches to make them playable or to add basic functionality. Why can't I say the same about your games, Derek? You've released countless patches for your games, yet they are still rubbish.
1. The drivers that shipped with your board are the 775 set

2. There's NOTHING excellent about the 775 drivers. AT ALL

3. Please list this dozen games please and I'll pull them off my shelf right now (I have about 4000+ games in my library - so, think before you piss up this tree) and tell you EXACTLY what's wrong with them.
You just can't stand it when someone has a difference in opinion, can you?
the 9xxx series HAS A MYRIAD OF PROBLEMS ACROSS A BUNCH OF MODERN DAY GAMES. As such, running benchmarks on LEGACY games, does not preclude the fact that THE DRIVERS SUCKED THEN., THEY SUCK NOW. AND THEY WILL SUCK FOR MONTHS TO COME. Did I leave any variation of SUCK out?
You keep saying "9xxx series". Don't you know that the 9000 and the 9700 aren't even related? The 9000 (aka RV250) is a derivative of the 8500, which is the R200. The 9700 is the R300. And you can say "suck" all you want, but it's still just your opinion.
I can tell you RIGHT NOW that running the 9700Pro on that 775-776 (haven't tried it on 777 yet) shows a LOT of Z buffer artifacts, multi-texturing problems,
I have yet to see one example of a multitexturing problem.

Z fighting implies multipass rendering (i.e. coplanar geometry), this is not multitexturing unless at least one of the passes uses two texture stages or more. And even if one of the passes were using multitexturing, that still wouldn't necessarily make it a multitexture bug.
The 777 drivers don't seem to do a damn thing different from the 776 drivers - apart from the fact that it now includes the driver patch that came after 775 and before 776. It may have fixed other things, but so far, I haven't seen any such indication
Let's take this one piece by piece.
The 777 drivers don't seem to do a damn thing different from the 776 drivers - apart from the fact that it now includes the driver patch that came after 775 and before 776.
It may have fixed other things, but so far, I haven't seen any such indication
So, because you haven't noticed any changes, that means the drivers aren't improved?

You keep pulling links out of the air in a vain attempt to prove your case. Of course you neglect the links that show where people aren't having problems, even with the same games that others are. You neglect the links where people get their problems sorted out due to a configuration error. You neglect links where people are happy with their new card. It appears that, in your mind, one person complaining about driver problems, whether confirmed or not, means that the driver "sucks".

You know, the world doesn't revolve around Derek Smart: Kicking, screaming and throwing temper tantrums won't change that.

-FUDie
 
Why isn't this thread closed since page 2 ?

I'm sure we can use Beyond3D space and bandwidth in far more interesting discussion than this huge bashing and 10 year old boys comments.

If Derek Smart really want to say something he can make a new thread with meaningfull, precise informations, including screenshots and proofs.

So what's the point in leaving this thread open ?

[I'm not against arguing, I'm against bashing, and useless comments.]
 
Vote that up. Mr. Smart can be fun for a couple of pages, but I think there was enough of that here with the previus 20 pages thread. Unless B3D wants to introduce a psychological disorders section in the forum.
 
Is venting a good reason for leaving it open?


-DC Phd, MD, MPH, JD, Rear Admiral, and RN (Registered Nurse)
Project Lead, Product Manager, CEO, Senior Vice President of Engineering, Receptionist, and Mail Room Boy for DemoCoder industries, creates of the FlameCruiser2003 series.

Next patch for FlameCruiser will include ability to use new words like Discombobulate, Eviscerate, and MORON synonyms like ament, cretin, feeb, half-wit, idiot, imbecile, simpleton, dullard, dullhead, dumbbell, dummkopf, dummy, and ignoramus.
 
Randell said:
well in D3D which is all DS cares about (I think) yes it was a prime example - it was the best card for purely D3D games at its launch time. Admittedly a slow OGL ICD must have been a bit of an oversight ;)

With the first drivers it had major stability issues (esp at high agp rates) with quite a few systems.
 
aye perhaps time provides rose tinted specs - I always liked the G400 and the friends who had them did as well :)
 
Its a little past 5.30am and am at work. I check my email to get a notification of several PM in my inbox. I check the PMs. Check the threads.

....and run into the same shit. I don't even know WHERE to start.

Nevertheless, I was going to start with this post - taking it apart at the seams, as only I know how - until I got to this bit. Then I went ...screw this. No way I'm bothering with this git So, I'm just going to post the bit and move on.

rubank said:
Derek uses the word "pharmacodynamic" without ever having taken the time to look it up in the dictionary.

This is yet another example of assassination and fanATIcs deliberately taking my posts out of context, twisting my words etc etc. NOWHERE in this forum - or anywhere else, have I EVER - EVER used this word. What a damn lie. :rolleyes:

Look, I don't mind taking on the world. I have a box of toothpicks. But if you're going to engage me, BE FAIR. I'm only ONE man you know. As such, do you folks have to sink to the depths seen in this thread and others, in order to get ahead (which, in itself, isn't bloody likely) of me?

noko said:
Well Fresh from what I understand (not much) Derek seemed to have used the right method for his game with the W-Buffer, where he has a rather large 3d space with objects throughout the space vice a bunch of close up objects. So the accuracy of far objects and not so much accuracy of a few near objects seems to make sense due to the nature of his game.
I didn't know that the W-Buffer wasn't supported in DX6 capable hardware and above. Meaning I thought that was a hardware standard that was pretty much used by all recent graphic chips.

As for the bugs in his games I wouldn't know since I can't seem to find any of his games at the store here.

Do yourself a favor and just ignore Fresh. If you've read the other thread, you'd know why. He doesn't have a CLUE what he's talking about. And I clearly described what I'm doing, how I'm doing it and why.

An example? He says that most devs use 1.0 ft as the nearZ and why should I set it to 0.01m? That right there, is the sign of someone who has no frigging clue what he's talking about - but likes to think so.

Even though I painstakingly gave a description of why its seeded at that value - he is the only one (surprise!) who choses to ignore this and continue to post the usual drivel. He must think I'm developing a shooter, in which all the pickups in the world as the same size as a 6' human. e.g. the TDU (Target Designator Unit) used to illuminate targets for precision munitions guidance, one of the smaller items in the game, is as tiny as 0.18m H x 0.12m W x 0.03m L

As for the W buffer issue, just TWO days ago, I was in an email trail with Mike over at dev rel related to this W buffer. To the extent that I even sent emails to some of my industry contacts (including JC) to see if they knew wtf was going on and whether they were in fact advised by ATI that they had (a) removed W buffer support from the 9xxx series (b) that W buffer was in fact, NOT in the HW at all

Below is my email to the head of dev rel, in its entirety. His reply basically said that the W buffer is not (!) in the HW, they had no intentions of supporting it - and that one of the devs will come up with a sample app to do it via pixel/vertex shaders (I already tried it - and as I suspected, it does not bloody work!). Thats when I told him that if they say it works, they can bloody well write the damn sample app like nVidia would if I asked them to prove something to me. He agreed. And I'm in a holding pattern waiting for it.

Of course, note that this shader solution won't work for legacy apps (such as my previous games, which rely on a W buffer)

I mean, they took out W buffer WITHOUT TELLING ANYONE. To the extent that the card is DX backward compatible and a W buffer is accounted for in DX6 and higher. The jury is still out on DX9, but as far as I know, is still available in Beta 2.

Mike,

I've been doing some investigations. And then some thinking.

I know that the 9xxx series cards still have W buffer support in the hardware.

However, there is no way to turn it on because it is disabled in the display properties and the
registry settings are being bybassed. The driver is doing this, obviously.

So, I mean, whats it going to hurt to put this back in the compatibility section (since thats
where such stuff should go) of the ATI control panel, leave it off and let the user/application
decide if they want to use it or not? I mean, according to your explanation below, I really
don't see why ATI should be concerned about speed, if its left to the developer to determine
its use.

This all comes down to this nVidia vs ATI speed thing again - and its UNFAIR for you [ATI]
folks (not you personally, I mean) to do this - while breaking backward compatibility with games
(mostly sims) that make use of and/or require a W buffer. This is the kind of thing that really
irks most of us (though other developers aren't as vocal) and yet, the ATI bigwigs wonder why such
a high number of games don't work on the 9xxx series (driver problems aside).

I mean, here is an excerpt of an message that an ATI engineer suggested as a workaround. He forgot
one simple (!) thing - most legacy cards do NOT have a shader!!! In fact, some cards don't even
had a pixel shader; they only have a vertex shader.

You can use the pixel shaders to write a Z value out. Given this, a vertex shader could pass
W as a parameter to the pixel shader, which could then write this out as a W value.

The basic principal is to setup the W computations in the vertex shader, then pass that
to the pixel shader by using a texture address component, which are 32b floats. In the pixel
shader, the shader will receive a 32b interpolated float, which will be the per pixel W value.
Writing that out, it will get converted to a 24b float, which can be Z buffered, if you make
sure that all W's written out are positive. Or, you could convert to integer, and write out
that value.


Hopefully you'll be able to talk some sense into the ATI folks who make these decisions. In my opinion,
and that of several devs I know, the decision to disable the W buffer in these cards, is just plain
ludicrous and is of NO benefit to anyone.

And before they go telling you otherwise, to the best of my knowledge (me being on the DX beta team)
there are NO plans to remove W buffer support from DX9 because it has backward compatiblity going all
the way back to DX7

cheers


LittlePenny said:
The intended point was to see how people felt according to which drivers should be used in a review. The ones with the card? latest official? or latest beta?

I see. I think it depends on the reviewer. I for one am of the opinion that the drivers that come on the CD are those that should be used. In the same way that I described off-the-shelf products. Bubba was quite right in using the 775 drivers.

LittlePenny said:
]Just a question, I wasn't prodding. :)

heh, OK. I see you skirting the realms of prodding and joining in the fan club. :D

LittlePenny said:
Reviewing with bias on the genre of sims is what I found to be different, that's all.

Ah, OK. I agree with you then. Besides, Bubba would be hearing from me if he had gone an' run Sacrifice, Morrowind or such - being a hardcore simmer and all, that would be like sacriledge. :D

Althornin said:
Nagorak said:
Can't we just keep the personal attacks to a minimum? :eek:
Yeah, It would be nice to see that from BOTH sides here...

If you're going to jump on the Hey look at me!! look at me!! I have something interesting to say!! No really!!! bandwagon, at least gets you facts right. Who else is making personal attacks apart from the Usual Suspects?

Reverend said:
Derek, post screenshots of all those games exhibiting flaws in ATI's drivers.

I think that there are enough to go around on the web. Go to the Rage3D forum and take your pick. I'm not in the mood.

Besides, I'm working on a research article about this fiasco for two hardware sites. By the time I'm done and that propagates across the Net and in print, we'll see who has a louder voice - me or the bunch of goons making personal attacks. I'm going to bury them so deep, they'd need a spatula to get out of it.

Its obvious that on forums like this on the Net, the signal to noise ratio is too high for anything meaningful to be derived from my postings. Thats the plan though - they feel that by distorting the thread and taking it downhill, the message is lost. Fair enough, I'll just pick a different battlefield. The more they try to stiffle me, the louder I'm going to keep yelling.

You'll read about it soon. Be patient.

geo said:
Y'know, as an 8500 owner (and a RadLE and Rage Fury owner before that), I really don't mind at all having a Derek Smart pounding on ATI for better drivers. I *would* mind if the *only* thing he was doing was pissing and moaning, but it isn't. There have been quite a few examples in his various posts where he has quoted his various communications to ATI and it is clear (to me at least) that those communications are helpful, polite, and even good humored. Helpful in not just pointing out a problem, but in also (at least the ones I saw) suggesting where the problem may be.

Derek must be an honorary Israeli tho --he seems to believe in massive retaliation. The threads I've seen go to hell start with Derek making a strong, but not unreasoned statement, then somebody makes it personal, and then there are smoking craters all over the place. . .

Well said. If you look at my postings. This thread in particular, what you describe is exactly what goes on. As soon as they want to stiffle me, they start the personal attacks, character assassination, post distortion etc etc To the extent that you'd think I was running for office or something and this was an election year.

They WASTING THEIR TIME because I'm going to keep voicing my opinions as I see fit. The day ATI cleans up their act, they'll get a clean bill of health from me, from devs and from gamers. But I'm not holding my breath. There are STILL bugs in the 7xxx and 8xxx driver series, yet, we have a 9xxx series of boards on the shelves - with their own set of problems. ONLY in this industry like this will crap like that occur.

I for one, won't be surprised to see the first ATI class action lawsuit. From what I know, its coming. You'll see.

IT said:
Note: OpenGL has no W buffer, and plenty of apps get by just fine without it.

All good points. But show me one OpenGL program which has large outdoor scenes. :rolleyes:

There IS a reason I chose to stick with D3D instead of adopting OGL several years ago when I migrated from DOS.

crystalcube said:
I think it has been highlighted earlier too..but...
W-Buffer support was optional and seems it has been dropped in DX9 so ATI decided not to support it in newer hardware.

Every feature in DX is optional

W Buffer support has not been dropped from DX9

crystalcube said:
As far as fallback & informing developer , the DX API provides the option to query the availability of W-buffer so if a developer has indeed programmed according to the API then there is no need for informing the developer.

Listen to me, closely. No, not that close - back off the monitor a bit....

EVERY line of code that tries to do something such as this, checks the card's CAPS to see if it is supported. OK?

Good

Now, if the program checks for a W buffer, doesn't find it, then checks for a Z buffer, finds and uses it - thats normal. OK?

Good

If the program RELIES on W buffer (hence the check in the first place) in order to alleviate artifacts and use that as a preference, the first time the dev (such as me) is going to know that the W buffer no longer exists, is (a) when I notice artifacts (as I did in the case of the 9xxx series) or someone reports it or (b) if I knew this beforehand

NOBODY knew that W buffer was absent on the 9xxx series until the artifacts in their games started showing up. ATI never told anyone. It doesn't appear in ANY of the docs/bulletins on their dev site.

What? Do we now have to trace through every line of our code each time a new card comes out, in order to find out which CAPS report is failing?

You're kidding, right?

crystalcube said:
So I really fail to see how ATI broke any atandards or API in their new hardware by not supporting a feature which was optional.

:eek:
 
Derek Smart [3000AD said:
]If the program RELIES on W buffer (hence the check in the first place) in order to alleviate artifacts and use that as a preference, the first time the dev (such as me) is going to know that the W buffer no longer exists, is (a) when I notice artifacts (as I did in the case of the 9xxx series) or someone reports it or (b) if I knew this beforehand

NOBODY knew that W buffer was absent on the 9xxx series until the artifacts in their games started showing up. ATI never told anyone. It doesn't appear in ANY of the docs/bulletins on their dev site.

What? Do we now have to trace through every line of our code each time a new card comes out, in order to find out which CAPS report is failing?

You're kidding, right?
Actually you must be kidding, right? If you need to trace through every line of code for this, you should think about using a better concept for your engine, this is ridiculous.
 
Derek Smart [3000AD said:
]This is yet another example of assassination and fanATIcs deliberately taking my posts out of context, twisting my words etc etc. NOWHERE in this forum - or anywhere else, have I EVER - EVER used this word. What a damn lie. :rolleyes:

I beleive that was one of those 'enter a name' and generate a generic psycho-babble attack things. I dont beleive it was meant seriously.
 
I think that there are enough to go around on the web. Go to the Rage3D forum and take your pick. I'm not in the mood.

I’ve seen lots of screenshots of issue with the like of Morrowwind and a number of other titles nobody is saying they don't exist. But I’ve yet to see any issues with all of the games Bubba has been looking up, from either Rage3D, or any forum, or from yourself – you are talking about issues with an absence of fact, we are only asking you to back them up.

All good points. But show me one OpenGL program which has large outdoor scenes.

Tribes?
 
Status
Not open for further replies.
Back
Top