Pixel shaders... Nv specific titles on the way?

I think his comment is more like "We have limited resources for supporting multiple cards, so unless ATI helps us with developer relation resources and responds to our issues, bug requests, and help requests in a timely manner, we will not code specifically for feature extensions of their card"


It's much easier supporting a card if you can send your source code to their driver developers and ask "why isn't this working" or "can you help us out with this" If ATI isn't helping them enough and NVidia is, this will be the result.

It's not as simple and rosy as Humus makes it out to be. You just can't code to the OGL specs or DX9 specs and expect everything to just work out of the box on every card. If only the world were like that.
 
Anyone who is so stupid they'd sell their R8500 for a GF3 just to play such an obviously poorly coded game is a customer ATi doesn't need anyway...

wow :eek: , so Ati only choose l337 people to be customer? and they only need 'smart' customer?

I thought they are just a company who sell customers their product regardless of the intelligence level of their customer ....
but that is just me ..
 
Crusher said:
:LOL:

Is that ATI's new driver marketing campaign?

"We know our drivers are broken, so what? Go ahead and buy another card, see if we care. We didn't want people like you buying our products anyway."
The point is, the drivers werent broken. The software devs were LYING.
They didnt want to admit that they had used nvidia proprietary OGL extensions...
 
Althornin said:
The point is, the drivers werent broken. The software devs were LYING.
They didnt want to admit that they had used nvidia proprietary OGL extensions...

First of all, there's nothing wrong with using proprietary NVIDIA extensions, or proprietary ATI extensions, or proprietary Matrox extensions, etc., especially when there are no standard OpenGL extensions available to perform the same task. What other way is there?

Second, I don't see any place where a software developer at Bioware specifically claimed it was not working due to a bug in the ATI drivers... the only thing I see is someone stating that they had tried for a week to implement the effect on Radeon cards without success, and they had received no help from ATI on the issue, so they had given up on it until everyone with a Radeon card started bitching that they were getting the shaft on their shiny water.

Third, my comment wasn't really concerned with the specific issue of Neverwinter Nights, but rather the humorous idea that ATI wouldn't want customers simply because they expect their products to work (regardless of whose fault it is that it's not working).

Note: I have no stock in this argument, since I do not own Neverwinter Nights, nor do I own a video card that would be capable of rendering the shiny water in question.
 
well i borrowed it from my friend to take screen shots of it when i first got my video card and spent days on thier forums to find that it is a nvidia specific feature and its an open gl game. you do the math ... to help you out look in the forums
 
jjayb said:
Just ran the demo for the game on my 9700. Looks like the first screenshot with the ice. Very beautiful game indeed. May have to pick this one up.

Thanks, thats what I was waiting to hear. When I last looked, there was no demo available. Thanks for the heads up.

Of to buy the game.
 
Crusher said:
Althornin said:
The point is, the drivers werent broken. The software devs were LYING.
They didnt want to admit that they had used nvidia proprietary OGL extensions...

First of all, there's nothing wrong with using proprietary NVIDIA extensions, or proprietary ATI extensions, or proprietary Matrox extensions, etc., especially when there are no standard OpenGL extensions available to perform the same task. What other way is there?

We are back to the proprietary versus vendor specific argument again. Which ATI extensions are proprietary? Maybe some of their R300 extensions are? In any case, I'll let you have that discussion with Humus.

Second, I don't see any place where a software developer at Bioware specifically claimed it was not working due to a bug in the ATI drivers... the only thing I see is someone stating that they had tried for a week to implement the effect on Radeon cards without success, and they had received no help from ATI on the issue, so they had given up on it until everyone with a Radeon card started bitching that they were getting the shaft on their shiny water.

Well, you are both correct and incorrect, speaking as someone who followed the issue to its current point:

1) "Bio Mods" at the site are NOT Bioware employees. It was a Bio Mod who made the OpenGL 1.2 comment and I believe some other similar rather silly statements that many users took as the truth.

2) One Bioware employee did respond in a way directly supporting this statement. This statement was later corrected and possibly removed.

3) What they ended up stating was that they had it working, and upon release it suddenly stopped working. They further clarified that they were working with ATI to implement it and gave a specific quote that they (Bioware) had spent something like a week trying with no success to get it working...(I think you are paraphrasing that comment).

4) Given other incidents with their patching and game update progress over the same (long) interval...umm...how to put this...it seems likely that the problem may not have been "simply" ATI's fault. We'd have to see the 1.27 code for shiny water to really tell whether it was.

My own guess is that their ATI codepath was(?) fubared (for example, the game crashes your system if you try to save the game by specifying a save game name if you have truform turned on...this bug used to happen sometimes for any graphics card, but was patched away for everything but ATI cards using truform to my knowledge).

Let's just say that this game is not likely to be the cleanest example of coding out there. Sounds really harsh, I know, but it is very ambitious and feature rich and I think it got rushed out early by the publisher and playing catch up with bug hunting over-burdened the reduced staff dedicated to patch coding (again, this is supported by other factors about the patches in the same time interval...).

Third, my comment wasn't really concerned with the specific issue of Neverwinter Nights, but rather the humorous idea that ATI wouldn't want customers simply because they expect their products to work (regardless of whose fault it is that it's not working).

Yeah, I'm pretty sure ATI doesn't want just the technically savvy as customers. I find the "they are stupid so they deserved it" idea to be popular among people who are knowledgeable about whatever that person was stupid about. Sort of like how a physically strong person might be prone to the idea that beating up a physically weak person, the person deserved it for being weak. Human nature I guess.

Note: I have no stock in this argument, since I do not own Neverwinter Nights, nor do I own a video card that would be capable of rendering the shiny water in question.

Well, I do to both. ;) I've been especially frustrated by how some toolset crashes are categorized as an ATI driver bug when 1) the toolset crashes in similar fashion on other types of cards 2) the toolset didn't crash for me under the circumstances when this bug was supposed to manifest. Yes, yes, it doesn't prove it wasn't an ATI bug, but collectively it is frustrating for a developer to manage to do all of these things at once in a game.
 
jvd said:
well i borrowed it from my friend to take screen shots of it when i first got my video card and spent days on thier forums to find that it is a nvidia specific feature and its an open gl game. you do the math ... to help you out look in the forums

I did look on the forums, where they say it's supported on Radeon cards now. I also see where they said the reason it was an NVIDIA specific feature before was because they spent a week trying to get it to work on ATI cards and couldn't. I don't doubt that they used NVIDIA proprietary extensions to implement it when they were developing the game, I just don't see how you can blame them for doing that when they also tried (unsuccessfully) to implement it on ATI cards as well. Should game companies be forced to implement every feature on every card, regardless of how much time and effort they have to spend in doing so? Should I demand that they find a way to make their shiny water work on my GeForce 2, even if it takes them 4 months of development time to do so?

It seems like sometimes the people on this forum forget that games are made up of a lot more than a shiny water effect, and that it takes time to implement other things as well. Compromises must be made in development, and I don't agree with labeling products as promoting one card over another, or being tailored for one card over another, if you're talking about one special effect which makes up 0.000001% of the game code, especially if they put a reasonable ammount of time into attempting to implement it on both cards.
 
Crusher said:
Humus said:
He's saying, "we don't want to code for anyone else because we're lazy and only want to take the path that's easy for us".

This coming from the guy who makes demos that only work one video card, and only with a specific driver version for that card... ;)

First of all, drop this BS about me only supporting one cards bla bla. If you look closer you'll see that I tend to always use ARB extensions as long as I can. Take a look at my site, I have the required extensions listed for all demos. I had to scroll back 8 demos before I found the first that required an ATI extension. That's the dither demo. Oh, let's see, it requires either of GL_ATI_fragment_shader or GL_NV_register_combiners. Yes, I put special effort into support nVidia there. So I go to the next page ... there I find the laplace demo. This demo will not run on any NV hardware no matter what I do. I can continue and the first demo I can really find that's really ATi only is the Sea demo, which could possibly have been implemented with less than PS1.4.
Even if some demos do have a little better ATi support than other vendors it quite a difference when compared to a developer having access to all cards and is working on a commercial game.
 
Crusher - well it seems that they don't like to keep posts on the boards , check for posts by jvd and you wont find any yet i have 72 posts in my info ? funny that
 
I will also add that in my experience, ATi's developer supports is way beyond nVidia's. I have yet to have a positive experience of dealing with nVidia. They just doesn't care about "insignificant" developers like me. I'm certain they are very responsive to guys like Carmack and Sweeney, but for hobbyist guys like me they don't even bother to reply on emails. I've heard similar experiences from other people. You report a bug and it takes months before it gets fixed. I had this real let down the other day. So this bug that has plagued nVidia's drivers for months, WGL_ARB_render_texture performing nowhere near it's potential, it should never be slower than doing a simple glCopySubTexImage2D(), but it was way slower. So I had contributed in a topic on OpenGL.org about it, and some nVidia guy stated that they were working on it. So I got an email from him later on, he had tried to run my "Shadows that don't suck" demo to see if their newer driver would work better with it, but he couldn't make it compile (I had done some changes to the framework). So I sent him the updated version, I also asked him to tell me if it turned out to fix the problem. Well, he didn't, never heard any more from him. One would think that in a case like that when I was doing him a favor after request that he would at least reply on it ... :rolleyes:
ATi on the other hand ... I've had 6 emails from the the last two days. They send me beta drivers fixing my bugs without me even requesting it. They provide instant feedback on pretty much everything.
 
Randell said:
its happened already in a limited way

NeverwinterNights only used nVidia extensions for PS effects for the Gf3/4 and the AA slider only works on Gf3/4's as well. Due to outcry and maybe due to bugged ATI drivers during coding revisions this is now being addressed in a patch.


That's totally wrong. There was no "bug" whatever in the ATI drivers. The problem was (as I understand it) that NWN is a game written and optimized for nVidia's chips and which makes use of nVidia's OpenGL extensions (which optimization seems borne out by my experience with it.) I have the game and it's kind of funny to work the internal FSAA sliders in the game and see "Quincuxx" come up as a possiblity--when I'm running it on my 9700 Pro. They use a nVidia extension for "shiny water" and of course ATI does not use the same extension for that effect--but has its own. What they are doing in the patch is fixing it so that it runs optimally on cards other than GF3/4's, and that's about it. HOwever, all other effects that I can see aside from shiny water seem to work fine--at least according to comparisons with screen shots I looked at. The FSAA thing is a dead giveaway. I'm not so mad as all of that with Balck Isle for doing this because for one thing I owned a GF4 Ti4600 before replacing it with a 9700 Pro (and put the GF4 in my wife's machine.) And I can understand that at the time game development was going on the GF4 cards were the optimal 3D cards (I think) to own. I'm just glad they are fixing the game now that the 9700 has (in my opinion) completely usurped the title from the 4600. Using nVidia extensions in this case is, I think, forgiveable. Now, if they didn't intend to fix it...why that would be another matter (I miss nothing from the fact that water doesn't shine in the game, but it will be nice to see it if I haven't finished it by then all the same.) Oh, yea--the FSAA slider works just fine with the 9700 Pro--of course I don't get "Quincuxx"--but then again that was bad enough as it was--and with the Ti4600 my preference for FSAA was OFF. I haven't enjoyed FSAA as much since the V5 as I enjoy it with the 9700 Pro these days.
 
Just have to pedantic here. Black Isle (a division of Interplay) and BioWare are 2 different 'companies'. Black Isle had nothing to do with Neverwinter Nights. NWN was made 100% by Bioware.
 
In the case of NV specific features / titles I think that its gradually going to be a case of "get used to it". AFAIK NVIDIA hired someone from Sony Computer Entertainment to head up their dev rel and he's running it the same as he would for the Playstation: score exclusives (where possible). Why do you think there's been so many ties in with Square and Sony recently?

I'd heard this had caused some friction within some quarters of NV's dev rel and given Huddy's comments since he's been working with ATI I wouldn't be surprised if this was a major factor in his leaving.

So, I think this is just going to happen more and more - ATI are doing it as well and they've started to directly target and court developers.
 
NWN was 100% Bioware's fault you mean :)

It is a good thing we have m$ to at least slightly keep this market together.
 
DaveBaumann said:
So, I think this is just going to happen more and more - ATI are doing it as well and they've started to directly target and court developers.

What!? Are you suggesting that a lot more game developers will favor either ATI or nVidia hardware in the future? Or are you saying that more are going to optimize for several different hardware and at the same time support specific features (like TruForm)?
 
Let's not forget that most ISV's would just as soon only have to develop for one video card. When it looked like nVidia was going to be dominant forever, I think they increasingly resented having to accomodate other video cards at all. The same thing happened (on a smaller scale) when 3dfx was dominant. I think the ISV's felt that it was in their long-term interest (lower development and support costs) to (subtly) encourage their customers to adopt only one video card architecture.

That's one of the reasons it was so important for ATI to have a real hit in this generation; if they didn't, they would be increasingly discounted by ISV's and the bad game support would make their cards increasingly difficult to market. With hard-core, vocal gamers buying 9xxx cards in large numbers, ATI has insured continued attention from ISV's.
 
Humus said:
You report a bug and it takes months before it gets fixed.

>Sent: Thursday, October 10, 2002 1:54 AM>
> To: CGSupport
> Subject: Next Release of Cg ?
>
> Hi,
> When can we expect the next release of Cg ?
>


>Received: Wednesday November 20th
>Hi Rob,
>By the end of the year.
>Regards,


A month and 10 days - thank god I didn't ask a complex question :)
 
DaveBaumann said:
In the case of NV specific features / titles I think that its gradually going to be a case of "get used to it". AFAIK NVIDIA hired someone from Sony Computer Entertainment to head up their dev rel and he's running it the same as he would for the Playstation: score exclusives (where possible). Why do you think there's been so many ties in with Square and Sony recently?

I disagree. I think it generally backfired for Bioware. There is a GREAT deal of dissatisfaction among ATI owners (the non-sheep variety that realized how silly it was to "have" to buy a new card to get support for features their card had) as a result of this and unless nVidia actually paid them gobs of money they got a very poor return for their decision for the extension. Even if nVidia actively encouraged them to not implement the functionality for ATI hardware, I don't think they have the tools (another proprietary shader extension) to manage it going forward.

Of course, I view Cg as possibly such a tool (i.e., nVidia would like it to be), but I think DX 9 HLSL and OpenGL 2.0 will work to prevent its use as such, as well as prevent other "proprietary" functionality in future.

I think the "exclusive" concept as from Sony is more like the UT2k3 "logo", and I don't think that type of thing has much of a life span (but I'm "allergic" to hype of that degree, maybe most consumers have a higher tolerance).

I'd heard this had caused some friction within some quarters of NV's dev rel and given Huddy's comments since he's been working with ATI I wouldn't be surprised if this was a major factor in his leaving.

That is very interesting. Integrity? Who'd have thunk it?

So, I think this is just going to happen more and more - ATI are doing it as well and they've started to directly target and court developers.

If Huddy disapproved of it, why would he start exactly the same thing at ATI? I certainly think he would focus on countering it, but that does not necessitate doing exactly the same thing. Note, my interpretation of what you mean by "it" is encouraging developers to implement a feature exclusive to the IHV's hardware with the exclusion of other IHVs' hardware with the same functionality.
 
Back
Top