Pixel shaders... Nv specific titles on the way?

Fuz

Regular
Having a look at the launch titles for the NV30 at the Nvidia site got me a little worried.
Have a look at this picture.
large03.jpg

Notice the ice, and the use of pixel shaders... looks fantastic.

Now check this out, same game running on dif hardware.
10.jpg

Looks bland in comparison. No use of Pixel shaders it seems.

This got me thinking. Will ppl miss out eye candy for not having a GF FX in this title? Its like back in the days.... when games were 3dfx enhanced. Of course, in the Voodoo1 days, there was a good reason for this, but today the features of competing hardware are so similar, you'd expect games to look the same on dif hardware (same generation).

I know other hardware on the market today can render this title with the same details as the GF FX, but I would hate to see game developers enabling features on just one companies hardware for what ever reason, even though it can be rendered on other hardware.

Any one have this game running with pixel shader effects like the first pic I posted? If so, can you please post some screen shots, cause all the review/previews I have read on this game show screen shots that look like the second pic I posted.

I guess I am concerned that we will start seeing games coming out with features that can only be enabled on a certain brand of hardware, even though it could be rendered on others..... I sure hope things don't pan out that way.

Are we going back to the old ways... brand specific game titles on the way?
 
its happened already in a limited way

NeverwinterNights only used nVidia extensions for PS effects for the Gf3/4 and the AA slider only works on Gf3/4's as well. Due to outcry and maybe due to bugged ATI drivers during coding revisions this is now being addressed in a patch.

Giants had a Gf3 enhanced version and pre-patches some bump-mapping effects only worked on nVidia cards even if the hardware (e.g Kyro) supported it IIRC.

I dont see it happening too much though with DX - people will code to the DX standard - OGL depends on what can done via open extensions v proprietary I suppose.
 
ARB extensions can be used, and vendor specific extensions aren't really vendor specific, they just have been designed by a vendor, other vendors can still use them.

RadeOn 8500 Catalyst 2.4:

GL_HP_occlusion_test
GL_KTX_buffer_region
GL_NV_texgen_reflection
GL_NV_blend_square
GL_SGI_texture_edge_clamp
GL_SGIS_texture_border_clamp
GL_SGIS_texture_lod
GL_SGIS_generate_mipmap
GL_SGIS_multitexture
GL_SUN_multi_draw_arrays

they aren't ATI but they are supported.

However I recognise the ARB member should work more often together to create as little extensions as possible, and EXT or ARB extensions preferably.
 
So was it ATI's decison not to support the nVidia extension that Bioware used in NWN as oppsoed to a cost or other reason?
 
I was under the impression that unlike other manufacturers extensions which are open and anyone can use most of nvidias are propriatory (sp? :/) which means that you have to pay to use them.

The pixel shader extension used in NWN is an example although they're supporting and ati extension in a patch apparently.
 
Odd, tribunal is in there too and thats only got the same PS1.1 water as morrowind hasnt it?

Maybe they just picked random games with use of Pixel shaders although that doesn't explain NOLF2 :-?
 
Bambers said:
Odd, tribunal is in there too and thats only got the same PS1.1 water as morrowind hasnt it?

Maybe they just picked random games with use of Pixel shaders although that doesn't explain NOLF2 :-?

Kinda like the way that Diablo 2 used Hardware T&L. :LOL:
 
Bambers said:
I was under the impression that unlike other manufacturers extensions which are open and anyone can use most of nvidias are propriatory (sp? :/) which means that you have to pay to use them.

The pixel shader extension used in NWN is an example although they're supporting and ati extension in a patch apparently.

That's true for most of the GL_NV extensions. The NV extensions listed above are some of the few that doesn't contain this in the spec:

"IP Status

NVIDIA Proprietary."
 
I've been in the NwN forums a while. Due to a combination of the shiny water issues discussion (which included comments early on by Bioware forum moderators saying that ATI's drivers weren't "OpenGL 1.2 compliant" and hence no shiny water), and stability issues (the game has had and continues to have many stability problems, but one particular problem out of the many, which for some unknown reason I didn't experience myself, was attributed to an ATI driver bug), many people who have enthusiastically waited for that game have sold Radeon cards (including 8500s) and bought GeForce cards...I'm sure nVidia is happy about that result. :( How many of those people who bought GeForce cards to get improved visuals ended up buying GeForce 4 MX cards :( ...well...that's a discussion we can have elsewhere, but you can do a search at the neverwinter bioware site and try to see for yourself in the meantime.

Bleh, in any case, more directly on topic, I'm not too worried about this type of thing continuing. I do still have concerns about Cg, but I currently don't think Cg matters going forward (atleast outside of being an nv optimized DX HLSL) and don't think there will be an increase of "nv specific" situations like Neverwinter Nights. Hmm...well, perhaps nv specific performance enhancements like in some benchmarks (was it vulpine?) might be around a bit longer.
 
Trawler said:
Bambers said:
Odd, tribunal is in there too and thats only got the same PS1.1 water as morrowind hasnt it?

Maybe they just picked random games with use of Pixel shaders although that doesn't explain NOLF2 :-?

Kinda like the way that Diablo 2 used Hardware T&L. :LOL:

I was under the impression that NOLF 2 uses shaders on all hardware to enhance some visual effects and performance. I just think the "simplified" nature of the configuration settings hides this from the user....that and the developers are very quiet about their work. I guess we'd have to compare visuals in detail to find out for sure.
 
Yea vulpine gl mark only supported the nv vertex array extension and not one for any other cards giving geforces a huge boost in the first outdoor part.
 
I guess I am concerned that we will start seeing games coming out with features that can only be enabled on a certain brand of hardware, even though it could be rendered on others..... I sure hope things don't pan out that way.

One title seems to be IL-2 Sturmovik: Forgotten Battles. This is what the developer Oleg Maddox has said one the subject.

(Q): Are you using the proprietary NVidia OGL extensions to create the water effects?
Or did you include a program path that can display these effects on ATI gfx cards, too (Radeon8500 and higher) ?
(A): If you'll read my post at dev. update with more attention you'll get the answer:
"It works on GF3 and GF4. Probably will works on some other cards finally. But old 3D engine is included in FB also, so old render also works, but water there is also better. And not only water…"
Now I will just add one notice. With other cards if not in release then in add-on.
Probalm that manufactures went in diffferent standards for the same hardware effect and that is a problem for developers. Developers alwasy will selct the way where it is more useful and more easy to implement with their current code. Also dvelopers alwasy will select the way that has not contradiction in other parts of their code.
10/19/02 03:50PM
(A): I'm sorry to say that NVida card are really more developer friendly in our case.
Someone here think that support of DX is panazea .... that is wrong opinion.
We are working at first with these that are friendly to our code. Then with special new code with others.
Priorited in development that is limited by time, etc is a must.
The water isn't the main feature of FB. This is also the main fact that sould be considered.
10/21/02 10:
 
glappkaeft said:
One title seems to be IL-2 Sturmovik: Forgotten Battles. This is what the developer Oleg Maddox has said one the subject.

(Q): Are you using the proprietary NVidia OGL extensions to create the water effects?
Or did you include a program path that can display these effects on ATI gfx cards, too (Radeon8500 and higher) ?
(A): If you'll read my post at dev. update with more attention you'll get the answer:
"It works on GF3 and GF4. Probably will works on some other cards finally. But old 3D engine is included in FB also, so old render also works, but water there is also better. And not only water…"
Now I will just add one notice. With other cards if not in release then in add-on.
Probalm that manufactures went in diffferent standards for the same hardware effect and that is a problem for developers. Developers alwasy will selct the way where it is more useful and more easy to implement with their current code. Also dvelopers alwasy will select the way that has not contradiction in other parts of their code.
10/19/02 03:50PM
(A): I'm sorry to say that NVida card are really more developer friendly in our case.
Someone here think that support of DX is panazea .... that is wrong opinion.
We are working at first with these that are friendly to our code. Then with special new code with others.
Priorited in development that is limited by time, etc is a must.
The water isn't the main feature of FB. This is also the main fact that sould be considered.
10/21/02 10:

Can someone translate this to English? What he's trying to say?
 
He's saying, "we don't want to code for anyone else because we're lazy and only want to take the path that's easy for us".
 
Just ran the demo for the game on my 9700. Looks like the first screenshot with the ice. Very beautiful game indeed. May have to pick this one up.
 
Humus said:
He's saying, "we don't want to code for anyone else because we're lazy and only want to take the path that's easy for us".

This coming from the guy who makes demos that only work one video card, and only with a specific driver version for that card... ;)
 
demalion said:
I've been in the NwN forums a while. Due to a combination of the shiny water issues discussion (which included comments early on by Bioware forum moderators saying that ATI's drivers weren't "OpenGL 1.2 compliant" and hence no shiny water), and stability issues (the game has had and continues to have many stability problems, but one particular problem out of the many, which for some unknown reason I didn't experience myself, was attributed to an ATI driver bug), many people who have enthusiastically waited for that game have sold Radeon cards (including 8500s) and bought GeForce cards...I'm sure nVidia is happy about that result. :( How many of those people who bought GeForce cards to get improved visuals ended up buying GeForce 4 MX cards :( ...well...that's a discussion we can have elsewhere, but you can do a search at the neverwinter bioware site and try to see for yourself in the meantime.

Anyone who is so stupid they'd sell their R8500 for a GF3 just to play such an obviously poorly coded game is a customer ATi doesn't need anyway...
 
gkar1 said:
<mean mode>Make a demo yourself and stop bitching. </mean mode>

I wasn't bitching, I was just amused at the irony of his comment (notice the wink smiley).

Nagorak said:
Anyone who is so stupid they'd sell their R8500 for a GF3 just to play such an obviously poorly coded game is a customer ATi doesn't need anyway...

:LOL:

Is that ATI's new driver marketing campaign?

"We know our drivers are broken, so what? Go ahead and buy another card, see if we care. We didn't want people like you buying our products anyway."
 
Back
Top