ATI Demos on nvidia cards?

Well can you play around with Ati demos(dx9) on NV3x cards?

I was thinking also of downloading codecreatures but can´t find any link that works, anyone know?
 
Hmm...I think the answer to the first question you asked, as I understand it, is: "you should be able to, but can't at the moment due to lack of required DX 9 featureset exposure in nVidia drivers". Perhaps recent driver sets fix this problem, or the rumored "50.xx" that is promised to be "re-written" for the FX cards.
 
Yes, ati have nothing in their demos from the 8500 ones onwards that stop other cards running the demo. Providing a card has the needed feature set it should run. Apparently the 8500 island demos work on FX cards.
 
Yeah right, ATI is really fair. It doesn't literally say that my card is not ATi. It says that my card doesn't support Pixel Shader 1.4, Truform, Smart Shader or whatever. Yeah right :rolleyes:
 
All of the ATI demos start on the FX. (At least the ones I tried.)

When a feature is missing you see objects missing though.

You usually still can see something like the wheels and seats from the car paint demo.

Not that missing things are unusual with nVidia, many of their drivers can't render their own wolfman demo correctly...
 
embargiel said:
Yeah right, ATI is really fair. It doesn't literally say that my card is not ATi. It says that my card doesn't support Pixel Shader 1.4, Truform, Smart Shader or whatever. Yeah right :rolleyes:

rofl?
 
embargiel said:
Yeah right, ATI is really fair. It doesn't literally say that my card is not ATi. It says that my card doesn't support Pixel Shader 1.4, Truform, Smart Shader or whatever. Yeah right :rolleyes:
well i'm pretty sure the only cards to use hardware trueform are based on the r200 core. Also the nvidia drivers you are using may not allow for ps 1.4 you may be limited to 1.3 on your fx card or if you have a geforce 3 and 4 you never had ps1.4 to start with .
 
R200 demos works on FX just fine.

R3x0 demos works with geometry artifacts. It looks like FX doesn't support something like disp. mapping or i don't know...
 
DegustatoR said:
R200 demos works on FX just fine.

R3x0 demos works with geometry artifacts. It looks like FX doesn't support something like disp. mapping or i don't know...

I believe the problem that screws up most R3xx demos is that NV3x doesn't (yet) support multiple render targets. (Which are--someone correct me if I'm wrong--an optional feature of DX9? Or are they required?) Dunno if support is missing in hardware or just not in the drivers yet.
 
I thought it was required??? maybe my mind is playing tricks on me. Just shows where their driver focus is...
 
vrecan said:
I thought it was required??? maybe my mind is playing tricks on me. Just shows where their driver focus is...

Actually I think it is required but I'm not sure if the spec is specific in terms of what formats must be supported?

As for your dig at Nvidia's driver priorities, I think most consumers would rather they concentrate on (legitimately) improving performance on current games before enabling DX9 features that won't be supported in games for some time. Of course concentrating on cheating is another thing entirely, but a case can be made that MRT wasn't the most important focus right after NV3x release.

(Of course not having MRT working by now... Eh, ok, trying to defend Nvidia's driver decisions is pointless. ;) )
 
Dave H said:
(Of course not having MRT working by now... Eh, ok, trying to defend Nvidia's driver decisions is pointless. ;) )
Especially when you consider that ATI supported MRTs from the very first DX9 driver release. And developers are using MRTs.
 
OpenGL guy said:
Dave H said:
(Of course not having MRT working by now... Eh, ok, trying to defend Nvidia's driver decisions is pointless. ;) )
Especially when you consider that ATI supported MRTs from the very first DX9 driver release. And developers are using MRTs.

Other than Mr Carmack and Mr Sweeney?
 
K.I.L.E.R said:
And developers are using MRTs.
Other than Mr Carmack and Mr Sweeney?[/quote]
I won't speak for either Carmack or Sweeney, but I know for a fact that some developers are using the features as I've seen demos that use them.
 
The concept of MRT is to save calculation work right?

I just pray when games come out using MRT that the devs follow this concept and don't end up making the game slower with them enabled as some features in the past that were meant to save work end up doing more work.
 
An MRT alpgorithm can be ported to a card w/out MRT support - at the cost of multiple passes. (One pass per render target.)

It's a problem with the FX (efficiency), but not the biggest problem. The biggest problem is that it has no high precision render target support.

From R9700 demos only hatching requires MRT, but you'll see visual errors in all of the others, because of missing texture/RT formats.
 
K.I.L.E.R said:
The concept of MRT is to save calculation work right?

I just pray when games come out using MRT that the devs follow this concept and don't end up making the game slower with them enabled as some features in the past that were meant to save work end up doing more work.
The concept of MRT is not real new, but the implementation is a lot more general that in the past. For example, OpenGL supports rendering to the front and back buffers at the same time. What this means is that a single Z test is done (if enabled) per pixel but you can get two colors written. With MRTs, you can render to any surfaces you choose (up to 4 at once), but there's still only a single Z test per pixel. The great thing about MRTs is that you can output different values to each surface: That wasn't possible with the OpenGL implementation.

Imagine binding two floating point render targets at once. You could output x,y,z,w to one surface and diffuse, specular, u and v to the other and you've just created a vertex.
 
I thought NVidia was not going to support MRT but MET (multielement textures - basically the same but with a few restrictions and different usage) instead.
 
Back
Top