Catalyst 3.0 available from ATi + New Directx 9 demos

I got the first four when RC0 was released. Of these, "Rendering with Natural Light" was the most impressive visually, and the Animusic demo was the most fun to watch. The car and bear demos are pretty static--and I thought the car looked pretty good with just the DX8-precision normal maps.
 
BTW, you must download the demos again(Ver.1.0) as the previous versions won't run under this new DX9 release.....
 
Just downloaded the API and drivers.

Also downloaded some demo's.

They actually fixed a hell of a lot in these drivers so I cant wait to find some annoying bugs in them with the games I have :)

edit: typo
 
my favourite dx9 ssaver...

bacteria.JPG
 
Joe DeFuria said:
Oddly enough, nvidia now has a "Cg" (DX9?) game demo on their site:

http://www.nvidia.com/view.asp?IO=game_gunmetal

I've heard of one person who tried this on a Radeon 9700 so far, and claims it won't run (surprise)....could be isolated incident...someone else care to try?


Actually It's pretty good on my GF3. I was watching the rolling demo thinking - 'hey, this is just like something from Rage (Uk dev studio).' and then I hop over to www.yetistudios.com and it's ex-Rage developers.
 
Joe DeFuria said:
Oddly enough, nvidia now has a "Cg" (DX9?) game demo on their site:

http://www.nvidia.com/view.asp?IO=game_gunmetal

I've heard of one person who tried this on a Radeon 9700 so far, and claims it won't run (surprise)....could be isolated incident...someone else care to try?

It's NOT DX9 - quoted from the readme:

DirectX

Gun Metal requires DirectX version 8.1 or later. When the Gun Metal installation completes, you will be given the option to install DirectX 8.1. If you decline to install DirectX 8.1 at this point, but find later that you need it, please reinstall Gun Metal.


Doeas not run on my R9700 Pro - here is the log (times are in GMT:)):

Code:
Time: 21/12/2002  00:14:53
No supported 3D card found
Time: 21/12/2002  00:15:27
No supported 3D card found
 
GunMetal, the Cg version, was specifically listed to utilize DX 9 features such as floating point precision (I think they said "128-bit float") when I looked earlier on the nVidia page for it.. This does not contradict with it being able to run on DX 8.1 or higher...

The thing is it requires Cg to utilize this functionality, it seems. :-?

Maybe the DX 9 HLSL isn't implemented or fully functional yet. *shrug* Doesn't seem to make much sense with the nv30 not released and all to require it for that functionality, but it does lend itself to the perception of ATI being "incompatible" by the uninformed...I guess the next best thing to actually having a DX 9 capable part.

Who knows, Yeti themselves should easily be able to release a DX 9 "port" of the functionality in short order if they decide to.
 
demalion said:
The thing is it requires Cg to utilize this functionality, it seems. :-?

but it does lend itself to the perception of ATI being "incompatible" by the uninformed

Cg doesn't have any shader functionality AT ALL beyond HLSL, in fact it has rather less.

There's of course no technology in that demo that a Radeon 9700 couldn't run. Even if it was a strict DX9 game, the 9700 could still handle all those features, albiet the fp rendertarget wouldn't be the same depth, but that's a quality argument, not a feature one.

So I'm pretty pissed that NVIDIA and that company stooped to the level of ignoring other vendors. The 9700 is a complete superset of the GF3 features, so there's no reason in the world it should reject that card, except for a hard-coded line that reads:

if(deviceStr.find("Geforce") == -1)
Error("No supported 3D card found");
 
So I'm pretty pissed that NVIDIA and that company stooped to the level of ignoring other vendors. The 9700 is a complete superset of the GF3 features, so there's no reason in the world it should reject that card, except for a hard-coded line that reads:

I can't figure out why anyone should be angry with this design choice from Yeti/NVidia.

As it's underscored and touted as "Cg" all over the place, runs on GF3's/4's yet fails to run on 9700 Pros, this can only cause NVIDIA to lose perceptual value for Cg which they have spent so much time already trying to dispell single platform myths.

It likely has absolutely zero to do with Cg itself, but the perceived image is going to be strongly imposed as "Use Cg, limit your target to one IHV" from this tactic. Not a very smart move on NVidia's part.
 
Try reading through this....

http://www.bluesnews.com/cgi-bin/board.pl?action=viewthread&threadid=39214

With nice Comments like..

Why would you ATI guys think nVidia would get a demo for ATI cards and put it on their site. The text on the page screams "CG" at you, come on!

it's not a dx9 demo (not fully anyway, i guess it might take advantage of it if it finds it) it's a Cg demo, Cg is nvidia only I beleive, and as it's a demo released by nvidia, that states it's to show off Cg, really, what did u ATI guys expect?

There is more where that came from..

sharkfood

As it's underscored and touted as "Cg" all over the place, runs on GF3's/4's yet fails to run on 9700 Pros, this can only cause NVIDIA to lose perceptual value for Cg which they have spent so much time already trying to dispell single platform myths.

It seems to me that exactly the OPPOSITE of what you are saying will hapen. It makes the General public this that Nvidia is *better*, and that you should own *Nvidia* hardware. Whaich is EXACTLY what i said would happen when the big arguments over this took place.

You are right though.. wether Nvidia thought through this or not this little incident screams *CG IS NVIDIA PROPRIETARY*.. even if it has zippo to do with the actual reaons it wont run on Ati cards..

However.. It also makes it look like Ati has screwed up DX9 hardware.. to the uninformed..

Which Is REALLY underhanded imo. Thus Nvidia has done nothing but prove me right again about the total LACK of Character is their entire Company.
 
790 said:
The 9700 is a complete superset of the GF3 features, so there's no reason in the world it should reject that card, except for a hard-coded line that reads:

if(deviceStr.find("Geforce") == -1)
Error("No supported 3D card found");

Really?
What about:

Code:
if (FAILED(pD3D->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_X8R8G8B8, D3DUSAGE_RENDERTARGET, D3DRTYPE_TEXTURE, D3DFMT_D24X8)))
      Error("No supported 3D card found");
 
Hyp-X said:
Really?
What about:

Code:
if (FAILED(pD3D->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_X8R8G8B8, D3DUSAGE_RENDERTARGET, D3DRTYPE_TEXTURE, D3DFMT_D24X8)))
      Error("No supported 3D card found");

How about...

Code:
if(FAILED(m_pd3dDevice->CreateTexture(size, size, 1, 
		D3DUSAGE_RENDERTARGET, D3DFMT_A32B32G32R32F,D3DPOOL_DEFAULT, &renderMap, NULL)))
 Error("Sorry, your GeForce1/2/3/4 card doesn't support fp targets or any other DX9 features. Hold on 3 months while we get out the GeForceFX (at $450 it's a steal), or try a Radeon 9700/9500...");

m_pd3dDevice->CreateDepthStencilSurface(size,size, D3DFMT_D24X8, D3DMULTISAMPLE_NONE, 0, TRUE, &renderMapZ, NULL));
 
Hyp-X said:
790 said:
The 9700 is a complete superset of the GF3 features, so there's no reason in the world it should reject that card, except for a hard-coded line that reads:

if(deviceStr.find("Geforce") == -1)
Error("No supported 3D card found");

Really?
What about:

Code:
if (FAILED(pD3D->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_X8R8G8B8, D3DUSAGE_RENDERTARGET, D3DRTYPE_TEXTURE, D3DFMT_D24X8)))
      Error("No supported 3D card found");

Yup - that call does not currently succeed on R9700 (not due to any lack of hardware support for rendering to Z buffers, obviously).

On the other hand I believe it also fails on the reference rasterizer (I need to check, but I don't believe that D3DFMT_D24X8 appears in the adapter format caps for D3DRTYPE_TEXTURE in refrast)...

Ok... I've checked, and this code doesn't run correctly on refrast

...which naturally raises very major question marks as to whether this is legal or not (refrast is the spec in this regard - it is supposed implement all of what is legal). In fact, I would say that this conclusively makes this illegal from the specification point of view.

It is therefore extremely questionable if this construct should ever be used at all, unless you specifically want to target non-standard hardware, which effectively changes this line back to what we had before -

Code:
if(!only_vendor_I_want_to_support)
   FAIL;

Not very good when you are coding for an industry standard API - you might as well use Glide if you're going to do something like this...

- Andy.
 
Hyp-X said:
Code:
if (FAILED(pD3D->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_X8R8G8B8, D3DUSAGE_RENDERTARGET, D3DRTYPE_TEXTURE, D3DFMT_D24X8)))
      Error("No supported 3D card found");

The above code is not valid on nVidia hardware eighter - there's a typo.
Here's a (hopefully) "correct" one.

Code:
if (FAILED(pD3D->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_X8R8G8B8, D3DUSAGE_DEPTHSTENCIL, D3DRTYPE_TEXTURE, D3DFMT_D24X8)))
      Error("No supported 3D card found");

And yes, this is an nVidia only feature, that isn't supported by the reference rasterizer.
It's primary use is for shadow buffers.
 
andypski said:
Ok... I've checked, and this code doesn't run correctly on refrast

I corrected the code - but, no it doesn't run on refrast.

...which naturally raises very major question marks as to whether this is legal or not (refrast is the spec in this regard - it is supposed implement all of what is legal). In fact, I would say that this conclusively makes this illegal from the specification point of view.

ZBias is illegal too?
It's implemented by both nVidia and ATi, yet it has no support in refrast.

It is therefore extremely questionable if this construct should ever be used at all, unless you specifically want to target non-standard hardware, which effectively changes this line back to what we had before -

Code:
if(!only_vendor_I_want_to_support)
   FAIL;

Not very good when you are coding for an industry standard API - you might as well use Glide if you're going to do something like this...

I agree that it's not good at all.

But not because it uses a vendor specific feature.

If it was a technical demo made by nVidia or a third party it's not a problem if it uses and requires vendor specific features.

It is still legal for a game to use vendor specific features - optionally.

What is no good is if a game requires a vendor specific feature. It should have fallback options at least, a warning in the worst case.

Note however I did not deny that what they actually did was excluded non-nVidia hardware. I don't even know if they use shadow buffers at all.

Of course it is possible to do shadow buffering in standard DX9 using fp buffers, and PS2.0.
Actually it is possible on R8500/R9000 using PS1.4 as well with reduced precisity.


It strange that you say Glide, you supposed to say OpenGL extensions didn't you? :rolleyes:
 
ZBias is illegal too?
It's implemented by both nVidia and ATi, yet it has no support in refrast.

D3DPRASTERCAPS_DEPTHBIAS is present in the reference rasterizer, as is D3DPRASTERCAPS_SLOPESCALEDEPTHBIAS, so I'm not sure as to the question here? Is it that it doesn't operate the same as you expect, or doesn't work at all?

I agree that it's not good at all.
But not because it uses a vendor specific feature.
If it was a technical demo made by nVidia or a third party it's not a problem if it uses and requires vendor specific features.

It is bad for this to be exposed at all - it is a fragmentation of the specification, and an obvious one since not even the reference rasteriser can run the code.

If this path exists then someone will use it, and not necessarily even check the reference rasterizer for legality or behaviour. Then after they go miles down this (incorrect) route they finally find that they have coded using a feature that doesn't even exist in the specification. This messes up everyone - I see at least 3 possible scenarios -

- Other hardware vendors have to expose this same (incorrect) behaviour (assuming, of course, that they can).

- The application writer has to rewrite their graphics engine to work around the lack of this feature (might not be easy if they've got a long way with coding the application).

- The writer releases a title that doesn't work on other vendor's hardware but doesn't realise it because they haven't bothered to test on that hardware (or even the reference). The other hardware vendor gets blamed for not supporting the feature - 'Oh, but it works on vendor X, and they're only a DX8 part - how come your Dx8.1, DX9. DX10 part can't do it'. Complaints by vendor Y that this feature are non-standard get short shrift from the developer and the consumer, with the developer shifting the blame to vendor Y at any opportunity, and the consumer wondering why the DirectX game doesn't work on their card, with vendor X's apologists claiming that it must be due to vendor Y's 'bad drivers'.

Everyone loses (except potentially VendorX who sits back and laughs, especially if the third option above is what happens)

It is still legal for a game to use vendor specific features - optionally.

Yes - they can use caps bits, but they then should not be advertising their application as DirectX compatible (if they stray outside of RefRast).

I agree that for this nVidia-specific demo they are ok, but there is a more insidious problem here, and one that can affect real applications. Did you just assume that your code above would run on refrast because it worked on nVidia hardware?

It strange that you say Glide, you supposed to say OpenGL extensions didn't you? :rolleyes:

Ok - we can use OpenGL extensions as the example then if it makes you feel better about it :rolleyes:

The same problem does exist with GL extensions as well. At least the developer should be aware that they are extensions, and may therefore not work. You can't (or shouldn't) claim that an application is OpenGL 1.2 compatible if it requires a vendor-specific extension, and won't run on any hardware that doesn't have the extension, can you? It might just as well be Glide.

Admittedly this should be clear to people in DirectX when checking caps bits as well - if a cap bit is not in Refrast then it is not DirectX, but sadly some developers don't even bother, and then...

[edit] - clear up glide->OpenGL thing...
 
Hellbinder said:
It seems to me that exactly the OPPOSITE of what you are saying will hapen. It makes the General public this that Nvidia is *better*, and that you should own *Nvidia* hardware. Whaich is EXACTLY what i said would happen when the big arguments over this took place.

You are right though.. wether Nvidia thought through this or not this little incident screams *CG IS NVIDIA PROPRIETARY*.. even if it has zippo to do with the actual reaons it wont run on Ati cards..

However.. It also makes it look like Ati has screwed up DX9 hardware.. to the uninformed..

Which Is REALLY underhanded imo. Thus Nvidia has done nothing but prove me right again about the total LACK of Character is their entire Company.


If the "general public" thinks "nVidia is better" because of a single *nVidia product demo* on nVidia's web site (for DX 8.1, yet), then, my goodness, just think how much "better than nVidia" will Mr. General Public think ATI is when he goes to ATI's site and discovers all these cool DX9 demos that won't run on current nVidia cards?....*chuckle*...

As far as the "CG" portion of this goes--good grief--the "general public" wouldn't know a "cg" from an "aa", so that's not even relevant.

So just forget about gunmetal--it doesn't even count--and it isn't going to convince anybody that 'nVidia is better'...

And listen...about CG...if nVidia starts cramming it full of a lot of nVidia-specific stuff, developers will abandon it like the plague! Software developers aren't in business to sell graphics hardware for anybody--they want to sell their software, and a very big part of that is making sure it runs on the widest possible variety of hardware. I think you can relax about this--doesn't mean anything as far as I can see.
 
Back
Top