First DX9 compliant GPU?

Which of these companies will product the first (full) DX9 compliant GPU?

  • Bitboys

    Votes: 0 0.0%
  • Matrox

    Votes: 0 0.0%
  • nVidia

    Votes: 0 0.0%
  • PowerVR Technologies

    Votes: 0 0.0%
  • Other

    Votes: 0 0.0%
  • Creative/3DLabs

    Votes: 0 0.0%

  • Total voters
    116
All right..I'll give ya that :LOL:

I did hack the poll though and voted 24 times ..then hit Kyles @ H famous next gen card poll another 1000 times :oops: :D

<edit>
Joke just in case the flame throwers come out.
 
Love reading some of those posts. Especially the ones pointing out that Huang said the NV30 would not be called a GeForceXX.

As it turns out, you only need to change one of those X's... GeForceFX.

:D
 
Bigus Dickus said:
Love reading some of those posts. Especially the ones pointing out that Huang said the NV30 would not be called a GeForceXX.

As it turns out, you only need to change one of those X's... GeForceFX.

:D

'Coz IIRC Huang said that approx. one year ago, Jan/Feb of 2002.
 
Microsoft + Intel/AMD did it already!
Look at the reference rasterizer it has all features (true dm, n-patches, rt-patches, ps3.0, vs3.0, ...).

;)

Thomas
 
If you want to see a GPU that will support every single feature supported in the reference rasterizer for DX9, it's probably never going to happen. I don't think that this has happened with any previous DX versions yet.
 
Chalnoth said:
If you want to see a GPU that will support every single feature supported in the reference rasterizer for DX9, it's probably never going to happen. I don't think that this has happened with any previous DX versions yet.

Whats in the DX5/6/7 that R300 or NV30 doesn't support? :D
 
DaveBaumann said:
Whats in the DX5/6/7 that R300 or NV30 doesn't support? :D

(Quintic) RTPatches... some funky filter modes that MS added at some point, can't remember the name but nobody ever implemented them. There is probabaly more in terms of bizare things, I think there is even a cap for order independent translucency which R300 and NV30 can not support :LOL: (Neon250 could though IIRC)
 
Not sure about NV30, but GF4 doesn't support the MIRROR_ONCE texture address mode that has been around for quite a while, at least since DX6.
 
Obviously, we all were wrong. the correct answer is Imagine

According to this:
http://cva.stanford.edu/imagine/project/im_impl.html
they got working silicon back from the fab 04/2002. Did ATI have r300 in metal then ?
Ok, ok, i know we arent talking about consumer product but its damn interesting IMO. Strange how there has been almost no talk about it on B3D forums.
Obviously both OGL and Reyes have been successfully run on simulators, and probably on the chip as well. Basically, with appropriate programming and drivers, even the prototype board should be able to function as a PCI graphics card.
Wonder what the real-world performance would be.
 
DaveBaumann said:
Can be done in the shader... :D

But does it support the DX5/6/7 interfaces for it? Regardless, it is of little consequence, as others gave examples that are probably better than this one.
 
What strike me as funny is that it looks like some people went ahead and voted for the R300 (edit: ATI) after this thread was resurrected.
 
Chalnoth said:
But does it support the DX5/6/7 interfaces for it? Regardless, it is of little consequence, as others gave examples that are probably better than this one.

Thats just a case of writing a driver to do it. The Driver may actually exectue a shader program, but as far as the developer is concerned he'd just be doing a standard call.
 
I didn't see an option for "None of the above". which is where my vote goes.

I'd like to think that the next 8 to 14 months will see bigger break-throughs in innovation, new technologies and non-"standard" methods of doing 3D Graphics, to the point of which DirectX 9.0 will have outlived it's usefulness and a whole new revision would be required.

After all.. chasing a goal of hitting some arbitrary ceiling of maximum shader instructions or programmability just doesn't seem all that important. Of more importance are new, unusual and clever ways to do the same and push the envelope to that "photo realism" ahead of the the technology gap.
 
DaveBaumann said:
Thats just a case of writing a driver to do it. The Driver may actually exectue a shader program, but as far as the developer is concerned he'd just be doing a standard call.

That won't work in DX8, and will significantly hurt performance.
 
Back
Top