ATi unveil R420

Ostsol said:
Diplo said:
When Nvidia unveiled T&L with the GeForce 256 did any games support it? Nope. I remember then 3DFX fans saying how it would never catch on and laughing that no games supported it (I bought my GeForce 256 2nd hand off someone who sold it to buy a Voodoo3 - hah!). Now how many games don't support T&L?
I think this was only true for Direct3D games, where the programmer explicitly has to use or to not use HT&L. In OpenGL the difference is entirely in the drivers. There's no difference in the coding.

No a lot of DX5/6 (and open GL games of the time) games did the transform and lighting in their own code and then just passed pretransformed and lit verts to the driver.

The argument was that the openGL implementaion (and the DX one) was too slow, and by coding it in software the dev could take advantage of a lot of shortcuts. It's one of the reasons it took a while for Hardware T&L to take off. To use it devs had to commit to the slower software paths for there legacy support.
 
Ardrid said:
Well, if I remember correctly, 3Dc is an open standard, meaning NVIDIA could support it. Obviously it wouldn't be supported in NV40, but the opportunity is there.
I predict 3Dc will receive the same support from nVidia that Cg received from ATi.

nVidia will be providing its increased visual quality/performance from within the specs of DX9, ATi won't.
 
nVidia will be providing its increased visual quality/performance from within the specs of DX9, ATi won't.

Kinda like they did with the NV30, right?

Right...

Nvidia relying totally on an API's specs.. :rolleyes: That'll be the day.
 
Ardrid said:
Well, if I remember correctly, 3Dc is an open standard, meaning NVIDIA could support it. Obviously it wouldn't be supported in NV40, but the opportunity is there.
FXTC was an open standard too.

3Dc better have significant advantages over the current compression if it wants traction.
 
my post covered more than just Cg...

Does not change the fact that one IHV will play by the spec this round and one won't. I guess if you do want to drag NV3x and Cg into it, its fairly clear what happens to the non spec player.
 
radar1200gs said:
I wasn't aware that we were discussing NV3x here...

We're discussing Nvidia (since topics always digress from the original topic), and since you say they will get all their visual quality/performance straight from the DX9 specs, I feel inclined to make an example of where they completely failed, and in fact, REFUSED to follow the DX9 specs...

It's not the first time they haven't stuck to an API's specs.. Who says they have changed their ways? Because you say they will only strictly reap their visual quality/performance from DX9?
 
radar1200gs said:
I guess if you do want to drag NV3x and Cg into it, its fairly clear what happens to the non spec player.

Yeah because we all know NV3x's performance problems translate peeeerfectly into this situation...
 
Nick Spolec said:
radar1200gs said:
I wasn't aware that we were discussing NV3x here...

We're discussing Nvidia (since topics always digress from the original topic), and since you say they will get all their visual quality/performance straight from the DX9 specs, I feel inclined to make an example of where they completely failed, and in fact, REFUSED to follow the DX9 specs...

It's not the first time they haven't stuck to an API's specs.. Who says they have changed their ways? Because you say they will only strictly reap their visual quality/performance from DX9?
Only one problem with your argument: you are the one who dragged NV3x into this, trying to link it to Cg, which was developed and used BEFORE NV3x

While we are on the subject of non standard extensions, what shall we say about the success (or lack therof) of Truform?
 
radar1200gs said:
Only one problem with your argument: you are the one who dragged NV3x into this, trying to link it to Cg, which was developed and used BEFORE NV3x

While we are on the subject of non standard extensions, what shall we say about the success (or lack therof) of Truform?

Who brought up cg? Oh ya, that was you.
 
AlphaWolf said:
radar1200gs said:
Only one problem with your argument: you are the one who dragged NV3x into this, trying to link it to Cg, which was developed and used BEFORE NV3x

While we are on the subject of non standard extensions, what shall we say about the success (or lack therof) of Truform?

Who brought up cg? Oh ya, that was you.
And that was in direct response to the assertion that nVidia would support 3DC. I could just have easily have used truform as an example...
 
Only one problem with your argument: you are the one who dragged NV3x into this, trying to link it to Cg, which was developed and used BEFORE NV3x

Heh.

When I made a comment about Nvidia not sticking to DX9 with the NV30, I didn't mean it was because Cg....

I meant it as the NV30 being a DX9 card but "Not supporting all DX9 features" and "Not rven using DX9 features".


While we are on the subject of non standard extensions, what shall we say about the success (or lack therof) of Truform?

How about Nvidia's "Per pixel shading" with the GeForce2 series? Giga Texel Shader? WTF IS THAT? Shadow Buffers from GeForce3? How many more useless features can I list that Nvidia have pushed and promoted, yet, rarely ever got used?
 
radar1200gs said:
Ardrid said:
Well, if I remember correctly, 3Dc is an open standard, meaning NVIDIA could support it. Obviously it wouldn't be supported in NV40, but the opportunity is there.
I predict 3Dc will receive the same support from nVidia that Cg received from ATi.

nVidia will be providing its increased visual quality/performance from within the specs of DX9, ATi won't.
:?:
And that was in direct response to the assertion that nVidia would support 3DC.
 
Nick Spolec said:
Only one problem with your argument: you are the one who dragged NV3x into this, trying to link it to Cg, which was developed and used BEFORE NV3x

Heh.

When I made a comment about Nvidia not sticking to DX9 with the NV30, I didn't mean it was because Cg....

I meant it as the NV30 being a DX9 card but "Not supporting all DX9 features" and "Not rven using DX9 features".


While we are on the subject of non standard extensions, what shall we say about the success (or lack therof) of Truform?

How about Nvidia's "Per pixel shading" with the GeForce2 series? Giga Texel Shader? WTF IS THAT? Shadow Buffers from GeForce3? How many more useless features can I list that Nvidia have pushed and promoted, yet, rarely ever got used?
But they were all over and above the spec that existed at the time, not a desperate attempt to make up for missing portions of the spec.
 
Register combiners formed the foundation of DX8, and shadow buffers do get used in a few games (such as Splinter Cell), more heavily on the XBox. If ATI had supported them, you'd see alot more usage. They're gonna be in OpenGL2.0
 
IgnorancePersonified said:
radar1200gs said:
Ardrid said:
Well, if I remember correctly, 3Dc is an open standard, meaning NVIDIA could support it. Obviously it wouldn't be supported in NV40, but the opportunity is there.
I predict 3Dc will receive the same support from nVidia that Cg received from ATi.

nVidia will be providing its increased visual quality/performance from within the specs of DX9, ATi won't.
:?:
And that was in direct response to the assertion that nVidia would support 3DC.
 
Back
Top