Original Radeon and DirectX 8 - clarification needed

Laa-Yosh

I can has custom title?
Legend
Supporter
I've had a small debate with a few guys today about the whole DX8 and Radeon issue. Basically, the other party has the following theory:
- MS has been consulting IHVs about DirectX 8 specifications
- specs are decided, IHVs start building chips based on it
- ATI completes Radeon chip, releases to market
- NV gets late because of shooting far beyond the spec
- NV lobbies the hell out of MS to change specs and succeeds
- GF3 enters market as "first" DX8 compliant part
- MS and NV sign Xbox XGPU and XMPC deal.

He also quoted an interview with the ATI CTO from Anand:

- Will the next RADEON revision be a fully DX8 compliant part, or is that offering coming at a later date?
- The currently available RADEON?, which has been shipping since last summer, already supports most key DirectX? 8.0 features including keyframe interpolation, vertex shaders, and pixel shaders with the latest drivers.  We are confident that ATI products will support any DirectX? 8.0 features that make it into released games.


Now, I disagree with him based on technology reasons. The Radeon256 had not enough of an edge over the GF2; it had a more capable fixed function T&L unit, and a register combiner with different abilities (like able to do EMBM and use 3 textures). This, IMHO, is not advanced enough to justify a new version of DirectX.
In fact, as far as I understand, Carmack said that the GF2 was more capable under Doom3 than the Radeon (better speculars vs. less passes thanks to 3rd TMU), although this functionality (combiners) was not exposed under DX.

So, I know that evidence is pretty hard to come by, but I wonder if anyone could offer a clarification...
 
Hardware design typically precedes software design. It just takes longer. DirectX is based upon the hardware designs that IHV's have already begun working on.
 
Laa-Yosh said:
I've had a small debate with a few guys today about the whole DX8 and Radeon issue. Basically, the other party has the following theory:
- MS has been consulting IHVs about DirectX 8 specifications
- specs are decided, IHVs start building chips based on it
- ATI completes Radeon chip, releases to market
- NV gets late because of shooting far beyond the spec
- NV lobbies the hell out of MS to change specs and succeeds
- GF3 enters market as "first" DX8 compliant part
- MS and NV sign Xbox XGPU and XMPC deal.
I think this theory is rather silly.
The DirectX spec is based on chip specifications, not the other way round.
The original Radeon was announced around the same time (April 2000) as the GeForce2, and the same time first details about DX8 surfaced - including PS and VS, as well as N-Patches. GeForce3 wasn't late, and the Xbox deal was signed many months before NV20 came out.

Just think for a moment about when Microsoft asked the IHVs for their input. And let's not forget about 3dfx! Rampage was planned to hit the shelves the same time as NV20, in spring 2001. So you can assume their design periods weren't too far off (well, maybe redesign period for Rampage). Now imagine MS asked the IHVs when the R100 design phase was over, but the NV20/Rampage design phase wasn't. Both NVidia and 3dfx would have told them to wait just a bit until they have their designs ready. Would Microsoft have ignored them and instead started working on a "R100 DX8"? Certainly not! Especially considering the Radeon256 pixel pipeline is fully exposed under DX7 (at least if it's more than that, ATI never mentioned it), and only the geometry features extend beyond that level.
 
I don't see why it's less silly than the original one.
Yours IS based on nothing, just like that.
Sorry, I made a mistake - the original one has some timeline, at least.

First of all, AFAIK not the Rampage nor GF3 was originally planned for Spring - and Rampage missed the shelves for ever.

Anybody has any hard facts? I'd be convinced immediately. :)

Edit: Does anybody know RadeOcean demo? "You don't need GF3 to see a pixel shaded ocean... ;)
 
Xmas said:
The DirectX spec is based on chip specifications, not the other way round.

I'm pretty sure this is true.
But it's possible that the spec changed because nVidia told MS "Hey wait for us, put this in there too!".
I've never seen a DX8 beta, but with DX9 this was what happened between beta versions (they kept on putting more and more stuff).

On the other hand I belive the pixel shader concept is MS's attempt at standardizing pixel processing, more than nVidias push.
I believe nVidia would have been much more happy if MS extended the TSS model to include more, but they refused.
The result is that neither NV1x nor NV2x are properly exposed under D3D.
 
Hyp-X said:
Xmas said:
The DirectX spec is based on chip specifications, not the other way round.

I'm pretty sure this is true.
But it's possible that the spec changed because nVidia told MS "Hey wait for us, put this in there too!".
I've never seen a DX8 beta, but with DX9 this was what happened between beta versions (they kept on putting more and more stuff).

On the other hand I belive the pixel shader concept is MS's attempt at standardizing pixel processing, more than nVidias push.
I believe nVidia would have been much more happy if MS extended the TSS model to include more, but they refused.
The result is that neither NV1x nor NV2x are properly exposed under D3D.

Hey, you're a coder, aren't you? ;) (just kidding.. :D)

What about that demo, Hyp-X? Did you see that? What do you think about Orton's blurb? :)
 
Just a couple of things.

I have heard old comments from ATI about the spec being moved - I can't remember where. It sort of said that the spec got moved and R100 becames PS0.5 - Ichy might have a better recolection on that. GF3 was initially slated for the fall, it took 5 revs though and GF2 Ultra was the contingency.
 
T2k said:
I don't see why it's less silly than the original one.
Yours IS based on nothing, just like that.
Sorry, I made a mistake - the original one has some timeline, at least.
I'm not sure I understand what you mean. I pointed out things that are IMO clearly wrong in the theory presented by Laa-Yosh's friend.

First of all, AFAIK not the Rampage nor GF3 was originally planned for Spring - and Rampage missed the shelves for ever.
Well, if they were late, and originally targeted at fall 2000, that brings their design phase even more in line with the original Radeon. That makes it even more unlikely that MS would ignore those much more capable parts and make the R100 the baseline for DX8.

Edit: Does anybody know RadeOcean demo? "You don't need GF3 to see a pixel shaded ocean... ;)
Yeah, a Matrox G400 can do EMBM, too ;)
 
Hyp-X said:
But it's possible that the spec changed because nVidia told MS "Hey wait for us, put this in there too!".
I've never seen a DX8 beta, but with DX9 this was what happened between beta versions (they kept on putting more and more stuff).
Surely the spec wasn't set in stone from day one on. But the thought of "NV lobbying the hell out of MS to change the specs to make R100 non-DX8-compliant" is really ridiculous.

On the other hand I belive the pixel shader concept is MS's attempt at standardizing pixel processing, more than nVidias push.
I believe nVidia would have been much more happy if MS extended the TSS model to include more, but they refused.
The result is that neither NV1x nor NV2x are properly exposed under D3D.
The main problem with exposing the register combiners is that the max number of ops depends on the operations themselves. NV2x can do 16 pairwise-independent MUL or DP3, but only 8 ADD or MAD. It's hard to communicate that through a unified interface.
 
Xmas said:
The main problem with exposing the register combiners is that the max number of ops depends on the operations themselves. NV2x can do 16 pairwise-independent MUL or DP3, but only 8 ADD or MAD. It's hard to communicate that through a unified interface.

That's not a real problem, since it's already the case. GeForce2 supports practically all 2 stage operations it has caps for, limited 3 stage functionality and even one 8 stage combination. There may be other combinations, but unfortunately NVIDIA doesn't document this. IMO should have been possible to extend stages to support all combiner functionality with the normal ValidateDevice caveat. The concept of validation could have been extended to shaders, too. We have to write different stages and shaders for different hardware anyway, with almost each chip having different stage capatilibites or shader format.
 
T2k said:
What do you think about Orton's blurb? :)

It wasn't the last time ATI claimed non-existant features of their hardware...

R200: programable AA sample patterns...
R300: adaptive tesselation... (or hw tesselation of any kind for that matter)
 
Xmas said:
Hyp-X said:
But it's possible that the spec changed because nVidia told MS "Hey wait for us, put this in there too!".
I've never seen a DX8 beta, but with DX9 this was what happened between beta versions (they kept on putting more and more stuff).
Surely the spec wasn't set in stone from day one on. But the thought of "NV lobbying the hell out of MS to change the specs to make R100 non-DX8-compliant" is really ridiculous.

I agree, that's ridiculous.
But I also think nobody said that: NV lobbies the hell out of MS to change specs and succeeds - hmm? :)

Edit: '[/quote]' inserted
 
Hyp-X said:
T2k said:
What do you think about Orton's blurb? :)

It wasn't the last time ATI claimed non-existant features of their hardware...

R200: programable AA sample patterns...
R300: adaptive tesselation... (or hw tesselation of any kind for that matter)
Well, the sample patterns are programmable -- just in drivers only.
 
T2k said:
I agree, that's ridiculous.
But I also think nobody said that: NV lobbies the hell out of MS to change specs and succeeds - hmm? :)
But that implies that there was already a spec set that didn't include features from both Rampage and NV20, and also that MS was not willing to include those features in DX8. Which is highly unlikely IMO.
 
from Dave B. in the DX thread

Quote:
NVIDIA Licenses Breakthrough 3D Technology to Microsoft

Strategic Agreement Enables Key Innovations and Advanced Features in Microsoft DirectX 8.0

SANTA CLARA, Calif.--(BUSINESS WIRE)--Nov. 28, 2000--In a continuing effort to drive the evolution of 3D graphics technology, NVIDIA(r) Corporation (Nasdaq: NVDA - news) today announced it developed and licensed enabling 3D features to Microsoft Corp. for their new DirectX 8.0 3D application program interface (API). NVIDIA contributions were in the areas of programmable vertex shaders, programmable pixel shaders, and rect/tri-patch support for high order surfaces. These features allow software content creators to use more elaborate artwork and flexible 3D rendering techniques in their applications and games.
http://www.nvnews.net/cgi-bin/archives.cgi?category=1&view=11-00
 
Ostsol said:
Well, the sample patterns are programmable -- just in drivers only.

Yes in the R300 they are.
IIRC that was a broken feature in the R200. They only got it working for 2xAA but even that had serious issues and it got removed.

Btw, I'm still hesitating about adding "R350: F-buffering" to my list.
They are at the stage where they start to convince developers that "it's not that useful anyway, so don't ask for support"...
 
DaveBaumann said:
Just a couple of things.

I have heard old comments from ATI about the spec being moved - I can't remember where. It sort of said that the spec got moved and R100 becames PS0.5 - Ichy might have a better recolection on that. GF3 was initially slated for the fall, it took 5 revs though and GF2 Ultra was the contingency.
That's about right I think. My recollection is that the R100 was designed and released based on a beta version of DX8 (or what became know as a beta version). After the chip was finished MS changed the spec (the reason why they changed the spec can be left up to speculation), and what was PS1.0 in the beta version became PS0.5 in the final release.
 
Ratchet said:
That's about right I think. My recollection is that the R100 was designed and released based on a beta version of DX8 (or what became know as a beta version). After the chip was finished MS changed the spec (the reason why they changed the spec can be left up to speculation), and what was PS1.0 in the beta version became PS0.5 in the final release.
There is no PS0.5 in the final release, though the number might be an accurate representation when compared to PS1.x. By the time the Radeon came out, even N-Patches were already in DX8.
 
Hyp-X said:
Ostsol said:
Well, the sample patterns are programmable -- just in drivers only.
Yes in the R300 they are.
Yep, in the Linux R300 drivers, there is an option to manually set the sample positions independently. But, unfortunately, I still can't get hardware acceleration to work because even though I got agpgart to load (finally), the system crashes every time I try to load a 3D program (unless I turn off acceleration, which isn't a very good option).

Anyway, if I ever do get hardware acceleration under Linux to work, I'll have to try to compare the R300's FSAA with gamma correction and without in a real game, or compare it vs. an nVidia card with the same sample positions, to see how much of the improvement of the R3xx's AA comes from sample positions and how much from gamma correction.
 
Back
Top