DirectX 3/5/6 difference?

vlask

Newcomer
Working on new version of my vgamuseum.info site. That means lot more info and more research at old chipsets. But still dont know if there's any difference in DirectX 3/5/6 support. DX7 is clear - T&L unit. But dont know requirements of earlier versions. Its anywhere around any list of HW features required by certain DX version, or software tool (Win9x friendly), which tell me about card dx compatibility?

Historic sources are not clear - most companies didnt cared about DX compatibility. They usually state only that chip has dx support, thats all. Im talking about early chips like Voodoo 1/Rush/2, 3D Rage II/Pro, Virge and similar 1 pipe stuff around 1996-97.
 
From what I know
DirectX 3 : horrible
DirectX 5 : usable, some games used it
DirectX 6 : now you get all sort of high tech things, like 32bit, texture compression, some bump mapping modes (virtually unused) but these things aren't required. Maybe you needed stuff like working alpha blending and miscellaneous little aspects.
 
From what I know
DirectX 3 : horrible
DirectX 5 : usable, some games used it
DirectX 6 : now you get all sort of high tech things, like 32bit, texture compression, some bump mapping modes (virtually unused) but these things aren't required. Maybe you needed stuff like working alpha blending and miscellaneous little aspects.

Thats nice, but i dont need to know in which version i should make game, but if its there any difference in HW requirements at all between DX3 and DX5. If they are, which ones, so i can check datasheets and start searching. Same for DX6.

At wikipedia theres nothing usefull. Only that they do better programming in DX5 - that means thats there no difference between DX3 and DX5 (only better software)????. So all DX3 cards = DX5 capable?

For DX6 theres something about multitexture and stencil buffers. So DX6 card must have these features? Or are they optional? Somewhere must be some info or way how to validate dx compatibility.
 
New features in DirectX 5.0 include these:

DrawPrimitive services for Direct3D, providing developers with the flexibility to pass polygon information directly to the hardware rather than using execute buffers
Progressive meshes and enhanced animations
DirectDraw® API support for accelerated graphics port (AGP) new low-resolution modes and MMX optimizations
DirectInput® API support for force-feedback devices and a new extensible game controller control panel
The DirectPlay® API with Windows NT security, client/server support and lobby client API
DirectSound® Capture and Notify APIs, to simplify use of audio streams
DirectSound3D support for 3-D audio hardware acceleration

DirectX 6.0 introduced Bumpmapping, sse and 3dnow support also S3TC Texture compression, Range-based Fog, Single Pass Multitexture

Some comparison pics
https://developer.valvesoftware.com/wiki/DirectX_Versions
 
For DX6 theres something about multitexture and stencil buffers. So DX6 card must have these features? Or are they optional? Somewhere must be some info or way how to validate dx compatibility.
Up until DX10, most of the D3D features were optional capabilities for the IHVs.

The D3D6 most usable/famous feature was multi-texturing, AFAIK.
Bump-mapping, at that time, was still either mostly proprietary implementation, or simply too slow for production work (DOT3).
 
Up until DX10, most of the D3D features were optional capabilities for the IHVs.

The D3D6 most usable/famous feature was multi-texturing, AFAIK.
Bump-mapping, at that time, was still either mostly proprietary implementation, or simply too slow for production work (DOT3).

I remember Matrox was first with EMBM support in their G400 card. I wanted to play DM2 with EMBM enabled but none of nVidia or ATi cards supported this technique at that time and G400 was too expensive and too glitchy with most games, besides it was slow.
 
If I remember correctly dx6 introduced vertex buffers and dx7 index buffers (index buffers came later than vertex buffers). Previously you only had "drawPrimitiveUP"-style functionality. Dx7 had both embm and dot3 texture combiners (embm allowed dependent texture read). Dx6 had multitexturing (you had a separate flag to load textures to voodoo 2 second texture unit memory, as voodoo 2 had split memory. Frame buffer + two texture unit memories). Dx8 introduced shaders (pixel shader max instruction limit was very short. IIRC only 8 instructions).

I had a Matrox G400 too. Got it free from them because my old indie games supported EMBM :)
 
Yeah I think a D3D3 card was capable of 5. Not all companies bothered with support though because of how fast hardware went obsolete back then.
 
I still keep some binary samples from DX3 and DX5 SDKs and all of them are still running on the latest Windows and D3D11 hardware, with full HW acceleration. Comparability is one thing where DX definitely excels. ;)
 
Ok, so lets say, that only difference between DX3 and DX5 support is in drivers. Removing then DX3 tag and using DX5 for all 3D cards that had official DX support.

Now main question is whats the main difference between DX5 and DX6 hardware. Is it multitexturing? (Voodoo 2 will be then DX6 card?).

EMBM was optional (edit: maybe not), so anyone knows any required function for DX6? Maybe DXT (renamed S3TC from Savage3D) - then Voodoo would be only DX5.

Edit: wiki says that card should have bump mapping + DXTC to be certified. So these are main features required?
 
Last edited by a moderator:
Ok, so lets say, that only difference between DX3 and DX5 support is in drivers. Removing then DX3 tag and using DX5 for all 3D cards that had official DX support.

Now main question is whats the main difference between DX5 and DX6 hardware. Is it multitexturing? (Voodoo 2 will be then DX6 card?).

EMBM was optional (edit: maybe not), so anyone knows any required function for DX6? Maybe DXT (renamed S3TC from Savage3D) - then Voodoo would be only DX5.

Edit: wiki says that card should have bump mapping + DXTC to be certified. So these are main features required?
Everything was optional. For example the TNT2 is your stereotypical DX6 card and it doesn't have DXTC support or EMBM support. IIRC at least one of the TNT2 variants didn't even support support multitexturing (and I know the Savage 3D didn't).
 
What is defined as multitexturing support?, e.g. Voodoo 1, Banshee happily support it despite having only one TMU, it's done in two passes.
Single pass multitexturing is of course a great feature (and done by a Voodoo2 running old Glquake), but my understanding is that e.g. Geforce 256, being a 4x1 pipeline configuration doesn't do it.

That way, Ati Rage Pro and Voodoo 1/Rush support it, S3 Virge maybe doesn't.

Even Voodoo3 went to a 2x1 configuration (vs 1x2 for Voodoo2 and 1x1 for Banshee), and my understanding is TNT/TNT2 are 2x1. Ditto VSA/100.
Geforce 2MX (latter updated to 4MX) was a 2x2 configuration with TnL, and was über powerful relatively speaking so to me it spells the end of the era of proto-GPU and early GPU.
 
Last edited by a moderator:
Direct3D 9 supported DX6 cards fine, by the way.
UT2003 ran on Banshee in a fully recognisable form :), with merely rather low res textures and 16bit rendering. Though I don't know if that game uses DX8 or DX9 with a DX 8.x featureset at most.


Direct3D 8 and up did not support the Voodoo2 anymore, it only support full 2D+3D cards and that was my pet grief.
 
Direct3D 9 supported DX6 cards fine, by the way.
UT2003 ran on Banshee in a fully recognisable form :), with merely rather low res textures and 16bit rendering. Though I don't know if that game uses DX8 or DX9 with a DX 8.x featureset at most.


Direct3D 8 and up did not support the Voodoo2 anymore, it only support full 2D+3D cards and that was my pet grief.

Not creating offical d3d8.x compatible drivers for voodoo2 might have been a business decision and not a technical one. Of course it wouldn't take advantage of any of the new feature sets, but I'm just talking compatibility.

http://simhq.com/forum/ubbthreads.php/topics/687595/Voodoo2_with_DX8
http://www.rage3d.com/board/showthread.php?t=33653062
 
Everything was optional. For example the TNT2 is your stereotypical DX6 card and it doesn't have DXTC support or EMBM support. IIRC at least one of the TNT2 variants didn't even support support multitexturing (and I know the Savage 3D didn't).

I think it's even beyond nVidia to make a TNT brand card without multitexturing, I mean it stood for TwiN Texel..
 
I think it's even beyond nVidia to make a TNT brand card without multitexturing, I mean it stood for TwiN Texel..
Yeah AFAIK every RIVA TNT (and Vanta) was 2x1.

Maybe Ryan is thinking of the Voodoo3 that Compaq shipped that was 1x1. They eventually branded it as 3dfx Velocity 100.
 
Riva 128 was single texturing and Riva TNT was dual texturing. Voodoo 2 was basially a Voodoo 1 with dual texturing (split physical memory for each texture unit) and some other tweaks. I had all of these cards (and programmed for them). I started with DirectX 5 so I don't know anything about DirectX 3, except that the general opinion about it (it was hard to use).

Both embm and dot3 combiners were optional. Matrox (G400) and ATI supported EMBM, while NVIDIA (GeForce) supported dot3 (TNT didn't). Voodoo 1/2/3 didn't support any of these. I never owned A Voodoo 4. EMBM required triple texturing support (base texture + UV bias texture + secondary texture). Both ATI and Matrox supported triple texturing (even while not doing EMBM). Nvidia and 3dfx only supported dual texturing.

You had multiple caps fields to check GPU supported features. If I remember correctly Dx7 still supported the original Voodoo just fine (I assume vertex and index buffers were emulated by the driver).
 
Yeah AFAIK every RIVA TNT (and Vanta) was 2x1.

Maybe Ryan is thinking of the Voodoo3 that Compaq shipped that was 1x1. They eventually branded it as 3dfx Velocity 100.
Ahh yes. I was probably confusing a Vanta with a Voodoo (or a ViRGE or a Verité; so many V's in the 1990s...).
 
Back
Top