DirectX 3/5/6 difference?

The TNT(2) and the G400 were both 2x1 cards and yet supported multitexturing.

Multitexturing is possible even with a single TMU/pipeline as long as you have two texture ports and loop-back capability. The Savage4, for example, retained the 1x1 configuration of the Savage3D but added a second texture port. The G400 has three texture ports for EMBM.

Loop-back was (is?) also used for trilinear filtering. In general, I remember that writing some 3D code was a real mess because different cards almost inevitably yielded different results.

My memory of that period is a bit fuzzy though. I still have a collection of several hardware/register manuals for many GPUs of the period somewhere for those interested.
 
T
My memory of that period is a bit fuzzy though. I still have a collection of several hardware/register manuals for many GPUs of the period somewhere for those interested.

Stiletto collecting them. We merged our archives and hes uploading them also to my site. Dont have yet all of them sorted now.

Im still interested in any datasheet or other info. Can make you ftp account for uploading, if you willing to share them.
 
Sure, no problem in sharing them with others. Most of what I have collected was found scouting obscure FTPs around the world so it has been available on the net one way or another.

It will take me sometime though. I remember last year I posted the Savage4 hardware manual on this very board... or was it the Savage2000?

Feel free to contact me via PM if you want me to upload all this stuff somewhere.
 
Btw.... DirectX6 bump mapping wasn't really referring to EMBM or dot3. Almost no cards could do those at the time. They usually were referring to emboss bump map texture layers. I think Arx Fatalis uses the technique but not sure what else. It tends to look blurry and a bit like detail textures.
 
Evolva uses emboss and dot 3 iirc
edit:
diffuse bumpmapping according to the readme

kOFkqvn.jpg

EEpWadL.jpg

OcTgTVC.jpg

UIqZjJP.jpg
 
Sure, no problem in sharing them with others. Most of what I have collected was found scouting obscure FTPs around the world so it has been available on the net one way or another.

It will take me sometime though. I remember last year I posted the Savage4 hardware manual on this very board... or was it the Savage2000?

Feel free to contact me via PM if you want me to upload all this stuff somewhere.

If you've any hardware manual vlask does not have, I will be interested as well (and would be surprised!). Let us know. :)
 
From what I know
DirectX 3 : horrible
DirectX 5 : usable, some games used it
DirectX 6 : now you get all sort of high tech things, like 32bit, texture compression, some bump mapping modes (virtually unused) but these things aren't required. Maybe you needed stuff like working alpha blending and miscellaneous little aspects.

The difference between D3D 3 and 5 was not features but the API itself, and its funny that D3D12 vs 11 will be almost the same case but the other way around...

It's funny because that the way of render stuff in Direct3D 3 is much more similar to the way D3D12 will work, and back in the days people hate it because it was too complex compared to OpenGL. So the main improvement of D3D 5.0 (vs 3.0) was the introduction of the Draw Primitive commands, which made the API much more similar to OpenGL.

Previously Direct3D 3.0 used Execution Buffers, those were build and then executed, you never draw directly to the device, in a similar way of what new modern APIs like Mantle or D3D12 are designed (build command buffers then execute them).

Of curse in the 90s, the CPU did a lot of work in the pre T&L era (since vertex processing was done in the CPU), so the cost of API and driver overhead was minimum. Also you didn't need to use more than one core/thread since there was no multicore CPUs.
 
I remember Matrox was first with EMBM support in their G400 card. I wanted to play DM2 with EMBM enabled but none of nVidia or ATi cards supported this technique at that time and G400 was too expensive and too glitchy with most games, besides it was slow.
The G400 Max was neither slow nor glitchy and it had much better image processing quality as well as better output circuitry quality than the competition. In fact, I would've rather have had a G400 max than a Geforce 256 DDR.

The G400 was way ahead of its time.
 
Yeah the Geforce 256 had some troubles. Mainly that many had cheaply designed boards that were blurry. Also they had a DXTC bug that caused image quality issues. Faster than G400 Max without a doubt though and superior driver support.

Actually in retrospect I think Voodoo3/4/5 were underappreciated. Glide was still quite useful back then. UT, Deus Ex, etc ran best with it. Voodoo3's 16-bit limit wasn't much of an issue in reality back then. And, Voodoo3-5 have excellent signal quality and super fast GUI.
 
Voodoo3 was good value for gamer, but it was holding industry back, mostly with small textures despite what 3dfx was promising to developers.
 
At that time (late '99) Quake 3 was freshly released and everybody were quick to jump on the new performance bandwagon and GF256DDR just happened to be pretty much the only way to cross the 60fps barrier at 1024x768 in the latest id hit title, even at 32-bit color. This had a huge marketing effect, certainly more than the UT performance metrics.
 
Yeah the Geforce 256 had some troubles. Mainly that many had cheaply designed boards that were blurry. Also they had a DXTC bug that caused image quality issues. Faster than G400 Max without a doubt though and superior driver support.

Actually in retrospect I think Voodoo3/4/5 were underappreciated. Glide was still quite useful back then. UT, Deus Ex, etc ran best with it. Voodoo3's 16-bit limit wasn't much of an issue in reality back then. And, Voodoo3-5 have excellent signal quality and super fast GUI.
I agree about the Voodoo5 being underrated, but the GeForce 256DDR was overrated in my opinion because it had bad filtering quality and the DXTC bug in addition to not having superb signal quality like 3dfx and Matrox did. Nvidia's first good architecture was G80 in my opinion.
 
G80 was quite the shake up no doubt. But there were some great NV moments before that with clear advantages over the competition. Radeon DDR and Radeon 8500 were not exactly amazing from a hardware or software standpoint.
 
at least you guys weren't one of the poor saps that bought the GeForce sdr boards only to find out 3 months later the ddr boards came out and greatly increased performance :-[ I cut so many yards for that card
 
G80 was quite the shake up no doubt. But there were some great NV moments before that with clear advantages over the competition. Radeon DDR and Radeon 8500 were not exactly amazing from a hardware or software standpoint.

I loved my Radeon 8500 DDR! Hardware was good to very good offering more advanced shaders, dual-monitor support, 'tesselation' and few more but drivers were bad to mediocre most of the time.
 
I loved my Radeon 8500 DDR! Hardware was good to very good offering more advanced shaders, dual-monitor support, 'tesselation' and few more but drivers were bad to mediocre most of the time.

Yeah but it had problems. Its AF quality was atrocious in addition to being super duper angle dependent. It didn't work right with some KT266-KT333 boards. It had PS1.4 sure but it turned out to be pointless because it wasn't efficient at it (still blown away by GF4 in Doom3), and most games used PS 1.1-1.3 anyway. Truform tessellation was almost totally useless. The DVI barely works. Geforce4 stomped Radeon 8500, and that's why I got 8500 for $90. lol

And then there were the drivers..... With the driver issues that card had I wonder how broken the GPU was. I remember the ugly'n'slow antialiasing would get broken, fixed, broken, on and on.
 
Yeah but it had problems. Its AF quality was atrocious in addition to being super duper angle dependent. It didn't work right with some KT266-KT333 boards. It had PS1.4 sure but it turned out to be pointless because it wasn't efficient at it (still blown away by GF4 in Doom3), and most games used PS 1.1-1.3 anyway. Truform tessellation was almost totally useless. The DVI barely works. Geforce4 stomped Radeon 8500, and that's why I got 8500 for $90. lol

And then there were the drivers..... With the driver issues that card had I wonder how broken the GPU was. I remember the ugly'n'slow antialiasing would get broken, fixed, broken, on and on.

Bolded part was the reason why this card stayed in my memory as good product! Because I moved on to R200 before Gf4Ti and had a pleasure to own Gf3Ti200 before it, card impressed in quite a few titles. I also was lucky to own the card in time when most of the major performance improvements came from drivers, especially OpenGL went from very poor to good and usable, with great speed in Quake games. Obviously nVidia had an upper hand in Gf4Ti generation, but then we all know what happened (hint: NV30 and R300).
 
Back
Top