Help me remember all pre-Voodoo PC 3D accelerators.

To be fair, all the cards of that era only (initially) had their own proprietry APIs so it was really a matter of how much effort the ISV (and/or IHV) put in.

Well that's clear. The thing is, (AFAIK) Glide was the first API with both the functionality needed for games back then AND easy to program for. I remember that PVR stuff offered better image quality, as well as Verite, but were not as easy to program and in case of Verite still very slow in comparison. Well, PVR had the disadvantage of using a different rendering approach, but that's another can of worms.

Heh, I remember discussions back then about which API is better/will survive and we were all so sure that OGL will prevail and Glide and D3D will have to go. Ahh, the old times...
 
Sure, but what did these accelerate besides 2-3 proprietary titles, if anything?

OpenGL!

The workstation cards back then had multiple GPU setup with separate memory setups.
ELSA's GLoria range for instance..
 
Well that's clear. The thing is, (AFAIK) Glide was the first API with both the functionality needed for games back then AND easy to program for.
Not at all. The S3, ATI and Matrox toolkits were all pretty much identical to Glide in terms of functionality offered, and none of them were particularly tricky to code for. There were random missing bits in each bit of hardware, mind (Z buffers, blend modes, even blending itself).

The Warp 5 was an awkward beast because of the deferred rendering clashing with the prevailing paradigm. Not sure about PowerVR, they never contracted for a driver :).
 
Well that's clear. The thing is, (AFAIK) Glide was the first API with both the functionality needed for games back then AND easy to program for.

i'm sure Simon would step forth here, but from what i recall, PVR had their own API, which, IIRC, was more or less on par with glide.
 
i'm sure Simon would step forth here, but from what i recall, PVR had their own API, which, IIRC, was more or less on par with glide.

There were actually 2 levels of the "SGL" API.

One was a low-level API which was, indeed, on-par with glide (i.e. a wrapper would have been pretty easy but 3Dfx allegedly got quite upset when people did that sort of thing). It accepted batches of transformed/projected triangles.

The second was a high-level display list/tree API with lighting, instancing, projection etc etc. This took either mesh models or, preferably, objects built out of convex pieces as the API would also generate shadow volumes directly from those if desired. This API actually was implemented first, which was great for demos etc, but didn't really "bolt-on" to the low level output of games.

Simon
 
3DLabs Glint 300SX (4/94) / GiGi (95)
3DLabs Permedia (introduced: 95, relased: 97)
ATi 3D Rage (aka Rage I, Rage 3D, spring 96)
ATi Rage II (end of 96)
CirrusLogic Laguna 3D (announced : 96)
Chromatic Mpact! (announced: 95)
Matrox Athena R1/R2 (94) - Gouraud shading, Z-Buffering, double-buffering
Matrox Millennium I (spring 95)
Matrox Mystique (summer 96)
Number9 Imagine 2 (aka Imagine 128 series 2, 96)
nVidia NV1 (95)
Power VR PCX (aka PCX1, ??)
Rendition Verite 1000 (H2 96)
S3 Virge (aka Virge 325, 95/96)
S3 Virge VX (Q4 96)

and some other chips (I don't now their release date, some of them were not released at all):

Tseng ET6000 (does it really support 3D?)
OAK Warp 5 (never released)
Stellar PixelSquirt (scan-line rendering, probably a bit latter chip)
Silicon Motion Lynx 3D
NeoMagic MagicMedia 256AV (96 or 97)
Western Digital Tasmania 3D
Philips Trimedia TM1 (VLIW architecture)
Silicon Reality Taz 3D
TriTech Pyramid 3D (engineering samples available in summer 1996)
Ark Logic Tiger 3D (aka Ark 8100, delayed, promised for the end of 98)

some professional solutions (96 or older):

Lockheed Martin R3D/100 (Q1 96, $2,800)
ARTIST Graphics 2000 Series (96)
AccelGraphics AG300 (95)
Evans & Sutherland Freedom Graphics (announced: 95)
Oki TrianGL
Intergraph GLZ Series
Division's VPX Image Generator
Dynamic Pictures V192
APD Free Dimension
Arcobel Imagine
Synthetic Images Reality Blaster / Reality Blazer RB-1000PC
 
http://www.byte.com/art/9412/sec4/art8.htm
Stop forgetting Matrox Impression! :) I have an issue of CGW where they show a screen of 47-Tek's Sento, some accelerated Gouraud shaded 3D fighter running on the board.

Matrox Athena R1/R2 (94) - Gouraud shading, Z-Buffering, double-buffering (Impression's chip methinks)
Rendition Verite 1000 (H2 96) (definitely worthy)
Tseng ET6000 (does it really support 3D?) (no, there was supposedly a ET6100 with 3D though)
NeoMagic MagicMedia 256AV (96 or 97) (Neomagic was full of BS on this one)
 
Last edited by a moderator:
with the exception of the Vérité which I believe came out very shortly before Voodoo, and was a very good chip, just not as fast as the Voodoo.
Matrox Mystique wasn't a decelerator either, while it was very limited in its capabilities (no bilinear texture filtering for instance) it completely blew the true decelerators like the original s3 virge away in terms of performance (though the s3 virge was more full-featured).
 
ViRGE has substantially better image quality than Voodoo Graphics. I built a retro rig a year ago to mess with both and ran the Tomb Raider patches for both. The ViRGE has ok speed (yup!) and the color/shadowing contrast in the game is far better. Much sharper filtering than Voodoo too.

I'm pretty sure tho that ViRGE would be very bad in D3D. Games that use the S3 API run ok. I also tried Descent 2 and Terminal Velocity, both are S3D games.
 
and ran the Tomb Raider patches for both. The ViRGE has ok speed (yup!) and the color/shadowing contrast in the game is far better. Much sharper filtering than Voodoo too.
I should have done the S3 patch with TR's depth-sort renderer really. Would have been much faster at high-res, the Z buffer hurt the performance too much, but it would have been harder to get working and it was done in a hurry.

I'm guessing you were using a DX or a GX though? They were a lot more efficient for perspective correction.
 
That was only 1997. I'd already been doing hardware drivers for BRender for a year or so by that point (I was still trying to get back into games after making the huge mistake of working with no contract for six months on a game). That's how I've got all these weird bits of hardware like the SMOS and the Warp 5, and I'd done another driver entirely on an emulator for a piece of hardware that I don't think made it (SPHW).

After/during the BRender work I did several bits of hardware porting and tuning on contract with S3 (and ATI, when I mentioned I knew both bits of hardware the guys at Core suggested us to ATI for the Rage Pro port too) before going on board full time with S3 in 1998.

10 years in this industry now. I reckoned I only had five years in it when I first started. I'm still saying the same thing now...

I spent a couple of evenings making the old ATI code run on D3D for a laugh last year. Didn't get all the UI bugs out and couldn't make the FMV work though (DirectDraw 1 is a little too dead to work) but the game was still playable. Couldn't use aniso or mipmapping unfortunately because of the texture pages, it would have been quite a lot more work to unpage them all, so it didn't actually look that different to how it did then. Except at 1600x1200 of course. And many apologies, but I certainly can't let that code out. Not a chance.
 
ViRGE has substantially better image quality than Voodoo Graphics. I built a retro rig a year ago to mess with both and ran the Tomb Raider patches for both. The ViRGE has ok speed (yup!) and the color/shadowing contrast in the game is far better. Much sharper filtering than Voodoo too.

I'm pretty sure tho that ViRGE would be very bad in D3D. Games that use the S3 API run ok. I also tried Descent 2 and Terminal Velocity, both are S3D games.
Virge 325 and VX supported trilinear, Voodoo didn't. But Voodoo had much better 16bit rendering due to 24bit rendering -> 16bit dithering -> 4x1 postprocessing. I don't think Voodoo (Glide) had worse IQ than Virge (S3 API). Anyway, 640*480 (Voodoo) vs 320*240 (Virge)... that's so big IQ difference, that "minor" differences like texture filtering can't infirm, that Voodoo offered better IQ in most situations.
 
Virge 325 and VX supported trilinear, Voodoo didn't. But Voodoo had much better 16bit rendering due to 24bit rendering -> 16bit dithering -> 4x1 postprocessing. I don't think Voodoo (Glide) had worse IQ than Virge (S3 API). Anyway, 640*480 (Voodoo) vs 320*240 (Virge)... that's so big IQ difference, that "minor" differences like texture filtering can't infirm, that Voodoo offered better IQ in most situations.
Yes, but 16-bit rendering on ViRGE was no faster than 32-bit rendering, so you only used 16-bit if you were short of texture RAM. I'm not at the right machine to check now but I think the TR port always used 32-bit.

I don't think 3dfx cards up to and including Voodoo3 supported more than 16-bit framebuffer (I remember having a laugh at nvidia's shirts at the Long Beach GDC "Yes Chris, we know you do 16 million colours, did you have to put all of them on one shirt?")
 
Last edited by a moderator:
Get Voodoo 2 SLI, start UnrealTournament using envvar SSTV2_VIDEO_24BPP and you'll see ;)

It's a pity, that the difference can not be captured to a screenshot - the only way is a photo. Sorry for bad quality, but I think, that the difference is clearly visible (the second picture = no vertical banding artifacts)

sstv2_video_24bpp=0
sstv2_video_24bpp0.jpg


sstv2_video_24bpp=1
sstv2_video_24bpp1.jpg
 
Back
Top