Digit Life 3D Chronicles (1995-1997) article

The only thing missed would be like 3D Blaster VLB... heh. Unless you want them to cover stuff like Matrox Impression Plus. It had hardware gouraud shading!

They did get RIVA 128 wrong tho. It's not 2x1.
 
I had a 128ZX 8MB AGP card back in the days. Was my first 3d card :) The Voodoo 3 3000 16MB that replaced it was such a huge upgrade.
 
Anyone know anything about this :
"What concerns Rendition, this company suggested using a 3D accelerator for some operations (transforming and lighting) instead of a CPU. That was why some transistors of the Verite 1000 were used for an additional unit."
 
Back then, there was a lot of chatter about how Verite V1000 had triangle setup while Voodoo 1 did not. It didn't really matter at all in practice, because Voodoo 1 obliterated the V1000 almost always. No contest. But it was said occasionally that V1000 was better for low-end CPUs because it offloaded more. Voodoo 1 needed like a P2-300 to really be saturated. Of course, you can look at it from an angle of peak performance too: Voodoo1 needed more CPU simply because it could do more faster, while V1000 reached its peak rather quickly. Huge fillrate difference between the two.

Here's what the man behind the T-Buffer had to say about his Voodoo 1 chip and its geometry acceleration:
http://groups.google.com/group/comp...aphics+triangle+setup&rnum=1#89f2787ea8519e8e
(hey look, it's our own Dave Glue in there too :cool:)

Heh. You don't see engineers from IHVs chatting about their hardware on newsgroups anymore.... ah well! There are lots of chats like this on newsgroups about hardware from those days.

I have one about Verite V1000/V2x00 and VQuake development bookmarked:
http://groups.google.com/group/comp...0&hl=en&lr=&c2coff=1&client=firefox-a&rnum=56

The Verite chips were kinda interesting to me because they were built around a RISC design, I believe. I had a fascinating in-depth datasheet a while back.

What's really crazy though is if you look at NVIDIA's execution with hindsight. They go from the NV1 bomb to NV3 RIVA 128 and beat Voodoo Graphics. RIVA TNT is faster than Voodoo2. It was supposed to beat V2SLI, but the clock rate didn't make it there. TNT2 is faster per clock than Voodoo3, while also having serious feature advantages. GeForce 256 kills off 3dfx, basically. Not just hardware, either. Their drivers quickly became excellent, and they were the only company to take OpenGL seriously for years. They were on the fast track in a big way. ATI looked like crap in those days, really until Radeon 8500 IMO. Heck, ATI wasn't really on par until R300, when they finally got their drivers to be very decent.
 
Last edited by a moderator:
swaaye: Voodoo 2 was at least comparable to TNT performance wise. I'd say TNT was much closer to Banshee - yes, many reviews showed Q2 results, which were slightly better for TNT. But in many other games TNT was slower. I remember review at hardwarecentral - at Q2, both cards performed over 25 FPS at all resolutions (640x480, 800x600, 1024x768), but TNT was faster. In Unreal, TNT was slower and didn't reach 25 FPS even at 640*480.
 
I had a Voodoo2 and a TNT and a Voodoo Banshee. The Voodoo2 was comparable to the TNT but the Banshee had alot of issues with multitexturing games which really really hurt it in some cases.

Chris
 
They did get RIVA 128 wrong tho. It's not 2x1.
Yep -- RIVA128 is single pixel/texel device and sadly, it was lacking enough sub-pixel precision but it had the raw power, though. ;)

nvidiachip21jp2.gif

Beautiful, isn't it!
The "3D Pixel Processor" should be the TMU.

As for S3's ViRGE, a moderate P-II would be faster in software emulation (MMX optimized path), in most of the cases. :LOL:
I miss those times...
 
I remember previews of TNT talking about the chip being clocked around 125 MHz or even higher. If it had managed to launch at that, instead of 90 MHz, it would've beaten everything except V2SLI. The thing just couldn't get there though with the process tech they had. They revised the specs a few times and ended at the launched 90 MHz.
http://groups.google.com/group/fido...st&q=riva+tnt+125+mhz&rnum=5#be87bad0a001914a

Remember too though, compared to Voodoo2, you could run higher than 800x600 and use 32-bit color. This was when 1024x768 started to become the norm. Matrox G200 and Banshee could run higher resolutions too but TNT had a lot more speed.

In Unreal, TNT was slower and didn't reach 25 FPS even at 640*480.

Well just remember that Unreal had a really sad D3D renderer for a long time (forever?). The engine was almost built for Glide. Unreal was a sort of company breaker because there were a bunch of cards that couldn't run the game at all, such as the V2200 and Virge cards. RIVA 128 can't run it either, interestingly.
 
Last edited by a moderator:
Well, I can say, that Unral runs very well under Glide emulation on a modern hardware. ;)
 
Last edited by a moderator:
Remember too though, compared to Voodoo2, you could run higher than 800x600 and use 32-bit color. This was when 1024x768 started to become the norm. Matrox G200 and Banshee could run higher resolutions too but TNT had a lot more speed.

Also remember the perf. hit from enabling 32-bit was too much for the card to handle @ high res, which back in those days was 1024x768 & up.
 
The Voodoo2 was comparable to the TNT but the Banshee had alot of issues with multitexturing games which really really hurt it in some cases.

Chris

the banshee only had a single tmu unlike the v2

ps: i had a riva 128zx and image quality wasnt all that good imho certainly not as good as the v1 (quake looked grainly for lack of a better description)
and the mystique had even worse mainly because to make it fast they made sure it didnt do much like bilinear filtering or practically any other feature apart from rendering accleration ;) so games looked like they were running in software apart from tomb raider which looked dammed good
 
Last edited by a moderator:
Bilinear quality on G200 was terrible -- coarse approximation, I guess.
But hey, that card would have gone up to 150MHz. :D
 
Also remember the perf. hit from enabling 32-bit was too much for the card to handle @ high res, which back in those days was 1024x768 & up.

yeah I remember that. it was a nice feature, nonetheless tho.

Didn't TNT support much higher resolution textures too? Voodoo 1 - 3 were limited to 256x256. heh. That probably didn't matter in TNT's time however.
 
Bilinear quality on G200 was terrible -- coarse approximation, I guess.
But hey, that card would have gone up to 150MHz. :D

It had more problems than that. I actually don't remember filtering being that bad on it. (RIVA128 has horrific filtering, and Voodoo Graphics is pretty blurry) But I think it had sub-pixel accuracy issues. The ground in Unreal under D3D mode will jitter and slide around. Very nasty. In UT, the lighting is all wrong unless you edit the INI file so multitexture is off. That could be the game's fault though.

The card did a nice, clean output though. Dithering was quite good on it. They bragged about it and called it Vibrant Color Quality.
 
Last edited by a moderator:
the banshee only had a single tmu unlike the v2

ps: i had a riva 128zx and image quality wasnt all that good imho certainly not as good as the v1 (quake looked grainly for lack of a better description)
and the mystique had even worse mainly because to make it fast they made sure it didnt do much like bilinear filtering or practically any other feature apart from rendering accleration ;) so games looked like they were running in software apart from tomb raider which looked dammed good

I ran a Mystique 220 alongside a Voodoo1. Sometimes I actually preferred the Mystique (or at least it was a interesting contrast). Without bilinear filtering it makes a very sharp picture whereas Voodoo's aggressive filtering is quite blurry.
 
I don't remember bad bilinear on G200, too. G200 had pretty good IQ for that time, in my opinion. Only drivers were pretty buggy... And it was the first chip equipped with WARP engine :D
 
You had to look a bit closer...closer...CLOSER! Haha, scare ya? :D

Regarding G200's driver support, it is still fresh in my mind of how I hanged passionately in their site for the first "holy" ICD driver release date, but that wouldn't happen for an year and half after.

Then I moved to Banshee. ;)
 
Yeah there wasn't any real OpenGL for G200 until well into G400's lifetime heh.
 
Back
Top