Digit Life 3D Chronicles (1995-1997) article

swaaye: Voodoo 2 was at least comparable to TNT performance wise. I'd say TNT was much closer to Banshee - yes, many reviews showed Q2 results, which were slightly better for TNT. But in many other games TNT was slower. I remember review at hardwarecentral - at Q2, both cards performed over 25 FPS at all resolutions (640x480, 800x600, 1024x768), but TNT was faster. In Unreal, TNT was slower and didn't reach 25 FPS even at 640*480.

Dont forget one other very important point back then. UT had like 60 patches before they got D3D to work properly with any card and OGL took almost as many patches. So those comparisons really aren't fair in my mind. UT in Glide not only worked flawlessly, but also offered the best frame rate.

I remember previews of TNT talking about the chip being clocked around 125 MHz or even higher. If it had managed to launch at that, instead of 90 MHz, it would've beaten everything except V2SLI. The thing just couldn't get there though with the process tech they had. They revised the specs a few times and ended at the launched 90 MHz.

Yeah I remember that, though I took a HSF from a P1, cleaned it up, removed the heat on the TNT, cleaned the chipped, superglued P1 HSF to it and ran it NP at 140Mhz for 18 months before I replaced it with a TNT2U and did the same thing to it and pumped it up 30mhz from it standard clock.
 
Last edited by a moderator:
Remember too though, compared to Voodoo2, you could run higher than 800x600 and use 32-bit color. This was when 1024x768 started to become the norm. Matrox G200 and Banshee could run higher resolutions too but TNT had a lot more speed.

32 bit color and similar speed to Voodoo2 was the main reason I got a Riva TNT. Unfortunately, 32 bit color looked washed out and was unplayable in virtually all games. And as for speed, it felt "laggy" for lack of a better word in FPS games (Quake, Quake 2, UT, Halflife, etc) compared to the V2. So I only used it for about a week before I tossed it into a drawer where it's been collecting dust.

Also V2 could do up to 1024 res in SLI.

Regards,
SB
 
Silent_Buddha, that's how I remember it too.

The TNT was slower / felt laggy, looked washed out in comparison to the V2, and the 32bit color mode was too slow to be usable in games. The TNT2 was a bit more comparable to V2-SLI in terms of color and speed.

What was nice was upgrading from V2-SLI to V3-2000 @3000 speeds. It offered better picture clarity than the V2-SLI and higher resolution while being mostly fully backwards compatible with the immense library of Glide games.
 
I wonder if you guys are remembering some sort of gamma issues. I actually have a old TNT card around yet and it really outputs a pretty decent picture. Its 2D is even sharp. (yea, unbelievable!) You may have also been running the TNT through the Voodoo pass-thru, and that would hurt quality too...

But I think I do remember G200 having better image quality than the TNT I had way back then. Maybe I'll set up the TNT I have in the archive drawer and run some stuff on it again.
 
It most likely was Gamma issues or different color settings. Some games didnt provide any means of adjusting gamma. It was a royal PITA to recalibrate the monitor when trying to use the TNT.

As for the TNT 2D output, it was incredibly fuzzy at 1600x1200, a little less so at 1280x1024, even on a top of the line Viewsonic 21" CRT. It couldn't hold a candle to the Matrox G100.
 
There were huge differences in 2D quality between the manufacturers back then. Some used cheaper DAC's and such so the quality was much worse.
 
It most likely was Gamma issues or different color settings. Some games didnt provide any means of adjusting gamma. It was a royal PITA to recalibrate the monitor when trying to use the TNT.

As for the TNT 2D output, it was incredibly fuzzy at 1600x1200, a little less so at 1280x1024, even on a top of the line Viewsonic 21" CRT. It couldn't hold a candle to the Matrox G100.

Dont know what issue you were having, But I never experienced any of that with my TNT card.
 
You must have somehow managed to get the best manufacturer on your TNT card then. As every TNT card I tried by the top 4 vendors were all the same. The text was fuzzy and not as crisp. This was extended into the TNT2 and GeForce cards too. You might not even know there was an issue unless you had a top-notch card to compare it with, like one from Matrox.

The only website i could find mentioning the effect is Google's cache of this page: http://www.maxuk.net/241mp/geforce-image-quality.html.

The issue has even carried forward to the GeForce FX product line in 2004 -- http://www.beyond3d.com/content/interviews/25/2 :
In the past, 2D and filter/circuitry quality varied greatly depending on the manufacturer of the card and how far they strayed from reference component levels. Has anything been done with GeForce FX to better ensure consistent quality?

Yes, GeForce FX includes a new generation of Digital Vibrance Control (DVC) that features the ability to sharpen photographs, colored text and many elements of the GUI. This will help to maximize image quality across all GeForce FX boards. However, there will still be variation based on the quality of output filter components. NVIDIA recommends you choose a board supplier that meets your personal preferences.
 
Last edited by a moderator:
You must have somehow managed to get the best manufacturer on your TNT card then. As every TNT card I tried by the top 4 vendors were all the same. The text was fuzzy and not as crisp. This was extended into the TNT2 and GeForce cards too. You might not even know there was an issue unless you had a top-notch card to compare it with, like one from Matrox.

The only website i could find mentioning the effect is Google's cache of this page: http://www.maxuk.net/241mp/geforce-image-quality.html.

The issue has even carried forward to the GeForce FX product line in 2004 -- http://www.beyond3d.com/content/interviews/25/2 :

My TNT and TNT2U were both by Creative and neither had fuzzy text at 1600x1200. My GF2GTS from Hercules(the one that was factory OCd and had faster memory which made NV mad) did't have the issue either. Maybe I got lucky or maybe the monitors I used were not as finicky about the signal from the cards I had.
 
Yeah I have a Creative Labs Graphics Blaster TNT PCI. It really is not blurry. I'm only running it at 1152x864x32 but at 100 Hz I think. I was blown away to be honest. I have been through the GeForce 256 and GeForce2 cards that put out NVblur.

I actually got this TNT brand new not too long ago. I stumbled on a ebay auction for it and a Graphics Blaster 3D (the weird Cirrus Logic thing with RAMBUS). Both were brand new shrinkwrapped in original boxes for like $20 or something. heh.

I even photographed the cards for Wikipedia. I waste way too much time on there. :)
 
Last edited by a moderator:
yeah it wasn't the DAC, it was the board quality. The analog signal path was not properly designed on some cards (i.e. they made it ultra cheap).
 
I wonder if you guys are remembering some sort of gamma issues. I actually have a old TNT card around yet and it really outputs a pretty decent picture. Its 2D is even sharp. (yea, unbelievable!) You may have also been running the TNT through the Voodoo pass-thru, and that would hurt quality too...

But I think I do remember G200 having better image quality than the TNT I had way back then. Maybe I'll set up the TNT I have in the archive drawer and run some stuff on it again.

Wow, can't believe someone mentioned 2D quality and TNT in the same sentence. :) Between me and the people I LAN gamed with (some prefered the TNT over the V2) not a single one out of about 35ish Riva TNT's had anything even resembling halfway decent 2D quality.

Most of the LAN gamers in our area paired up the V2 with ATI or Matrox cards. Although a few like me used Tseng based cards and a few used S3 cards.

You must have gotten a card from an exotic card maker or something. From all the Nvidia cards I used, none of them had decent 2D quality until the Geforce 4 series.

Regards,
SB
 
I kid you not! :) I know where you're coming from there. I remember the NVBlur from GF256 and GF2. I had a STB TNT way back in their day. I went from a G200 to the TNT and I do remember finding it not to have quite the same quality, but it was so much faster I didn't care. After that I got a Matrox G400.

The TNT I have now is a Creative Graphics Blaster TNT. There's a photo of the card above. I'm also not running thru the evil Voodoo passthru.

I photographed it! Ha! (ignore the scan line...tried to get it at a good moment) Yeah so it obviously didn't work that well. But, that's 1600x1200x32 72 Hz. My monitor's max. It's really quite impressively clear and very sharp, probably better than my G200 (really!). I would be willing to say it's as good as a G400. The G400 is a lot faster at 2D though. TNT was visibly redrawing at this res.


This is from my beastly 440BX rig with its fabulous 1400 MHz Pentium III-S, Vortex 2, Voodoo1 (not hooked up right now), and the TNT. Notice 3DCC running in the bottom? Who remembers that?
 
Last edited by a moderator:
Damn, brings back memories. :) Although I skipped the P3. Went right from Dual Celery 300's @ some ridiculous overclock that I can't remember on a BX based dual socket board to an Athlon and the Golden Finger overclocking devices. :D

Hell, I remember lapping pennies to make sure the L2 cache modules would be in contact with the heat spreader. :D

Ah...those were the days.

Regards,
SB
 
Back
Top