32 bit colours on voodoos

Doom, let me help you with graph reading. There is no original TNT in those charts. It's the TNT2 and TNT2 Ultra.

And yes, many claimed that the TNT was too slow to run in 32-bit color.

I always ran at 800x600x32 with my TNT 16MB, when it was available. It was slow, and sometimes hard to play, but I couldn't stand to play at a lower resolution/color depth. Btw, I didn't play Quake3 (mostly played Unreal Tournament at the time).

You cannot convince me in any way, shape, or form, that the 32-bit color of the TNT2 was unusable. I used it on the original TNT.

As for games that supported 32-bit color while I owned my TNT (fyi, I purchased a GeForce DDR shortly after release...so this is in the span of about a year...), I'll see if I can remember a few:

Unreal, Unreal Tournament, Quake2, Quake3, Half-Life (not all that easy to enable, but it's there...), GLQuake, Alien vs. Predator, Freespace 2, Descent 3.

Ones that I can remember playing that didn't support 32-bit:

Mechwarrior 2: Mercenaries (had poor 3D support period, so I didn't play this one much...I think it was DX3).
Mechwarrior 3 (overall poor game, unfortunately)
Final Fantasy 7 (Didn't matter much for this game...not much 3D)

In short, I remember playing far more games in 32-bit color with my TNT than I remember playing games in 16-bit. Again, there's no way you can convince me that there were more games supporting only 16-bit. If you had more, then it was because you owned a lot of games from the pre-TNT era.
 
Chalnoth you were probably some of my target practice online then :LOL:
...I do have a good idea since I had my company in that era and lots of testing on my end.

I disagree.
 
In Unreal Tournament? Well, I was in a clan with the original Unreal, playing with a TNT. We weren't much of a clan, but we had fun. I managed to win approximately 75%-90% of the games I played online against random players. I purchased Unreal Tournament the day it was available, and had Xan online before most anybody else.

Regardless, if you played against me, I played by the same name, and I still won most of the matches I played in UT...though the hardest thing for me was the fact that I was on a 56k modem connection. I did best with fewer players and slower games, where strategy played more of a factor than reflexes. Right now, despite having a much faster CPU and video card, I don't think I'm nearly as good when playing online (mostly because I don't play online much anymore...).
 
Colourless said:
I've made some screenshots that show what corrected sampling positions do for the Voodoo 5. I took thte shots in Tribes (1)

Example 1
Normal Settings
Corrected Settings

Example 2
Normal Settings
Corrected Settings

It boggles my mind that 3dfx themselves didn't correct the sampling positions.

-Colourless

Nice improvement Colourless 8) . Did you achieve this by the registry edit you mentioned earlier? or through your new version of GlideXP? ( http://pub43.ezboard.com/fx3dfxfrm1.showMessage?topicID=11929.topic ). If by registry edit, what lines did you add?
 
Chalnoth said:
In Unreal Tournament?

Nope, I meant other online games like Quake 2 :) but if its UT then sure.

Regardless, if you played against me, I played by the same name, and I still won most of the matches I played in UT...

Well I'm not going to brag here, but I do know what a winning combination is when playing online, and the more the frames the better.
Occasionally I was ranked #1 in the world on NGstats when I was playing alot out of 20,000 players, some of these stats are old and I have no played UT in a while...I feel for you if you were playing on a 56K modem and you would not notice the frame rate issue on a modem since your frames are capped @ 28..
My average frame rate on UT is 90-130 fps...IMO the sweet spot..
 
I did it with GlideXP.

But it's still possible to do with normal drivers. The problem is though you've got to modify 12 Registry Settings to do it, and each registry setting is different. That is just a tad awkward IMO. I modified GlideXP to automate things so you only now need to edit 1 registry setting to do it.

Here's the list of registry setting that need modifying if you want to do it manually:

FX_GLIDE_AA2_OFFSET_X0
FX_GLIDE_AA2_OFFSET_X1
FX_GLIDE_AA2_OFFSET_Y0
FX_GLIDE_AA2_OFFSET_Y1
FX_GLIDE_AA4_OFFSET_X0
FX_GLIDE_AA4_OFFSET_X1
FX_GLIDE_AA4_OFFSET_X2
FX_GLIDE_AA4_OFFSET_X3
FX_GLIDE_AA4_OFFSET_Y0
FX_GLIDE_AA4_OFFSET_Y1
FX_GLIDE_AA4_OFFSET_Y2
FX_GLIDE_AA4_OFFSET_Y3

I'm not going to give you the actual values required. That will just take me too much time to work out.

Also, I am only involved with drivers to the extent that i'm the GlideXP programmer. I don't have anything to do with any other aspect of the drivers.
 
Chalnoth said:
The GeForce2 GTS, which was actually available before the Voodoo5 5500, if I remember correctly, did 1280x1024x32 in many games without too much problem.

The first GTS that hit the shelves prior to the V5 in my neck of the woods was the Elsa Gladiac, which was a big steaming pile from an RMA standpoint (I know, I RMA'd three of them personally to try and get one that worked). The CLAP2 came right around the time of the V5 and was only able to muster about 82 fps in Q3 @ 1024x768x32, 1280x1024x32 wasn't a realistic option at <40fps in most everything tried at the time. Drivers helped immensely about 2-3 months afterwards though- with massive improvements.

And the Voodoo5 was a fair bit slower than the GF2 GTS, FSAA or no. The GTS could do 640x480x32 with FSAA just fine, sometimes 800x600x32 with FSAA. While the Voodoo5 might have been good for older games with very low fillrate requirements at the time (in particular, flight sims), I have a hard time believing it was any good for newer games, where playable framerates were only available at 640x480x32.

You should have actually bought a V5 and tried one. This would have easily dispelled the myths you are propagating here. I provided a whole slew of shots of games at 60 fps at 10x7 2xFSAA. The OGSS on the GTS wasnt even a usable feature in most titles (either by not functioning at all, bugs with overlay, or extremely poor performance) for another 4 months after the driver-hack that was placed around the 6.xx -> 7.xx.

The GeForce, feature-wise, was quite a bit ahead of even the Voodoo5.

So, you rushed out and got an R200 since it was quite a bit ahead of the GF3/4 "featurewise" right? :)

I wont go into detail of how humorous your V3->TNT comparisons were as they speak for themselves. When your choice is 33 fps at 32 bit, or 70 fps in horribly banded 16-bit, versus the same or higher in substantially better 16/22-bit, I think the choice was pretty clear.

As a quick note, which video card do you think will be able to play DOOM3 at all?

The card that purchased all the IP of the competing product to thwart DX8->9 driver development, that is which. The only factor that will prevent a V5 from playing Doom3 is the well executed driver development thwarting NVIDIA has put into place.

As far as plain technology goes, nVidia was ahead with the release of the original TNT. It supported 32-bit color, two pixels per clock (or one pixel with two textures), true trilinear filtering (only when multitexturing was disabled...with multitexturing it used the ugly MIP map dithering...), and even FSAA. FSAA was later disabled in the drivers as it was just far too slow, and don't believe it was ever available to be forced through the drivers...only games that supported FSAA could turn it on.

It really comes down to what consumer's truly want- a stack of "box side" checklists or features/value for their games. There is something to be said about introducing a feature to help mold the future market or advocate a featureset that isn't "ready for primetime"- but to make the consumer's pay for this is a different debate all together.

As it stands right now, the TNT with it's "true trilinear" and "32 bit" were effectively useless features. A GeForce256 can't even muster 42 fps in Quake3 at 10x7x32, and this is 2 generations ahead of the feature inclusion. It starts to become debateable at the GTS level, but again is limited to certain titles and certain resolutions. The funny thing is, most people that ride the NVIDIA horse of 32-bit are self-conflicting with the FSAA-horse, as 40 fps with 32-bit is somehow playable framerate, yet 50 fps with FSAA is ghastly and unacceptable. Consistency is key here as self-conflicting arguments of performance->release period speak more truth than anything else.

By comparison, here's what the Voodoo5 had new to offer:
1. 32-bit color (A full year and a half late...)
2. FSAA (Very good FSAA, but it seriously screwed up textures with default settings...leading many to prefer the GF2's FSAA)
3. T-buffer (A subset of the accumulation buffer available even in the TNT)

This is another knee slapper. V5 seriously improved textures to remove moire, shimmering and aliasing like no other. 32-bit color was "right on time" as the fillrate/bandwidth could finally make it useful in the >40 fps range (Q3 was about 76fps at 10x7x32) and you could finally count the number of games with 32-bit textures or "real" support for 32-bit on more than one hand.

In general, your "recollection" of the past I find rather colorful and without any collection of facts. Instead, it goes against all published literature (with the exception of NVNEWS and ReactorCritical, go figure.. the other 98% disagree) as well as statements that could have surely never come from someone that ever used the hardware in question. So, the truth is you never owned a V5? Is that the correct assumption based on your "findings"? hehe. It totally reeks of NVNEWS forum speak amongst a bunch of GTS owners that never even tried the card... much like the same you see today concerning the Parhelia or 8500..

Cheers,
-Shark
 
Colourless, thanks for the screenshots and registry settings.

To everyone else who has hijacked this thread and turned it into yet another Nvidia -vs- 3dfx(AnyoneElse) pissing match... FUCK OFF already!

--|BRiT|
 
Sharkfood said:
You should have actually bought a V5 and tried one. This would have easily dispelled the myths you are propagating here. I provided a whole slew of shots of games at 60 fps at 10x7 2xFSAA. The OGSS on the GTS wasnt even a usable feature in most titles (either by not functioning at all, bugs with overlay, or extremely poor performance) for another 4 months after the driver-hack that was placed around the 6.xx -> 7.xx.

Btw, I was only considering 4x modes when I made the previous statement, sorry. Still, I doubt many games at the time with higher fillrate requirements (ex. Quake3) could do 1024x768x32 w/ 2x FSAA at >60 fps. And screenshots don't mean anything when judging performance...

The GeForce, feature-wise, was quite a bit ahead of even the Voodoo5.

So, you rushed out and got an R200 since it was quite a bit ahead of the GF3/4 "featurewise" right? :)

There's more than just features to consider when purchasing a card. I was trying to emphasize 3dfx's main problem of the era: a lack of technological advancement. They didn't do too bad in speed, but just didn't do enough in technology (though that wasn't really what buried them...poor management did that well enough...).

I wont go into detail of how humorous your V3->TNT comparisons were as they speak for themselves. When your choice is 33 fps at 32 bit, or 70 fps in horribly banded 16-bit, versus the same or higher in substantially better 16/22-bit, I think the choice was pretty clear.

Let me just say that I actually did see a Voodoo3 in action, and didn't like the dithered 16-bit quality at all. Call it whatever you want, but since I got my TNT, I never played in 16-bit in any game that supported 32-bit.

The card that purchased all the IP of the competing product to thwart DX8->9 driver development, that is which. The only factor that will prevent a V5 from playing Doom3 is the well executed driver development thwarting NVIDIA has put into place.

Try the lack of DOT3. DOOM3, according to JC, supports no method of rendering without bump mapping.

As it stands right now, the TNT with it's "true trilinear" and "32 bit" were effectively useless features. A GeForce256 can't even muster 42 fps in Quake3 at 10x7x32, and this is 2 generations ahead of the feature inclusion.

That's a GeForce SDR, which wasn't much faster than the TNT2 Ultra...the DDR, which I had, did much better (~60 fps at launch, closer to 70-75 fps with later drivers).

The funny thing is, most people that ride the NVIDIA horse of 32-bit are self-conflicting with the FSAA-horse, as 40 fps with 32-bit is somehow playable framerate, yet 50 fps with FSAA is ghastly and unacceptable. Consistency is key here as self-conflicting arguments of performance->release period speak more truth than anything else.

Let me explain my situation for a moment. First of all, as far as 3D games, back when I had my TNT, I mostly played Unreal Tournament (I didn't like Quake3 at all). Unreal Tournament both had the good fortune of having a more consistent framerate compared to Quake3, and less of a fillrate dependence. Because of this, I could play UT just fine with my TNT a 800x600x32. I was mostly CPU-limited at the time. 60+ fps just was not an option, period.

Now I have a much faster CPU, and a much faster video card, and 60+ fps is expected (for action games, anyway...). In fact, it's pretty much been this way since I got my Athlon and GeForce DDR (I now have a Ti 4200). I found that because of the performance hit, I couldn't stand to enable 4x FSAA at above 640x480x32, or 2x FSAA at above 800x600x32 on my GeForce DDR.

From my perspective, now that I could play well at 1024x768x32 in pretty much any game, I did not find it worth it to enable FSAA. The only reason I could see to enable FSAA would be for games that specifically benefitted from it, in particular racing games, flight sims, and the like. I have always said that for people who enjoy these types of games, a Voodoo5 was a good choice (not any more, but back when it was new...).

This is another knee slapper. V5 seriously improved textures to remove moire, shimmering and aliasing like no other. 32-bit color was "right on time" as the fillrate/bandwidth could finally make it useful in the >40 fps range (Q3 was about 76fps at 10x7x32) and you could finally count the number of games with 32-bit textures or "real" support for 32-bit on more than one hand.

The fact that the V5 "seriously improved textures" was a direct result of the supersampling FSAA implementation. Any video card that implements SSAA can do the same. The Voodoo5 just didn't improve the LOD to compensate by default, making the difference in aliasing more pronounced.
 
Chalnoth only goes to underline the lack of validity in commenting on the 3dfx cards without living with them. Having had a few, I am with Doomtrooper and Sharkfood.
3dfx FSAA was substantially different to see from that of the competition. Texture aliasing did not reappear just as before by lowering the LODbias, and I have yet to hear from anyone who used 4xFSAA without lowering it. And the 16/22bit color was perfectly suited to its time. You see, people like me know these things because we kept the cards.

Other than this, performance figures through this thread for that generation of cards are spuriously inflated. You wouldn't be quoting figures from more recent setups would you?
 
I can understand Chalnoth's point now that he has clarified this.

His statement-
Very good FSAA, but it seriously screwed up textures with default settings...

further clarifies to:
The Voodoo5 just didn't improve the LOD to compensate by default, making the difference in aliasing more pronounced.

As 3dfx exposed complete and full user adjustment of LOD Bias for both Direct3D and OpenGL, they left the *user* the ability to adjust to their own tastes.. from which some form of negative shift needed to be applied when adding FSAA.

I totally prefer this method as opposed to "automagic" LOD shifts.. this point has come up a number of times with Smoothvision + ATI as in early drivers, applying SV would create an LOD shift, then applying AF would also apply an LOD shift. The two together wound up with overly aggressive LOD bias. When they finally decided to only apply an automatic shift with SV/AA, people complained that AF wasnt as "sharp" when used alone, or vice-versa.. so it's lose-lose however you try to provide this. It's best to put LOD Bias in the control of the USER and let them adjust to -0.75 for AA or maybe -0.50 for AA + AF, or whatever they prefer best... as I know some sick puppies that preferred -1.25 with 2xAA on the V5, where I couldnt stand the texture aliasing at this aggressive a shift. 3dfx had the right idea- let the users do it themselves :)
 
Atari Jaguar owned SNES ... but ... _who cares_ ??
Voodoos have their place in history, so do TNTs and GF256-s. All of them had their pros and cons, they've been argued over like a million times.
Whats the point of bringing it all back once again ?
 
Chalnoth said:
Sharkfood said:
The card that purchased all the IP of the competing product to thwart DX8->9 driver development, that is which. The only factor that will prevent a V5 from playing Doom3 is the well executed driver development thwarting NVIDIA has put into place.

Try the lack of DOT3. DOOM3, according to JC, supports no method of rendering without bump mapping.

Of course the only factor that prevented V5 from supporting DOT3 was the well-executed hardware development thwarting NVIDIA has put into place. ;) Did I mention that they killed JFK as well?
 
Of course the only factor that prevented V5 from supporting DOT3 was the well-executed hardware development thwarting NVIDIA has put into place. Did I mention that they killed JFK as well?

It was a well known fact that for what ever reasons they may have had, nVidia was asked by MS for 3DFX driver source so they could include a default windows driver and nV said no. nV also canned the real x-3dfx and wicked3d driver work as well. Now weather that was right/wrong or if nV reasons were just, its not my call to make....
 
Sharkfood said:
As 3dfx exposed complete and full user adjustment of LOD Bias for both Direct3D and OpenGL, they left the *user* the ability to adjust to their own tastes.. from which some form of negative shift needed to be applied when adding FSAA.

I totally prefer this method as opposed to "automagic" LOD shifts..

I just have to say that I prefer automatic LOD shifts, but it is definitely nice to also have custom LOD settings available (I can access custom LOD with nVidia's products through tweak programs...).

In particular, for every setting that adjusts texture filtering quality (SSAA, anisotropic), there is also a nice, mathematical formula that describes how the LOD should change.

As a quick example, the GeForce4's LOD doesn't just change when anisotropic is enabled, but the MIP boundaries actually change in appearance (not just distance).

While I truly don't know whether or not the GF4's change in MIP boundary shape is the proper selection for MIP maps under their anisotropic method, it does bring up an interesting point: that more can change than just the LOD when the texture filtering method changes.

But yes, for quirky software, custom LOD should always be available (if only to power users...).
 
jb said:
It was a well known fact that for what ever reasons they may have had, nVidia was asked by MS for 3DFX driver source so they could include a default windows driver and nV said no. nV also canned the real x-3dfx and wicked3d driver work as well. Now weather that was right/wrong or if nV reasons were just, its not my call to make....

How does that translate into Nvidia being responsible for V5s inability to run Doom3 when the card does not have the necessary hardware features?
 
just to expand on what Colourless said on jitter control.
all D3D display registry settings are under

\HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\Class\Display\<adapter_number>\D3D

(for win98, that is) and 2x fsaa control is as follows:

SSTH3_PRIBUFVTXOFFX_2SMPL -- for controlling the 1st sub-sample offset along the x-axis
SSTH3_PRIBUFVTXOFFY_2SMPL -- for controlling the 1st sub-sample offset along the y-axis
SSTH3_SECBUFVTXOFFX_2SMPL -- for controllig the 2nd sub-sample offset along the x-axis
SSTH3_SECBUFVTXOFFY_2SMPL -- for controlling the 2nd sub-smaple offset along the y-axis

above entries are all of string type, and take arguments of the form "<integer_part>.<fractional_part>" which control the x- or y-axis offset (i.e. 'jitter') of the respective sub-sample in respect to the macro-sample center. the valid randge for an offset covers at least the [-3.0, 3.0] interval, with effective graularity of 1/16 (i.e. 0.0625). the magnitude of the values passed corresponds to a factor of a pixel unit, i.e a value of 1.0 denotes an offset by one unit along the respective axis.

hope the above is of any help
 
Back
Top