32 bit colours on voodoos

The drivers contain a number of registry settings that allow you to change the sample positions for FSAA. These need to be used to adjust the sample position in Glide and OpenGL.

However, it can be done automatically with just a single registry setting in Direct3D. The drivers have a registry setting to change the centre of the pixel. By default the sample point for a pixel in Direct3D (with the 3dfx drivers at least) is assumed to be the exact centre of the pixel (0.5,0.5). The registry setting allow you to set the centre of the Pixel to (0,0). Because of the additional (0.5,0.5) bias caused by FSAA setting the pixel center to (0,0) reduces the blur substantially.

I should probably create some screenshots showing the difference. UT is a good example where it helps.
 
Colourless said:
I should probably create some screenshots showing the difference. UT is a good example where it helps.

Yes. Please do. Please create a page/post detailing the settings that need to be changed and the visual impact they have.

--|BRiT|
 
Colorless:
I assume you're talking about places where texels are aligned to pixels (like text or maybe HUDs).
I don't see how it could affect blur in other places.
 
I disagree, I could definately tell the difference when I bought a GF2 to replace my Voodoo3. The GF2's videos was blurry and washed out looking.
I had to do the filter mod to make it look acceptable. Filters make a big difference. I'm not trying to insult you, but if you can't hear the difference between a tube amp and an a solid state(digital amp), then I suggest a hearing check :)

DemoCoder said:
Uh, no. The voodoo cards had a weird gamma setting. 90% of this "brighter and more saturated color stuff" comes from that. You can easily see it by putting a V5 and GF2 together and using a tweak-tool to set the GF2's gamma to be the same as the V5. (some of the powerstrip tools actually had "voodoo gamma" setting)

Most of the 2D arguments going around nowadays are purely subjective. It's like CD vs SACD vs LP, or audiophile arguments. Unless you put an oscillascope on the output, you're not likely to see the effects of these so-called "cheap filters". Nor will you hear the different between a tube amp and a digital amp on some newfangled audio card (but people will attest to it)

Yes, I'm sure if you put different cards side by side, they look different. But before you start talking about filters, you better equalize the driver settings. If you guys saw someone comparing a card with -2 LOD bias vs a card with 0, or 16x aniso vs 2x anisotropic, you'd say it was unfair, but very few of the people who talk about 2D deal with the gamma issue or color profiles issue. They happily post screenshots from different cards and assume they are an accurate comparison everywhere.


All of these IQ comparisons should use equal settings and double blind testing scenarios.
 
hmm my only direct comparison on the same monitor came from a clan mate who took out his V5 and put in a Gf2Pro and hated the IQ of the Gf2 Pro but stuck with it for speed reasons. He is not in any shape or form a 3d techy, just a gamer, but his inital reaction on seeing the Gf2 in actiion was ugh.

It may have been the gamma settings, I dont know.
 
I agree, my Radeon colors where much deeper and richer then my GF3. My GF3 always has that slight faded look, washed out no matter how you adjust the contrast, brightness and gamma settings. The newer drivers are even worst in fadiness factor. My two cnets.
 
If you are looking at screenshots of another card on a PC and think the colors look much brighter and more vibrant then what you are using you can match it(for anything 2D). If your vid card looks "faded" then you should calibrate both your monitor and your vid card. I actually think that my Gainward GF2(listing which nV chip is on your graphics card is useless when comparing them, you need to list the manufacturer) is too bright by default(not as bright as a Radeon however) and utilize a gamma setting of 0.60 and then adjust the contrast, brightness and color channels to get it right where I like it. The ATi Radeon boards suffer from quite a bit of color bleeding out of the box and a too bright display IMO, although with calibration I can make it look just how I want it. 3dfx's boards, all of them that they made, are horrendously too high on the gamma settings IMO. They can be callibrated back down to match the others however.

I like to use the AV guideline- when you are looking at a black screen you shouldn't be able to tell if your display is on or off. Overly bright settings are a marketing trick that have been used by TV manufacturers for years, they correctly think that a typical consumer will be attracted to a brighter picture, no matter how bad it is in terms of color seperation/saturation etc. Many vid card manufacturers use the same trick.
 
No, the clueless are impressed by high contrast high saturation images. Bright alone is not enough.
I have never had a problem with color saturation, but have found the need to calibrate colors.
 
forget it i found the problem.. its my monitor. i swicthed monitor n its lesser vibrant now.. but still looked better than the geforce card.. :)
 
Colourless said:
The drivers contain a number of registry settings that allow you to change the sample positions for FSAA. These need to be used to adjust the sample position in Glide and OpenGL.

However, it can be done automatically with just a single registry setting in Direct3D. The drivers have a registry setting to change the centre of the pixel. By default the sample point for a pixel in Direct3D (with the 3dfx drivers at least) is assumed to be the exact centre of the pixel (0.5,0.5). The registry setting allow you to set the centre of the Pixel to (0,0). Because of the additional (0.5,0.5) bias caused by FSAA setting the pixel center to (0,0) reduces the blur substantially.

I should probably create some screenshots showing the difference. UT is a good example where it helps.

Screenshots would be nice ;).

The x3dfx-community drivers (1.08.04) have a toggle for this in the 3dfx tools: FSAA Jitter Control. I get a bit confused when comparing what you said with what the x3dfx-tool help-text says :

x3dfx-tools said:
FSAA Jitter Control - When either X2 FSAA or X4 FSAA are selected, you can use FSAA Jitter Control to further change and alter the level of Anti-Aliasing that is applied to Full-Scene Anti-Aliasing. This is achieved by the Jitter Control function toggling the secondary Anti-Alias buffer according to either the X and/or Y coordinates. This is an extremely useful function, as now users can change their level of Anti-Aliasing to meet the visual needs of their games. In some instances, with the correct settings, a user can have X2 FSAA selected, and almost achieve the same effect as X4 FSAA without the performance hit that would occur with X4 FSAA.
-X +X - Setting this value to a negative number will toggle the X buffer left of the default (0.0) plane. Setting this value to a Positive number will toggle the X buffer right of the default (0.0) plane.
-Y +Y - Setting this value to a negative number will toggle the Y buffer left of the default (0.0) plane. Setting this value to a Positive number will toggle the Y buffer right of the default (0.0) plane.

NOTE:Once toggled, the default settings for jitter control must be manually set to the values below to resort to the default FSAA values:
X2 FSAA X-Axis - (-0.50)
X2 FSAA Y-Axis - (-0.50)
X4 FSAA X-Axis - (0.0)
X4 FSAA Y-Axis - (-0.50)

But maybe the x3dfx-community lads have changed the default from 0.5 to 0.0 already. I'm not sure if the help-text is from some 3dfx-documentation the community have access to or if they have written it themselves. I get confused easily ;).
 
Avoid all community drivers. They do not have the slightest clue as to what they are actually doing when they mess with all those registry settings. Most of the time their settings are not what they say they are, or the effect of the settings is the exact opposite to what they say.
 
So if I was to change the registry settings manually, which setting to change and what to add ? (i browsed through the registry but didnt found anything obvious to do with jitter or defining 0.0 values). I'm using the 1.04.01b drivers right now.

thanx / Fafner
 
Galilee said:
Both my GF3 and GF4 have/had better imagequality than my old Voodoo5. No question about it. Sharper and better colors.
Every time I think about my V5 I only get pictures of dithering and 16-bit color in my head :( What a lousy card. Beside good AA it didnt really have anything.

That's interesting, since the V5 supported 16-bit, 16/22-bit, and 32-bit 3D rendering modes.... ;) And it had by far the best FSAA of any product at the time, among those products which had any FSAA, of course, which were very few. It also has to be remembered that when the V5 was released GPU and cpu power was fairly low compared to present standards and 640x4, 800x6, and 1024x7 resolutions were the common gaming resolutions for 3D, emphasis on the lower two. And at these resolutions FSAA was far more dramatic than it is at 1280x1024 or 1600x1200--which wasn't practical for anybody back then. As well, anisotropic filtering didn't exist to any degree yet.

Now the V3 was 16, 16/22 only, which maybe is what you're thinking about. But when I bought my V3 I was using the current nVidia-based card at the time the V3 was released--a TNT, believe it or not (an original TNT.) The V3 actually blew it away completely--no comparison. Oh, I remember well arguing with some people who said that 10 fps at 32-bits was *great* and who took umbrage when I said I didn't believe them. I still don't believe that any 3D gamer at the time would have preferred a TNT to a V3--heck at the time I was using my TNT I was still using my V2 alongside it, as I didn't judge the TNT to even rate replacing my V2. The V3 was a *great* card for the time it was released--far, far better than the TNT.

However, as is well known, 3dfx's fatal mistake was that it took far too long for them to go from the V3 to the V5, and in that time frame nVidia almost caught up, and offered 32-bit 3D to boot (TNT2 and then GF1.) The V4 of course didn't ship until after the V5 (oddly enough.) No doubt about it, had 3dfx been well managed enough to ship the V5 6-8 months earlier than it did 3dfx would still be around and be very much a player these days.

What's currently happening with the nv30 and the R300 is kind of interesting. When 3dfx announced it would use a .25 micron process for the VSA-100, nVidia of course aggressively pushed for the newer process at the time--.18 microns. 3dfx said it decided to stick with .25 because it was concerned about the reliability of the .18 micron process at the time and was therefore concerned about yields. nVidia's .18 micron gamble paid off with a successful GF1 launch.

Now, though, it appears that nVidia's aggressive stance with regard to designing chips for manufacturing processes that have yet to be perfected may have (and I stress *may*) backfired on them with respect to the nv30 and its ability to compete with the ATI R300. In this case ATI chose the conservative 3dfx approach to process and it seems thus far to have very much worked in their favor, whereas it's possible that an immature .13 micron TSMC process is holding nVidia back now with the nv30. If that's the way it is for nVidia it's nothing if not ironic.
 
Colourless said:
Avoid all community drivers. They do not have the slightest clue as to what they are actually doing when they mess with all those registry settings. Most of the time their settings are not what they say they are, or the effect of the settings is the exact opposite to what they say.

Unfortunately, you are correct. Their driver sets aren't well assembled... Weren't you part of their project, rolling in your WinXP glide ports?
 
WaltC said:
That's interesting, since the V5 supported 16-bit, 16/22-bit, and 32-bit 3D rendering modes.... ;) And it had by far the best FSAA of any product at the time, among those products which had any FSAA, of course, which were very few. It also has to be remembered that when the V5 was released GPU and cpu power was fairly low compared to present standards and 640x4, 800x6, and 1024x7 resolutions were the common gaming resolutions for 3D, emphasis on the lower two. And at these resolutions FSAA was far more dramatic than it is at 1280x1024 or 1600x1200--which wasn't practical for anybody back then. As well, anisotropic filtering didn't exist to any degree yet.

The GeForce2 GTS, which was actually available before the Voodoo5 5500, if I remember correctly, did 1280x1024x32 in many games without too much problem.

And the video cards that supported FSAA (The Radeon wasn't yet released...):
1. GeForce SDR (32MB)
2. GeForce DDR (32/64MB)
3. GeForce2 MX (32MB)
4. GeForce2 GTS (32/64MB)

And the Voodoo5 was a fair bit slower than the GF2 GTS, FSAA or no. The GTS could do 640x480x32 with FSAA just fine, sometimes 800x600x32 with FSAA. While the Voodoo5 might have been good for older games with very low fillrate requirements at the time (in particular, flight sims), I have a hard time believing it was any good for newer games, where playable framerates were only available at 640x480x32.

However, as is well known, 3dfx's fatal mistake was that it took far too long for them to go from the V3 to the V5, and in that time frame nVidia almost caught up, and offered 32-bit 3D to boot (TNT2 and then GF1.) The V4 of course didn't ship until after the V5 (oddly enough.) No doubt about it, had 3dfx been well managed enough to ship the V5 6-8 months earlier than it did 3dfx would still be around and be very much a player these days.

Almost caught up? The TNT2, except in Glide games, was quite a lot faster, and better-looking, than the Voodoo3. The GeForce, feature-wise, was quite a bit ahead of even the Voodoo5. As a quick note, which video card do you think will be able to play DOOM3 at all?

And don't forget, if 3dfx had shipped the V5 6-8 months earlier, it would have been beat by the GeForce DDR. After all, if it had shipped that much earlier, it would most certainly have been slower, and possibly even had fewer features.

3dfx said it decided to stick with .25 because it was concerned about the reliability of the .18 micron process at the time and was therefore concerned about yields. nVidia's .18 micron gamble paid off with a successful GF1 launch.

GeForce2 launch. GeForce1 was on the .22 micron process.

Now, though, it appears that nVidia's aggressive stance with regard to designing chips for manufacturing processes that have yet to be perfected may have (and I stress *may*) backfired on them with respect to the nv30 and its ability to compete with the ATI R300. In this case ATI chose the conservative 3dfx approach to process and it seems thus far to have very much worked in their favor, whereas it's possible that an immature .13 micron TSMC process is holding nVidia back now with the nv30. If that's the way it is for nVidia it's nothing if not ironic.

Every disadvantage can be turned around. For one, I don't believe for a moment that nVidia's engineers haven't used the extra time made available to enhance the design of their NV30 product. That is, what you're going to see is going to be better than what would have come out if the NV30 was to be shipped this month (as nVidia usually does with their fall product). In other words, nVidia has a few more months of refining their design to make sure that it's as good as it can be. In still more words, nVidia has no excuse to lose out to ATI's R300, in any area.

If you ask me, 3dfx only put out two truly good products, the Voodoo1 and Voodoo2. The original Voodoo brought 3D gaming truely alive. It was far faster, and often better-looking, than its competition at the time. 3dfx really hailed in a new paradigm in 3D rendering: one clock, one pixel. The Voodoo2 brought that technology a step further with support for two textures per pixel, and one pixel per clock.

As far as plain technology goes, nVidia was ahead with the release of the original TNT. It supported 32-bit color, two pixels per clock (or one pixel with two textures), true trilinear filtering (only when multitexturing was disabled...with multitexturing it used the ugly MIP map dithering...), and even FSAA. FSAA was later disabled in the drivers as it was just far too slow, and don't believe it was ever available to be forced through the drivers...only games that supported FSAA could turn it on.

Then, with the release of the TNT2, nVidia received the speed crown. The TNT2 was just plain better-looking and faster than anything else out there, particularly in most newer games that were starting to support Direct3D and OpenGL over Glide.

Once the GeForce came out, nVidia firmly cemented their leadership in the 3D market, leadership that they had kept until just a couple of months ago. The competition to the GeForce at the time was the Voodoo3 from 3dfx, the Rage Fury MAXX from ATI, and the Savage2000 from S3.

Just in terms of new features, here's what the GeForce brought to the table:
1. True trilinear filtering without a significant performance hit.
2. DOT3 bump mapping
3. Register combiners (weren't exposed in drivers until GeForce2 launch)
4. Hardware Transform & Lighting
5. Cubic environment mapping
6. Anisotropic filtering (Again, wasn't expoed in drivers until later...and only 2-degree aniso was supported).
7. S3TC texure compression (not available in OpenGL until around the December->January after the GF's release).
8. Industry's first FSAA forceable by the driver (Enabled after 3dfx announced their form of FSAA. As you know, nVidia's had FSAA support for a while in hardware, they just apparently didn't think about forcing it in hardware until 3dfx made their announcement).

By comparison, here's what the Voodoo5 had new to offer:
1. 32-bit color (A full year and a half late...)
2. FSAA (Very good FSAA, but it seriously screwed up textures with default settings...leading many to prefer the GF2's FSAA)
3. T-buffer (A subset of the accumulation buffer available even in the TNT)

The Voodoo3 was outclassed by every new video card out that fall, and was far behind the times. The dual-chip Rage Fury MAXX didn't have the performance that was promised, was plagued by poor drivers, and suffered from a chronically-unstable framerate. As a quick note, it was ATI's first and only multi-chip product for the consumer market. The Savage2000 from S3 was promised to be a "GeForce killer." I don't believe it was actually released until quite a bit after the GeForce, actually, and only performed well in one game: Unreal Tournament. Other than that, it was plagued by poor drivers, and a generally poor hardware design. In particular, support for hardware T&L was promised, but never delivered (It wasn't active at the launch...but was promised with later drivers, something that never occurred).

Quite simply, while 3dfx turned the industry on its ear back with the Voodoo1 and Voodoo2, nVidia did it again with their sheer magnitude of progress in later years. Once the other companies saw the amazing speed at which nVidia was advancing the industry, all but ATI pulled out for a little while. ATI stuck in there, and has finally caught up with nVidia, and has had to be faster at advancing new features to do it (1-year new architecture cycle, as opposed to nVidia's current 18-month new architecture cycle). It seems like all of the larger companies in graphics of years passed are coming back for more this year. Unfortunately, Matrox has pretty much failed with their Parhelia, for 3D gaming anyway...too bad, as it has some neat technology, too.

Personally, I don't believe that we will ever again have one product that is just hugely better than other products available. Since the programmability of these processors is quickly coming to a head, there's not much left to do in terms of new features on the programming side of things. From now on, it will pretty much be a fight closer to the Intel vs. AMD drama that's been going on since AMD stopped simply cloning Intel's processors (About the time of the K5, I believe?).

Anyway, I think I sort of flew off on a tangeant there. Time to stop.
 
Chalnoth said:
Then, with the release of the TNT2, nVidia received the speed crown. The TNT2 was just plain better-looking and faster than anything else out there, particularly in most newer games that were starting to support Direct3D and OpenGL over Glide.

I don't think so Chalnoth..and 16 bit on the TNT-2's were brutal, since most games at that time didn't even support 32-bit...it wasn't a big deal at all..

D3D GAME
image014.gif


32-bit Trilinear is getting a whole 40 FPS on Quake 3 :LOL:

image024.gif

image020.gif


It was not a viable feature Chalnoth..The TNT was getting 30 FPS with 32-bit...Yipeee can you say slideshow in a real gaming enviroment..
You are doing a good job using Nvidia's old marketing crap from that era though..ahhh the good old days.
 
Doomtrooper said:
It was not a viable feature Chalnoth..The TNT was getting 30 FPS with 32-bit...Yipeee can you say slideshow in a real gaming enviroment..
You are doing a good job using Nvidia's old marketing crap from that era though..ahhh the good old days.

You just showed it at 40 fps.

And I'd beg to differ on most games not supporting 32-bit at the time. I'm pretty certain that the popular games supported 32-bit just fine. I only remember owning a very few games that didn't support 32-bit color...of which two come to mind, Final Fantasy 7 (which didn't matter as very little was actually 3D anyway), and Mechwarrior 3.

And all of the benchmarked games certainly supported 32-bit color, including: Quake2, Quake3, Unreal Tournament, Expendable, Descent 3.
 
Let me help with some graph reading, you will notice the TNT is getting 30fps and the Ultra is getting 40 fps..

Chalnoth yes please list all the games supporting 32 bit in that era, that should keep you busy for a while.

The TNT's had to run 16-bit just like th voodoo's Chalnoth in most real game enviroments, the peformance hit was just too great.
Except the Voodoo's post filter made the TNT's 16 bit look like MUD, the only time IQ was better was 32-bit which wasn't useable..as proven above.
 
Back
Top