WaltC said:
That's interesting, since the V5 supported 16-bit, 16/22-bit, and 32-bit 3D rendering modes....
And it had by far the best FSAA of any product at the time, among those products which had any FSAA, of course, which were very few. It also has to be remembered that when the V5 was released GPU and cpu power was fairly low compared to present standards and 640x4, 800x6, and 1024x7 resolutions were the common gaming resolutions for 3D, emphasis on the lower two. And at these resolutions FSAA was far more dramatic than it is at 1280x1024 or 1600x1200--which wasn't practical for anybody back then. As well, anisotropic filtering didn't exist to any degree yet.
The GeForce2 GTS, which was actually available before the Voodoo5 5500, if I remember correctly, did 1280x1024x32 in many games without too much problem.
And the video cards that supported FSAA (The Radeon wasn't yet released...):
1. GeForce SDR (32MB)
2. GeForce DDR (32/64MB)
3. GeForce2 MX (32MB)
4. GeForce2 GTS (32/64MB)
And the Voodoo5 was a fair bit slower than the GF2 GTS, FSAA or no. The GTS could do 640x480x32 with FSAA just fine, sometimes 800x600x32 with FSAA. While the Voodoo5 might have been good for older games with very low fillrate requirements at the time (in particular, flight sims), I have a hard time believing it was any good for newer games, where playable framerates were only available at 640x480x32.
However, as is well known, 3dfx's fatal mistake was that it took far too long for them to go from the V3 to the V5, and in that time frame nVidia almost caught up, and offered 32-bit 3D to boot (TNT2 and then GF1.) The V4 of course didn't ship until after the V5 (oddly enough.) No doubt about it, had 3dfx been well managed enough to ship the V5 6-8 months earlier than it did 3dfx would still be around and be very much a player these days.
Almost caught up? The TNT2, except in Glide games, was quite a lot faster, and better-looking, than the Voodoo3. The GeForce, feature-wise, was quite a bit ahead of even the Voodoo5. As a quick note, which video card do you think will be able to play DOOM3 at all?
And don't forget, if 3dfx had shipped the V5 6-8 months earlier, it would have been beat by the GeForce DDR. After all, if it had shipped that much earlier, it would most certainly have been slower, and possibly even had fewer features.
3dfx said it decided to stick with .25 because it was concerned about the reliability of the .18 micron process at the time and was therefore concerned about yields. nVidia's .18 micron gamble paid off with a successful GF1 launch.
GeForce2 launch. GeForce1 was on the .22 micron process.
Now, though, it appears that nVidia's aggressive stance with regard to designing chips for manufacturing processes that have yet to be perfected may have (and I stress *may*) backfired on them with respect to the nv30 and its ability to compete with the ATI R300. In this case ATI chose the conservative 3dfx approach to process and it seems thus far to have very much worked in their favor, whereas it's possible that an immature .13 micron TSMC process is holding nVidia back now with the nv30. If that's the way it is for nVidia it's nothing if not ironic.
Every disadvantage can be turned around. For one, I don't believe for a moment that nVidia's engineers haven't used the extra time made available to enhance the design of their NV30 product. That is, what you're going to see is going to be better than what would have come out if the NV30 was to be shipped this month (as nVidia usually does with their fall product). In other words, nVidia has a few more months of refining their design to make sure that it's as good as it can be. In still more words, nVidia has no excuse to lose out to ATI's R300, in any area.
If you ask me, 3dfx only put out two truly good products, the Voodoo1 and Voodoo2. The original Voodoo brought 3D gaming truely alive. It was far faster, and often better-looking, than its competition at the time. 3dfx really hailed in a new paradigm in 3D rendering: one clock, one pixel. The Voodoo2 brought that technology a step further with support for two textures per pixel, and one pixel per clock.
As far as plain technology goes, nVidia was ahead with the release of the original TNT. It supported 32-bit color, two pixels per clock (or one pixel with two textures), true trilinear filtering (only when multitexturing was disabled...with multitexturing it used the ugly MIP map dithering...), and even FSAA. FSAA was later disabled in the drivers as it was just far too slow, and don't believe it was ever available to be forced through the drivers...only games that supported FSAA could turn it on.
Then, with the release of the TNT2, nVidia received the speed crown. The TNT2 was just plain better-looking and faster than anything else out there, particularly in most newer games that were starting to support Direct3D and OpenGL over Glide.
Once the GeForce came out, nVidia firmly cemented their leadership in the 3D market, leadership that they had kept until just a couple of months ago. The competition to the GeForce at the time was the Voodoo3 from 3dfx, the Rage Fury MAXX from ATI, and the Savage2000 from S3.
Just in terms of new features, here's what the GeForce brought to the table:
1. True trilinear filtering without a significant performance hit.
2. DOT3 bump mapping
3. Register combiners (weren't exposed in drivers until GeForce2 launch)
4. Hardware Transform & Lighting
5. Cubic environment mapping
6. Anisotropic filtering (Again, wasn't expoed in drivers until later...and only 2-degree aniso was supported).
7. S3TC texure compression (not available in OpenGL until around the December->January after the GF's release).
8. Industry's first FSAA forceable by the driver (Enabled after 3dfx announced their form of FSAA. As you know, nVidia's had FSAA support for a while in hardware, they just apparently didn't think about forcing it in hardware until 3dfx made their announcement).
By comparison, here's what the Voodoo5 had new to offer:
1. 32-bit color (A full year and a half late...)
2. FSAA (Very good FSAA, but it seriously screwed up textures with default settings...leading many to prefer the GF2's FSAA)
3. T-buffer (A subset of the accumulation buffer available even in the TNT)
The Voodoo3 was outclassed by every new video card out that fall, and was far behind the times. The dual-chip Rage Fury MAXX didn't have the performance that was promised, was plagued by poor drivers, and suffered from a chronically-unstable framerate. As a quick note, it was ATI's first and only multi-chip product for the consumer market. The Savage2000 from S3 was promised to be a "GeForce killer." I don't believe it was actually released until quite a bit after the GeForce, actually, and only performed well in one game: Unreal Tournament. Other than that, it was plagued by poor drivers, and a generally poor hardware design. In particular, support for hardware T&L was promised, but never delivered (It wasn't active at the launch...but was promised with later drivers, something that never occurred).
Quite simply, while 3dfx turned the industry on its ear back with the Voodoo1 and Voodoo2, nVidia did it again with their sheer magnitude of progress in later years. Once the other companies saw the amazing speed at which nVidia was advancing the industry, all but ATI pulled out for a little while. ATI stuck in there, and has finally caught up with nVidia, and has had to be faster at advancing new features to do it (1-year new architecture cycle, as opposed to nVidia's current 18-month new architecture cycle). It seems like all of the larger companies in graphics of years passed are coming back for more this year. Unfortunately, Matrox has pretty much failed with their Parhelia, for 3D gaming anyway...too bad, as it has some neat technology, too.
Personally, I don't believe that we will ever again have one product that is just hugely better than other products available. Since the programmability of these processors is quickly coming to a head, there's not much left to do in terms of new features on the programming side of things. From now on, it will pretty much be a fight closer to the Intel vs. AMD drama that's been going on since AMD stopped simply cloning Intel's processors (About the time of the K5, I believe?).
Anyway, I think I sort of flew off on a tangeant there. Time to stop.