What did you think of 3dfx?

What did you think of 3dfx?

  • Pretty good chips, but too slow to react to a rapidly changing market (voodoo3 16-bit anyone?)

    Votes: 0 0.0%
  • Last decent product was Voodoo2 and Savage3D was faster. Nice antialiasing though.

    Votes: 0 0.0%
  • Lucky enough to get first decent 3D board out, never knew where they were going

    Votes: 0 0.0%
  • Who?

    Votes: 0 0.0%

  • Total voters
    182
Considering that Rampage is probably == NV30... I wouldnt be so optimistic about it.

icon_cry2.gif


-----------------------
Revenge is Comic
 
Funny how the extreme 3dfx fanatics are always raving about a chip that never even made it to the market and whose technology has still never seen the light of day.

As for my own vote, I never owned a 3dfx product (almost, but wound up buying a TNT2 Ultra instead) so it'd be pointless for me to vote.
 
Randell said:
no but with an 128bit memory bus and decnt core speed - 166/166, rather than 143/125, it would have competed very well as its IQ was excellent (fast 32bit and fast trilinear and the original S3TC+Metal combo for Unreal/UT of course)

Free trilinear (well, about 1% performance hit to be accurate) - that's pretty good for the time.

As for Metal on UT... I still think we made it so it was the best way you could possibly run the game at the time, but since I helped to write it I'm biased (as usual)

- Andy.

Bias Meter: Arsenal[x----|-----]Chelsea (Sorry, Randell ;) )
 
Funny how the extreme 3dfx fanatics are always raving about a chip that never even made it to the market and whose technology has still never seen the light of day.

And that is why I hope to God that nVidia never goes out of business. If they do, then in the year 2100 we will still be talking about the "secret 3dfx-Gigapixel-nVidia fusion NV50 that never came out".
 
I frankly hesitated between option A & B.
But I finally decided to vote B.

My real answer would rather be something in-between, though:
"3DFX had a very good engineering team, but a bad ( not horrible ) management team and too much self-confidence ( that is, until the VSA-100 became a disaster ) , both on an engineering and management POV."

BTW, I *reinsist* that the NV40 is when we're gonna see GigaPixel stuff ( as well as a little more 3DFX stuff, maybe, not sure )
I'd kill anyone, anytime to get all the NV40 specs...
I know ( mostly from rumors, some from nearly certain speculation ) it has 150M transistors, is manufactured on IBM 0.13, is to be released in late H1 2004 and that'll it use at least some GP tech. Beside that, well... nothing :(

Let me resay that: I'd do *anything* to get the full NV40 spec sheet. I'd even say in every single forum that the NV30 is the greatest card ever, that CineFX is bugfree and that the NV35 can easily beat a 256-chip R390. Okay, so maybe I wouldn't do that, but you get the point...


Uttar
 
andypski said:
As for Metal on UT... I still think we made it so it was the best way you could possibly run the game at the time, but since I helped to write it I'm biased (as usual)

having played UT extensivley in all modes (Software renderer on my dads pc, Metal on my S4, Glide on my V3/V5, D3D & OpenGL on my 8500) I agree. Pity it locked up online in Metal though. I ended up trading it in for a V3, stabler and with speed - shame about the loss of IQ.

andypski said:
Bias Meter: Arsenal[x----|-----]Chelsea (Sorry, Randell ;) )

grrr lucky lucky S.O.B's
 
Randell said:
having played UT extensivley in all modes (Software renderer on my dads pc, Metal on my S4, Glide on my V3/V5, D3D & OpenGL on my 8500) I agree. Pity it locked up online in Metal though.
I thought that got ironed out eventually?
 
naw.. Forget all that 3dfx stuff... ;)

I was totally ripped and excited for the Redition V3000.. or better codenamed Redline.

T&L, 4 pixel pipelines.. EMbedded Ram.. etc etc etc... all would have been released in 1998.. had micron not bought them out. They wanted to add Geometry coprossessing to the V2200. An external coprocessor to accelerate OpenGL games. As I recall they were even working closely with John Carmack on a proper solution..

now those guys WOULD have taken over the world and WERE ahead of their time.. but alas... :cry:
 
OpenGL guy said:
Randell said:
having played UT extensivley in all modes (Software renderer on my dads pc, Metal on my S4, Glide on my V3/V5, D3D & OpenGL on my 8500) I agree. Pity it locked up online in Metal though.
I thought that got ironed out eventually?
We never 100% nailed it down. Since we couldn't play online from our 'office' it was hard to test :)

Playing it hard there on the LAN (we had 10 machines once, I think, the room bulged at the seams!) I never had a crash that was conclusively 'not random' - at the time my expectations of PC reliability were a lot lower than they are now (mostly because of Windows 2000, I guess).

Savage2000 was a little more temperamental than Savage4 which indicated the video card was part of the problem - my eventual conclusion was that under pathological load cases either the video or network could miss something and hang the PCI bus - when they went like this even SoftICE couldn't get back in. I did notice that swapping the default 3com drivers for our network cards (they had debug msgs in them!) for more recent ones helped a lot, and those machines tended to crash less. The only other correlation was that there was miles less trouble on BX chipsets than on everything else.

But it was pretty much impossible to fault-find, we just couldn't get a reproducer.
 
Dio said:
Savage2000 was a little more temperamental than Savage4 which indicated the video card was part of the problem - my eventual conclusion was that under pathological load cases either the video or network could miss something and hang the PCI bus - when they went like this even SoftICE couldn't get back in.
Yeah, bus hangs are a pain!
I did notice that swapping the default 3com drivers for our network cards (they had debug msgs in them!) for more recent ones helped a lot, and those machines tended to crash less. The only other correlation was that there was miles less trouble on BX chipsets than on everything else.
I had noted the same thing about the 3Com drivers when I first joined S3! I was flabbergasted that I was hitting int3's on (apparently) every packet! I quickly swapped that card into by build box to avoid troubles.
But it was pretty much impossible to fault-find, we just couldn't get a reproducer.
Ok, I wasn't aware that the problem was a bus hang. I had worked a long time on a hang with the SavageNB that I thought might be related, but that wasn't a bus hang so I guess not.
 
Well all I can say - every one of us S4 owners who played UT online at SDN locked up when using Metal online. You even released a bios to see if that would fix it. By May 2000 I had had enough and decided to save up for a better card and swapped the S4 for a V3 to tide me over :)
 
Well when I saw the first Voodoo1 card I said something along the lines: "OMG, so many chips - there's no way they'll be competitive with one chip solutions".
I mostly based my judgement on that and I was surprised they even lasted that long.

Edit: typo
 
Option 2.

The original Voodoo Graphics was their best product IMO. Real fast for its time coupled with excellent IQ (again for it's time). Those first moments with GLQuake and the patched Tomb Raider are still highlights for me. By the time of Voodoo 2 only the sheer speed (SLI anyone 8) ) was a reason to upgrade from a Voodoo 1.
After the introduction of Voodoo 3 it became clear 3dfx was stagnating. At that time I was very much into the demoscene (just as a fan :)
) and several programmers were bitching about the limited featureset compared to it's competitiors. No doubt it was a great card to play games, but boring to experiment with from a pogrammers POV.

The Voodoo 4/5 was unfortunately late and the rest is history.

It would still be great to have 3dfx around just for competition, but I guess most of their staff now works still in this business and continues to deliver what we all need most desperately :D :
Powerful 3D cards

The progress that realtime 3D graphics made over the last couple of years is nothing short than amazing and I guess it will be like that for some time to come.
 
Well, I don't think the Voodoo2 was a bad product. At the time, nobody could really compete with it, and particularly not agains the Voodoo2 SLI. I don't really see why, at the time, the high resolutions available couldn't make up for the lack of 32-bit rendering (If I recall correctly, there were some 32-bit cards available, but they were very, very slow).

I'd say that the V2 was nearly on par with the original Voodoo in terms of how good the product was. However, it certainly wasn't as innovative.

As for the Rampage, it really is pointless to talk about it. Since the product was never finished, nobody has any idea just how good it eventually would have become. All that anybody can use to accurately judge 3dfx are the products that were actually released.

Remember that since Rampage was never finished, features could have been added and removed, there could have been hardware bugs that prevented certain features from working or working properly. There could have been driver issues or CPU usage issues that prevented features from being usable. The list goes on.
 
Chalnoth, Rampage was finished, for all intents and purposes. It had taped out and was IIRC on its second hard revision. I think. Don't quote me on that.
 
Even if that is true, it still leaves open all of the problems I posted about counting on unrealeased hardware except that features may be intentionally cut or added.

How many times have we seen features that were announced, but not implemented at launch, or were never implemented problem free, or were never implemented at all?

Some quick examples off the top of my head:
1. Palletized textures in the GeForce 256 (they were supported eventually, but it was a while)
2. Hardware T&L on Savage 2000
3. Programmable FSAA on Radeon 8500 (never worked flawlessly)
4. 3D textures on GeForce 3 (apparently no support in first chip revision that didn't achieve wide circulation).
5. HOS on the GeForce3/4 (worked well enough, but apparently required far too much CPU power to be useful).

Beyond that, you also have to consider things such as core and memory clock speed changes before launch. One classic example is the original TNT, which was marketted as a "Voodoo2 SLI killer." This, of course, never materialized as nVidia was forced to dramatically lower the clock speed, and the core ended up not being as efficient as nVidia originally envisioned.
 
Back
Top