Why i still use 3dfx, my own take on the 3d market.

UT2 will be out in six months and will have a simplified version of the Unreal Performance Test 2002.

Many people will be really sorry for buying this ugly GF4MX :rolleyes:
 
THANKS for this summary. It "proves" my decision to go with an GF3Ti200 instead of waiting for an maybe faster MX 460 ( the MX460 is more expensive here than the GF3Ti200 !! ).

You're welcome, but I'm not sure I "proved" much of anything. The two cards are pretty equal everwhere except the Unreal2 test, and I don't know what to draw from that. As I said, the Ti 500 has a 43% faster core than the 200 and 25% faster memory, and the framerate difference is 23%. The MX460 is 30-35% faster than the MX440, and it has a 35% memory speed advantage but only a 9% GPU speed advantage. That sounds like memory bandwidth is the issue. On the other hand, the difference between the Ti 4400 and 4600 is 10-12%, and the 4600 has a 9% faster core and 18% faster memory, so that looks a bit more like a GPU issue. Or is that really a CPU limitation being felt on the fastest cards, and a faster CPU would push that up to around 18%?

I guess I don't know what Unreal2 test shows. UT was such a poor benchmark because of its CPU dependence. That engine didn't utilized hardware T&L, and the new one apparently doesn't do much with the DX8 stuff, so its hard to draw much in the way of conclusions from such a divergent approach. Or maybe it only seem divergent when compared to the oft-cited Quake engines.

In any case, I don't know why the Ti 200 has such a massive advantage over the MX460 in this test, 55-65%. The core has only a 17% fillrate advantage, and it doesn't seem its LMA could offset the 35% speed advantage of the MX and add anything close to that kind of margin. Maybe U2 has an awful lot of 32-bit chunks passing in and out of memory - if its LMA created a doubling of memory bandwidth efficiency (not realistic, of course), that would create a 55% advantage, which when coupled with the fillrate edge might explain this. This one perplexes me, but maybe someones else has an idea what's happening here.
 
Mark said:
THANKS for this summary. It "proves" my decision to go with an GF3Ti200 instead of waiting for an maybe faster MX 460 ( the MX460 is more expensive here than the GF3Ti200 !! ).

You're welcome, but I'm not sure I "proved" much of anything. The two cards are pretty equal everwhere except the Unreal2 test, and I don't know what to draw from that.


It's not about speed alone.

I feared that the GF3Ti200 is slower than an MX 460 in most games, but you "showed/proved" otherwise ( I wanted an future-proof DX8 card and no DX7-card so I was willing to live with lower speed ).

On top :

- the GF3 Ti200 is cheaper

- has DX8 hardware

- is highly overclockable (at least most boards)

so in summary it is by far the better package compared to the MX 460, no wonder that Nvidia has killed it.
 
The advantage is 71% at 1024x768x32 (build 856).
Probably the main reason is the higher average textures per pixel around to 3. Unreal 2 will require from 2 textures to 50 passes with some particle effect (flamethrower).

My guess the GF3Ti200 quad texture pipeline is a big advantage.
Please people break this paradigm:
- older hardware
- older name (GF3<GF4)
- lower peak fillrate
- lower memory bandwith
- lower core MHZ
 
Probably the main reason is the higher average textures per pixel around to 3. Unreal 2 will require from 2 textures to 50 passes with some particle effect (flamethrower). My guess the GF3Ti200 quad texture pipeline is a big advantage.

I'm not sure that the numbers bear that out. As I said, the GF4 MX and GF3 lines seem to show that within those families it's memory bandwidth that makes the difference. The GF2 shows the same thing in Anand's original article on the engine - at 1280x1024 the Ultra and Ti, with the same core speed but a 15% memory speed difference, show a 12.5% framerate difference, and the Ti and Pro, with the same memory speed but a 25% core speed difference, show a 5% framerate difference.

That doesn't rule out rendering capability between families of cards, but if you look at the GF2s vs. the GF3s, both with the quad pipeline thing, what do you see? The Ti 200 and the Ti both have 200mhz memory but the Ti has a 43% core speed edge, yet the Ti 200 is 50-60% faster. The Ultra and the Ti 500 have the same 250mhz core speed, the Ti 500's memory is only 9% faster, yet the Ti is 80-90% faster.

Anand's conclusions include, " If these results are any indication, moving forward, GPU clock will actually play a much more important role than it has in the past. A delicate balance between GPU clock and memory clock, such as what was made possible on the GeForce3, will be ideal to obtain... There are still some questions that remain unanswered, including how effective hardware T&L actually is on slower systems." And his comment on the GF4 MXs was, "A significant disappointment is the GeForce4 MX which fails to even outperform the GeForce2 Ti 200. This is exactly why we recommend either going for the GeForce3 Ti 200, the Radeon 8500LE or waiting for the GeForce4 Ti 4200".

But this really sheds very little light on the issue. Both the brute-force GF2s and the GF4 MX finesse fail to come close to even the dog-slow Ti 200, let alone the GF4 Ti's blistering pace. Maybe this game utilizes the single-pass quadtexturing capability of the GF3's pipelines and maybe its version of crossbar is much more effective here than the MX's. I suppose Epic could have designed the engine's optimal quality settings to be achieved specifically through the use of the high-end hardware available during its development. But the margin just seems too large to be explained by that alone.

And the Radeon 8500's margin over the 7500 is nearly as great as the Ti 500's over the GF2 Ti or GF4 MX460, even though the approach is different. HyperZ might well be better than the Z-stuff in the GF3s, but there is no crossbar, and the 8500 can do more textures per pass than the GF3. The 7500 is even more different compared to the GF2. Compared to the 7500, the 8500 has memory that is 20% faster, the 7500 has a 5% faster core clock, The 8500 has four twin-texture pipelines and the 7500 two triple-texture pipelines, and unlike the the GF2/GF3/GF4 MX comparisons, the basic memory architecture is identical. The 8500 does have an improved version of HyperZ, though. The numbers show the 8500 84% faster at 12x10 and 55% at 16x12, with the gap falling as reso increases and not the reverse as with all the nVidia cards. Perhaps in part a driver issue, though, but it could be a memory bandwidth limitation at 16x12. So hard to say why the pronounced difference, but fillrate must be part of it.

But I drift OT here. I still don't see why the GF4 MX's performance in U2 is so different from that of the Ti 200. A blend of all the things I've mentioned and some I haven't considered, I suppose. :-?
 
Ahh.. i should have come back and defended my post earlier, but real life got in the way.. but i will cover some points here. :LOL:
And i will try to make it short.
And did i say Suxor in my first post ? Guess that is what happens when one reads VE forums too often. But i dont think i would stoop that low to say that..

And before i start, it amazes me how it passes over you guys heads how most of the opinions i posted was rehashing of YOUR own opinions stated all over this Forum.

dumb question: do you have any idea what the future (never released) products from the late 3dfx included?
It does come down to what the majority really wants doesn't it?

Sure i am well aware of what the late 3dfx was up to, and what their then in development products were capable of, and yeah, that tech is long old and long gone, all i ask from Nvidia is to integrate the 3dfx FSAA into their products, they now have the speed it seems, and i dont see one detractor from the gaming community regarding them adding in this feature. In fact i believe the most wanted feature out of the 3dfx takeover was quick integration of their FSAA routines.

And the majority ? Seems the Majority is stuck getting TNT2's in their system still to this day. If not Rage Chips.. that was/is my main gripe, Correct me if i am wrong, but in fact, what the majority wants, is for the top manufacturers to stop selling 4-5 year old tech to OEM's.

DKSuiko - Mang, i gots respect for what you post here, but you totally took my words out of context, is it cos i put 3dfx into the topic?? ;)
Once nVidia, ATI, or any other company stop concentrating on adding features and on how fast they can push their cards, then there will be NO 3D market.

Wrong, if ATI announced that they were coming out with a new board that would rock the house and be the most compatible to date with everything, and at the same time they were going to develop perfect drivers for this along the entire way, but at the cost of a year or more of true dev time, think of the Hype it would cause, sure the Nvidia fans would start their jeering, but in the end, if ATI delievered, what a boost it would be to the market, and to them, period.
Them and Nvidia will not fail b/c they still supply thousands of cheap TnTs and etc etc to the market, that is their bread and butter, not the cards that push the envelope, that is the hype they live by. They hype the latest and greatest, but still, they make their money off the cheapest they can provide. If either camp stated tomorrow that you would not see a card until this time next year.. they would NOT fail. they both have too many sellable products on the market already. That was 3dfx's failure. One product to sell while developing for 2 years.

Are you saying that had 3Dfx not went down, the PC market would not be touched by the consoles competition and gaming would be great?
nope, didnt say that at all, but was only using that event as a basis to start a timeline.

That long diatribe sounds like you're just trying to convince yourself over everything you've said.

Nope, only trying to convince myself over everything i have READ on here and other gaming/hardware sites. The problem here is that you guys took me too seriously thinking i was taking myself seriously ;)
Cos why in the hell would i put 3dfx in the subj line, i know what happens here when one does that. No respect. ;)
And i am never dead set on my opinions :) in this post i am being deadset on most of YOUR opinions. :)

BTW the rest of the posts after the Leet speak Suxor ones are the type of thing i was trying to incite here, Sharkfood especially brings up the best points out of what i was trying to say, i would love to see a lenghtened product cycle to produce the card Sharkfood speaks of.
As well as Marks points. founts of information :)
But as it stands, right now, you are paying for a overpriced product that will never deliever what is promised from it. And i suspect most of you here wouldnt wait for the DEV cycle to catch up with it before buying into the Hype again.

as far as it goes, hell yeah i still enjoy my V5, if there are going to be new drivers for it, thats more better for the fact that it will make a great 2nd box card. Because by the time the games hit that will render the v5 stupid, i hope to have a better card in this system. I am quickly approaching a decision on what to buy, but as sharkfood stated, i know soon as i plop it into my system, i will have wished i waited a lil longer for somth a lil better, but thats the bleeding edge game we play. I am just glad to say i was able to hold onto the V5 for as long as i did, b/c yes i am one of the fools that paid over 300 bucks for the damn thing and i dont believe in spending the price of my systems worth each year just in video cards alone. But to get the milage i got out of it, dont slight me on that one bit. :rolleyes:
 
I don't think that the v5s FSAA tech will be used by nvidia.

Its super sampling which is slow. Multi sampling is far more efficient.

They may also skip straight to a pseudo random FSAA instead of RG.
 
Stop blaming Nvidia, ATI, ImgTech et al

Why are you blaming Nvidia et al for the fact that OEMs choose to integrate the cheapest card no matter what the intended market for that card is? Are you suggesting Nvidia et all withdraw their cheapest cards because the OEMs should be putting a better card into gaming machines but they're not? Maybe you also blame the RAM makers for the fact that OEMs are putting out Windows XP PCs with 64mb of RAM? They should only sell 256Mb sticks, at least that way OEMs would always put at least 256Mb RAM in a Windows XP PC (the bare minimum for a Windows XP).

Or maybe you think that Nvidia et al should just screw the average computer user (who plays about 1 3D game in 2 years and has never even used their TNT2 to it's fullest potential) and also screw all the people in China etc who would have to spend half their yearly wage to get a Geforce4 MX440 instead of a TNT2 for their comps. I would agree that the prices for the GeForce2, 3, 4 should and could be lower (while keeping the company heavily in the blue) and all the other cards including the TNT2s. The fact is that if Nvidia didn't have their TNT2s their other cards wouldn't be much cheaper because they're mean bastards who want to make a whole lot of money. If they weren't so bastard like, their other card would be cheaper but so would the TNT2s which would be even cheaper yet still very popular, especially in countries like China etc.

What you have here is a problem a lot more complex which is related to a lot more then Nvidia etc being bad companies.... Solving it would require all the countries in the world and most companies in the world to do a lot things they will never do. Of course, if you did solve it, you will probably solve world hunger and poverty at the same time so there is a positive side.
 
Second reply

Correct me if i am wrong, but in fact, what the majority wants, is for the top manufacturers to stop selling 4-5 year old tech to OEM's.
The majority? What do you mean by the majority? If you mean all the people who own expensive cards then you're probably right. If you mean all the computer users in this world or all the population or even all the people who play 3d games then you're wrong. The majority of computer users don't give a damn or want Nvidia (and others) to continue sell the cheapest because it's the only thing they can resonable afford!

That was 3dfx's failure. One product to sell while developing for 2 years.
I think there was a lot more things then this including the fact that their card were probably the worst hyped but failed to truly deliver IMHO. However you're right. Concentration on a niche market tends to be a bad idea....

But as it stands, right now, you are paying for a overpriced product that will never deliever what is promised from it. And i suspect most of you here wouldnt wait for the DEV cycle to catch up with it before buying into the Hype again.
Well if you're talking about most of the people who buy the absolute best cards, you may be right. However, I reckon if you talk about people who buy the cheaper, lower end card like the G3Ti200, Radeon 8500LE etc, you're dead wrong. Many people will be using these cards for at least 1.5 years IMHO, probably longer. Just as there are a lot more poor (and mid class) computer users then rich ones, there are a lot more poor (and mid class) 3d gamers then rich ones who will indeed by a low high range card with the intention of using it for a long time on 3d games. It often makes more sense (is cheaper and better) then buying a series of (or two) low end products in in the same time period. Of course, there are many people whu buy low mid range product or even real low range products with the intention of using it in 3d games in the years to come including many who know what it will eventually be like for them but do it because it's the most they can resonable afford.

I'm not saying I agree with the current product cycles but I am saying that there are a lot of people who WILL be using their cards's potential even though it fairly cutting edge (although maybe not the best). In fact, I reckon (although not so strongly) that the majority of people who do buy even the real best cards, e.g. Ti4600 will in fact use a lot of it's potential. There are some who evalute this as their best option (although I tend to disagree) [get the real best and keep it for a real long time] for buying video cards.
 
Seems the Majority is stuck getting TNT2's in their system still to this day. If not Rage Chips. that was/is my main gripe, Correct me if i am wrong, but in fact, what the majority wants, is for the top manufacturers to stop selling 4-5 year old tech to OEM's.

I don't think this reflects reality. Yes, some OEMs still use TNTs and Rage128s and even Rage Pros, but which OEMs are they? I see them mostly in ads by mom-n-pop shops, and always in cheesy $600 boxes. No one is going to do any sort of serious gaming on these things, so who cares? The big OEMs give you video card choices, and you can upgrade way beyond that sort of outdated stuff. The problem now, if there is a problem, is the GF MX cards which litter the OEM scene, and some people get them thinking they are getting a card with serious ganing capability, because the specs say GeForce2 with 64mb of memory (or soon GeForce4).

They hype the latest and greatest, but still, they make their money off the cheapest they can provide. If either camp stated tomorrow that you would not see a card until this time next year.. they would NOT fail. they both have too many sellable products on the market already. That was 3dfx's failure.

Don't think so. nVidia was able to move ATi out of the mainstream/low-end OEM leadership position because of their reputation as the technology leader in the high-end and the selling power that brought their name. ATi bungled through that Rage Fury/Pro/Maxx period without much applause from the press, so lost ground. 3dfx was (unfairly, I believe) being roasted by the press at the same time, so their cards designed for the beginnings of OEM penetration never really got off the ground.

The reality is that high-end technology opinion drives the low-end market, like it or not. And nVidia discovered that at the high-end it matters not if the newest, flashiest features can be used at all in games on the market, the press and the people salivating over the new stuff will eat it up anyway. That was mostly the case during the TNT2/Voodoo3 period, when 3dfx put out a card one year after the all-conquering Voodoo2 that ran as fast as a $600 Voodoo2 SLI combo, added good 2D, and sold it for $129, and that card was mostly panned by the press as old technology. Instead they went wild over a $250 card that supported 32-bit color at reasonably playable framerate when games were designed in 16-bit color and most people were playing them that way, included 32mb of expensive memory that was only required for that 32-bit color, supported 2048x2048 textures when games topped out at 256x256, supported AGP texturing which was of no practical value, was AGP 4X compliant when that standard hadn't yet showed on the market, but was no faster than the $179 Voodoo3 3k in 16-bit and had poorer IQ. Then it was on to hardware T&L on $300 cards, programmable shaders on $400 cards, etc. And in the meantime OEM boxes began to be filled with TNT2s and GF2 MXs.

Whether or not it was intentional, 3dfx made cards at that time that allowed you to play the games that were actually on the market (all of them, including Glide games) at playable framerate and with good image quality (including the V5's FSAA), all at reasonable prices. And that wasn't as good as nVidia's "future protection" featuresets, high prices, and six-month product cycle. That in part contributed to their demise, and taught ATi that it was now nVidia's game and had to be played by their rules. I do think nVidia has gotten a bit better now, with their attention to raw speed, memory bandwidth enhancement, IQ, and broadening of the non-3D featureset shown in the GF4 line, but the damned things still cost an arm and a leg. At least they've moved beyond $300+ GeForce256s with unutilized T&L and slow SDR memory.
 
The Demise of 3dfx

I have to tell you, I made some wrong calls myself. :oops:

A long while back (it sat on the internet for over two years, but I believe it's finally gone) I made a comment, somwhere... I don't remember where. It was right after the Voodoo 5 6000 plans were sold to Quantum 3D.

What I said was along the lines of:

'3dfx and Quantum 3D have been LONG TIME partners, so this should be no big scary shock to anyone. This, and the recent sale of the plant in Juarez, Mexico, appear to be signs of renewal. It looks like 3dfx has a plan, and I like what I'm seeing.'

Essentially, that's what I said.

About four months later, there was no more 3dfx. :eek:

Their product development was probably their downfall. Pushing back release dates, and press confrences, not unvailing products at certain times, or falsifying their products at trade shows...

So, I made a bad decision. But why do I still use 3dfx hardware? I enjoy the community feeling. Even places that fight, amongst themselves, site to site... it's still a diverse community.

Also, 3dfx was always an underdog. Everyone knew the nVidia cards could crank out frames... once nV made it on the scene, it seemed as though it was all over for 3dfx. But the benchies weren't what I looked at. Screen shots were worthless to me as well. I had to experience. I enjoyed Voodoo 2's back in the day, but the 3d Prophet (GeForce 1) was extremely tempting. After finally tinkering with both, and finding that for my Quake 2 needs 3dfx played the part, I went with the Voodoo 3.
-On that all machine, at the time, the required PCI V3 2000 was more inexpensive than the 3D Prophet was, and I preferred it over the GF1, so that was another determining factor.

But the question remains; why do I still use 3dfx TODAY?

Glide? Perhaps. But I have plans to keep using Glide well after I purchase a newer card, beit a GeForce, Omen, Radeon, or anything else out there.

Cost? It's a big factor. I've got an extremely low income... I'm a student!! I can't even afford a new CDROM drive right now! I've had the plan, all along, to try and make my Voodoo 5 last until fall/winter 2002. It looks like it's going to make it.

There are more and more compatibility issues, which we all know will not be resolved without official driver support, or developer support. But all the games that were supposedly going to be the death of my Voodoo 5, such as Max Payne, and Return to Castle Wolfenstein - smooth as silk, with image quality to knock your socks off. Even the newest games, while slightly buggy, have no real PROBLEMS. I'm enjoying War Craft 3, Serious Sam 2 (without being forced to run in 3dfx compatibility mode),

I'm a Quality buff, not a speed buff... and I simply find the image quality of the Voodoo 5 superior, still, to anything out there. Granted, the others can look just as good, or even better. But the performance to quality ratio provided by these cards is too poor. Even if the image quality is better, and FASTER than my Voodoo 5, when you look at the numbers, getting a newer card to look like a V5, you take more of a hit, than you did with a Voodoo.

Another factor, I'm a fan of the 3dfx Tools. Call me old fashioned, I don't even like Windows XP. All for the same reason. I'm not an idiot, don't dumb things down. For instance, ATi's old RAGE3D control pannel, with the huge blue buttons... "IMAGE QUALITY" check box or "SPEED" check box. 3dfx Tools allows you to customize and have MUCH more control over the features and performance your card can utilize. I'm glad 3dfx didn't think their owners were fools.

I will be moving on soon, and keeping a Glide based video card, just for kicks... and there is no one simple answer the why I do still have 3dfx hardware. I gues all I can say is that I love it!

NuAngel
 
Sure i am well aware of what the late 3dfx was up to, and what their then in development products were capable of, and yeah, that tech is long old and long gone, all i ask from Nvidia is to integrate the 3dfx FSAA into their products, they now have the speed it seems, and i dont see one detractor from the gaming community regarding them adding in this feature. In fact i believe the most wanted feature out of the 3dfx takeover was quick integration of their FSAA routines.

I'm not sure to what extend your are "aware" what 3dfx had actually in their plans even before the VSA-100 release. To the best of my knowledge the initial plans were to introduce hardware full scene antialiasing through the M-buffer, where the T-buffer and it's Rotated Grid Supersampling was rather an "afterthought" and was an alternative implemented in the VSA-100 line, when the Rampage core got more delays, feature-creeps, redesigns whatever....

Anything M-buffer or in extension Spectre was to have Rotated Grid Multisampling, which was to be combined with up to 128-tap anisotropic filtering and that even in the product planned beyond that one. Apart from the NV20's inability to use RGMS with 4 samples, could you possibly point at as to where and why NVIDIA proposed something "that" different to what 3dfx had in plans for their competing dx8 compliant product?


3dfx Tools allows you to customize and have MUCH more control over the features and performance your card can utilize.

Nu,

You really should take a closer look to the driver control panels concerning 2D and 3D and that for all vendors.

Past 2 years from 2000, and leaving RGSS quality aside there's hardly anything recent cards are lacking in terms of image quality, options, configurations etc etc. Rather the contrary, but then again it's pretty normal.
 
My only on-topic comment :

The only reasons someone would hang on to a Voodoo5 would be either they are addicted to its 4xAA (of which I have to say as a matter of fact that nothing has come close to it since then), or they are out-and-out 3dfx fans, or they have no money to upgrade, or the games they continually play (and continually buy) are so severely CPU-limited that getting the newer generations of cards since the V5-5.5k affords them nothing compared to their absolute longing for the V5-5.5k's 4xAA that upgrading to a faster CPU may be the better/best choice. We still don't have any really good games with BM or PS or VS or some such.

Off-topic :

I just sold (more like getting rid) a butt-load of hardware today. I sold a P3-650, a AMD T-Bird 800, two mobos, four PC133 RAMs, a SB LIve s/card, a GeForce2 GTS, a plain-vanilla GeForce3 and two V5 5500s.

I kept one V5 5500. I'll probably never use it again but I simply had to have one for keeps. Nostalgia, y'know :) . The fact that it is the longest add-on card I own and has two chips with individual fans also influenced my contemplation to keep one :) .
 
My son has my old V5 in his box. He is only seven so the only 3d game he wants to play is Jedi Knight II, and it handles it adequately for his needs on a 15" monitor at 800x600.
 
Hmm...according to some sources

There was a group of 3dfx fans that were going to produce a new driver set not just modded infs and reg keys like 1.08
but were gonna call it 1.09...for whatever reasons...I won't go into it here they found that 3dfx had started anisotropic which relates to the filtering comment above...
 
maybe in the unified driver set, they were getting ready for Rampage aniso. However these claims that they've enabled aniso on the V5 is laughable.

On the old hardwarre front, was aniso ever enabled for the S4?
 
Back
Top