Why i still use 3dfx, my own take on the 3d market.

Riptides

Newcomer
Okay, this is just a laymans take on the whole entire mess with the market today, please take this as mostly fun, so poke at it as you will, correct me where i am wrong, and hopefully my points are well made and thought out, thanks :rolleyes:
===============================================

Most V5 users realize that the shelf life/useability of the voodoo series is quickly approaching its end.. but if you are like me.. this is still one of the very best cards i have ever purchased in my time of buying video cards.. after upgrading beyond the 1 gig range, i have found the V5 scaled rather nicely.. much more than i expected.. as this was one of the features of the card when released.. that it would continue to be a viable card, and even more so if it had any sort of driver support, when faster processors would be released. :D

Even tho there will NEVER be any NEW drivers for the card, only optimised and system specific changes are all thats left to be made to the now aged x3dfx final, i still feel that i will never RETIRE my card, even when i remove it from my main system, it will still live on in my secondary.. for my TnT2 Ultra is getting very long in the tooth..

AS far AS nvidia goes.. kudos for them for keeping their release schedules, and making some of the fastest cards avail.. but until they use the 3dfx tech they acquired, namly the FSAA routines, i do not feel that they have in any way EARNED my buisness.
Not with their monkey assed driver swap routine that one must constantly play depending on the game one wants to play.. Yes.. i had a GeForce GTS before i got the V5, and i just got PLAIN tired of it.. and i still see this going on with their drivers to this day.

Once Nvidia stops concentrating on adding features that are not to be used until 2 years from now and how fast they can push their cards, and start concentrating on stability and compatability, they have in NO way EARNED my business. But even tho, they have worked more towards this, i still see Nvidia as OverHYPED and OVERpriced for what it is they offer. Why is it almost all computer components continue to drop in price as new tech comes along.. but Nvidia continually increases PRICE thru their releases ?..

But my biggest and its ONE BIG :devilish: gripe with Nvidia these days, is how they are pumping out newer and faster cards than the market (namely game dev's) can even begin to keep up with.. yet at the same time, they still manufacture the TnT lines for OEM products.. This continually assures that game makers will still produce games to the lowest common denominiator, and we will NOT see the new features they tout in the new products until 2-4 years down the road. Nvidia is slowly tightening their own noose by doing this, constantly making obsolete their own tech every 6 months, while at the same time, selling more of their old tech as OEM..

Anybody who is anybody, who keeps up with video gaming news, has to realize that by now.. ATI has come leaps and bounds with their driver development. While still not perfected, they are coming close.. shame on all you who thought ATI should have perfect drivers right off the bat.. But these days one cannot argue that ATI creates SuXoR drivers anymore, and when comparing, one must realize that ATI is almost on Par with Nvidia considering drivers.. And offering a cheaper product with the same speed, they are very poised to take over alot of gamers business, if they keep up with this ATI will soon be my next choice in videocards, but they still need to get things worked out decently enough for me to want to support them.. *stop disabling features in your drivers dammit :cry:* but if my V5 fails me tomorrow and i am forced to make a NEW purchase, i feel that ATI has done more to earn my support and business than Nvidia has. Just by them coming out honest with that one statement that they KNEW their drivers were lacking, and that they were working on a viable FIX for this, and almost a year later, it seems to be paying off. But then again *sighs* :cry: , ATI is still trying to do the same as Nvidia, continually release new products, making obsolete their old, yet the whole time pumping their oldest *ATI RAGE* chipsets into the OEM market.. Until this shitty trend ends, no-one is going to see the Uber Graphics that we should be seeing considering the market of cards avail from BOTH Nvidia and ATI.

Lastly, i am not one to hold hope in phantom products.. right now it seems that the Kyro continuation is in question, *well development as it stands* but it will probably work out for them, but how long until another product ? Long enough to lose more customers to a fan-boi stance ?.. maybe..
and how will it fare considering the product cycle that Nvidia created and now ATI follows as well ?

Bitboys ? jeezuz, i think they were being laughed about in the Voodoo2 days.. Sorry but farting in a small room :eek: , calling it a great idea, does not a videocard make. :oops:
And until i see somth more than a backwater story regarding their ALMOST card, i just will not entertain the idea that such a thing exists..

SiS, (shitty integrated shit ?), these guys could do somth great, but probably won't, upon reading their specs.. seems that they are just creating another fair to middlin product, nothing to blow anyone skirts up.. swapping certain rendering processes off to the Main Processor, is.. well redundant in any form with whats avail out there today..

Matrox, the little card that almost does, if you listen to the rumor mill.. Seems like they are always ready to release the NEXT gen 3d Accelerator, but only if you listen to the fans, which i do not blame Matrox for, but in the end it does hurt the company a bit, but also, what does this tell you regarding everyones feelings about only 2 main players in the 3D biz ??

Its not that i want any of these companies to fail, but this is where the reality lies.. they (the 2 main) are all slowly choking the market to death with their products and how they implement them these days, the rest, are either happy in their niche markets *matrox*, are struggling along *sis and Kyro*, or just plain do not exsist *bitboys*.
In my opinion, i want the market to settle down, i want companies to spend more time on working on and perfecting the next gen releases, and PHASE out all chipsets older than 3 years, and create some sort of standard, until then, i feel that the market will continue as it has been since the demise of 3dfx, which is.. it is slowly stagnating.. look how much the console market has pulled away from devloping games for the PC market over the past 2 years ?
How good was last year for gaming ? Not very ?
and even tho there are some good contenders this year.. it still seems nothing OUTSTANDING has been released or will be released this year that takes advantage of new 3d tech in any way. ;)
 
Dumb question: do you have any idea what the future (never released) products from the late 3dfx included?

Those kind of conversations are really getting old and although NVIDIA will most likely make use of their tech/patents some time in the future they don't "rely" on it and it doesn't determine their future either. They would still be market leaders, 3dfx tech or not.

What really makes a huge difference in this business is execution. Any company that would sit back right now and slow down processes delibaretly would automatically commit suicide. As to where and why I leave it to you to figure it out. Some sale statistics could help here of the past years.

It does come down to what the majority really wants doesn't it?
 
SiS, (shitty integrated shit ?), these guys could do somth great, but probably won't, upon reading their specs.. seems that they are just creating another fair to middlin product, nothing to blow anyone skirts up.. swapping certain rendering processes off to the Main Processor, is.. well redundant in any form with whats avail out there today..

At least they seem to be smarter than Nvidia. The SiS330 is the chip the GF4 MX - series should have been. Including pixel-shaders and leave out vertex shaders is IMO by far smarter than vice-versa.
 
Reply

Rip,

Once Nvidia stops concentrating on adding features that are not to be used until 2 years from now and how fast they can push their cards, and start concentrating on stability and compatability, they have in NO way EARNED my business.

Once nVidia, ATI, or any other company stop concentrating on adding features and on how fast they can push their cards, then there will be NO 3D market. Stability and compatability? They're not concetrating on that? The goal is to push new features, push the speed, and at the same time push stability and compatability. It's not one or the other, the goal is to get them all.

i feel that the market will continue as it has been since the demise of 3dfx, which is.. it is slowly stagnating.. look how much the console market has pulled away from devloping games for the PC market over the past 2 years ? How good was last year for gaming ? Not very ?

Are you saying that had 3Dfx not went down, the PC market would not be touched by the consoles competition and gaming would be great? Hardly. 3Dfx, nVidia, ATI, or anyone else's video cards will never make or break gaming. That responsibility lies in the developers hands alone. So, if you're going to rant about gaming, video card companies are not the target you should aim to shoot.

and even tho there are some good contenders this year.. it still seems nothing OUTSTANDING has been released or will be released this year that takes advantage of new 3d tech in any way.

Well, if the 3d market does what you want it to, settle down and stop concentrating on new features and better performance, then there will never be a new and outstanding product.

-dksuiko
 
Even tho there will NEVER be any NEW drivers for the card, only optimised and system specific changes are all thats left to be made to the now aged x3dfx final, i still feel that i will never RETIRE my card, even when i remove it from my main system, it will still live on in my secondary.. for my TnT2 Ultra is getting very long in the tooth..

Wrong! 3Dfx Underground is building a new driver, 1.09.00, with a totally re-written OpenGL ICD (1.2 compliant).

http://3dfxunderground.cjb.net
 
That long diatribe sounds like you're just trying to convince yourself over everything you've said.

Just get whatever is within your needs and budget and just play games. That's what it is all about, not dissing a company you don't like because you think they're blah, blah, and blah.

I'm not going to even argue with any of the points you brought up, because it looks like you're dead set on your opinions. dksuiko said it best anyways. Enjoy your V5. :)

Edit: That whole "PC gaming is declining" bit is total BS. I've heard that every year for the past 5 years, always the naysayers and pessimistic just dying to cry out the doom of gaming. Sorry bub, it's not dying or declining, and consoles have nothing to do with PC gaming. Perhaps you're just being jaded and need to take a deep breath, relax and take a vacation, or maybe stop gaming altogether.
 
One question, what does SuXoR stand for? I see quite a bit and I still don't have a clue to what that is or means.
 
Say what you will about nVidia's business tactics, but it seems to me they have been a benefit to the online hardware enthusiast community. They started releasing beta drivers when they saw we liked the leaks, there PR department answers numerous emails from every website big or small, and if they follow through with that internet server patent, I think it will be nV will have a very solid base of loyal customers that use their gaming servers.
 
Matt said:
That whole "PC gaming is declining" bit is total BS. I've heard that every year for the past 5 years, always the naysayers and pessimistic just dying to cry out the doom of gaming.

Exactly. We're still a long way from rendering Star Wars Episode II or Lord of The Rings real time on our cards, but getting there (albeit incremental) with new features and generations is what people here enjoy. Odd, but true. :p

Sure, I might never see much greatness from the old T&L on my old Geforce DDR but the card has been good to me anyway from the greater performance (over my Voodoo2). I paid for progress that I will not see before Unreal 2 (which I BTW will enjoy on my next card). And I guess some GF3 owners will have shifted to R300 or NV30 before a DX8 game is out.

So you pay for progress but the return just comes later than you expect[ed]. That's okay with me because new card not only deliever new features but also greater performance with the games that is out now.

Regards, LeStoffer
 
This isn't really in-line with the topic...But on a related note, how many people who bought the original GF3 can honestly say that they got a lot of mileage out of it, despite the lack of DX8 titles?

Seriously....I realize the Ti4600 can beat the crap out of it in several meaningful ways, but there (for me, this is almost 12 months later...) isn't one single title out that makes this card feel old or outdated....

In fact, based on the early numbers we've seen (and recalling the Lowest Common Demon. factor) from the Unreal2 Performance Test, there won't be a title to stress this thing out for some time to come...

It's funny, because even 12 months ago, I was very hesitant about getting the card, knowing full well just how long it would take for *any* title to take any significant advantage of the hardware...

But despite it all, the performance of the thing will wind up carrying you a good 2 years, as I don't actually expect any software to make you think twice until Doom3...

It's just kinda funny when you think back to this time last year...

So, even when these guys try their best to get developers to jump on the <fill in the blank> bandwagon...and it takes a couple of generations to happen (if it all), you can still say it was a good investment, provided that the sheer performance of the thing, along with any other advancements(IE FSAA, advanced filtering, etc) gave you a better 3D experience.
 
3DFX

I have to agree with Rip. I'm still using my voodoo 4. It runs all the games I play just fine. The main game I play is WW IIO and get equivelent frames , sometimes slightly more or less , than then similar systems running any geforce mx or geforce 2.

I've never been a bleeding edger, I dont have the spare dough, I do have 2 kids, wife , mortgage , life ins ect.... but I am a hobbiest

What I really want is good value, good price, good performance and compatibilty ; with a whole bunch of tweaking options

geforce 4 is listing at $600 here in Canada
geforce 4 mx $200
Radeon 8500 $269
radeon 7500 $179

Obviously the radeons are a better deal but in the game I play ie WW IIO
a 7500 is no better or faster than my voodoo4, and radeons have serious issues with the game to boot!!

If a had to buy today it would be a kyro II at $150 it will eat up any Mx, but I'll wait hoping the Creative and 3D Labs can pull something off.....

Slaterat
 
...how many people who bought the original GF3 can honestly say that they got a lot of mileage out of it, despite the lack of DX8 titles? ...the performance of the thing will wind up carrying you a good 2 years, as I don't actually expect any software to make you think twice until Doom3... ...you can still say it was a good investment, provided that the sheer performance of the thing, along with any other advancements(IE FSAA, advanced filtering, etc) gave you a better 3D experience.

I think it's ironic that the game you cite which might finally make the GF3 seem inadequate is likely the first real DX8 title, and what was the big selling point of the GF3 in the first place? You see people saying, "get a GF3 Ti 200 over a MX460, no question, because of the DX8 stuff", but does it really make sense to buy the slowest, least feature-laiden DX8 card at the end of that generation's lifespan at the top, just because of a DX8 capability that might not do you much good when the good, DX8-optimized games actually appear? Seems like that same old nVidia 32-bit color/large texture/T&L future protection argument of the 3dfx era to me.

What probably turns out to have been the most meaningful new feature with the GF3 was Lightspeed, which allowed that card, with a 200mhz core and 230mhz DDR memory, to achieve notably higher framerates than a GF2 Ultra, clocked at 250/230, particularly at higher resolutions. But did it make sense to pay around $400 for that, when you could have paid half that or less by waiting six months and buying a Ti 200 with memory probably capable of running at the same speed? Unless you were stuck with using a GF256 SDR card or something slower for that period, you didn't lose too much, even with games like Giants, B&W, Max Payne. And the drivers for that new card had a chance to mature.

And that savings means you could potentially make the next upgrade earlier - when the GF4 4400 drops to a reasonable price or maybe even when the 4200 comes out - instead of hanging onto that GF3 for another generation and then maybe blowing another $400. I just don't think that buying the fastest new-gen card from nVidia at the $400 they charge ever makes much sense, no matter how you put it.
 
Usually I say get a GF3Ti200 (or 8500LE) because of the quad stage texture pipeline and higher sustained polygon throughputb T&L pipeline.
It will help a lot with Doom3, Unreal 2 and UT2 in six months :)

Then to simplify the recomendation people say get a DX8, but not necesserally because the nice DX8 features.
 
But on a related note, how many people who bought the original GF3 can honestly say that they got a lot of mileage out of it, despite the lack of DX8 titles?

This is where, it seems, I part company with a lot of folks. It simply didn't have the 'mileage' potential I was hoping for the moment the antistatic bag was removes and slammed into an AGP slot, which meant immediate greedy desire for the next product batch.

Same goes for the 8500 (albeit a notch better), and looking into the 330mhz core GF4's now in hopes of this getting me there.

What do I want? I want the best IQ, playable and fantastic framerates, compatibility and stability for *all* of my games (not just a select few OGL and/or D3D titles) and future support in the form of drivers to maintain this balance with the same static set of titles.

Basically, I want V5's FSAA, the GF3's anisotropy and framerates to meet or exceed 60 fps in most all games. Flight sims and non-action based adventure crawlers can be half that, but you get the idea.

I'm less interested in what will run Doom3 or Unreal 2 a year from now as I am what will run the stack of CD's I have sitting on my desk right here and now. When I throw on anisotropy and 4x AA and games stagger and sputter below 20 fps, it's a little discouraging.

Also, when I throw on anisotropy and 4xAA, if I need to take screenshots, import into photoshop and zoom to 900% to see the improvement, I'm equally discouraged.

It's also important that adding anisotropy and AA shouldn't be a simple placebo setting in the display properties. I'm less interested in being able to say "yeah, I'm running 4xAA!" versus "Wow, this AA looks great" which seems to be the trend these days. If you don't *visually* know you have AA or anisotropy enabled or not, you're missing the point and the performance hit such methods take is meaningless.

The number of "samples" or "taps" a particular image enhancement feature is advertised to use and it's related performance hit should never take precedence over it's overall impact on visual quality. This concept is heresy given the current round of benchmarks, user commentary and flame wars seen abroad.

I believe it's this very "placebo specs" versus "delivered visual impact" mindset that has been driving future product lines. Matt said it best- play your games. Let this facet control the purchasing and upgrading of hardware, as well as praise for product lines or new hardware.
 
Usually I say get a GF3Ti200 (or 8500LE) because of the quad stage texture pipeline and higher sustained polygon throughputb T&L pipeline.

Let's look at that. The Ti 200 is quad pipelined at a core speed of 175mhz, and the MX460 has a dual pipeline architecture running at 300mhz. So it seems to me the Ti has a theoretical 17% advantage. But this card like most is ultimately memory bandwidth-limited in the real world, so probably less than that in performance. The MX has a 35% memory speed advantage, 270mhz DDR to 200mhz, and this is a real world advantage. The Ti counters with Lightspeed I, 4X32-bit memory controllers plus the Z-buffer stuff. The MX has Lightspeed II MX, which features 2x64-bit controllers, but (I assume) adds the upgraded Z-occlusion culling of the GF4 Ti's that the GF3 Ti's don't have. So what do the benchmarks say?

TH at 12x10 and 16x12 shows the Ti up 10% in Giants, about even in Max Payne, and 10% edge to the MX in Q3A. Anandtech shows that Q3A edge down to 5%, a 10-15% advantage to the Ti in SerSam2, a very small edge to the MX in RTCW, and a huge 55-65% advantage to the Ti in Unreal2 test. So other than Unreal2 it's really a wash. Seems like the advantage the GF3 has is with its version of Lightspeed, which must make up that 35% memory speed disadvantage.

So we look at the 200mhz GF2 Ti, also at 200mhz but without Lightspeed at all, and with a 250mhz quad-pipeline core, a 43% advatage over the Ti 200. Giants shows the GF3 Ti with a 30-40% advantage, Max Payne a 55% advantage, Q3 a 35-45% advantage, and the GF2 runs at the same speed as the MX in Unreal2, far behind the Ti 200. Seems like that verifies the advantage Lightspeed gives the GF3, but also casts serious doubt on what LMA II MX really does for the GF4 MX cards.

In any case, I'm not sure how much of an advantage with the Ti 200 over the MX460 that will show itself in this area on near future games. The Ti does a bit better on newer current games, but much better with the polygon-intensive Unreal2 test. What does this test say about the MX, which can't beat the GF2 Ti even with memory 35% faster and with LMA? Is this game so GPU-intensive that memory bandwidth is no longer the issue? If so, why does a Ti 500 have a 23% framerate advantage when it has a 43% core speed edge but only a 25% memory speed edge? And is this test a real indication of future game stresses?

If so, nVidia has put out a new MX card that probably represents the low-end for them for at least the next year, a card that will be included in many OEM PCs, and not only does it not include programmable shaders, but either can't crank the polys fast enough to provide higher framerates than a GF2 on the new stuff, or its LMA MX is vastly inferior to that in the GF4 Ti's. Doesn't sound good. I still have my doubts about this Unreal test being indicative of the future, though.
 
Usually I say get a GF3Ti200 (or 8500LE) because of the quad stage texture pipeline and higher sustained polygon throughputb T&L pipeline.

Let's look at that. The Ti 200 is quad pipelined at a core speed of 175mhz, and the MX460 has a dual pipeline architecture running at 300mhz. So it seems to me the Ti has a theoretical 17% advantage. But this card like most is ultimately memory bandwidth-limited in the real world, so probably less than that in performance. The MX has a 35% memory speed advantage, 270mhz DDR to 200mhz, and this is a real world advantage. The Ti counters with Lightspeed I, 4X32-bit memory controllers plus the Z-buffer stuff. The MX has Lightspeed II MX, which features 2x64-bit controllers, but (I assume) adds the upgraded Z-occlusion culling of the GF4 Ti's that the GF3 Ti's don't have. So what do the benchmarks say?

TH at 12x10 and 16x12 shows the Ti up 10% in Giants, about even in Max Payne, and 10% edge to the MX in Q3A. Anandtech shows that Q3A edge down to 5%, a 10-15% advantage to the Ti in SerSam2, a very small edge to the MX in RTCW, and a huge 55-65% advantage to the Ti in Unreal2 test. So other than Unreal2 it's really a wash. Seems like the advantage the GF3 has is with its version of Lightspeed, which must make up that 35% memory speed disadvantage.

So we look at the 200mhz GF2 Ti, also at 200mhz but without Lightspeed at all, and with a 250mhz quad-pipeline core, a 43% advatage over the Ti 200. Giants shows the GF3 Ti with a 30-40% advantage, Max Payne a 55% advantage, Q3 a 35-45% advantage, and the GF2 runs at the same speed as the MX in Unreal2, far behind the Ti 200. Seems like that verifies the advantage Lightspeed gives the GF3, but also casts serious doubt on what LMA II MX really does for the GF4 MX cards.

In any case, I'm not sure how much of an advantage with the Ti 200 over the MX460 that will show itself in this area on near future games. The Ti does a bit better on newer current games, but much better with the polygon-intensive Unreal2 test. What does this test say about the MX, which can't beat the GF2 Ti even with memory 35% faster and with LMA? Is this game so GPU-intensive that memory bandwidth is no longer the issue? If so, why does a Ti 500 have a 23% framerate advantage when it has a 43% core speed edge but only a 25% memory speed edge? And is this test a real indication of future game stresses?

If so, nVidia has put out a new MX card that probably represents the low-end for them for at least the next year, a card that will be included in many OEM PCs, and not only does it not include programmable shaders, but either can't crank the polys fast enough to provide higher framerates than a GF2 on the new stuff, or its LMA MX is vastly inferior to that in the GF4 Ti's. Doesn't sound good. I still have my doubts about this Unreal test being indicative of the future, though.
 
Mark said:
Let's look at that. The Ti 200 is quad pipelined....

THANKS for this summary. It "proves" my decision to go with an GF3Ti200 instead of waiting for an maybe faster MX 460 ( the MX460 is more expensive here than the GF3Ti200 !! ).
 
Back
Top