VSA100 ?

demalion said:
Well, to be fair, I think the conflict between nVidia and VisionTek cannot simply be ignored in all this.

Yikes, I wonder how much validity there is to their side of the story...sounds pretty ugly.

What story? where's the scoop? I'm always up for gossip.
 
Based on the Inquirer generally not mangling things that can be confirmed by looking at court cases and such, I provide this link.

I thought it had been mentioned here already, sorry.
 
Thanks, interesting info. No doubt fuel for fires of and proof positive for some of "nvidia's dirty business tactics".
 
I would really like to know if anybody still disagrees with VSA-100 comparison :)

Geez, it's now so close to it, it's downright scary. At least 3dfx had enough guts to simply cancel the V5-6000, as opposed to sending them out to reviewers en masse, have people look @ the numbers, and then cancel the thing.
 
Hah

I still say it's not too late for NVidia to release a 16-way VSA-100 solution. True, it would need 4 slots and engine coolant from Pep Boys, but it would kick ass! :devilish: :devilish: :devilish: :devilish:
 
Typedef Enum said:
I would really like to know if anybody still disagrees with VSA-100 comparison :)

Geez, it's now so close to it, it's downright scary. At least 3dfx had enough guts to simply cancel the V5-6000, as opposed to sending them out to reviewers en masse, have people look @ the numbers, and then cancel the thing.

It has reminded of the 3dfx-nVidia struggles for a long time now--I've stated before out here I believe that the parallels are eerie in how similar they are--only this time nVidia is sitting squarely in 3dfx's shoes.

All of this actually makes me pine for the V56K....;) (Sort of.) But there are some major differences. There you had something that was at least interesting--4 gpus running in parallel, sharing some onboard memory, each gpu using some of its own dedicated ram--no over clocking, no over heating, no giant Dustbusters in sight. You had T-Buffer and wonderful FSAA as well as 8 pixels per clock, which has only hit the market in the last six months. Power supplies then were not as robust as they are currently, so 3dfx had to create the costly "VoodooVolts" thing...it's all pretty hazy now. But basically the GF FX Ultra comes off as a crude kludge comparatively because all it is is a massively overclocked standard nv30 with a giant, noisy fan--that's it in a nutshell. V5 6K had some much more interesting technology behind it, IMO. It's really too bad 3dfx made all those horrible choices with regard to their business model and STB. Their engineering prowess was obviously unmatched at the time for 3D architecture. V5 5.5K was an excellent product, a product I enjoyed immensely, and had it but *shipped on time* it would have made mincement out of the GF1. But really it wasn't being late with the V5 5.5K which killed them--it was the STB fiasco which sucked their coffers dry and distracted them to such a degree they lost track of their core business. All of this is of course IMO...;)
 
WaltC said:
It has reminded of the 3dfx-nVidia struggles for a long time now--I've stated before out here I believe that the parallels are eerie in how similar they are--only this time nVidia is sitting squarely in 3dfx's shoes.

Not even close. 3dfx = rapidly dwindling market share, no OEM contracts worth mentioning, lots of overhead with a board plant that didn't even begin to come close to its production capacity because no one was buying the boards it produced, last product released was built on a core that was basically four years old even then.

Yes, it's monumental that a competitor finally surpassed Nvidia, but wasn't it bound to happen sooner or later? What isn't monumental is that ATi's only been in the driver's seat for one product cycle, at least so far. Let them maintain their performance lead throughout this year and I'll share more of your enthusiasm over this situation.

All of this actually makes me pine for the V56K....;) (Sort of.) But there are some major differences. There you had something that was at least interesting--4 gpus running in parallel, sharing some onboard memory, each gpu using some of its own dedicated ram--no over clocking, no over heating, no giant Dustbusters in sight. You had T-Buffer and wonderful FSAA as well as 8 pixels per clock, which has only hit the market in the last six months. Power supplies then were not as robust as they are currently, so 3dfx had to create the costly "VoodooVolts" thing...it's all pretty hazy now. But basically the GF FX Ultra comes off as a crude kludge comparatively because all it is is a massively overclocked standard nv30 with a giant, noisy fan--that's it in a nutshell. V5 6K had some much more interesting technology behind it, IMO. It's really too bad 3dfx made all those horrible choices with regard to their business model and STB. Their engineering prowess was obviously unmatched at the time for 3D architecture. V5 5.5K was an excellent product, a product I enjoyed immensely, and had it but *shipped on time* it would have made mincement out of the GF1. But really it wasn't being late with the V5 5.5K which killed them--it was the STB fiasco which sucked their coffers dry and distracted them to such a degree they lost track of their core business. All of this is of course IMO...;)

Yes, the V56K would've had amazing AA for its time (probably better than the 9700's 6x for edge quality) but I'm not sure I would've paid $600 USD in 2000 for one. Regardless, comparing its technology to that of the GF FX is amazingly inane. The latter is generation upon generation more advanced than the VSA-100 chip, and probably offers a feature list longer than every 3dfx product ever released combined (doesn't help 3dfx that they milked the same core fo nigh a half-decade). Beyond just chip specs, yes, the FX Ultra strikes me as a clumsy piece of engineering with its board size, heat, and noise levels, but most of this is a result of trying to ratchet clock speeds to better compete against the 9700 Pro. A 128-bit memory bus has proven to not be overkill.
 
John Reynolds said:
Not even close.

Going to have to say listen to John on this one. AFAIK, nVidias internal situation is nothing like the one 3dfx was in. Beyond the external things Mr. Reynolds said, in hindsight it seems like 3dfx had Ebola and all it's internal supports and systems were just liquifying and dissolving and by the time the situation was being rectified (eg. "cured") it was too late.

-Tell me if I'm wrong, but nVidia still has significant human resources with a good retention rate and is still attracting new talent from the pool.
-nVidia still has positive cash flow and earnings.
-nVidia still has dominent developer suppoty
-nVidia still has the initiative.

Infact, the question is What does ATI have? And the answer as John said, is one cycle where they executed and nVidia didn't....

..but the important thing is why didn't nVidia execute? And it wasn't an internal issue, and people need to digest this before declaring ATI the neo-messiah of the 3D world. Just as embracing the bleeding-edge of lithography was once a leading factor propelling nVidia into the position it achieved... it will do it again.
 
I'm with John and Vince on this one.

nVidia had a fumble with the GeForce FX partly because their aggressive use of the latest state of the art process is a risk than was bound to cost them sooner or later. nVidia is still the main player in term of market share and revenue and they are a very aggressive firm.

The fumble gave ATI the possibility to gain some ground, and I think that ATI will grow stronger because of this while I really don't see nVidia getting any weaker.

Its also important to note that ATI has delivered on their new business plan with regard to trying to compete with nVidia for having the most advanced high-end card which architeture they then reuse in cheaper parts. It has worked great for nVidia - and now ATI has executed on the plan also.

:arrow: Both these two companies is here to stay big time.
 
John Reynolds said:
Not even close. 3dfx = rapidly dwindling market share, no OEM contracts worth mentioning, lots of overhead with a board plant that didn't even begin to come close to its production capacity because no one was buying the boards it produced, last product released was built on a core that was basically four years old even then.

Yes, it's monumental that a competitor finally surpassed Nvidia, but wasn't it bound to happen sooner or later? What isn't monumental is that ATi's only been in the driver's seat for one product cycle, at least so far. Let them maintain their performance lead throughout this year and I'll share more of your enthusiasm over this situation.

Here's how it is close in my view:

(1) nVidia is late--a "product cycle" late, just as 3dfx was late (However, prior to the V5--the *only reason* 3dfx's market share was "rapidly dwindling" was because it was so late with the V5. Last numbers I saw ATI is picking up market share versus nVidia, which is not surprsing at all given the product spread offered by the two companies. On the high end, now that GF FX Ultra is cancelled, nVidia is working on being two cycles behind. But the R350 would have seen to that anyway.)

(2) nVidia was considered "number 1" in 3D performance before R300, just as the V3 was clearly "number 1" in 3D perfomance versus nVidia's TNT (the TNT2 didn't ship for 4-5 months until after 3dfx shipped the V3.) I bought the V3 the first month it shipped and had a TNT at the time which I'd been using with my V2s. The V3 demolished TNT in every respect--there was no comparison. It was also much faster than TNT2 as well when it shipped a few months later, but 3dfx's market share was very strong prior to the STB fiasco and the tardy V5. Had 3dfx shipped the V5 on time it would have maintained its market share easily over nVidia, as the GF1 looked punk in comparison. As it was the V5 ended up facing GF2, where things were not so clearly defined. At this point 3dfx was distracted by the STB mess and just lost sight of its core business.

(3) nVidia has been distracted by xBox and its core-logic chipset business, among other things. It's not the same thing as the STB distraction, of course, but these other avenues have served to take the company's focus away from its core 3D chipset business.

(4) Management hubris. 3dfx felt so secure in its position that it even ran ads and stated in articles that it felt a "shakeup" of the industry was coming and it was attempting to change its business model to survive. 3dfx was really--literally--pissed when nVidia shipped the GF1. I often got the impression they felt nVidia was robbing them of something they were entitled to. (Which may have been true according to the lawsuit which got buried in the nVidia buyout.) But nonetheless, nVidia was was just as lax in its attitude in designing nv30 as the company was obviously judging its future products by its former products with no thought whatever as to what might come from a competitor.

(5) As 3dfx struggled to ship the V5, the publicty became increasingly anti-3dfx from virtually every Internet wanna'-be. You remember--we used to roast 'em over the coals a lot over on 3dfx.com, remember? *chuckle* (Ah, but it was fun!) The publicity over the past couple of months, especially over the last month, has been more negative than I have ever seen it for nVidia. And it's due to the nv30 being late, and then to the nv30 not being what people expected versus the R300 (which IMO mirrors the 3dfx V5 publicity cycle to the T.) nVidia hasn't gotten it as badly as 3dfx got it, I don't think, but it's certainly very close. The GF FX Ultra has been the butt of Internet jokes around the world--just as the V5 6K was in it's "day."



Sorry, John, but I just think the irony here is too obvious to be ignored. To me, it's kind of sweet, actually...;) The shoe is definitely on the other foot right now for nVidia--but nVidia has been an underdog before so they know what it's like. I think the biggest hurdle nVidia has ahead of it is ATI itself--this is no bumbling, pushover company mismanged by engineers like 3dfx seemed to be...;) I think ATi has at last found its 3D "sea legs" so to speak, and will from now on out barring something completely unfoseeable, be a far tougher competitor to nVidia than 3dfx ever was.


Yes, the V56K would've had amazing AA for its time (probably better than the 9700's 6x for edge quality) but I'm not sure I would've paid $600 USD in 2000 for one. Regardless, comparing its technology to that of the GF FX is amazingly inane. The latter is generation upon generation more advanced than the VSA-100 chip, and probably offers a feature list longer than every 3dfx product ever released combined (doesn't help 3dfx that they milked the same core fo nigh a half-decade). Beyond just chip specs, yes, the FX Ultra strikes me as a clumsy piece of engineering with its board size, heat, and noise levels, but most of this is a result of trying to ratchet clock speeds to better compete against the 9700 Pro. A 128-bit memory bus has proven to not be overkill.

I agree that you can't of course compare the VSA architecture to nv30--that'd be silly...;) What I was talking about, and which I think you got a sense of, was the ideas and concepts behind the V5 6K. As a complete product, V5 6K I feel was much more sophisticated and complex in an elegant way than was the GF FX Ultra, which was just brute overclocking and overvolting. In fact, I don't recall anyone doing a 3D product on the same principles as GF FX Ultra. You might find something like it featured in a modder's web site *chuckle* but I doubt you'd ever have seen 3dfx make something like it or ATI, for that matter. I think it would have been much more profitable for nVidia to have realized these things back in September as opposed to getting this far with it just to kill it. I hope, I really hope, that nVidia doesn't believe that anyone will buy the 5800 thinking it's an Ultra...*chuckle* Overall, I have found nVidia's handling of this situation with the 9700P to be just about as poorly handled as 3dfx handled nVidia's up-coming competitive products at the time. We've heard the same kind of PR double-talk--but no competitive product appears. That strikes me as hauntingly reminiscent of 3dfx...;)
 
The funniest thing that has happened during this delay was nVidia's attempt to lure gamers with the "The way it was meant to be played" crapola.

All the while, everybody and their brother knew that if you really wanted to do a game the "way it was meant to be played," you would have no choice but get yourself an ATI 9700.
 
im telling you all its the Curse of the Rampage. remember the NV30 is the first nVidia chip with 3Dfx tech so one would assume it would contain some Rampade-derived functionality. The Curse might not kill nVidia, but maybe it will...
 
WaltC said:
The V3 demolished TNT in every respect--there was no comparison. It was also much faster than TNT2 as well when it shipped a few months later, but 3dfx's market share was very strong prior to the STB fiasco and the tardy V5.
Only in Glide games. Not in Direct3D or OpenGL the majority of the time.

(3) nVidia has been distracted by xBox and its core-logic chipset business, among other things. It's not the same thing as the STB distraction, of course, but these other avenues have served to take the company's focus away from its core 3D chipset business.
I don't think so. There hasn't been much work done recently in either case. The nForce2 really isn't much more than a tweaked nForce1. The NV2A used in the X-Box has been done for quite some time now. I only see two possible reasons for the current state of affairs:

1. Possible delays because nVidia felt they had the time to wait, so as not to jeapordize their current product line.
2. Process problems at TSMC. While in the past (TNT) this didn't prevent them from executing with some chip, I think the main problem here is again they were a little bit too secure in their position.

But there is a huge problem with all of these parallels with 3dfx's downfall. That is simply that 3dfx's downfall did not begin with the Voodoo5. It ended with the Voodoo5. nVidia started gaining marketshare back in the days of the TNT2. That is, nVidia released the TNT2 M64 around that time, which made its way into many OEM machines. This sector of the market still makes up the core of nVidia's business, and is a sector that 3dfx never succeeded in penetrating.

In fact, most of nVidia's marketshare wasn't taken from 3dfx's, but from ATI. When 3dfx was in its heyday, ATI commanded most of the OEM market. This simple fact should make it obvious that as far as the health of the company is concerned, the high-end doesn't mean squat.
 
Chalnoth said:
WaltC said:
The V3 demolished TNT in every respect--there was no comparison. It was also much faster than TNT2 as well when it shipped a few months later, but 3dfx's market share was very strong prior to the STB fiasco and the tardy V5.
Only in Glide games. Not in Direct3D or OpenGL the majority of the time.

The V3 actually shipped a few weeks before the TnT2. As for performance, the ZD/CGW Gamgegauge benchmark suit (8 games) showed only the TnT2 Ultra being faster than the 3000. And at the time five of those eight games supported Glide, yet it wasn't used during testing. The Ultra also cost 40% more than the 3000. Yea, a lot of hoopla was made over 32-bit rendering, which I never understood (I didn't enter the world of 32-bit gaming until Nov. of '01). Oh well.

But to write that it "demolished" the TnT is a bit much, especially since they belonged to different product launches (the TnT competed against Banshee, and did so quite well).
 
Back
Top