Two new (conflicting) Rumors

If the GF TnL feature was no good it was only because it was something that was not used at that time.. that doesn't make it useless though.

I've shown modern T&L games where the TnT2 Ultra with no T&L outscores the GF SDR, and yet you claim the T&L just wasn't used?

That is definitive proof in my eyes that the T&L on the Geforce SDR was in practical terms useless.

I'd be interested to hear exactly why you think otherwise.
 
It was a building block for others to improve upon.

Without that first step where would TnL of today be or gfx in general?

That's all.
 
What I find odd is that folks tend to claim that we'd be better off if 3dfx were still around to challenge nvidia. The pseudo-logic being that the competition would generate better features and performance at a lower cost.

To be honest, I think ATI is doing a fine job of challenging nvidia's throne. Please see the demise of the GF4MX460 for a great example of this effect in action.

Additionally, since most of those 3dfx engineers now work for nividia, wouldn't they be more capable of generating better features and performance now that they work in collaboration rather than in market-driven competition? Despite everyone's self-centered desire to have more and better and cheaper now, I for one do not like the trend of rushed-to-market products.
 
flf said:
To be honest, I think ATI is doing a fine job of challenging nvidia's throne. Please see the demise of the GF4MX460 for a great example of this effect in action.

yes now ATI are doinga fine job. But it took them 8-9 months to challenge the Gf3 whereas 3dfx 'should/could/would*' have been in there at the same time.

*delete whichever words you want.
 
for backing up your point you use the 't&l vs rgaa' case. this is simply misleading - you can't generalize on this sole case to draw your '3dfx's feature-righteousness' conclusion.

Hey...

I'm not trying to say anything about anyone's features being more "righteous" than anyone elses. I fully understand the "need" at some point to implement new features, even if "practically useless" in the initial products in order to start gaining market acceptance...and drive the industry forward to bigger and better things. (The chicken-and-egg situation.) So surely, you can argue that GeForceSDR T&L is "righteous" in that respect.

However, I am merely looking at two products on the shelf, and evaluating them SOLELY on what value the give to the gamer when they were being sold. From that perspective ("which product do I buy")....considering features that won't see much (any?) use during the cards life span adds ZERO value to that purchase....again...from a product perspective.

Back in the day....I don't know how many times I've "congratulated" nVidia for introducing T&L, etc. HOWEVER, at the same time I never once recommended an nVidia card for purchase because it had T&L.

Understand?
 
Randell said:
yes now ATI are doinga fine job. But it took them 8-9 months to challenge the Gf3 whereas 3dfx 'should/could/would*' have been in there at the same time.

And that 8-9 months was an interminable wait?

Look, no-one has a monopoly here, so what's the problem? You may be personally put out that your favorite feature isn't getting all the attention you want, but technology certainly isn't standing still.

See, here's the deal:

Everyone clamours about how great competition is, such that new features and faster products are delivered in shorter time frames at a reduced cost. This is lovely.

On the flipside, everyone clamours about how feature sets aren't complete, don't have complete driver support, are partially or totally broken, or are simply useless. This isn't so lovely, but it a direct result of the fierce competition that you so fawned over when it comes to speed and price.

Competition isn't "good" or "bad", it just is. It's a market force. If someone gets too monolithic and complacent and doesn't keep their prices in line with what's reasonable (Intel), along comes some upstart and smackes them around with a competing product that is sold at a very attractive price (AMD).

I don't see anyone complaining that Cyrix isn't in the game anymore, although they were a contender several years back. (At least on paper.)

3dfx is dead, however their technology lives on in the coming products from nvidia. It's a picture-perfect success story of how competition thinned the market and made the resulting companies stronger. If it had been nvidia who was fiscally irresponsible and 3dfx the paragon of the bottom line, then perhaps 3dfx would have hired all of the engineers from nvidia.

And what then? Would 3dfx be weaker for having lost it's chief rival?

Your arguments make no sense. Progress is progress, no matter what form it takes.
 
3DFX made a lot of bad business moves. Probably all that TV marketting was a bad move... The 3D gaming market at that time (and still somewhat today) probably wasn't big enough to waste money on such broad exposure. They reached a few gamers and a lot of people who didn't know the first thing about computers. Those gamers would also be the type most inclined to ignore marketting and go with whatever is faster, so I think you can consider that $12 million 3DFX spent basically flushed down the tubes.

Then the move to produce their own boards...just overall a big mistake. At the time they thought they were on top of the world. They didn't see any competition coming, apparently. They wasted a lot of money moving over to a marketting scheme that turned out to be less effective and effectively gave Nvidia all the outlets they'd ever need!

Then VSA was late...that was just the last straw. If they hadn't wasted so much money making bad business descisions and opening the door wide open for Nvidia to walk in (by unlocking all the AIBs), then they would probably have survived.
 
flf said:
Your arguments make no sense. Progress is progress, no matter what form it takes.
What? What arguements? Competition is good, no competition is bad? They dont make sense?

Yes in that 12 months when the Radeon was OK but not brilliant and 3dfx were gone, nVidia released the Gf2Ultra and the Gf3 all at over £300 in the UK. They also released the Gf2MX which was a semi good thing price wise (but a Gf1DDR was actualy better at higher resolutions as was a V5).

For the best part of a year there was only 1 DX8 card and it was over £300 until ATI released the 8500.

How can 3dfx going be a good thing in that context?
 
flf said:
Look, no-one has a monopoly here, so what's the problem? You may be personally put out that your favorite feature isn't getting all the attention you want, but technology certainly isn't standing still.

you dont know me and dont put words in my mouth.

flf said:
See, here's the deal

dont patronise me.
 
128-bit bus for Nv30 = disappointing if true.

Parhelia, P10 and R300 all have 256-bit bus. Nvidia using 128-bit data path
would not make any sense for an all out performance chip and would represent a huge comprimise. not that 128bits make NV30 a bad chip, just that its a comprimise.

I most likely would wait until NV35 before I took the plunge on any NV3X based board, even if NV30 had 256-bit bus, but if NV30 has only 128-bit bus that would be a non-starter for me. regardless of the efficency of nvidia's design, even if it used much faster memory than R9700.
256 bits is 256 bits.
 
megadrive0088 said:
but if NV30 has only 128-bit bus that would be a non-starter for me. regardless of the efficency of nvidia's design, even if it used much faster memory than R9700.
256 bits is 256 bits.
Not true at all.
You are basically saying here that you wouldnt buy the NV30 if it were cheaper, faster, and had more features than the r300 JUST because it only has a 128bit bus?
I'd buy it in a heartbeat IF the above were true.
In fact, IMO, the only things that matter are performance, features and cost.
WHo cares what the bus width is as far as making a purchasing decision?
 
DaveBaumann said:
What if the 8 pixel pipes were designed at 64bit, rather than 128bit?

Perhaps some1 with a better understanding of the Cinefx pdfs could answer this. Will the ability of an application to use this 'performance mode' (64-v-128) need to be coded for each game or be dynamically enabled on chip?
 
AH : The intention of the six month product cycle is that we are in line with the PC cycle, namely the Spring and Back-to-School periods
So if they missed the Back-to-School period... Certainly seems interesting since he dodged the Christmas question. But, I don't think it would be till Spring '03(or Fall '03 if your Pascal ;) ) that we see NV30.

AH : I'm a firm believer in 'keeping it real' - as in realism in real-time
Somewhere in the US, the black man who coined the "keeping it real" just cringed.
 
randell said:
dont patronise me.

Why not? You certainly seem to need coddling.

What? What arguements? Competition is good, no competition is bad? They dont make sense?

Good for whom? Obviously competition wasn't good for 3dfx... it squeezed them out of the game, no? Without competition their dreadful mismanagement may have been able to survive, even come to prosper.

But, no. It's this same wonderful competition that you keep droning on about being so wonderful that nailed their coffin shut.
 
Jima13 said:
I don't think that's correct; I picked up a V5-5500 in June '00 and a Radeon64 vivo in July '00.......unless I totally misunderstood:)

Hmm.. You ARE correct, i was thinking of the Radeon Pro series being their first cards.. Mainly because at that time i would think no one really put much stock in ATi actually bringing something great to the market. So forgive my ignorance, but at that time i just didn't put much salt into them :) I also remember when i was shopping for a card that fall, i never did see the Radeon's on the shelves until that next spring. If i had i might have picked one up. I went thru trying 3 GeForces which were lackluster IMO when i fell in love with the V5.

Btw, what Nagorak says is so true it hurts.. I remember the news at the time about 3dfx's failed ad campaign. And how stupid it was of them to buy that plant in Mexico, for what was it 55 or 95 million ?? And them leasing the other plant overseas to be able to have int'l distribution. Bad bad bad buisness mistakes. I think something about putting all eggs into one basket fits here.

And truthfully, i still fully believe that it wasn't competition that killed 3dfx, but the blame lies in all their business decisions since releasing the v3. But in the end, i also believe that 3dfx, if they were still here today, would still be playing catch-up to Nvidia and Ati.

Edited for ignorance :)
Continue On..
 
DaveBaumann said:

But pass that quote through the marketing->technical translator, and it sounds like, "the nv30 is designed to do 64-bit, and has to do 2 passes to handle 128-bit, whereas r300 is designed to do 128-bit." Which might also have been part of the reason why Nvidia wasn't happy about the way DX9 was being speced, and wasn't as big of a driving force in it as they had hoped.

128-bit memory interface? Designed for 64-bit color? So far nv30 isn't sounding too revolutionary...
 
Back
Top