Two new (conflicting) Rumors

duncan36 said:
You're simply wrong, you cant compare a GF2Mx to a TnT2Ultra and call the GF2MX a Geforce SDR.

Here's the link to the actual benchmarks, which despite your waffling are valid, and show the actual card I'm talking about.

http://www.reactorcritical.com/review-savage2000/review-savage2000_9.shtml

Look at the benchmarks, Geforce SDR is listed on them. I wasn't comparing it to MX or anything...it's on the graph (and generally is same speed as MX but anyway...). I couldn't find a review of the original SDR otherwise I would have posted that.

I don't have any vested interest in this discussion, I'm just pointing out the card was faster. Maybe not a lot, maybe not worth the money, kind of like R9700 isn't really worth your money if you already own a Ti4600...but it was faster.
 
duncan36 said:
What a ludicrous statement.

I correctly stated that Tom's review showed the Geforce SDR outperforming the TnT2 Ultra in 3 benchmarks and the TnT2 Ultra outperforming the Geforce SDR in 2 benchmarks.

I also quoted the Reactor Critical review which showed the TnT2 Ultra outperforming a Geforce SDR in Quake 3, and Anands review was quoted showing the Geforce SDR outperforming the TnT2 Ultra in Quake 3.

Typical net nuckleheads arguing for the sake of arguing.

After seeing this, i'm pretty sure that you're arguing just for the sake of arguing. Because if not, well, i'm not going to go down that route.

Anyway, yep, you correctly stated that it (TNT2 Ultra vs GF 256 SDR) outperforms it in certain benchmarks and vice versa in others.
It's just that 2% faster in one benchmark and 50% slower in another doesn't make the cards equal.
 
The problem with T&L isn't the hardware, it's using the hardware optimally

I find humour in that statement, you hear that Geforce SDR owners anyday now that killer T&L game is going to come out because you see the developers are just now figuring out T&L and that extra $250 you spent upgrading that TnT2 Ultra is going to finally pay off, yipee!


What is your proposal? That NVidia never have introduced T&L on any graphic cards and simply waited for developers to learn how to use it, on what, a simulator?!?

No my proposal is that Nvidia should not have intentionally mislead the buying public in the face of lacklustre benchmarks into believing that T&L was ever going to be useful on that horribly bandwidth limited card.
As much as you know about how T&L is hard to implement in games in progress, Nvidia knew more, so it becomes even worse because Nvidia specifically stated that 1.)you would need T&L to play the new games coming out fairly soon 2.)there are hundreds of T&L games just around the corner.

NVidia underestimated the complexity of educating developers and getting them to support new features. The same learning curve will apply to DX9.

So are you saying that developers just never figured out the T&L on the Geforce SDR and thats why nothing ever came of it, a full 2 and a half years after the card was released? You don't expect me to take you seriously do you?

I don't wish any harm on Nvidia but I do take note when a company blatantly rips off the buying public as Nvidia has done several times. Most notably with the Geforce SDR, then later with the ridiculously priced Geforce 2 64mb. Plus in general I don't like their tactics, they're too manipulative for my tastes.
 
Anyway, yep, you correctly stated that it (TNT2 Ultra vs GF 256 SDR) outperforms it in certain benchmarks and vice versa in others.
It's just that 2% faster in one benchmark and 50% slower in another doesn't make the cards equal.

Well explain to me how Tom's Demo 001 gives the TnT2 Ultra half of what the Reactor Critical gives in the exact same Demo 001 and I'll let you parade your 50% number around. Obviously its either an install problem on Tom's part, or the later Nvidia drivers are crippling the TnT2Ultra in Quake 3.

Will you ignore the obvious to try to make a point?
 
duncan36 said:
I don't wish any harm on Nvidia but I do take note when a company blatantly rips off the buying public as Nvidia has done several times. Most notably with the Geforce SDR, then later with the ridiculously priced Geforce 2 64mb.

Good for you in the sense that you have now learned a valuable lession: Don't believe all the hype... ;)

There is really only one point to make of this: It's a simple matter of fact that you won't be playing any games that take full advantage of the given DX-generation hardware within, lets say, two years. (Just look at UT2003 and Doom III: Neither requires DX8 features to run, but they do requires DX8 or DX9 level hardware to play fast enough).

But this should be changing for the better starting with the DX9 generation: One could agrue that we with programability are fairly close to getting the basic features we need. Now it's about getting the hardware to perform as many operations per cycle as possible.
 
Duncan,
For any given card, no matter the feature set, it will be 2-3 years before any games support it in a mature way, that's just the facts. Hardware comes first, then it takes 18 months to write a game for that hardware, then learn your lessons, then another 18 months for the second revision.

Moreover, in the first 18 months, no developer is going to design a game for bleeding edge $150+ video cards that the vast majority of their customers simply do not have. It is a slow evolutionary process, and you will not see games specifically designed for DX9 *MINIMUM* for atleast 2-3 years.


Even on consoles like Xbox and PS2, it takes a good 2-3 generations of game titles before developers learn how to code for the platform.

Please spare me the NVidia-is-evil-lying-scum stories. NVidia sold T&L hardware. On some games it performed good, on others it didn't, and it would take atleast a good 36 months before developers begin to really take advantage of it. It was also take many driver revisions before NVidia engineers themselves learned to squeeze performance out of their T&L engine.


What NVidia was selling was potential, the same as Matrox and their EMBM hype and game lists, and ATI and the truform hype. All of those features, including DOT3 bump mapping, have remained hype until very recently.

In fact, only Doom3 seems poised to finally show what per-pixel lighting (e.g. DOT3) can do, and that's years after it was introduced!


I'm sorry if you bought a GF1SDR thinking some third generation T&L game like UT2k3 was imminent, but that's your own naivete.

If you were a heavy early adopter, you'd know not to buy things as soon as they come out. First, you pay a premium price. Second, nothing uses it to full extent of its capability.

This goes for all consumer electronics: x-box, ps2, gamecube, DVD player (when first came out), HDTV, PDA's (Newton anyone?), etc.


Don't buy bleeding edge cards if you can't take it.
 
The source of the 128 bit bus was the Inquirer - what more needs to be said about their predictive powers on this forum?

A good friend once asked me if used toilet paper is a more accurate read than the articles they often publish about the future. Tough call...
 
I'll quickly jump in, then hopefully jump out.

What NVidia was selling was potential, the same as Matrox and their EMBM hype and game lists, and ATI and the truform hype. All of those features, including DOT3 bump mapping, have remained hype until very recently.

Interestingly, the one company that wasn't selling "potential" was 3dfx. They were trying to sell "here and now." Look where it got them. It is just so hard to sell "real" things like AA, when the competition is releasing tech demos and making "sexy" promises about highly detailed worlds, etc.

I'm sorry if you bought a GF1SDR thinking some third generation T&L game like UT2k3 was imminent, but that's your own naivete.

Actually, I can sympathize with Duncan. I'l can almost bet that ducaon wasn't naive...he was probably among others (like myself) trying to combat nVidia "hype" about T&L....By saying the same things you are saying right now...and was probably called an anti-nVidia or 3dfx fan boy for his trouble.

I can't tell you how many times I was called a 3dfx "fanboy" for telling people to forget about the SDR, that T&L was "useless" as far as the consumer was concerned.

At least the DDR version offered a considerable raw performance increase in fill-rate.

Incidentally, the same goes for the ATI 9700. The selling point for consumers really isn't DX9 support...as we won't really see that until this card is a couple years old. It's that it runs todays games that much better with AA and Aniso. DX9 is an added bonus...but will probably be relegated to running tech demos (though cool as they may be. ;))
 
Sure I love my PS 1.4 tech demos - I will never see PS 1.4 solely utilised to there potential in games.
 
Joe DeFuria said:
Interestingly, the one company that wasn't selling "potential" was 3dfx. They were trying to sell "here and now."

Emphasis on trying. They weren't actually selling anything, since their V5 architecture was way delayed, ditto for Rampage.

Instead of focusing on NVidia "T&L" GF1SDR hype as killing 3dfx, let's look at the real problem: failure to deliver and bad management decisions.

When the V5 finally arrived, it was already facing the GF1DDR and GF2GTS. Performance was on par in many titles, and IQ better in AA (but enough for consumers to notice and did 3dfx do a good job of marketing this to consumers!?), but NVidia had oodles of OEMs blanketing the market with cards. If you went to a store, they'd have 10 different brands of NVidia cards, and one retail card from 3dfx, and 3dfx was late.

Moreover, the V5-6000 was a no show, so they never had the elusive performance crown, and the poor old V5-5500 ended up facing refreshed GTS Pro/Ultra and Radeon.

You simply cannot blame the downfall of 3dfx on NVidia's T&L push anymore then you can blame PowerVR's failure to take off on it.

3dfx wasn't successful in taking their message to consumers or developers, screwed their business model by manufacturing their own cards, and was far behind in delivering their next-gen stuff.

Ultimately, the GF1DDR and GF2GTS delivered *better performance* on *old games* for consumers in addition to new features and NVidia delivered a steady stream of driver updates that boosted this even higher, and added FSAA.

That's why people bought GF1DDR's and GF2GTS's, and why they bought PROs and Ultras.

And they simply were never contested by the V5-6000 that was supposed to crush them.
 
You simply cannot blame the downfall of 3dfx on NVidia's T&L push anymore then you can blame PowerVR's failure to take off on it.

Don't overreact.

I'm not blaming 3dfx's downfall on heavily nVidia's T&L push. There were lots of things that contributed to 3dfx's demise, some of which you mentioned. However, marketing does play a part.

(but enough for consumers to notice and did 3dfx do a good job of marketing this to consumers!?),

Well, in 3dfx's defense, they tried real hard to evangelize AA. However, as you know, AA is rellally just something that needs to be seen in motion to be appreciated. 3dfx probably could have done a bit better, but the fact remains that a feature like AA is very difficult to "market."

That's precisely why, imo, they even bothered to market other T-Buffer effects. AA was certianly the most useful...but things like motion blur, depth of field, etc., are more "tangible" for the consumer.

That's why people bought GF1DDR's and GF2GTS's, and why they bought PROs and Ultras.

Why repeat what I already said? The DDR variants provided a notable raw performance increase. That's a valid reason for buying them (assuming you had enough money.)

My point is that people were evanglizing T&L as a "reason" to choose GeForce over Voodoo. And people who said "T&L is really a non-issue...don't choose a Geforce because of THAT", usually got told:

1) You're a 3dfx fanboy (Blah...)
2) Look at these tech demos...THE FUTURE! ("When? Is the future?)
3) nVidia told us T&L x-mas...see their list of titles! ("When is the future?)
4) Would you rather have nVidia NOT introduce T&L ? (Irrelevant to the point.)
 
DemoCoder said:
Moreover, the V5-6000 was a no show, so they never had the elusive performance crown, and the poor old V5-5500 ended up facing refreshed GTS Pro/Ultra and Radeon.

If i remember correctly there wasn't even rumors yet of a Radeon when 3dfx announced their folding that November. So it was wholly them facing the Geforce series.

DemoCoder said:
3dfx wasn't successful in taking their message to consumers or developers, screwed their business model by manufacturing their own cards, and was far behind in delivering their next-gen stuff.

And thats a shame considering they still hold the record for the most money spent on advertsing a computer accessory. I believe they spent more than 12million on advertising in less than a years time.

But can we not draw a parallel at this point between 3dfx and Nvidia.
Now Nvidia is faced with being behind the competiton in their released tech. By the time the NV-30 comes to the shelves i am sure ATi is going to have something *NEW* to be released right along with it ?

Maybe its the 3dfx tech they are incorporating in the NV-30 that has caused such delays ? Oh the irony :rolleyes: :D
 
Joe DeFuria said:
Well, in 3dfx's defense, they tried real hard to evangelize AA. However, as you know, AA is rellally just something that needs to be seen in motion to be appreciated. 3dfx probably could have done a bit better, but the fact remains that a feature like AA is very difficult to "market."

joe, i may be wrong but i get the impression you're trying to show how 3dfx chose 'viable yet difficult to market features' vs nv who chose the opposite - 'marketing buzz over usability'.

for backing up your point you use the 't&l vs rgaa' case. this is simply misleading - you can't generalize on this sole case to draw your '3dfx's feature-righteousness' conclusion.

problem was the vsa100 was doing anything but pushing the tech envelope, even when taking into account the praised rgaa. why whould it need pushing anything, you could ask. because the tech it was built on was really old - vsa100 was yet another re-iteration of the sst1, alas the final and most refined one. this part's existence within the timeframe it actually occured was hardly reasonable (aside from 3dfx's having troubles delivering) - the sst should have ended with the v3. not later. why:

  • vsa100 did not have a single new texturing mode over the sst - no embm, no dot3, no cube maps, no texture render targets - nada, zilch (3dfx tex compression was introduced with the v2). overall, the programming model vsa presented the developers with was the same old sst-derivative but with true color and >256x256 tex support (finally!).
  • the 2-way sli (i.e. the 5500 model) was nothing extraordinary in terms of performance outside glide-based titles. and yet, the 5500 was priced at the upper-range.
  • despite vsa100's fsaa being user-tangible, turning on fsaa on the vsa automatically halved performance (if say, at 2xfsaa chosen, [ed: with fill-rate limited titles]). this combined with 5500's non-present performance edge outside glide made vsa100's fsaa almost as good for real-world, non-glide titles as was nv10's t&l for non-t&l-optimized titles - i.e. just a check-box feature.

so, summing it up: was vsa100 a fine legacy (read glide) card - sure, the best there had ever been. OTH, had 3d game industry outgrown its glide infancy - definitely yes. was vsa100 helping in any way the advancement of viable new features (e.g. dependent texture reads, per-pixel lighting models) - well, that's hard to say - it supported accumulation buffers (aka t-buffer) and had >256x256, >16bit texture support, but that's about all. think about it - if for some reason the vsa100 had gained market dominance at the time it was introduced then the most advanced visual experience we'd have now and in the next, say, 3 years would be q3-level technology.

my whole point being - don't say 3dfx did the right thing with trying to sell the vsa100 to the consumers - it's not clear at all if this would have eventually been beneficial for them.

My point is that people were evanglizing T&L as a "reason" to choose GeForce over Voodoo. And people who said "T&L is really a non-issue...don't choose a Geforce because of THAT", usually got told:

1) You're a 3dfx <bleep> (Blah...)
2) Look at these tech demos...THE FUTURE! ("When? Is the future?)
3) nVidia told us T&L x-mas...see their list of titles! ("When is the future?)
4) Would you rather have nVidia NOT introduce T&L ? (Irrelevant to the point.)

#4 as stated your way does sound irrelevant indeed. but that's because you made it sound this particular way. i, for one, would have put it as:

4) would you rather have 3dfx introduce something to help 3d game industry advance at the time vsa100 got introduced, and moreover, with a similar price tag as the vsa100 (as it was over-priced)?
 
Joe isn't trying to make any claim about 3dfx, he's merely pointing out the factual case that Nvidia marketed the GF SDR T&L as a feature that would 'unlock the power of the card' down the road and games coming out shortly would require T&L.
They used this as a weapon against their competitors even though they knew it to be totally false.
And they used it to sell GF SDR cards to TnT2Ultra users, I was around then many, many TnT2 Ultra users bought GF SDR cards out of 1.)fear of all these T&L only games coming out 2.)The belief the card was going to become much more powerful down the road because of T&L.

They did the same thing with their TnT card and 32 bit graphics, they pushed a feature heavily, despite the fact that the card wasn't architectually capable of using it in the resolutions gamers want to play in.

Basically all I'm looking for, and I think Joe is too, is the admission that Nvidia knew T&L on the GF SDR was never really going to be useful in games, yet they marketed it contrary to this. Bluntly, they deceived to increase sales.
 
darkblu said:
[*] despite vsa100's fsaa being user-tangible, turning on fsaa on the vsa automatically halved performance (if say, at 2xfsaa chosen, [ed: with fill-rate limited titles]). this combined with 5500's non-present performance edge outside glide made vsa100's fsaa almost as good for real-world, non-glide titles as was nv10's t&l for non-t&l-optimized titles - i.e. just a check-box feature.

oh please, that old arguement. Anyway thats been done to death on these boards.

If the V5 had acheived more sales and kept 3dfx in business, then the consumer would have been better off from 2001 onwards in the shape of Gf3/DX8 competition in the form of Rampage. Whilst I agree 3dfx did nothing to help 3d hardware development over mid-1999/2000 their survival would have been in the best interest of the industry.
 
The problem is that we need both. We need new technology to push the envelope (TnL, DX78/9 TnL, ect) as well as technology that the end user can use right away (FSAA, AF, Suround Gaming). Saying one company was better because the chose to cater to that one side is silly. Just as saying the other company had the "right" choice. I think everyone here understands that new features usally take a long time to implement correctly and thus are "needed" to help push developement. But we also enjoy the fact that some of that tech can be used before the card is for all intents too old/slow to run a new game with said feature. 3DFX may not have pushed the limits in the last days, but at least they have left us with FSAA which like it or not has become almost a standard today....
 
"If i remember correctly there wasn't even rumors yet of a Radeon when 3dfx announced their folding that November. So it was wholly them facing the Geforce series"

I don't think that's correct; I picked up a V5-5500 in June '00 and a Radeon64 vivo in July '00.......unless I totally misunderstood:)
 
duncan36 said:
Joe isn't trying to make any claim about 3dfx, he's merely pointing out the factual case that Nvidia marketed the GF SDR T&L as a feature that would 'unlock the power of the card' down the road and games coming out shortly would require T&L.
They used this as a weapon against their competitors even though they knew it to be totally false.
And they used it to sell GF SDR cards to TnT2Ultra users, I was around then many, many TnT2 Ultra users bought GF SDR cards out of 1.)fear of all these T&L only games coming out 2.)The belief the card was going to become much more powerful down the road because of T&L.

They did the same thing with their TnT card and 32 bit graphics, they pushed a feature heavily, despite the fact that the card wasn't architectually capable of using it in the resolutions gamers want to play in.

Basically all I'm looking for, and I think Joe is too, is the admission that Nvidia knew T&L on the GF SDR was never really going to be useful in games, yet they marketed it contrary to this. Bluntly, they deceived to increase sales.

cool. let's say nvidia probably knew that t&n on the nv10 was a consumer-bating feature. still, using 3dfx's vsa100 as a counter example of 'honest marketing' (oximoron?) is, well, not quite valid. since 3dfx, too, knew their card was eventually the last speciment of a phasing-out architecture and would be essentially useful for legacy titles, but i don't recall 3dfx openly advertising the vsa100 for its worth. see, at the end of the day w/ or w/o t&n gf1 is doom3-compliant. vsa100 is not.
 
since 3dfx, too, knew their card was eventually the last speciment of a phasing-out architecture and would be essentially useful for legacy titles

Up until quite recently the Voodoo 5 was capable of running all games released, I'm not sure I get your point?

The fact of the matter was that hardware FSAA on the V5 was a useful feature and T&L as implemented on the Geforce SDR was not.
 
If the GF TnL feature was no good it was only because it was something that was not used at that time.. that doesn't make it useless though.

Was the 32bit colour feature of the TNT useless? No!

If you don't build it.. they won't come!

So I don't see the point or direction of this argument... anyone?
 
Back
Top