3dfx/3dhq Revenge superchip.

Crusher said:
It was announced, demonstrated, and the only reason it wasn't released was because it kept running into problems and eventually the company went bankrupt before it could work them out. I think that counts as a failed product. You could also count the Voodoo 3 2000 and 3000 as one product. The 4500 wasn't terribly more advanced than a V3 either, but enough so that it deserves to be listed separately, I think.

The point is, every product since the original Voodoo was just an evolution of the previous generation, with the addition of the T-buffer at the end. And quite a few of them failed miserably. It seems unlikely that this same company would have had such amazing things coming into fruitition at the same time they were going bankrupt and couldn't secure loans or investment. I think what they probably had, were a lot of ideas that seemed really good in theory. But even if those ideas had been given the opportunity to develop into products, I think it would have both taken a lot longer than some people suggest, and it wouldn't have ended up being as spectacular as the idea made it out to be (see T-Buffer).

So we are talking about a company like nvidia , there was the riva 128 , after that there was the tnt , tnt 2 , then the geforce which was basicly two tnts and added features and then they just let the geforce evolve untill the nv30 which seems to me as just an evolved geforce 4 . The nv30 seems to be failing in the market. Diffrence between 3dfx and nvidia it seems is that 3dfx only released two cards where it had basicly no rivals , where as nvidia has had a good 3 or 4 with no real rival. Not untill the r8500 can we say ati was on par with them .Is this acurate or am i off course ?
 
I have no disagreement with those comments. Obviously NVIDIA's and ATi's products have all been more evolutionary than revolutionary as well, and NVIDIA had a good run for about 2 years at the top, which has come to an end. The difference being, people aren't claiming that either of those companies had radically advanced technology in prototype form years ago.

If Rampage had actually been released, I somehow get the feeling that it's launch would be virtually identical to the Parhelia's. Fancy new AA that doesn't work quite right all the time, a few features that everyone else is going to have in a few months anyway, but lacking the ability to use them adequately, and maybe a couple of other specialized features that only about 0.05% of the market would ever care about.
 
If Rampage had actually been released, I somehow get the feeling that it's launch would be virtually identical to the Parhelia's. Fancy new AA that doesn't work quite right all the time, a few features that everyone else is going to have in a few months anyway, but lacking the ability to use them adequately, and maybe a couple of other specialized features that only about 0.05% of the market would ever care about.

That's the other side's extreme I can't believe either. Since T-buffer AA had very few compatibility problems (if any), I see little reason why it wouldn't have been the same with the M-buffer. In fact it was my understanding that the M-buffer was present very early in the rampage designs and Tarolli just switched it to the T-buffer for the VSA-100 line.

The featureset of Spectre had quite a few rudiments and would have had the same market acceptance most of the NV20 features had during the past too. All IHV's "experiment" in relative terms with new features, some succeed after quite some time (often presupposition being that other IHV's adopt a similar technique too), some just don't make it. There it isn't much different either at ATI/NV or any other IHV.

What I haven't seen so far stated in this thread is that Spectre would have had for a long time after the V2, a featureset that was more or less in line what the rest of the market or the competition had. It's no coincidence either that the initial rampage design was initially aimed to follow up the V2 and signified part of the company's management flops.

***edit: to avoid misunderstandings: the first ever rampage design was quite a bit different than the last one. AFAIK the rampage that is debated in this thread carried a "R4" codename; wether that means four re-designs or not is another chapter.

As a rather less relevant sidenote: the way I see it ATI would have had an even bigger success with the R200 generation, would it have had MSAA all along.
 
Ailuros said:
That's it I'm convinced.

I repeat. I could send you a 3dfx Rampage powerpoint which even specifically mentions 16x FSAA. Want it? PM your e-mail address and I'll send it this evening.

Getting nervous already?

No, frustrated. It tends to happen. And it's why I don't normally like to argue, even when I know I'm right.

I'm also cursed by being able to see both sides of almost any argument, which never helps. I can see your side very clearly, and I do understand your reluctance to trust the somewhat... excessive performance claims.

Thanks for at least that verification that had gone unanswered so far no matter how often I asked. It was too obvious from where (it's called nowadays) "performance" anisotropic on NV30 could originate from.

Uh... I said it looks [/i]smoother[/i] on average, i.e. it's a little better looking on average than NV30's Performance AF, i.e. it is a different implementation. Hell, I can prove that. NV30's P-AF works without AA, right? And gets real good performance without AA? Rampage's performance AF doesn't even work without some form of AA going on, and the degree is limited by the number of AA samples (you can't get 16x AF from 2 AA samples, but it does work with 4, and works even better with 8).

Cheat AF, fillrate-free AA with piles of bandwidth, and 30-50% efficient HSR. WHY is it so impossible to attain 180fps?

See R300 three whole years later. See CPU's/memory/platforms of 2000 and today etc etc.

Tell me that someone found two rampage cores in the garbage can and build a board out of it and I'll have another kneeslapper. I've seen worse claimed so far from the infamous 3dhq gang; nothing surprises me anymore.

It was one of "today"'s CPU's that the Rampage in question was tested on. With AA and AF off, what does Radeon 9700 Pro score in Q3A 1280x1024x32, on a 3.06GHz P4?

At a cost of a predicted 500$ pricetag for dual chip. As for the murder it happens more than often in vaporware wet dreams of some out there.

It's only logical. Rampage is supposed to compete with NV20. 3dfx got ridiculed big-time for needing the giant Voodoo5 5500 with two cores to even begin to compare to GeForce2 GTS's one chip, and the 6k size jokes would've easily matched the FlowFX mania had the card actually been released. There were already a few of them, though. Hell, the 6k needs external power!

3dfx wouldn't be stupid enough to repeat that debacle; they knew they had to compete chip vs. chip... which would mean, logically, that a two-chip solution would be really frickin' fast by comparison. Correct?

I'm puzzled wether I am actually having an argument with a female here or a truck-driver. ;)

Blame it on a bad week, coming out of bad PMS, coupled with my tendency to get frustrated while arguing over anything at all (I could be arguing about peanut butter and I'd start shouting at some point :oops: )...

I'm sure you counted them one by one haven't you?

I don't think I want to comment further on that one.

By the way, you have a PM. It's sort of an 'ace up the sleeve'... I'd rather you didnt' tell anyone what's in it.

You seem to be the only one resisting so adamantly. Where's Vince? Even HE isn't beating me with his Anti-3dfx™ Stick.

I don't think that could even stand as an argument here. I don't see where 3rd parties are relevant.

Mood was bad when I wrote that, apologies, point conceded.

Sure, plenty of engineers went to nVidia. But not all the engineers were working on Rampage. A good number of the core Rampage staff... did not go to nVidia.

Which were how many? I asked a perfectly logical question and expect a reasonable answer for it. When you "guestimate" 10%, then I suppose that you have roughly an idea how many engineers they had in total and how many assigned to Rampage.

The guesstimation was based on knowing that the Rampage team wasn't very big to begin with, and knowing roughly how many engineers went to the mystery company... (eerie music plays) I don't actually have any concrete numbers, other than about how many went to our beloved MC. And I know a chunk of them went to ATi and Matrox. The guesstimate was just that, a guesstimate.


Althornin said:
Because its patently obvious to most of us that you are totally irration WRT this 3dfx thing, and as such, its almost not worth our time to "discuss" it with you.

I'm trying to rationalise things, get explanations in, and discuss exactly how it's possible that a dual Rampage could pull off such a seemingly ridiculous frame rate...

Why do YOU find it so impossible to believe that 3dfx tech wasnt "all that and a bag of chips"?
I cannot believe some of the moronic stuff you 3dfx fanpeople post. If the tech was that good, it would have been used.
I also like you constant insistence that 3dfx tech IS good enough to be used, and the accompaning "proof" that it is used in the 3D part of the nv25...
If the only good part was 2D, why do you keep harping on the 3d performance?
Why must you people paint 3dfx in such a holy glowing light?

3dfx's Rampage, as a whole, was very fast. I'm only praising a few bits of tech to all hell, because some of them were way ahead of their time, and others (like their performance AF) really were ingenious. The individual technology in Rampage, and most of its parts, were obsolescent by the time everything went to nVidia anyway. If nVidia had released Rampage, they would have A. scaled back features from their NV20, something they know would have been hypocritical, and B. admitted they were inferiour to their longtime bitter rival.

The main part of what I'm discussing, really, is how Rampage could've been so damned fast. But it's already been brought up that Rampage/SAGE didn't quite support enough to be PS1.1/VS1.1 compatible... it's a bit backward, ne? On top of every other good reason not to use Rampage, why would nVidia want to release a PS1.0/VS1.0-only vidcard after already having released GeForce3 which supports, IIRC, up to PS1.3.


Geeforcer said:
On the contrary, it never seizes to amaze me how some people are willing to believe almost anything, no matter how improbable, without a shred of hard evidence. Let's save the blind faith for Sunday mass, shell we?

We can't give any hard evidence. Technically the boards aren't even supposed to exist. If there was proof they exist, jobs will be lost. Unfortunate, isn't it? But the information that has been released, has been cleared already with the ah... owners of the said boards.


jvd said:
at the end of the day in 2000 or 2001 when it was set to be released it would have been one of the top of the line cards if not the top of the line card. But compared to the new cards i believe it would loose. Tech doesn't stand still . But then again there are some things that the rampage may still have done better . Nothings perfect .

Indeed. It would still - in speed only - compare today, in a select few situations. Other than that, though, it wouldn't be too desirable at this point, due to lack of advanced features. THIS is why 3dfx tech isn't used. It was great at the time, and resulted in a core that was incredibly fast, and offered great IQ for its time.


Crusher said:
3dfx failed products:

Voodoo Rush
Voodoo Banshee
Voodoo 3 3500
Voodoo 5 6000

I wouldn't exactly say the 3500TV failed... and the 6k was perfected just fine. Quantum3D has all of the 6k's produced, and they work just fine in the low-end AAlchemy systems they're sold in. :) 3dfx just didn't have the money to afford to put out such a low-margin product anymore.

If Rampage had actually been released, I somehow get the feeling that it's launch would be virtually identical to the Parhelia's. Fancy new AA that doesn't work quite right all the time, a few features that everyone else is going to have in a few months anyway, but lacking the ability to use them adequately, and maybe a couple of other specialized features that only about 0.05% of the market would ever care about.

Why would Rampage's AA not work quite right all the time? It's nearly the same implementation as VSA-100's T-buffer, except it could use multisampling as well as supersampling. Last I checked, VSA-100 can use FSAA, playable or not, in every game out there except Diablo II (well, maybe in D3D, but not in GLide).

The primary features, like the texture computer, would be surpassed soon after (NV20 and PS1.1+) in most ways, but of course, it's like NV30 - doesn't support PS3.0, but it does support a lot in it, and a lot of things not in it.

Other than that, some of the other features, like its performance AF, were for speed, and application-ignorant. :)

The main thing about Rampage that IS impressive, is pretty much just the speed 3dfx were able to achieve with it.


Ailuros said:
The featureset of Spectre had quite a few rudiments and would have had the same market acceptance most of the NV20 features had during the past too. All IHV's "experiment" in relative terms with new features, some succeed after quite some time (often presupposition being that other IHV's adopt a similar technique too), some just don't make it. There it isn't much different either at ATI/NV or any other IHV.

Yep. Since no video cards were ever released that didn't support PS1.1 or better, no games have ever or will ever use PS1.0 only. However, IF Rampage had been released, I'd put money down that there would be PS1.0 fallbacks in 3DMark03, and that 3DMark2001 would've used PS1.0... and other apps using PS would surely have some kind of 1.0 fallback. And I'm sure there would be a few niche apps with Rampage "PS1.0 Extended" support.
 
I have doubts this will ever come to an end.......


Uh... I said it looks [/i]smoother[/i] on average, i.e. it's a little better looking on average than NV30's Performance AF, i.e. it is a different implementation. Hell, I can prove that. NV30's P-AF works without AA, right? And gets real good performance without AA? Rampage's performance AF doesn't even work without some form of AA going on, and the degree is limited by the number of AA samples (you can't get 16x AF from 2 AA samples, but it does work with 4, and works even better with.

A little better or a little worse won't work for me. Would I have a NV3x whatever today I doubt I would prefer 8x "performance" over 2x "application" for obvious reasons.

I'm not even going to discuss how someone can count samples for anisotropic; what I care about is the final output.

It was one of "today"'s CPU's that the Rampage in question was tested on. With AA and AF off, what does Radeon 9700 Pro score in Q3A 1280x1024x32, on a 3.06GHz P4?

No I'll twist your question: what would the forementioned Spectre configuration cost? Now get that cost level and get a dual R300 design instead and then we'll see comparable numbers; where by the way R300 multichip designs are real albeit not aimed for the consumer market.

How does a roughly under 40GB/sec bandwidth sound there? Give me one good reason why such a hypothetical board would cost more than what the supposed Rampage setup is claimed there.

There are more factors in this cut throat market to consider and you're very well aware of it. Cost is a major consideration and that's the exact reason why the late 3dfx planned to completely multi-chip sollutions further in it's roadmap.

If ATI would feel today that there would be enough demand for a dual chip board, you can bet anything you want that they'd release it.

3dfx wouldn't be stupid enough to repeat that debacle; they knew they had to compete chip vs. chip... which would mean, logically, that a two-chip solution would be really frickin' fast by comparison. Correct?

Nope. It conflicts with the rumour Fear being a single chip only design and not that long past Spectre release. Dual chip Spectre wasn't aimed for mass-quantities for obvious reasons.

The guesstimate was just that, a guesstimate.

Sounds better already ;)

On top of every other good reason not to use Rampage, why would nVidia want to release a PS1.0/VS1.0-only vidcard after already having released GeForce3 which supports, IIRC, up to PS1.3.

Errrrr nope....

NV20: PS 1.1/VS 1.1
NV25: PS 1.3/VS 1.1

We can't give any hard evidence. Technically the boards aren't even supposed to exist. If there was proof they exist, jobs will be lost. Unfortunate, isn't it? But the information that has been released, has been cleared already with the ah... owners of the said boards.

Patent infrigiments and possible lawsuits ignored obviously.

It was great at the time, and resulted in a core that was incredibly fast, and offered great IQ for its time.

Highly debatable under the light of a too much performance optimized AF algorithm. With Supersampling I can in relative terms ignore AF; for Multisampling though I personally prefer high texture quality.

Yep. Since no video cards were ever released that didn't support PS1.1 or better, no games have ever or will ever use PS1.0 only. However, IF Rampage had been released, I'd put money down that there would be PS1.0 fallbacks in 3DMark03, and that 3DMark2001 would've used PS1.0... and other apps using PS would surely have some kind of 1.0 fallback. And I'm sure there would be a few niche apps with Rampage "PS1.0 Extended" support.

That's another weird guestimate; it's like ignoring ATI through the whole hypothetical process too. No one can guarantee that if 3dfx survived, that Radeons would have sold significantly less than they did so far in reality.
 
Ailuros said:
I have doubts this will ever come to an end.......

Probably not. We should probably just call a stalemate some time soon...

A little better or a little worse won't work for me. Would I have a NV3x whatever today I doubt I would prefer 8x "performance" over 2x "application" for obvious reasons.

I'm not even going to discuss how someone can count samples for anisotropic; what I care about is the final output.

Well, wrt performance AF + MSAA, the actual base filter quality is maintained (unlike nVidia falling back to mostly-bilinear)... it's basically just a 'cleaning up', and it does remove texture shimmering. Rather than call it AF one should probably call it an LOD trick with texure shimmering almost completely removed; it gets mostly the same effect.

No I'll twist your question: what would the forementioned Spectre configuration cost? Now get that cost level and get a dual R300 design instead and then we'll see comparable numbers; where by the way R300 multichip designs are real albeit not aimed for the consumer market.

How does a roughly under 40GB/sec bandwidth sound there? Give me one good reason why such a hypothetical board would cost more than what the supposed Rampage setup is claimed there.

There are more factors in this cut throat market to consider and you're very well aware of it. Cost is a major consideration and that's the exact reason why the late 3dfx planned to completely multi-chip sollutions further in it's roadmap.

If ATI would feel today that there would be enough demand for a dual chip board, you can bet anything you want that they'd release it.

Price isn't an issue as far as I'm concerned... and besides, as I've already said the Rampage boards in question were 2xR, 2xS, which was never planned for retail anyway :)

Nope. It conflicts with the rumour Fear being a single chip only design and not that long past Spectre release. Dual chip Spectre wasn't aimed for mass-quantities for obvious reasons.

Sorta like hte FX5800 Ultra?

The guesstimate was just that, a guesstimate.

Sounds better already ;)

Indeed.

Errrrr nope....

NV20: PS 1.1/VS 1.1
NV25: PS 1.3/VS 1.1

I stand corrected...

Patent infrigiments and possible lawsuits ignored obviously.

That too. :)

It was great at the time, and resulted in a core that was incredibly fast, and offered great IQ for its time.

Highly debatable under the light of a too much performance optimized AF algorithm. With Supersampling I can in relative terms ignore AF; for Multisampling though I personally prefer high texture quality.

Well consider that GeForce3 dies a horrible performance death with AF...

That's another weird guestimate; it's like ignoring ATI through the whole hypothetical process too. No one can guarantee that if 3dfx survived, that Radeons would have sold significantly less than they did so far in reality.

I guess. But the Radeon R6's don't even support PS1.0, do they?
 
Well, wrt performance AF + MSAA, the actual base filter quality is maintained (unlike nVidia falling back to mostly-bilinear)... it's basically just a 'cleaning up', and it does remove texture shimmering. Rather than call it AF one should probably call it an LOD trick with texure shimmering almost completely removed; it gets mostly the same effect.

Stop that. It sounds scarier with every sentence I hear about it :oops:

:D

Price isn't an issue as far as I'm concerned... and besides, as I've already said the Rampage boards in question were 2xR, 2xS, which was never planned for retail anyway.

I just wonder what the long lasting past fuzz about it was actually for then.... :rolleyes:

Sorta like hte FX5800 Ultra?

It's loud, it's consuming apparently way too much power and failed to impress against the competion etc etc. but psssst....it's real ;)

Well consider that GeForce3 dies a horrible performance death with AF...

That's the price you pay for high quality, even more when it concerns a relatively outdated board for today's standards. Besides I wasn't concentrating on the NV20; there are far more capable VPU's out there that don't shy away from high performance high quality anisotropic.

I guess. But the Radeon R6's don't even support PS1.0, do they?

R100 was a V5 competitor albeit released slightly later. I meant the R200 which supports up to PS1.4.
 
Ailuros said:
Well, wrt performance AF + MSAA, the actual base filter quality is maintained (unlike nVidia falling back to mostly-bilinear)... it's basically just a 'cleaning up', and it does remove texture shimmering. Rather than call it AF one should probably call it an LOD trick with texure shimmering almost completely removed; it gets mostly the same effect.

Stop that. It sounds scarier with every sentence I hear about it :oops:

:D

It's fun to play with 3dfx's T-buffers. They're more programmable than you'd think. :)

I just wonder what the long lasting past fuzz about it was actually for then.... :rolleyes:

Huh? :?:

It's loud, it's consuming apparently way too much power and failed to impress against the competion etc etc. but psssst....it's real ;)

Silly ;P Rampage is real too, but 3dfx had to go and do something stupid (i.e. die) before they could release it... ;)

That's the price you pay for high quality, even more when it concerns a relatively outdated board for today's standards. Besides I wasn't concentrating on the NV20; there are far more capable VPU's out there that don't shy away from high performance high quality anisotropic.

True. But by now, Rampage would've been replaced by two more generations of chips from 3dfx, which would've focused more on precise, 'real' AF.

R100 was a V5 competitor albeit released slightly later. I meant the R200 which supports up to PS1.4.

Yeah, true. Heh, would've been interesting... I wonder, had Rampage been released, would 3DMark03 use PS1.4 with a PS1.1 *and* a PS1.0 fallback, or...?
 
R100 was a V5 competitor albeit released slightly later. I meant the R200 which supports up to PS1.4.

Yeah, true. Heh, would've been interesting... I wonder, had Rampage been released, would 3DMark03 use PS1.4 with a PS1.1 *and* a PS1.0 fallback, or...?

I thought that PS1.0 is only pre-DX8 (Like, MS upped the requirements to 1.1?). Therefore, Rampage would only score around 300 marks. Also, the original Radeon was 1.0 compliant, wasnt it?
 
Radea said:
Yeah, true. Heh, would've been interesting... I wonder, had Rampage been released, would 3DMark03 use PS1.4 with a PS1.1 *and* a PS1.0 fallback, or...?
I thought that PS1.0 is only pre-DX8 (Like, MS upped the requirements to 1.1?). Therefore, Rampage would only score around 300 marks. Also, the original Radeon was 1.0 compliant, wasnt it?

R6 was something like 'PS0.5'.
 
I thought that PS1.0 is only pre-DX8 (Like, MS upped the requirements to 1.1?). Therefore, Rampage would only score around 300 marks. Also, the original Radeon was 1.0 compliant, wasnt it?

Spectre:

PS1.1
VS1.0

Radeon8500:

PS1.4
VS1.1

The original Radeon was released the same year as the V5.
 
Rekrul said:
During the tests in question, what drivers were used with this custom Rampage card?

Custom drivers, of course.

3dfx did complete some alpha drivers for Rampage, that were capable of most of D3D and OGL (with some core fixes).

But those alphas weren't used.
 
Custom drivers, of course.

Do you know or can approximate:

a) Development time for these drivers?

b) number of people (programmers) involved developing them?

c) Any info about driver performance (not necessarily FPS) in games other than Quake 3?

Also, am I correct in my understanding that 3dfx was testing a single chip Rampage card when R&D stopped on 12/15/2000?
 
Rekrul said:
Do you know or can approximate:

a) Development time for these drivers?

b) number of people (programmers) involved developing them?

c) Any info about driver performance (not necessarily FPS) in games other than Quake 3?

Also, am I correct in my understanding that 3dfx was testing a single chip Rampage card when R&D stopped on 12/15/2000?

a, no idea. b, at most 4. c, nope.

D, yes, I'm pretty sure that's correct. They probably had a 2xR test board, at the very least, and probably at least one board which included SAGE. But I can't say for sure without asking more questions. :)
 
There are more factors in this cut throat market to consider and you're very well aware of it. Cost is a major consideration and that's the exact reason why the late 3dfx planned to completely multi-chip sollutions further in it's roadmap.
well, rampage was only about 30 million transistors and sage was about 25 million (and at that time, .18 micron). they costs alot less per chip than a nv25 or nv30, but with multiple rampages and a sage, it was not much more expensive than nvidia or ati single chips.

Sorta like hte FX5800 Ultra?

It's loud, it's consuming apparently way too much power and failed to impress against the competion etc etc. but psssst....it's real
so was spectre, but it was never released. edit: and fully finished, but it worked.

tag: those 250fps numbers came from aqueel. hes the one who originally said that figure.
 
Back
Top