A rambling history of AMD and nVidia's rise to the top

Leovinus

Newcomer
What follows is a recollection of events as I experienced them growing up. The graphics wars were brutal. 3DFX had risen from the early competition with the likes of Rendition's Verité chip, S3 had their Trio, and Matrox had made the transition from 2D to 3D. But lording over them all were 3DFX. As my tale begins 3DFX was running high on their success after the Voodoo 2, a card I had myself, and Glide was the defacto performance API. But soon terrible things would unfold. Horrible events that left AMD and nVidia as the only entrants left until the present day. A day in which Intel is vying to enter. This is based off of my recollection, so timing might not be entirely correct. But I believe the spirit is in keeping with what happened if nothing else.

The final battle that led to the duopoly of today began, for me, in the late 1990's. Gathered at the urinal was a small gang of companies. All of them growing into adulthood and vying for attention. But this day would be different. To me gauntlet began in earnest with the Riva TNT2 as nVidia unzipped its trousers to flaunt a well packed pair of briefs while locking eyes with 3DFX - which had been gleefully displaying a sizeable bulge under its pants for some time now to the envy of the rest. However as it turned out it was a piss boner which would disappear as quickly as it had grown to prominence.

3DFX took to the challenge in stride. The Voodoo 3000 had been beginning to look long in the tooth. So it had been tinkering on a new graphics core with 32-bit rendering and a view to implement hardware FSAA, which sounded promising. All eyes were on 3DFX. But in its haste to one-up the new competition it unzipped without looking and somehow managed to catch their dick in the zipper. In a moment of greed 3DFX had decided to take card production in-house. Suddenly chip development took a backseat as problems with angry vendors, retailers, and production difficulties reared its head. Bleeding and in pain 3DFX put on a brave face, but the blood-mixed urine that trickled out from its damaged downstairs region was not confidence inspiring.

Meanwhile nVidia had gracefully undressed and let loose an impressive stream of hits before the first blood infused droplets left 3DFX's now miserably disfigured undercarriage. GeForce was launched, and as a belated response 3DFX finally squirted out their Voodoo 4, huffing down the pain to splatter the Voodoo 5 out in a painful volley a while later. It held two GPU's in SLI mode and came with its own power brick in the tricked out 5500 version. Impressively it kept up with nVidia, almost, by brute forcing missing features like hardware TnL. nVidia's torrent just wouldn't cease however, and with a grunt 3DFX keeled over in pain. By this time doctors had been called who pronounce 3DFX dead at the scene from internal bleeding. Apparently it had been trying to pass something called the Voodoo 5 6000 or some such... beast of a card with 4 GPU's. Presumably with two power bricks and a fire extinguisher attached. Through its ruined urethra.

By this time Matrox, who never hadn't really developed after it hit puberty and content with releasing warmed over version of their G200, hurriedly unzipped, trickled out their Parhelia card, zipped up, and left. But it tripped on 3DFX's corpse and couldn't quite get up again. All the while no one had really noticed that the weird kid in the corner had hit puberty with impressive results while minding its own business. At the end of the trough was ATi, merrily trying to score in that little football into the cup game you find at certain urinals.

At this point nVidia was ready to stop, with the Geforce 4000 series the stream had started to peter out a bit, but it decided to give a last confident splash. Sadly it slipped in its own piss and launched the GeForce FX. Meanwhile ATi got the ball into the cup with the Radeon 8700 to cheers and high-fives, dabbed off, and was about to go home. So was nVidia, which got up, and was about to dabb off with the Geforce 6000 series when a strange boy entered the urinals.

Just as ATi and nVidia were leaving XGI Technology strode up, unzipped in a powerful motion, looked both of them in the eyes, and sharted just as the first drops started spraying wildly. The brown stain that came of this was called the Volari GPU. A pitiful thing that never got a proper driver to go with it. How capable the hardware was remains a mystery. Red faced the boy did up its pants and scurried out. Never to be seen again.

And so it was that AMD and nVidia were the only ones to leave the urinal in some semblance of dignity. But someone else had been saying at them from the window, Intel was seeing it's chance...
 
I think a lot of people bought into hardware T&L (me included) and 3dfx's lack of it really hurt them.

Yea, I see it as possibly the biggest flaw in their last chip. As I said, it was impressive that they managed to compete, sort of, with that disadvantage. But I think everyone could see that the future meant hardware TnL. Ages ago I did see a review of the mammoth Voodoo 5 6000 and apparently it held it's own incredibly well. It was a test part obviously, so how indicative it was of a potential final product is less than certain. It was impressive though. Compared favourably to the followup generation of the early GeForce cards if I remember correctly. The GTS maybe? But considering the BOM on that part I scarcely see how it'd ever have become a profitable product.

What I find fascinating, looking back, is that most companies that failed seemed to go for the "quality over frame rate" argument on their last generation. S3 did it with the Savage card, highlighting their S3TC texture compression and texture filtering prowess. Matrox did it with the Parhelia card, highlighting the strikingly good FSAA and general texture quality. That it couldn't play Quake 3 at anything but modest frame rates at lower resolutions was sort of side-lined. Because it could also support three screens! Look at that. And 3DFX of course tried their hand at the quality argument with the Voodoo 4 and 5 and FSAA. All of them when performance was lacking turned to image quality. I believe nVidia did the same with the GeForce FX. Actually I don't have to believe, I know. Because I bought it on that premiss. Stupid as I was back then.

The GeForce FX saga is really an epic unto itself. Maybe I'll write up a wall of text on that as well...
 
Those were 2D chips.
The rest is just nasty.

Oh right, it was Virge that followed the Trio as their first 3d chip.

I remember I had an old Toshiba laptop with their DeltaChrome onboard graphics solution though. Made after S3 left the discrete GPU business. According to a cursory wikipedia check S3 still exists, but I can't see much about what they have produced as of late. Apparently something called the Chrome 640/645 with DX 11 support is the last entry.
 
GeForce DDR was the first time I really felt power. Man that shit was expensive. Dad was super pissed I spent $450 on a video card.
 
Sadly it slipped in its own piss and launched the GeForce FX. Meanwhile ATi got the ball into the cup with the Radeon 8700 to cheers and high-fives, dabbed off, and was about to go home.

I don't think Radeon 8700 ever existed. Radeon 9700 was the, uhh... Okay I'm not going to continue stretching that genital/urinal metaphore.
 
I wish I had gotten a 9000 series card from ATi back when. But things turned out differently.

My second gaming computer was a prebuilt sporting an Intel Willamette based Pentium 4 with a blazing fast (no) 1.7Ghz clock speed and a GeForce 3. It was passively cooled little thing. As time came to upgrade, nVidia was in full swing with their sales pitch for their new GeForce FX. Their graphical quality powerhouse. Since the card was expensive and I was young my father asked his more technical friends about the choice between nVidia and ATi, and for some reason my dads friend felt that nVidia produced cards offloaded more from the CPU (beats me why he thought that though). And since my CPU was so great (again, no) the FX was the one to go for. Besides, It was marketed with games like Doom 3 and movies like Final Fantasy - The Spirits Within. As an aside - then nVidia mascot Dawn was created by crew that had worked on that film. A film which coincidentally is still a guilty pleasure of mine.

I had just bought it when I realised how gimped the thing was. And. The. Noise. nVidia actually made a big deal of their "FlowFX" cooling solution. The only thing that could tame such a professional grade rendering monster! My ears and the GPU temps begged to differ however. But it did come with Morrowind in the box, a game that I've since fallen in love with. Though as another aside - I actually quit and uninstalled it in disgust within the first couple of minutes. The first quest directed me to a gloomy, rainy, village. The atmosphere was awful. And I hated it. But while I'd read about it, it hadn't quite registered with me that it had dynamic time and weather. I was, clearly, an idiot back then. Thankfully curiosity and a reinstall later corrected my misstake.

By this time I rationalised the choice of FX, after the fact, by telling myself I had bought it to play Doom 3. A game which I knew ID's John Carmack was working hard to make compatible with the FX "curious" shader architecture optimised for 16-shader precision, with the capability of going 32-bit in a pinch. Annoyingly I liked Half Life 2 more. And at that point I had become well aware of the politics that had drivven nVidia to make the FX the way they had to begin with. A belief that as an industry juggernaut they could set the standard, and chose 16-bit, with the occasional 32-bit shader precision, for others to follow in their footsteps. It was difficult to reach the requisite doublethink to think of nVidia as an underdog after that. But I managed.

Too bad that nVidia royally screwed Microsoft on the Xbox. 16-bit didn't even make it into the initial spec of DX9. Microsoft decided to set the bar higher. With 24- and 32-bit. Which meant the FX defaulted to 32-bit when spec compliant games asked for 24, forcing it to run at a disadvantage for no discernible quality benefit. Que nVidia's rampant touting of its "cinematic computing" prowess, highlighting the quality advantage of 32-bit precision. The very worst mode for the card to run in architecturally. ATi, after all, could only do 24-bit in their 9000-series! Which, nVidia was quick to remind, was less than 32. Touché. That frame rates were so horrendous that nVidia were reduced to driver optimisations with severe IQ downgrades to even seem remotely competitive didn't help. Both ATi and nVidia were caught with their pants down in that regard though, but we all knew who benefitted the most and why by that point. I seem to recall 16-bit becoming part of the DirectX 9 standard with a later point release, but if it did it was far too late by that point to salvage the FX.

What I think galled me the most about the entire thing at the time is that Valve had made a Half Life 2 version internally that ran at a mixture of both 16- and 32-bit with a great performance uplift for the FX. But decided some time into that project to cancel it for being too much of an effort to placate one hardware vendor. Though I certainly understood why on rational level, emotionally it felt like a cruel prank after seeing the benchmarks.

Man, it's fun to think back like this. Looking back it meant so much which games and hardware vendors you surrounded yourself with. And its amazing how many purely emotional self-delusions you concocted to rationalise your circumstances. To be honest I still have a soft spot for the GeForce FX. That frankly amazing name, the marketing, the underdog status it got in my mind as it struggled along. It all still makes my heart flutter a little thinking back on those emotional times. Reminiscing is quite the drug.
 
Funny, I don't seem to remember much about the times I had R300 based cards with Athlon CPUs. I think I might have had a 9600 at some point. I definitely did have a 9800Pro which fried after a couple of years, and then bought another one to replace it. I have a feeling that they just worked (disregarding that one fried card), and I suppose as a result they generated less memories.

That's pretty much the only period in my life I did not have an 'underdog' system. I've had Spectravideo's 8bit machines when C64 ruled, Atari ST when Amiga ruled; then moving to PC hardware, Gravis Ultrasound MAX when SB AWE ruled, ATI Rage II / Rendition Verite 2000 when 3dfx ruled, Savage4 / Radeon 8500 when Geforces ruled. After the R300 phase I've stuck with AMD stuff both on CPUs and GPUs (even one AMD APU laptop), meaning a slide into underdog mode again until these recent days...
 
Well, stability will do that. I guess we remember chaos better. More for the brain to latch on to memory wise. And especially if there is an emotional component, an investment. Back when I was younger I guess I felt my investments reflected more on me as a person. And that's why I so easily slid into cognitive dissonance. Though I don't think for a second that that's unique. If anything it seems to be a common behaviour with everyone. Only differences for me as a grown-up is that I'm honest with myself. I'm comfortable now to say... buy a lower performing product because I just like something about the company or whatnot. But then again I can afford to now, and couldn't back then. And I don't care what others think the way I did back in those formative years.

Life was more interesting back then in a sense in tech though. I'm hoping Intel brings some of that excitement back. I was genuinely enthralled when XGI showed up from nowhere with the Volari GPU's. Fresh blood and some competition. I think the top end card even had a dual cpu thing going on, which brought back memories of the Voodoo 5 5500. I rooted for them to the last. But it seems they gave up themselves within a year or so. I don't recall them ever releasing a proper driver that fixed all of the issues the cards suffered, and I similarly don't recall a lot of review sites taking the card seriously whatsoever and retesting it with better drivers. Which was disappointing.
 
I started with tnt 2 ultra from diamond. I may still have it. Then it was boring until the ATI 9500 that was flash able, then the x800pro that was flash able. Things got less exciting again I guess I was too lazy to deal with fuses.
 
I started with tnt 2 ultra from diamond. I may still have it. Then it was boring until the ATI 9500 that was flash able, then the x800pro that was flash able. Things got less exciting again I guess I was too lazy to deal with fuses.

I had an AMD X850 XT for a brief period on loan from a friend. The noise reminded me of the GeForce FX... The horror... The horror...
 
Back
Top