The startup that saved ATI

http://www.eetimes.com/story/OEG20030421S0028


The startup that saved ATI

By Rick Merritt

EE Times
April 21, 2003 (11:06 a.m. ET)


No one expects that ATI Technologies Inc. (Markham, Ontario) will drive graphics king Nvidia Corp. completely off its pedestal. But the scrappy No. 2 player in 3-D chips for PCs is staging a comeback of sorts, at a time when computer graphics are making an architectural shift to programmability and a market shift to integrated parts.

Over the past couple of years, ATI has overhauled itself with management and engineering prowess from acquisition ArtX. Now, most observers expect it will re-emerge as Nvidia's equal, helping set the pace in both PCs and game consoles as a maturing graphics industry heads into what many say will be uncharted waters.

ATI is poised to launch at Computex next month what could be the first integrated chip set to run Microsoft's DirectX 8.1 API, use dual-channel 400-MHz double-data-rate memory and link to the 800-MHz bus for Intel's next-generation Pentium, dubbed Prescott. The chips could leapfrog anything chip set makers Intel, Silicon Integrated Systems and Via can offer in graphics while also beating Nvidia, which makes chip sets for AMD CPUs but has no Intel processor bus license.




The launch is especially sweet for ATI, which lost its perch atop PC graphics after Intel Corp. rolled out a new category of chip sets with integrated graphics in late 1999. Afterward, the erstwhile market leader saw its share of desktop sockets nearly halved.

But before it faded into graphics history, ATI beat out offers from competitors S3 Corp. and Nvidia to acquire ArtX (Palo Alto, Calif.), seen by many as the last hot startup in the maturing PC graphics industry. ATI bought ArtX in February 2000 for about $400 million in stock and options.

Since that time, the 70-person startup has been reinvigorating ATI with efforts that are now starting to bear fruit on several fronts. "The center of gravity for ATI has definitely shifted from Canada to California," said one observer who asked not to be named.

The story begins in late 1997, when a handful of top engineers and managers from Silicon Graphics Inc., many of whom had helped design the Nintendo 64 console, got an idea for a startup. They would cram high-end graphics into a PC chip set and leapfrog the giants of the mainstream desktop world by leveraging what they learned from designing a high-performance, low-cost game box.

The result was ArtX, which got initial funding from Taiwanese PC maker Acer Inc. About nine months later, old contacts from Japan came seeking a partner for their next-generation console, which later became the Nintendo GameCube.

"They had given up on SGI. The last of the people they trusted were gone, and they went looking for the people. It's not a company-to-company thing for them; it's a person-to-person thing," said Greg Buchner, at that time the head of ArtX.

The visit sparked a debate at the small startup. "We said we really didn't want to divert ourselves, but Nintendo can make a pretty compelling argument and it was a pretty huge opportunity, so we decided to go ahead in mid-'98," said Tim Van Hook, chief designer for the Nintendo 64 and a founder of ArtX. So ArtX forged a deal to develop the Flipper chip for the console code-named Dolphin in return for royalties. "We knew we couldn't take on the [chip] manufacturing. That would require as many more people as we had in the whole company at that time," said Joe Macri, another SGI veteran who became the 23rd person to join ArtX. He is now a director of technology at ATI.

At Comdex/Fall in 1999, the startup launched with some fanfare its ArtX1 PC chip set. By that time, the company had hired as its president David Orton, a hard-charging former manager of SGI's advanced-graphics division, who was keen to take ArtX public. However, an IPO looked risky. As it turned out, the Comdex splash brought the company lots of attention-and acquisition offers from ATI, Nvidia and S3.

It wasn't hard sorting out those bids. S3 was already in trouble and would break up in April 2000. "We could see the initial signs of that," said Buchner. As for Nvidia, "we didn't think it would work culturally or from a valuation perspective."

ATI was the clear fit. It was trying to get its own integrated-graphics program off the ground to catch up with Intel, which was wreaking havoc with the market. ATI had an Intel bus license, but it had no presence in the console space, no office in Silicon Valley and was badly in need of a makeover. Indeed, ArtX and ATI managers separately described ATI at the time of the acquisition as "a sea" or "a blob" of engineers without clear lines of responsibility. "They were a startup with one big organization," said Buchner.

In what turned out to be a case of the tail wagging the dog, ArtX's Orton was named president and chief operating officer of the merged entity from the outset. He reorganized ATI into separate business units and three major design teams under a handpicked set of managers who shared his drive to compete.

"He is someone who loves a good fight and he loves to win it," said Buchner, now one of two chief technology officers and four vice presidents of engineering at ATI.

Leveraging the ArtX team in Palo Alto, Orton created a Silicon Valley base for ATI just a mile down the road from Nvidia's sprawling green-marble headquarters in Santa Clara. Engineers at the ATI site finished the GameCube graphics chip, then led the design for the R300 graphics core, ATI's first to execute Microsoft's DirectX 9 application programming interface.

The DirectX spec was driving a new architectural direction in PC graphics. Rather than delivering fixed functions based on approximations using integer math and a graphics pipeline pioneered by SGI, DirectX 8.1 had taken a new course: toward more general-purpose programmable vector processors based on more-exacting floating-point calculations.

Ultimately, it is thought that the DirectX evolution will lead chip makers to create devices based on dozens of computing elements that can calculate polygon vertices and run pixel-shading programs for a variety of graphics and video applications. Sony, IBM and Toshiba apparently share this vision. Their Cell architecture-announced in March 2001, though not yet released-could someday use hundreds of cores in a parallel array to power future Playstation consoles and a wide variety of other broadband multimedia products.

"It's all about programmability now. That's the new battleground," said Peter Glaskowsky, editor of the Microprocessor Report. "These chips are not distinguished by the number of parallel pipelines or clock rates anymore. The key issue is how much can you do to each pixel you draw, how many programmable instructions you can run per pixel."

In this environment, ATI tacked into the wind. Unlike past cores that aimed for good-enough graphics, the R300 represented an effort to match or beat the best desktop chip Nvidia might offer. "We didn't say get the best performance at 10 x 10-mm die size. We just said get the best performance. We had to go out and capture the flag," said Orton.

And last August, ATI did just that, launching its Radeon 9700 six months before Nvidia shipped its GeForce FX part for DirectX 9. That lead-a rarity in PC graphics, where new cores ship every 12 to 16 months-was as much a triumph of execution for the reorganized ATI as the result of stumbles at Nvidia.

According to Macri, the tale of the tape fell on two strategic decisions. Nvidia opted for a 128-bit memory bus linked to next-generation GDDR-2 memory and made in the latest 130-nanometer copper process. ATI chose a 256-bit-wide memory bus geared for fast transfers over existing GDDR memory and made in a 150-nm technology-effectively the last generation of aluminum interconnects.

"You can blame me for the 150-nm decision," said Buchner, who leads on silicon issues in his CTO role. "It was one of the biggest unending arguments in the company. It was not a question of 130 nm not being ready. It was more about trying to hit the ground running and asking how many risks we want to take."

For its part, Nvidia attributed its delays on the GeForce FX to "getting the chips to yield at speed," said Dan Vivoli, vice president of marketing. Nvidia recently signed IBM as a fab partner in addition to its existing one, Taiwan Semiconductor Manufacturing Co. Some think that move could give it an edge getting to 90-nm chips.

While the high-end graphics chips get much of the attention, chip sets with integrated graphics have swept across the market, becoming in some ways more strategic. Market watcher Mercury Research (Scottsdale, Ariz.) estimates that by the start of 2003, as much as half of all desktops used integrated graphics, a category that didn't exist before 1999. In this sector, ATI's progress has come more slowly.

ATI combined its internal teams with those from ArtX and another acquisition, Chromatic Research. After misfiring on a couple of generations, ATI aimed at its core notebook users and hit the jackpot with the U1, which shipped last summer. ATI's integrated chips now sell at about a million units per quarter, a notch above Nvidia's integrated parts, said Dean McCarron, principal at Mercury Research.

The company will pitch the integrated chip set that is to debut in May for low-cost, high-performance consumer systems. That positioning might be a sop to soften competition with technology partner Intel, which could continue to command the chip set space for business desktops that don't require heavy graphics.

Also next month, Microsoft Corp. will make a move that will likewise strengthen ATI's hand in chip sets. At the Windows Hardware Engineering Conference, Microsoft will detail plans to use a 3-D interface on its next version of Windows, dubbed Longhorn and slated for 2005. Such a move could bolster ATI as one of the few chip makers capable of easily rolling out a DX 9 chip set in time for Longhorn's release. A higher profile for 3-D could ultimately boost the fortunes of all the graphics companies and set the stage for the PC, not the console, to command the most attention among software developers.

"When the OS has 3-D as part of the environment, that's the point when 3-D moves into everybody's world," said Dave Rolston, who heads ATI's 175-person Silicon Valley office.

In this expanding environment, the ATI vs. Nvidia battle has lots of legs. Nvidia will launch within days its next-generation core, the NV-35, which is expected to sport a 256-bit memory bus and other major enhancements. "It will be unambiguously the best," said Vivoli.

Meanwhile, both companies have hit the market with versions of their latest DX 9 parts aimed for mainstream desktops, where most of the money in PC graphics lies. "The design wins are happening right now," said analyst McCarron.

Beyond that the two are set to slug it out all over again with their next-generation cores built for the new PCI Express interconnect. ATI is also challenging Nvidia for the graphics socket in the next-generation Microsoft Xbox and is charging into consumer applications such as cell phones and set-top boxes. "In 2004 ATI will become a visual-computing company beyond the PC. We've got to get into a faster-growing part of the market," Orton said.
 
It actually sounds like ATi doesn't exist any more... sounds more like ArtX completely took over what was ATi. ^_^ Neat!
 
Nvidias "Equal", dont know where these guys have been, but with the 9700 pro and the 9800 pro, ATI is KING.. at least for the time being
 
CaptainHowdy said:
Nvidias "Equal", dont know where these guys have been, but with the 9700 pro and the 9800 pro, ATI is KING.. at least for the time being

I think he meant interms of industry leverage/OEM contracts/mindshare.
 
I also have to laugh.. 3Dfx gives NVIDIA their cursed legacy, and ArtX gives ATI startup-fevor-success :p

Never been a 3Dfx fan, but I wont hesitate to say 'told you so'! The 'FX' in GeForce FX is supposed to represent their contribution.. WOOPS :p
 
zurich said:
I also have to laugh.. 3Dfx gives NVIDIA their cursed legacy, and ArtX gives ATI startup-fevor-success :p

Never been a 3Dfx fan, but I wont hesitate to say 'told you so'! The 'FX' in GeForce FX is supposed to represent their contribution.. WOOPS :p


:LOL: next thing u know, Sega will contribute on making the XBOX2 and it will only sell 10million units.... oh wait they dont need Sega for that :LOL: :LOL:
sorry couldnt resist
 
zurich said:
I also have to laugh.. 3Dfx gives NVIDIA their cursed legacy, and ArtX gives ATI startup-fevor-success :p

Never been a 3Dfx fan, but I wont hesitate to say 'told you so'! The 'FX' in GeForce FX is supposed to represent their contribution.. WOOPS :p


if Nvidia had used what they gained from the 3dfx buyout, they would be king, but they have been cocky, the buyout was only to wipeout the competition, they wasted amazing technology, and this is why they suck now.
 
CaptainHowdy said:
zurich said:
I also have to laugh.. 3Dfx gives NVIDIA their cursed legacy, and ArtX gives ATI startup-fevor-success :p

Never been a 3Dfx fan, but I wont hesitate to say 'told you so'! The 'FX' in GeForce FX is supposed to represent their contribution.. WOOPS :p


if Nvidia had used what they gained from the 3dfx buyout, they would be king, but they have been cocky, the buyout was only to wipeout the competition, they wasted amazing technology, and this is why they suck now.


Um, no, they suck because they absorbed over 100 engineers in one sitting without the proper management infrastructure to support it. Not to mention that those engineers may or may not have sucked, judging from past 3dfx products.

Ever since the TNT days NVIDIA's been a juggernaught, then suddenly over 100 slackers join up and things sort to go sour. Gee, it must have been the core NVIDIA employees who had 6 generations of product wins under their belt! Riiight.. :rolleyes:

What tech did 3dfx have that NVIDIA should have used? T-Buffer? :LOL: Chip development is a gradual, planned process, you can't just throw out 6 generations of HW for some crazy Gigapixel tech or whatever.

Anyways, I'm not going to comment any more on this, because it looks like it'll turn into a frothing-at-the-mouth BLEEPboy thread.
 
3dfx was the first to bring out Anti-aliasing, everyone scoffed, but look at it now, its debatable that they SUCKED, the TNT SUCKED, as did the TNT2, the GEforce was the first card they semi got right, the voodoo was slower, but its image quality was far better than the Nvidia lineup.

Nvidia did not absorb 3dfx to put them to use, they wanted to destroy the only real competition they had at the time, it wasnt about what 3dfx had released, its about what they had in the works.. 3dfx put so much into thier unreleased card, it could have never hit market, if Nvidia had run with it, they would have come out on top, 52 bit color(yes, its an unusual number, but that was the official spec), the tile rendering technology 3dfx had purchased a bit before, all of that could have gone to some major leaps for Nvidia, they just let it all go to waste.
 
3dfx was the first to bring out Anti-aliasing

Actually that was PowerVR, Neon250 had FSAA. But 3DFX were the first to bring out a card with FSAA as a major selling point. As in the card was fast enough for its FSAA to be useble at a good framerate in most, or even all, games.
 
Teasy said:
Actually that was PowerVR, Neon250 had FSAA. But 3DFX were the first to bring out a card with FSAA as a major selling point. As in the card was fast enough to use FSAA in most, or even all, games.

Actually I believe the Verite had (hardware assisted) edge aliasing. Nevermind the fact that AA had been invented years prior but not in the consumer space.
 
Ty said:
Teasy said:
Actually that was PowerVR, Neon250 had FSAA. But 3DFX were the first to bring out a card with FSAA as a major selling point. As in the card was fast enough to use FSAA in most, or even all, games.

Actually I believe the Verite had (hardware assisted) edge aliasing. Nevermind the fact that AA had been invented years prior but not in the consumer space.
And the Voodoo2 too, but it wasn't really used.
Maybe Edge AA was used in a Tomb Raider game.
 
Stefan Payne said:
Ty said:
Teasy said:
Actually that was PowerVR, Neon250 had FSAA. But 3DFX were the first to bring out a card with FSAA as a major selling point. As in the card was fast enough to use FSAA in most, or even all, games.

Actually I believe the Verite had (hardware assisted) edge aliasing. Nevermind the fact that AA had been invented years prior but not in the consumer space.
And the Voodoo2 too, but it wasn't really used.
Maybe Edge AA was used in a Tomb Raider game.

VQuake was actually quite nice looking.. a hell of a lot sharper than GLQuake on the V1.
 
zurich said:
I also have to laugh.. 3Dfx gives NVIDIA their cursed legacy, and ArtX gives ATI startup-fevor-success :p

Never been a 3Dfx fan, but I wont hesitate to say 'told you so'! The 'FX' in GeForce FX is supposed to represent their contribution.. WOOPS :p

Yeah, I pretty much agree with this. How else do you explain that previously invincible nVidia goes to the crapper as soon as they start advertising "3dfx technology", while previously mediocre ATi becomes godlike once they acquire ArtX?

Bit more than raw coincidence, I think.

Besides, just look at the whole GeForceFX strategy. The entire process reeks of 3dfx.

- Incredibly cool pre-launch marketing hype.
- Paper launch of a vaporware product.
- Takes many months to arrive, and still doesn't exist in any significant quantity.
- Low end version is too slow, high end version is too big, too hot, too noisy, and too expensive.
- Drivers are much worse than they should be.
- New corporate strategy depends on making exclusivity deals with game developers. I wonder if they'll still call it GLide.
- Nonexistent videocard based on hyperadvanced technology is supposed to save the day - when it arrives!


PS: If nVidia actually manages to deliver the Rampa... I mean NV35, I'll forgive them. Of course, a paper launch within days probably means it'll be available around December.
 
Stefan Payne said:
And the Voodoo2 too, but it wasn't really used.
Maybe Edge AA was used in a Tomb Raider game.

That wasn't hardware assisted though. Well, ok maybe the lines are getting blurred now. Perhaps Simon would care to step in (once again on this topic) and set up some parameters.
 
Nvidia rested on one chip basicly the whole time . (geforce) It was only a matter of time before someone caught or passed them . Ati basicly caught them with the r200. Surpassed them with the r300 .
 
Beyond that the two are set to slug it out all over again with their next-generation cores built for the new PCI Express interconnect. ATI is also challenging Nvidia for the graphics socket in the next-generation Microsoft Xbox and is charging into consumer applications such as cell phones and set-top boxes. "In 2004 ATI will become a visual-computing company beyond the PC. We've got to get into a faster-growing part of the market," Orton said.

:?:
 
jvd said:
Nvidia rested on one chip basicly the whole time . (geforce) It was only a matter of time before someone caught or passed them . Ati basicly caught them with the r200. Surpassed them with the r300 .

Not really, the NV2x was quite different from the NV1x... and besides, many a benchmark has shown the R2x0 architecture getting smacked around by the NV2x. The NV3x is the first NVIDIA part since the Riva128 to suck feature wise, and speed wise.. even if the original TNT wasn't at the top of the charts, it had ace compatability and many things that 3Dfx wouldn't put in till the VSA-100 (32bit colour, 8 bit stencil buffer, 32bit Z, etc etc).

Right now ATI may shine in the enthusiast market, but there are many other places where no one will touch their products... (namely workstation). Try loading up Maya with a Radeon or FireGL (the ATI branded cards).. not pretty. Even SoftImage's shaders won't work on ATI parts - not because of CG, but because their OpenGL ICD is more or less focused on games, leaving many apps broken.

So while ATI may have gotten one market right, they're neglecting another.. and I wont hold my breath to see if they can fix one thing without screwing up another. For the computer guy who does everything from games to cad to 3d animation, NVIDIA is really the only choice (and I'm not talking Quadros, even the lowly GF2mx is more than capable of putting in a days work into 3DS/Maya then logging in some RTCW fun).

However, ATI scores 10s on their campaign to regain mindshare. The best thing they could possibly do is get the XB2 contract from MS. NVIDIA's already gained boatloads of support from leveraging their NV2A in the XB onto PC devs for ports of XB games and such. Two ATI baesd consoles and an ATI-influenced PC market is definitely their CEO's wet dream.
 
Back
Top