ZenThought
Newcomer
http://online.wsj.com/article/0,,SB108190075599182040,00.html?mod=yahoo_hs&ru=yahoo
if you don't have access to wsj. here is content:
----------------------------------------------
SANTA CLARA, Calif. -- Companies in Silicon Valley often say their technology is sexy. Jen-Hsun Huang, chief executive of Nvidia Corp. here, is showing off something that really is.
It is a computer-generated mermaid named Nalu, with a cloud of golden tresses that realistically seem to reflect dappled light and flow with the water. Nalu has rosy, unusually lifelike skin, and she is displaying generous quantities of it with a flirtatious wiggle.
More than 2,000 miles away, in a suburb of Toronto, executives at rival ATI Technologies Inc. are preparing a coming-out party for Ruby, another computerized creation who also has a skin texture unusually detailed for a videogame character, along with a shock of red hair and pneumatic chest.
The two characters are unlikely soldiers in a fast-moving technology battle helping to shape the evolution of digital entertainment. Nvidia and ATI, the two leading providers of chips that control graphics on personal computers and other gadgets, developed the animated figures to demonstrate the power of the improved technology each company is unveiling this month. The realism of animation increases as images use a greater number of geometric building blocks, called polygons, to create them. Nalu is composed of 300,000 polygons; Ruby has 80,000 -- both far in excess of what most video images today have.
This is why voluptuous Nalu and Ruby are more than just a form of eye candy: Human skin and hair have been among the hardest textures to simulate convincingly, so their technology presents a breakthrough.
Nvidia, which had a prior computerized model dubbed Dawn, says its creations are realistic enough to draw offers to appear in TV commercials. "We are getting calls from Hollywood agencies," says Mark Daly, a vice president in charge of digital content. "It's pretty weird."
Nevertheless, both companies still are a long way from the industry's ultimate goal -- artificial worlds that are indistinguishable from reality. But in the hands of skilled programmers, the chips will help bring a new level of realism and emotional force in games by creating characters that are more convincing when they move, talk, laugh and cry. Over time, such chips are likely to inspire richer forms of entertainment, where story lines and character development are as important as action, that will appeal to broader audiences.
"The videogame market is now mainly targeted at young men," Mr. Huang says. Soon, he adds, "you could imagine interactive soap operas."
Nvidia and ATI, which both have sales of about $2 billion and market capitalizations topping $4 billion, are longtime rivals in the graphics-chips business. Although ATI was a graphics pioneer, Nvidia outstripped it several years ago to dominate the sector. Lately ATI has been regaining market share. Microsoft Corp. picked ATI to supply chips for its next videogame system, succeeding Nvidia, which makes chips for the current Xbox system.
Dueling demos: Nvidia's Nalu and ATI's Ruby, right, illustrate the lifelike imagery made possible by the new graphics chips.
Still, Nvidia still accounted for 58% of the 23 million PC graphics chips sold for desktop PCs in the fourth quarter of 2003, says industry watcher Mercury Research, compared with 38% for ATI.
Mr. Huang vows to take the lead with the new chip the company is introducing today at a company-sponsored event expected to draw hundreds of gamers to San Francisco. ATI is also confident about Ruby's prospects in the graphics beauty pageant. "I don't think there is any doubt that we will win this round," says Rick Bergman, an ATI senior vice president in charge of its desktop products.
The competition between the two graphics-chips makers has caused the power of graphics chips to double every year. By comparison, Intel Corp's microprocessors typically double in performance every 18 months or so. Nvidia's new GeForce 6 chips, for example, have a whopping 222 million transistors -- nearly twice the number in Intel's most-powerful Pentium 4 microprocessor.
And while Intel's chips have circuitry for one electronic brain handling calculations, graphics chips have multiple processors for specialized jobs performed in parallel. Nvidia's latest chip has the equivalent of 32 specialized brains; ATI will disclose details of its new chip at its own event this month.
Such chips, built into PCs or sold on accessory graphics cards, are the source of most of the images on a PC screen. As such, they are increasingly important to the design decisions of engineers, movie studios, advertising agencies and Web developers, says Jon Peddie, an industry analyst in Tiburon, Calif. The standard microprocessor, in many instances, acts as a mere "coprocessor" to the graphics chips, he says.
When creating a game, designers define geometric models of objects and characters and then determine how they will move. They later define visual textures, such as simulated skin, cloth, wood or metal, and arrange artificial light sources that determine colors and shadows. A computing step called rendering creates the images that users see. The more powerful the graphics chip, the greater the degree of detail and range of choices a designer has at his disposal.
In animated movies, with a set number of scripted scenes, studios can throw hundreds of computers into a final rendering process that creates ultrarealistic images. Computer games typically have lacked the realism of movies because they offer nearly infinite possibilities for action and have to render scenes as users play them.
These latest Nvidia and ATI chips are narrowing the gap in image quality between games and movies. A key reason is a set of programming conventions, defined by Microsoft, that assigns a tiny piece of software to define the light and shade on each of the thousands of picture elements, or pixels, that make up a display screen.
As graphics chips become more powerful, the hardware movie studios and game makers use eventually could become the same, allowing them to swap scenes and characters. "It's going to become completely possible to have the graphics engines used for gaming also used in film rendering," says John Carmack, co-founder and technical director of id Software, which is finishing a long-awaited game called Doom III that took four years to produce.
Game makers already are moving toward film-quality images. Another high-profile game sequel called Half-Life 2 is expected to produce unusually realistic people, even with existing graphics. Valve, a closely held software company in Kirkland, Wash., has developed a system that simulates 40 muscle movements in human speech -- one of the most difficult actions to mimic. The movements will be used by protagonist Gordon Freeman, a virtual character who already is a star from the first Half-Life game.
Graphic chips also have helped games become a spectator sport, attracting onlookers who watch the action at tournaments on big display screens. "The nicer the games look, and the more realistic they look, the more appealing it is for people to watch," says Craig Levine, manager of Team 3D, a gaming team whose sponsors include Nvidia.
[/img]
if you don't have access to wsj. here is content:
----------------------------------------------
SANTA CLARA, Calif. -- Companies in Silicon Valley often say their technology is sexy. Jen-Hsun Huang, chief executive of Nvidia Corp. here, is showing off something that really is.
It is a computer-generated mermaid named Nalu, with a cloud of golden tresses that realistically seem to reflect dappled light and flow with the water. Nalu has rosy, unusually lifelike skin, and she is displaying generous quantities of it with a flirtatious wiggle.
More than 2,000 miles away, in a suburb of Toronto, executives at rival ATI Technologies Inc. are preparing a coming-out party for Ruby, another computerized creation who also has a skin texture unusually detailed for a videogame character, along with a shock of red hair and pneumatic chest.
The two characters are unlikely soldiers in a fast-moving technology battle helping to shape the evolution of digital entertainment. Nvidia and ATI, the two leading providers of chips that control graphics on personal computers and other gadgets, developed the animated figures to demonstrate the power of the improved technology each company is unveiling this month. The realism of animation increases as images use a greater number of geometric building blocks, called polygons, to create them. Nalu is composed of 300,000 polygons; Ruby has 80,000 -- both far in excess of what most video images today have.
This is why voluptuous Nalu and Ruby are more than just a form of eye candy: Human skin and hair have been among the hardest textures to simulate convincingly, so their technology presents a breakthrough.
Nvidia, which had a prior computerized model dubbed Dawn, says its creations are realistic enough to draw offers to appear in TV commercials. "We are getting calls from Hollywood agencies," says Mark Daly, a vice president in charge of digital content. "It's pretty weird."
Nevertheless, both companies still are a long way from the industry's ultimate goal -- artificial worlds that are indistinguishable from reality. But in the hands of skilled programmers, the chips will help bring a new level of realism and emotional force in games by creating characters that are more convincing when they move, talk, laugh and cry. Over time, such chips are likely to inspire richer forms of entertainment, where story lines and character development are as important as action, that will appeal to broader audiences.
"The videogame market is now mainly targeted at young men," Mr. Huang says. Soon, he adds, "you could imagine interactive soap operas."
Nvidia and ATI, which both have sales of about $2 billion and market capitalizations topping $4 billion, are longtime rivals in the graphics-chips business. Although ATI was a graphics pioneer, Nvidia outstripped it several years ago to dominate the sector. Lately ATI has been regaining market share. Microsoft Corp. picked ATI to supply chips for its next videogame system, succeeding Nvidia, which makes chips for the current Xbox system.
Dueling demos: Nvidia's Nalu and ATI's Ruby, right, illustrate the lifelike imagery made possible by the new graphics chips.
Still, Nvidia still accounted for 58% of the 23 million PC graphics chips sold for desktop PCs in the fourth quarter of 2003, says industry watcher Mercury Research, compared with 38% for ATI.
Mr. Huang vows to take the lead with the new chip the company is introducing today at a company-sponsored event expected to draw hundreds of gamers to San Francisco. ATI is also confident about Ruby's prospects in the graphics beauty pageant. "I don't think there is any doubt that we will win this round," says Rick Bergman, an ATI senior vice president in charge of its desktop products.
The competition between the two graphics-chips makers has caused the power of graphics chips to double every year. By comparison, Intel Corp's microprocessors typically double in performance every 18 months or so. Nvidia's new GeForce 6 chips, for example, have a whopping 222 million transistors -- nearly twice the number in Intel's most-powerful Pentium 4 microprocessor.
And while Intel's chips have circuitry for one electronic brain handling calculations, graphics chips have multiple processors for specialized jobs performed in parallel. Nvidia's latest chip has the equivalent of 32 specialized brains; ATI will disclose details of its new chip at its own event this month.
Such chips, built into PCs or sold on accessory graphics cards, are the source of most of the images on a PC screen. As such, they are increasingly important to the design decisions of engineers, movie studios, advertising agencies and Web developers, says Jon Peddie, an industry analyst in Tiburon, Calif. The standard microprocessor, in many instances, acts as a mere "coprocessor" to the graphics chips, he says.
When creating a game, designers define geometric models of objects and characters and then determine how they will move. They later define visual textures, such as simulated skin, cloth, wood or metal, and arrange artificial light sources that determine colors and shadows. A computing step called rendering creates the images that users see. The more powerful the graphics chip, the greater the degree of detail and range of choices a designer has at his disposal.
In animated movies, with a set number of scripted scenes, studios can throw hundreds of computers into a final rendering process that creates ultrarealistic images. Computer games typically have lacked the realism of movies because they offer nearly infinite possibilities for action and have to render scenes as users play them.
These latest Nvidia and ATI chips are narrowing the gap in image quality between games and movies. A key reason is a set of programming conventions, defined by Microsoft, that assigns a tiny piece of software to define the light and shade on each of the thousands of picture elements, or pixels, that make up a display screen.
As graphics chips become more powerful, the hardware movie studios and game makers use eventually could become the same, allowing them to swap scenes and characters. "It's going to become completely possible to have the graphics engines used for gaming also used in film rendering," says John Carmack, co-founder and technical director of id Software, which is finishing a long-awaited game called Doom III that took four years to produce.
Game makers already are moving toward film-quality images. Another high-profile game sequel called Half-Life 2 is expected to produce unusually realistic people, even with existing graphics. Valve, a closely held software company in Kirkland, Wash., has developed a system that simulates 40 muscle movements in human speech -- one of the most difficult actions to mimic. The movements will be used by protagonist Gordon Freeman, a virtual character who already is a star from the first Half-Life game.
Graphic chips also have helped games become a spectator sport, attracting onlookers who watch the action at tournaments on big display screens. "The nicer the games look, and the more realistic they look, the more appealing it is for people to watch," says Craig Levine, manager of Team 3D, a gaming team whose sponsors include Nvidia.
[/img]