Tech-Report R300/NV30/DX9 preview

Just to clarify some things
Renderman renderers perform shading computations at 32-bit precision per channel (3 channels for color and 3 for opacity, 192-bit total). Usually the final image is saved at 64-bit Tiff.

Certainly OpenGL 2/DX9 class hardware can accelerate some portion of the shading, but rendering production-quality frames at near-real-time is only a dream.
Actually, I can’t see how offline and real time rendering can eventually converge.
Sure, they will use the same hardware, but since offline rendering doesn’t care much about rendering time, big animation houses will continue to use huge renderfarms, and they will use the extra power of the new cards to accelerate previously impractical algorithms, like true area lights and Global Illumination. So, offline rendering will always have better quality.
 
Yupp, Blinn's law will prevail! :LOL: However, if gfx cards gain the accuracy and functionality necessary to accelerate offline rendering, we might se a sudden big leap in the quality of special effects on the big screen. The one thing that is often missing in scenes that look CG is accurate lighting. Hopefully hardware acceleration can help make GI more viable for SFX, I am only aware of a few cases where it is used in motion pictures today...

Regards / ushac
 
Pavlos said:
Actually, I can’t see how offline and real time rendering can eventually converge.
Sure, they will use the same hardware, but since offline rendering doesn’t care much about rendering time, big animation houses will continue to use huge renderfarms, and they will use the extra power of the new cards to accelerate previously impractical algorithms, like true area lights and Global Illumination. So, offline rendering will always have better quality.

I bristle when I hear words like "never" and "always" in discussions of computer technology. Do you mean that my quantum GPU running at 500 terahertz in the year 2024 won't be able to do true area lights and global illumination in real time? Are you quite certain about that?

Remember 20 years ago we were using 4.77 MHz CPUs and 320x200 monochrome video? Today's PCs with GPUs are literally many thousands of times more powerful in graphical computation than the PCs back then. It would have been inconceivable to a computer graphic artist in 1980 that something like Final Fantasy could be rendered on computers without taking hundreds of years.

And I don't think there will *always* be another better algorithm/effect that will require the use of powerful renderfarms. There is a practical limit to how much computational power you need to render "reality" in real time. Primarily these are limits of human perception. Once you have methods to make any computer generated scene appear absolutely "real" to a viewer, there is no such thing as making it "more real". And we are not many years off from acheiving this goal of computer generated reality using render farms. Then as computational power continues to grow, we'll eventually reach the point where we have enough power to render this in real time.

I'd bet within 10 years we will see fully computer-generated features that are indistinguishable from film. Complete with famous "actors" and "actresses" that are really just figments of artist's imaginations. Once we can do it with a render farm, it's just a matter of waiting for the technology to do it in real time.
 
OK, so it's a bit strong to say "always", but what was probably meant was "for the foreseeable future". Famous computer graphics pioneer James Blinn made the observation that "Rendering time is constant." eg, as more processing power becomes available, artists will add more detail, crank up rendering quality etc until rendering time is as high as they can afford, wether it's a few minutes or an hour. Until we have the processing power to render ANY scene with quality indistinguishable from real photage - in real time, real time quality WILL lag motion picture sfx quality. Granted, as quality edges closer to that goal (which I still say is quite some time away), the lag will probably grow smaller and smaller.

Regards / ushac
 
Yup. I guess I'm just pointing out that while Blinn's Law is enforcable right now, eventually the law will have to be repealed. ;)
 
The thing with that 96 vs 128 fp, if the number of passes a R300 can do efficiently is lower than the number it takes for color artifcts to start showing, I think 96bits is quite sufficient from an engineering viewpoint.
 
Pete said:
Wavey, you can "redeem" yourself with your R300/NV30 reviews. No pressure. :D

Well, one thing's for sure, whatever I may have said about ATi PR before, scratch it...
 
SteveG said:
Pavlos said:
I'd bet within 10 years we will see fully computer-generated features that are indistinguishable from film. Complete with famous "actors" and "actresses" that are really just figments of artist's imaginations. Once we can do it with a render farm, it's just a matter of waiting for the technology to do it in real time.


The implications of this on the entertainment industry are mind-boggling (not the least of which, what would this do to the screen actors. Especially the 'pretty ones like Brad Pitt/Halle Barry et al).


Well looks like another disruptive technology is coming down the pipe. If only someone would do this for the oil/energy industry...sheesh..I can dream can't I :)
 
Yes, they’ve been very communicative recently. They ended up sending more than replies though! (and that is about as much as I can say on that)
 
Good to hear. OpenGL guy start bustin' heads in the PR department for you?

Looking forward to the 9000 (and whatever else they sent) review(s).
 
KnightBreed said:
Good to hear. OpenGL guy start bustin' heads in the PR department for you?

Looking forward to the 9000 (and whatever else they sent) review(s).

I don't think its a 9000. I am pretty sure its a 9700 or both, otherwise Dave wouldn't be this excited!

I don't think he would post something like
Well, one thing's for sure, whatever I may have said about ATi PR before, scratch it...
and in particular this
Yes, they’ve been very communicative recently. They ended up sending more than replies though! (and that is about as much as I can say on that)
if ATI were only sending him a 9000.

Fuz
 
Sooner or later, computer graphics are going to surpass what TVs and monitors can display. At that point, graphics development will grind to a near-halt; the development of display systems like HDTV and Plasma Screens is extremely slow and I don't expect anyone to plug their brains into "The Matrix" or step into a Holo-Deck anytime soon.

If that seems far-fetched, keep this in mind:

20 years ago, the most advanced offline-raytraced 3d graphics (mostly cartoon animals) were far lower quality than computer games today. Real-time rendered 3d graphics ("Battlezone") were wireframe cubes and pyramids.

10 years ago, the best offline-rendered CG effects ("Terminator 2") were horribly expensive, and not much better than realtime graphics today. Realtime rendered 3d graphics ("DOOM") was done at 320*240 using 2d sprites and heightmaps for terrain.

Nowadays, offline CG looks incredibly life-like (check out Medivh in Warcraft 3 cinematics); in 5-10 years CG people will be nearly indistinguishable from actors, although still horribly expensive and difficult to produce. Once this happens, rendering time will no longer be constant : After you have photo-realistic quality, faster video capabilities will simply mean that rendering goes faster; there will be no more effects to add. It's going to happen much faster than you'd think. A decade after 3d tech demo "The Mind's Eye", we have the infinitely more impressive 3d tech demo movie "Final Fantasy The Spirits Within". Think of what another 10 years can do for CG; with visual quality like that, why would anyone need to waste hundreds of thousands of offline hours?

As for real-time rendered stuff, it will take more time, but is equally inevitable. No one will ever accuse a DOOM3 scientist of being photorealistic. However, given the difference in visual quality from DOOM to DOOM3, does anyone doubt that in 2012, the PlayStation6 will be rendering frames that surpass "Final Fantasy : The Spirits Within" in visual fidelity? By then, there will be little need to have more powerful video cards. After all, there is not much difference between rendering at 4x jittered antialiasing and 9x.

At this rate of progress, the visual quality of computer games will become indistinguishable from that in movies by the year 2025... perhaps the videocards used will be the same as well.
 
oh boy!

boy oh boy, you guys are missing a huuuuuuuuuuuuuuuuuuuge part of realtime 'stuff'.

i agree, at some time in the future when film cg will be nearly indistinguishable from real life there will very little room for 'improvement'.

so the gap in terms of cg visuals will get closer and closer.

but you got to realize, film cg is about looking real, not BEING real. film cg is linear and set in stone. you need a fiery explosion in scene 132? no problemo! artists will model, animate, and fine tune it until it looks like the real thing.

real-time, interactive cg has to more than look real, is has to BE real. what good is a photorealistic room in doom 999 if your character punches a wall and nothing happens? it's not as if an artist can model and animate each and every particle of drywall and paint and so forth :).

i come from an artistic background so the recent development where technology is enabling and therefore expecting much more from artists is extremely interesting. so you have pixar-level rendering ability, what good will that be if you don't have the artists to create the content to back it up?

it's crazy enough with linear entertainment like film, where you have hundreds of artists working on two measly hours of footage.

it gets super-duper crazy when it comes to real-time stuff. realistic visuals are going to DEMAND realistic behavior, the content-creation requirements will skyrocket well beyond film cg.

the only thing that makes sense to me is a system that more or less recreates the physical world. on a set a director doesn't need to build from the ground up things like lights, props, actors, etc. he just uses existing things. i think it will be the same on the 'digital set'. software will simulate the world as we know it so when the doomguy from doom999 punches the wall, the software will take into account the physics and the materials of both fist of doomguy and wall of doomwall :) and all the other million variables that would make it believable.

it'll be like the holodeck without the holograms :).

i know we are all excited about the possibilities of shaders defining this and that surface, but beneath the layers of polygons that we see will be a whole lot of other stuff to tackle. it will be the TRUE third dimension, and not this mock-up that we call 3d today.

who knows? maybe some day it won't be just about gpus and cpus. we'll all be talking about the latest ppus (physics processing units) and aipus (ai processing units) on a beyond3d/physics/ai forum :))).
 
jvd said:
but wouldn't they take advantage of the 128bits and 96bits if it could be done at the same speed they are doing 64bits now ? and wouldn't that make for more realistic cgi ?

1. There won't be any visible difference between 128-bit and 64-bit for most color ops that are done in a single pass, provided the DAC is no more than 10-bit.

2. Where did you get that 128-bit and 96-bit would be the same speed? There will certainly be a performance hit, and a significant one at that.

128-bit and 96-bit will most likely be relegated to special situations, such as, potentially, normal maps or other non-color calculations), or situations where you want to do many passes or output at higher than 10 bits per channel.
 
Most of the interesting points about the convergence of realtime and offline rendering have been mentioned already, but still there are some things to add.

I'dd emphasize the importance of 2D calculations to get certain effects by post-processing rendered images though; the recent Doom3 screenshots show a painful lack of these (glow/bloom filters, sharpen/blur filters, color correction, contrast adjustments etc.). Compositing is a very important part of the movie effects studios' pipelines, and thankfully there are some good uses of such things in recent games (mostly on console though).

Several kinds of simulations are pretty important, too. Physics for both rigid and softbodies, cloth, crowds, hair/fur/feathers and muscles are just a few areas that are heavily developing in today's 3D applications, with most of them requiring lots of computing power to calculate for 1-2 minute long shots. They also take a lot of iterations to get right, for which there will be no possibilities when performed realtime. This area will get very important soon...

I've also been thinking about the incredible demands in manpower for such next-generation realtime 3D applications. Building just one hero character like the cave troll in LOTR requires many months of work from a whole group of people - and this monster is still very far from a human being. New kinds of companies will need to arise, for example studios might develop digital actors and lend them for game projects... Just imagine a computer game with the PR babble 'New FPS starring Joe Digital!' :))
 
A Blast from the past to a comment made up the top of this post.

Yes I remember 20 years in the PC industry ago very clearly; when CPUs were 4MHz (8088 and 8086), the advanced PC model had 256 KB of RAM, you got edlin as a productivity tool (shudder - and quickly wrote your own visual text editor with no compiler or graphics interface manuals on your PC, which had two 8 inch floppy drives that sounded like a metal lathe). I remember DOS 1 and CPM very well, I remember waiting over 6 months for the C compiler to arrive even though the salesman said it was all written and available when my brother and I bought our first NEC APC 1982 PC for AUD $4,500. Inflation (dollars then had 3 times the purchasing power of a dollar 20 years later) and currency (1.25 * - sigh currency devaluation today = * .56) adjusted that is about USD $18,000 in todays monies - and that was a special student pricing, un-taxed purchase.

John Carmack said rendering farms will be for the most part obsolete by the end of 2003. Blame him for this view if you don't like it. Yes do artwork in Maya or whaterver - but compile the output to OpenGL or Directx and send it to a GPU the article says.

So animation firms win big time in alot of production areas, and sell your stocks in SGI soon huh?

The entry 3D graphics platform in a year or two's time will be absolutely amazing. By 2004 I guess most of what we'd like to have today will be at our fingertips, what will we reach for then I wonder?
 
BoddoZerg said:
Sooner or later, computer graphics are going to surpass what TVs and monitors can display.

LOL, not to be annoying buy Computer Graphics have already surpassed what a shitty TV can display. You realize that TVs have a resolution of <640*480 (it's not exact either). Honestly, the quality of TV is horrible, and VHS video is even worse, but anyway...

What we need is more HDTVs that don't all cost $3000+ (I'd never consider spending anywhere near that amount for a TV, LOL). Seriously, have the prices on HDTVs dropped at all? I guess the TV manufacturers are content with just pumping out cheap, crappy looking TVs.

TVs are also still running interlaced at 29.975fps, using a standard from back in the 1950s. It's pretty pathetic, really, TVs are just so far behind the times.
 
If video games become close to undistinguishable to reality, and a violent,realistic game (say Soldier of Fortune 50) is released, then to me that brings up some huge issues related to violence in video games, and how kids percieve reality vs. virtual reality, the difference between good in reality (not killing everything that moves, not jumping off 50foot cliffs and not taking "damage"), and good in a virtual reality (killing everything that moves, etc...). Yes what I'm saying presumes something more than just "real" visuals, with the gamer sitting in front of a screen and interacting with a keyboard/mouse/game controller.

just trying to start a discussion, my opinion only -

Serge
 
Back
Top