The Future of 3D Graphics

alexsok

Regular
It has become sort of a conjecture around here when it comes to stuff that has been posted already, so without trepidation
or anguish of repeating myself, hereby i present to you ExtremeTech's new article:
http://www.extremetech.com/article2/0,3973,1091392,00.asp

Here's a quote from the article, giving you a rough idea of what to expect:
The image above left is radiosity-generated, and the image above right is ray-tracing-generated. Every pixel in each of the pictures is rendered by simulating the inter-reflection of light rays within the scene. Triangles or pixels can't be drawn by themselves-- other triangles that can be "seen" from each pixel in the triangle must be taken into account, including light bouncing around the other triangles. Ultimately, a large number of light rays incident on each pixel must be calculated. In the radiosity image on the left, most of the room is lit indirectly. The light from the ceiling, for instance, is bouncing around and reflecting off various surfaces. The processing power required to render the scene is orders or magnitude more complicated than just doing simple triangle shading.

p.s
hope i didn't get too emotional! ;)
 
I give nVidia no credit and that goes for david K as well. He is NOT a John Carmack.. David should go work for another company as nVidia is in some serious trouble right now....... maybe ATi should hire him?
edit

He's nVidia's cheif scientist right? If I was him I would spend less time making products that suck in the Max IQ vs speed department and concentrate on making product that is a LOT faster without all of the hacks.

prime example ATi's 6x fsaa looks a hell of a lot better than nVidia's 8x

Ati got it right in the ansio department as well they found a solution to the problem that produces excellent IQ without killing performance...

3dfx was right way back in the day when the stated that 32bit color was not an easy thing to do and they never compromised IQ for speed
the same can be said for 128bit percision. it is not an easy thing to do and Ati was smart just to follow the DX 9 spec until they find a way to make 128bit really fast as no one likes a 50% performance penalty that the nV30 took

Maybe david k should stick to the DX9 spec and drop the cinematic computing as it definately looks like ATi is beating them at their own game...
 
YeuEmMaiMai said:
... <snipped emotionally driven comments, which is understandable given the current hot topic>
I found a lot of interesting things mentioned in this article personally, which makes it a good article to read in my books.

YeuEmMaiMai, I think the focus of the article goes beyond the usual out-of-the-box features like AA and AF (not that I'm downgrading their importance in the whole scheme of things).
 
i find it extremely funny that this guy wants to lead 3d graphics and he is currently working for a company that is doing everything that it can to

a. kill competition
b. cheat
c. produce products with crap IQ at default settings

How can you respect someone that is not leading the 3d industry but instead stiffeling it? after he is nVidia's cheif scientist and has first hand knowledge of all the crap that is going on over there.

You mean he did not honestly know that nV30 sucked at the very things they were promoting? FSAA, ANSIO, 129bit percision, shader ops, vortex processing. You are giving the guy too much credit....cinematic computing means excellent image quality...... you know like toy story and such....
 
YeuEmMaiMai said:
i find it extremely funny that this guy wants to lead 3d graphics and he is currently working for a company that is doing everything that it can to

a. kill competition
b. cheat
c. produce products with crap IQ at default settings

How can you respect someone that is not leading the 3d industry but instead stiffeling it? after he is nVidia's cheif scientist and has first hand knowledge of all the crap that is going on over there.

You mean he did not honestly know that nV30 sucked at the very things they were promoting? FSAA, ANSIO, 129bit percision, shader ops, vortex processing. You are giving the guy too much credit....cinematic computing means excellent image quality...... you know like toy story and such....

I've been extremely critical of a lot of David Kirk claims during interviews over the years, but like it or not Nvidia is the market share leader, and as the company's CTO people are going to listen to Kirk when he speaks. It's an article well worth reading.
 
I read it and while it seems interesting, not a lot of people are going to place faith in a man that works for a company that does what I have mentioned above....


with more and more AIB starting to use ATi products, I do not see nVidia keeping their market share.......
 
Josiah said:
is this really the direction NVIDIA is going in? real-time radiosity?
No. I think the point is GPU horsepower is increasing and so is flexibility. So in the future real-time radiosity and ray-tracing will even be possible. The method though will be up to the software developer. The GPU should still be faster at traditional hardware rasterization though.
 
oh sweet.

I went to the first half of David Kirk's presentation. Had to jet as my plane was leaving in 1.5hrs! Now I can read about the rest.
 
YeuEmMaiMai said:
i find it extremely funny that this guy wants to lead 3d graphics and he is currently working for a company that is doing everything that it can to

a. kill competition
b. cheat
c. produce products with crap IQ at default settings

How can you respect someone that is not leading the 3d industry but instead stiffeling it? after he is nVidia's cheif scientist and has first hand knowledge of all the crap that is going on over there.

You mean he did not honestly know that nV30 sucked at the very things they were promoting? FSAA, ANSIO, 129bit percision, shader ops, vortex processing. You are giving the guy too much credit....cinematic computing means excellent image quality...... you know like toy story and such....
Did the article mention whether David was interviewed before, during or after the NV30 made its phisical debut? I didn't catch that.

How can you respect someone that is not leading the 3d industry but instead stiffeling it?
No one person "leads" the 3D industry, regardless of which company that person works for. I fail to see a single instance in this article where we can draw a conclusion on whether David Kirk is "leading"( or proposes to lead) the 3D industry. A "Chief Scientist" is the the same as "Chief Janitor" -- they are head of a department of a (perhaps) a large corporation and he does what he is supposed to do, and perhaps express his knowledge of a topic being asked. Ignore his position and focus on what he has to say.

You are giving the guy too much credit
There is a difference between interviews and private correspondences. In interviews, whether that person being interviewed likes it or not, they probably have to live by/ follow certain company guidelines... interviews are rarely the exact same thoughts that the interviewee wants to say... that's where PR steps in. No matter what, as long as you work for a company, you may never like the answers in print because they may never be the same as the answers you really and originally want to say.

I have 30 private emails from David Kirk that never went through PR and his emails to me tells me he has far, far more knowledge about 3D than me (and maybe a few others here). Those tells you about a person, not interviews that are forced to go through PR.

Sometimes what you really want to say never comes out in print. I'm not saying David never really said what you read but I know the business. For all I know, David doesn't approve of a lot of things at NVIDIA but until he quits NVIDIA he probably will never get to say what he really wants to say while being tagged as a NVIDIA employee. I'm not defending David, just expressing my "journalist" experience.

David is a very, very smart guy.
 
YeuEmMaiMai said:
I read it and while it seems interesting, not a lot of people are going to place faith in a man that works for a company that does what I have mentioned above....

So if the CTO of Microsoft gave a speech about where he thought computers would be in ten years, you'd ignore him? And you'd expect everybody else to ignore him too?
 
I give nVidia no credit and that goes for david K as well. He is NOT a John Carmack.. David should go work for another company as nVidia is in some serious trouble right now....... maybe ATi should hire him?

Oh, ATI is in some serious trouble right now too. Heck, nVidia missed about one product cycle ( might even nearly say 2 ) with the NV30 - but ATI is in good shape to miss about 3 product cycles, hehe.
In case you don't get what I'm talking about, I'm refering to the whole Loci/R400 issue. Oh, sure, they're going to release Loci, and it's going to be godly fast, but still, nothing went as planned.

Also, I firmly believe nVidia got everything back on track.
Seeing how bad the NV30 was, and that the NV35 taped-out just 6 months after it, I must say nVidia did an amazing job with the NV35. Oh, sure, it doesn't win by a lot, and sometimes even lose ( although rarely ) , but comparing it to the NV35 and seeing how little time nVidia had to create it, I must say it's an impressive achievement.

Also, I believe the NV40 is going to put nVidia back on track. Heck, we know nothing about it - but what we DO know is that nVidia's R&D budget is increasing fast, very fast indeed. Heck, AFAIK, they barely had any tape-outs in Q1 ( NV36 is Q2, as is NV40 ) and they hit a record high R&D cost!

Also, while rumors are rare about the NV40, they're always impressive. Strange, eh? :)


Uttar
 
I wonder how one can shift hardware support from rasterisation to raytracing/radiosity. If 'hardwired' raytracing/radiosity is not on chip, then the 'pool of computation resource' model will have to be exploited. (?)
 
I seriously doubt that ATi is going to miss one product cycle let alone three. R300 should be a very nice clue as to where ATi is going.


I never really heard anything about R400 but then again unlike nVidia, ATi does not hype the crap out of a product. Only tihng I know is that R400 was cancelled because the technology needed to make it does NOT exist yet let alone matured.


Uttar said:
I give nVidia no credit and that goes for david K as well. He is NOT a John Carmack.. David should go work for another company as nVidia is in some serious trouble right now....... maybe ATi should hire him?

Oh, ATI is in some serious trouble right now too. Heck, nVidia missed about one product cycle ( might even nearly say 2 ) with the NV30 - but ATI is in good shape to miss about 3 product cycles, hehe.
In case you don't get what I'm talking about, I'm refering to the whole Loci/R400 issue. Oh, sure, they're going to release Loci, and it's going to be godly fast, but still, nothing went as planned.

Also, I firmly believe nVidia got everything back on track.
Seeing how bad the NV30 was, and that the NV35 taped-out just 6 months after it, I must say nVidia did an amazing job with the NV35. Oh, sure, it doesn't win by a lot, and sometimes even lose ( although rarely ) , but comparing it to the NV35 and seeing how little time nVidia had to create it, I must say it's an impressive achievement.

Also, I believe the NV40 is going to put nVidia back on track. Heck, we know nothing about it - but what we DO know is that nVidia's R&D budget is increasing fast, very fast indeed. Heck, AFAIK, they barely had any tape-outs in Q1 ( NV36 is Q2, as is NV40 ) and they hit a record high R&D cost!

Also, while rumors are rare about the NV40, they're always impressive. Strange, eh? :)


Uttar
 
YeuEmMaiMai said:
i find it extremely funny that this guy wants to lead 3d graphics and he is currently working for a company that is doing everything that it can to

a. kill competition
b. cheat
c. produce products with crap IQ at default settings

How can you respect someone that is not leading the 3d industry but instead stiffeling it?

I find it funny that people usually take shots at the company that has the most advanced products, in what they can do. NVIDIA currently has a full range of DX9 hardware, that is cheaper than ATI's low end DX9 hardware and has more features at the high end.

Granted, it's slower (since NV35 isn't out yet), and it has some driver issues, but how is that hardware stiffling the industry?

The same had always been said of ATI, which has often had the most advanced 3D hardware, but not the fastest. I find it really strange.
 
In all fairness UMMA is saying that NVIDIA's reputation at this moment in time is extremely low to some people so why would anyone listen to one of their highest ranking employees?

Now the most advanced part to me is up for debate....NV30 is not as advanced as R300 for several reasons and whilst the NV35 is a step in the right direction it is NVIDIA that is still playing catch up here ;0)
 
Oh, ATI is in some serious trouble right now too. Heck, nVidia missed about one product cycle ( might even nearly say 2 ) with the NV30 - but ATI is in good shape to miss about 3 product cycles, hehe.
In case you don't get what I'm talking about, I'm refering to the whole Loci/R400 issue. Oh, sure, they're going to release Loci, and it's going to be godly fast, but still, nothing went as planned.

Unlike Nvidia with the NV30, Ati is smart enought to realize the technology isn't there yet and work on something else. They won't be missing a product cyle unlike Nvidia. They won't be promising things that they can't deliver. They went to work on something they know they can produce rather than overhype something they know they can't at this time. Tell me again how that is serious trouble? Me thinks Nvidia's ego is what got them into the trouble they got into. Must be all those hallucinigenics.
 
I personally do not think that hardware accelerated radiosity or raytracing would be the future...

Yes, you could get realistic lighting for a scene; yes, you could get physically correct reflections and refractions and other effects like caustics, light dispersion and so on. But I don't think that these would provide a final rendering solution, just as they've failed to do so for the movie effects industry. This is because the goal of 3D graphics is not to recreate the real world with 100% realism, just as most movies are usually unreal and larger than life as well. Reality would be boring, reality would not serve the goals of entertainment, reality would not really be accepted as 'real'. 3D games are already using lens flares for example, which you only get to see through a camera - but gamers have accepted it and liked it. The same goes for the general look of any game - it has to be stylized, it has to somehow serve the gameplay and the mood in the end.

Every scene of a movie is lit artificially, with many different lights, bounce cards, and other trickery, in order to deliver the final image that the director wants. Lighting technicians would gladly drop real-life lighting as most of the time their job is to circumvent it. So why introduce problems with radiosity into a scene, only to have the added work of removing it in the end? You can stop objects to receive or cast shadows, exclude them from the effect of any light, you can do whatever you want with a CG scene very easily. The same goes for reflections and refractions, with the added experience that most people cannot detect wether and object reflects its enviroment correctly or not. The evidence is that Renderman, used for 95% of movie effects, does not support raytracing, but almost noone has ever complained about it so far (okay, there are scenes in Star Wars I. where the queen's ship does not reflect the people around it, but most people don't get that either).

There are other arguments besides the mostly artistic ones above. For example, raytracing (which is actually used for most radiosity or more precisely, Global Illumination algorythms) is very slow, and hard to optimize for. You must take into account objects in the scene that your camera doesn't even see; you have to hold practically the whole scene in memory during the rendering. You have to use supersampling as well to avoid aliased reflections and other artifacting which makes it all the more costly. No matter how good your hardware and algorythms are, you'll always be slower with raytracing than without it. And you can always do so many other things with your computing capacity that would get you more in the end - increase scene complexity, geometry detail, shader complexity, texture resolution and so on. This is also one of the reasons why offline rendering has not moved to raytracing in the past 15 years since The Abyss and why rendering times are still about 30 minutes for a frame. Artists have simply put in more stuff as computers got faster.

The other reason is that there are lots of techniques to fake radiosity, reflections and so on. Enviroment maps, compositing tricks, talented lighting artists have been there since the beginning, and a lot of research concentrates on this field today. ILM for example has developed reflection occlusion and ambient occlusion; the first is used to enhance enviroment mapped reflections, the second is a sort of baking global illumination into textures (here's a good link explaining ambient occlusion: http://www.andrew-whitehurst.net/amb_occlude.html )

So, in my opinion hardware support for raytracing is not the final goal and other things should be developed instead. Micropolygon displacement mapping, hair rendering, advanced skinning technologies, cloth simulation and such are technologies that would be far better to achieve realism, or rather, pleasing images. Also, support for image post-processing with a floating point framebuffer is another thing to concentrate on, because 2D compositing is a very important part of movie effects. And any additional capacity can be spent for more detail, instead of raytracing the whole scene.

Just my 2 cents as a 3D artist.
 
Back
Top