What's the current status of "real-time Pixar graphics&

Dio said:
Hmmm... I'm not understanding this industry then. Are you saying that because it takes more staffing and more time to implement a bump-map, that most game companies aren't going to implement it? That can't be true!
Unless the bump mapping will sell more games, it won't get done. On the current PC generation, it's hard to argue that adding bump mapping (or, generalising, complex shaders) will help sales significantly. There has to be a cost/benefit analysis somewhere.

Hmm.. I disagree with this. If we never went above and beyond, then we'd be stagnant from here on out. There must be a threshold that is passed that will introduce the newest features in games. Considering most games can't even present a compelling story, it would behove most developers to make the games more appealing by staying close to the current technology for their engines.

As I see it, companies wait for Carmack to do something that will make people buy his games and then other smaller companies follow suit with his engine or a derivative thereof. It seems as if Valve and a few other companies are the exception to the rules.

Whether we want to believe it or not, graphics sell games, otherwise we'd still be playing vectored star wars type games in our homes now.

-M
 
Mr. Blue said:
Hmm.. I disagree with this.
If we never went above and beyond, then we'd be stagnant from here on out.
I would argue that right now, we largely are. There's been no key reason for games companies to expend money driving graphics forward on the PC, especially given that over the last 2-3 years the budget crunches have hit hard. Most have done the best they could, and some have produced good results, but it's not really 'moving forward' much right now.

As I see it, companies wait for Carmack to do something that will make people buy his games and then other smaller companies follow suit with his engine or a derivative thereof. It seems as if Valve and a few other companies are the exception to the rules.
My guess would be that this is because the FPS genre is the only area of gaming where the PC is the leading edge - and the majority of FPS games are based off a Carmack, Unreal or Valve engine. There's therefore a limited drive to innovate (major surgery on the engine?) on the graphics side when it's already expensive enough getting the rest done.

Other game types (except possibly RTS, where graphics is a lesser factor anyway) are only being driven forward on the console side, and console capabilities are over a full generation behind the PC (1.1 shaders is the best it gets!). For games with multi-format releases, they are (rightly) more concerned with DOWN porting the X-box variant to run on the multitude of GF4MXes than they are with UP porting it for R3xx-class chips.

My prognosis would be for advance in the PC segment, but only in the field of FPS and not much other graphic innovation before the next-generation consoles launch. This is a pretty easy cold reading given the impending arrival of Doom3 and Half-Life 2. Nostradamus, Mother Shipton, etc. I ain't.

We're trying to help - there's a lot of great devrel guys at ATI, and tools like RenderMonkey are lowering the cost base of advanced shader games.
 
About movie 2K resolution:
It's 2048 * 1536 or so. Pixels aren't square (because it's an anamorphic resolution). The top and bottom part of the image also gets cut off in the theater - thus the effects have to be rendered in the full resolution, but no details are taken into account in the obscured areas.


About bump mapping:
Let's make a distinction between grayscale heightmaps (traditional bumps) and normal maps.

Bumps are used in the artistic sense to break up shading and specular highlights on an otherwise smooth surface. The reasons to go with a texture is that 1. detailed geometry is more expensive to render and 2. textures are painted, which is faster and more intuitive than modeling.
Bumps are used in the technical sense to fake real-world geometric detail that's usually quite small on the final rendered image.
Bumps are used in the economical sense because fast and robust micropolygon displacement mapping is not available in most renderers today. There are also a few possible problems with displacement (usually with high displacement bounds, ie. vertices are moved too much) that can result in rendering artifacts.

Bump maps can be generated relatively fast by a skilled texture artist, as they've been used for more than a decade in offline rendering. Some possible methods:
- If the texture is fully painted, start with the bump map and work with layers in photoshop. Copy individual bump layers to the color map as masks for different details like skin pores, wrinkles, rust, dirt etc.
- If using a photo texture, a possible shortcut is to desatureate the color map, run a high pass filter on it to remove low to mid frequency details, then adjust levels and brightnes/contrast. This can get quite convincing results.
In our experiences with prerendered cinematics, one of these two methods have always sufficed. For example I was able to paint a 2K set of bump/color/spec textures for a high-res Tiger tank in about 2 days (modeling and UV mapping took considerably more time BTW).
This might be new ground for most game artists though, who haven't got an extensive background in content creation for cinematics and the like. So they just need to be trained.


Normal mapping is a different beast. Id is using it for not just high-frequency details (so the small stuff), but also for med-freq things like facial features, muscle groups, small bits of equipment and accessories. Their problem is that all of these details have to be manually modeled in 3D which takes a lot of time. Especially for Raven with their cyber-stuff in Quake4 - see the leaked stuff, the detail is insane and close to the level of movie VFX content.
Id is in part forced to use this method because of the polygon count limits that stencil shadows force on them. However the issue will remain relevant even if their next engine will use a different shadowing system and higher resolution models - when video cards will support displacement mapping.

The VFX studio Weta Digital working on Lord of the Rings has made extensive use of a similar technique: laser scan a very detailed clay maquette (over 7 feet high for the 4-feet Gollum!), create an optimized mesh with less than 1/100 of the detail, and generate displacement maps from the highres scanned geometry. They are doing this to speed up the skinning (weighting or muscle simulation) of the characters, and also to keep scene sizes small. Both are perfectly reasonable for games as well, where the vertex shader would only have to transform and skin the low-res meshes, and then dice it up and displace at a later stage.

I think it is a safe bet that real-time hardware displacement mapping is on the horizon. Thus the content creation problem id has met with Doom3 will remain.

I'd like to add that the VFX industry has not found a real solution for modeling very high detailed objects yet either. Their only advantage is that they usually have to do just a few detailed models per VFX production - noone expects 40-50 monsters, bosses, vehicles, items plus levels. Practically every full CGI feature had a very stylized look that has allowed them to simplify their models; the only exception was Final Fantasy and we all know how big their budget was (over 100 million).
For most of the detailed models, hand-painted displacement maps are used. Here's another exception: Draco, the dragon in Dragonheart was perhaps the most detailed CG character yet (because of all the scales), and he was modeled in 5 months by 5 modelers - more than 2 man-years of work.

The solution for this problem is obviously in the content creation phase, instead of the rendering technology. New and better tools are needed and there is quite some research on this topic. See this forum (run by one of the Weta guys):
http://cube.phlatt.net/forums/spiraloid/viewtopic.php?TopicID=9

For those who don't want to read through, one of the 'new' tools is Zbrush which is basically a 3D painting app; it allows to directly manipulate the polygonal surface with brushes. The new version can handle up to 1-2 million polygons; thus a modeler can create a relatively simple 3D model, subdivide it to get enough polys and paint the details instead of modeling. It is more practical for organics though, painting all the cyber stuff in Doom/Quake doesn't seem to be possible yet.

Just my 2 cents, anyway... ;)
 
Dio said:
Mr. Blue said:
Hmm.. I disagree with this.
If we never went above and beyond, then we'd be stagnant from here on out.
I would argue that right now, we largely are. There's been no key reason for games companies to expend money driving graphics forward on the PC, especially given that over the last 2-3 years the budget crunches have hit hard. Most have done the best they could, and some have produced good results, but it's not really 'moving forward' much right now.

Hmm...interesting that you admit this. I concur.

As I see it, companies wait for Carmack to do something that will make people buy his games and then other smaller companies follow suit with his engine or a derivative thereof. It seems as if Valve and a few other companies are the exception to the rules.
My guess would be that this is because the FPS genre is the only area of gaming where the PC is the leading edge - and the majority of FPS games are based off a Carmack, Unreal or Valve engine. There's therefore a limited drive to innovate (major surgery on the engine?) on the graphics side when it's already expensive enough getting the rest done.

But this will oversaturate the PC market with FSP (I know I'm tired of them) and will eventually lead to the PC's downfall in gaming. I already have 2 consoles due to the variety of gaming that I can experience from the 2 and I only am currently playing one PC game. Guess what? It's a FPS. How many more derivatives of the FPS can we get? Not many. Only the ones made by Valve and Monolith have had relatively good stories and content. Tron 2.0 was a welcome change from the redundancy. Take Epic for instance - UT2004??? What the heck??? How many are they going to make till they decide to come up with a different game altogether?? Sheesh!


-M
 
It's back to Greg's presentation. It's rare anyone is allowed to do anything particularly original at the moment because unless there are guaranteed sales there's not the money to fund it.

I should perhaps clarify that there are a few games developers who are pushing things forward - mostly those developers who have large financial reserves such as Valve, id, Epic, Lionhead, and others that are determined to do so are making some progress. Things WILL get better, but my point it is that not until the capabilities of the 'standard' console platforms catch up with the PC leading edge that ALL games will start targetting specifically the higher capabilities of shader cards.

Myself, I don't own a console at the moment because I'm mostly a strategy player, although a GameCube (largely for Monkey Flight!) will seem very tempting if the price comes down any further :)
 
Currently the markets dictate that games are designed for graphics hardware that is around, let's say, 2 years old.

It would make absolutely no fiscal sense to design a game that would *only* run on hardware that currently <10% of the possible customers own.

Bleeding edge doesn't really have a foothold in the gaming industry.

Rendering, on the other hand, is pretty much only bleeding edge stuff.

So, you really cannot use the state of the current game technology as an example when thinking what is possible and what is not.

Even the demo coders have begun to fall behind on the rather advanced field of different kinds of 3D rendering technologies. In the old days, demo coders were the pioneers; now, it's not that simple anymore.
 
Dio said:
It's back to Greg's presentation. It's rare anyone is allowed to do anything particularly original at the moment because unless there are guaranteed sales there's not the money to fund it.

To bad this presentation can't be viewed in .pdf format? Also, isn't this a catch-22? If you make a game that will guarantee sales but those sales are only marginal, but if you spend the development time to make a really cool technological game, you risk overspending your budget and thus only coming out even still.

I should perhaps clarify that there are a few games developers who are pushing things forward - mostly those developers who have large financial reserves such as Valve, id, Epic, Lionhead, and others that are determined to do so are making some progress. Things WILL get better, but my point it is that not until the capabilities of the 'standard' console platforms catch up with the PC leading edge that ALL games will start targetting specifically the higher capabilities of shader cards.

I agree here.

-M
 
Daliden said:
It would make absolutely no fiscal sense to design a game that would *only* run on hardware that currently <10% of the possible customers own.

I don't think it's this simple. 90% of the PC owners out there at least have DirectX7.0 or DirectX8.0.

If the API is there, then use it. That's my opinion. Games are still only using a subset of DirectX7.0 functionality. Clearly way behind the spectrum.

-M
 
Mr. Blue said:
I don't think it's this simple. 90% of the PC owners out there at least have DirectX7.0 or DirectX8.0.

If the API is there, then use it. That's my opinion. Games are still only using a subset of DirectX7.0 functionality. Clearly way behind the spectrum.

-M
90% of the PC owners out there have SLOW DX7 class cards.
They cant make use of all the features, it would run too slow. What res do you think Doom3 wil run on a TNT2 M64?
 
Althornin said:
90% of the PC owners out there have SLOW DX7 class cards.
They cant make use of all the features, it would run too slow. What res do you think Doom3 wil run on a TNT2 M64?

Hmm, TNT2 is a DX6 card that won't run Doom3 at all...
You mean the GF2MX right?
 
Mr. Blue said:
To bad this presentation can't be viewed in .pdf format?
This piece in his blog has a brief summary, the ppt is a much better read though.
http://www.costik.com/weblog/2003_05_01_blogchive.html

Also, isn't this a catch-22? If you make a game that will guarantee sales but those sales are only marginal, but if you spend the development time to make a really cool technological game, you risk overspending your budget and thus only coming out even still.
Absolutely it is. So the publishers - which still control the budgets for 90% of games - are kind of technology averse, they don't see the cost/benefit analysis saying 'this is a must-have feature'.
 
I have been looking for something that shows what a bleeding-edge card like a 9800 can REALLY do, but no luck. It's really all simple stuff. Does anyone know of some demo that shows off what can be done at the moment? I would really want to see that.
 
Hyp-X said:
Althornin said:
90% of the PC owners out there have SLOW DX7 class cards.
They cant make use of all the features, it would run too slow. What res do you think Doom3 wil run on a TNT2 M64?

Hmm, TNT2 is a DX6 card that won't run Doom3 at all...
You mean the GF2MX right?
no, i meant TNT2 m64.
The fact that it will not run it at all only highlights the problem. Tons of Dell, etc systems shipped with these. Even newer faster dells have a GF2/4MX (less than a year old).
My point is that the vast majority of cards are too slow.
 
DiGuru said:
I have been looking for something that shows what a bleeding-edge card like a 9800 can REALLY do, but no luck. It's really all simple stuff. Does anyone know of some demo that shows off what can be done at the moment? I would really want to see that.
Have a look on ATI's web site in the Developer section. Both RenderMonkey and Ashli are really powerful, and I've seen them running some great technical demos.
 
Dio the Gamecube is coming down in price next week to about £70.
That's cheap as chips :)

Technical demo's are great and can show off the graphical prowess but normally a tech demo has no AI or physics etc to worry about. It is normally just a fixed camera rolling demo.

Case in point the Island demo's available for the Radeon 8500 - these graphics have not been surpassed in any 3D game since it was released. I think HL2 etc will finally bridge that gap.
 
Yes, once I can get GC + Monkeyball under £100 it's dangerously into impulse buy zone...

And yes, HL2 should do very nicely as a Radeon 9800 demo!
 
To everybody:

This is one of the most interesting and civil threads I've read in a long time.
Thanks to anybody for making beyond3d what it currently is. :)

Mr. Blue:
I've sent you the Greg Costikiyan presentation as a .pdf to your signature E-Mail address.
It is quite an interesting read (and quite blunt as others already noted).


I'm now going back to lurk mode. ;)
 
Half-life 2 will probably be truly nice, as far as games go. But it does not seem to tax a 9800 very much, or use even a fair sample of all the possibilities that are there. Sure, it goes beyond anything that is currently available. But it has to run on older hardware as well. It is better. But it is not cutting edge.

Then again, is there a market segment that makes truly awesome demo's that need at least a 9800 to run? Even ATi doesn't do that. The 9800 demos run fine on a 9600 as well. Sure, it is DX9. But is the software anywhere near the best that chip can do? No.
 
Back
Top