SPU usage in games

What other games (besides Roboblitz and probably Spore) use procedural texture ? Is this the same thing as "Procedural synthesis" ?
 
What other games (besides Roboblitz and probably Spore) use procedural texture ? Is this the same thing as "Procedural synthesis" ?

People talk about the two terms around themselves, my understanding is that Procedural Synthesis is anything generated by a cpu, while Procedural textures just refers to textures.

I dont know about many games that use procedural textures, however there are more and more games that use procedural synthesis for artwork. For example, all the tree's in oblivion where all generated from 5-6 different source models by a computer offline.
 
I dont know about many games that use procedural textures, however there are more and more games that use procedural synthesis for artwork. For example, all the tree's in oblivion where all generated from 5-6 different source models by a computer offline.

Yeah, it makes sense for foliage (Is it something called SpeedTree ?). Since we are in the technical area, I would be more interested in doing it in real-time (e.g., destructible environment ?). How realistic is this kind of things today ?
 
Last edited by a moderator:
Bit of an unknown as the mechanics of systems often isn't public. The best stuff so far that I know of is probably Digital Molecular Matter being used exclusively by Lucas Arts. That's for procedurally generating broken objects from artist created sources though.
 
There's different types of procedural synthesis. One is to record the artist's actions in a graphics application as a super-macro, and then repeat the same graphics functions at runtime to generate the texture. eg. Record the points of a beizer curve and fill gradient settings set by the artist, and recompose into a texture.

Wow, that sure sounds like sci-fi to me; are there commercial systems that work like that? I've seen artists perform on the order of thousands of operations to create a 1024x1024 texture for an object, and uses tens of layers to achieve the final result. This would be extremely slow to recreate, would take a non-trivial amount of memory to store, and would require reimplementing a significant amount of Photoshop functionality inside your procedural texture generator.

Um, AFAIK in Roboblitz the artist makes the texture, then it gets thrown in some kind of compilator thingy that generates a macro that will look very close to the originial, but generated on the fly (thus it will be different each time, but very slightly)

Nope, nothing like that. You can give it a try, there's a ProFX evaluation available on request. The Farbraush guys (famous with their 64 KB demos) also sell their own procedural texture system with the authoring tool freely available. Both work by allowing you to manually chain together a series of mathematical operations such as "threshold", "scale", "Perlin noise" etc. by stringing together small blocks in a Visio-like editor. A decent looking *material* texture (as opposed to "object texture") in the ProFX samples takes on the order of 70-100 operations. You probably can train a mathematically inclined texture artist to be productive with it after a steep learning curve, but it's a far cry from either "record their actions in Photoshop" or "give a compilator thingy the final texture and let it come up with a way to generate it".
 
Wow, that sure sounds like sci-fi to me; are there commercial systems that work like that? I've seen artists perform on the order of thousands of operations to create a 1024x1024 texture for an object, and uses tens of layers to achieve the final result. This would be extremely slow to recreate, would take a non-trivial amount of memory to store, and would require reimplementing a significant amount of Photoshop functionality inside your procedural texture generator.
There's going to be limits ;) I'm sure recreating a face texture would be a major chore, but you'd only use procedural synthesis for the stuff it can handle well. Particularly you could apply modifications to existing assets, changing a models size or images colours, to introduce variation.

As for commercial systems, I don't know of any. I don't know if anyone's really trying for procedural synthesis at the moment. When you have lots of GBs of disc storage space, that sort of synthesis is limited IMO to wanting to add variety, and at the moment developers are more concerned about getting actual game engines up and running. ProFX is offering itself as an environment enrichment system, for people who have engines and are looking to add some...procedural decay. If you think about that, how much automated decay does a title need? Unless you're building a game with timetravel, iyou can use static resources. The procedural system could look a lot better, but if the devs are happy with what they've got with traditional means, they won't look into procedural synthesis. eg. KZ2 has tatty walls, and a clear repeating texture. The game would look better with procedural dirt, but that'll take up processing resources. Guerilla are happy with the look, so they won't look into ProFX.

On the other hand GT5 should definitely get ProFX so it can lose the repeating grass textures!
 
As for commercial systems, I don't know of any. I don't know if anyone's really trying for procedural synthesis at the moment. When you have lots of GBs of disc storage space, that sort of synthesis is limited IMO to wanting to add variety, and at the moment developers are more concerned about getting actual game engines up and running.

I think ProFX's business plan crumbled the moment Microsoft raised the XBLA limit from 50 to 150 MB. We've done preliminary analysis of storage budget for XBLA titles before and after the raise, and while EVERYTHING is a problem at 50 MB, at 150 MB you have enough breathing room for textures, meshes, and level data, and you start to think about having "real" audio. Traditional and advanced image compression techniques like HD Photo (VERY good, can easily achieve 30:1 at artist-tolerable degradation and maybe 100:1 at programmer-tolerable degradation) are good enough for textures. And, to keep the mods happy, are a good match for SPUs to decompress :)
 
Well, exactly. To date synthesis has been thought of as much as anything as a form of compression. Procedural synthesis as an artistic choice seems on the back-burner. The original talk of 'Procedural Synthesis' creating rich, diverse worlds, seems to have just been PR talk so far. Anyone remember that organic world demo on Cell, with all the plants growing? No-one's including that in titles. Enemies are all clones of root models, without procedural variation. Scenery is all fixed textures repeated. I don't know how much of that is devs wanting to spend limited resources elsewhere, and how much is devs being disinterested in it.
 
I doubt procedural textures will replace a good artist any time soon as far as the diffuse textures are concerned. Maybe on the detail textures, like tweaking gloss maps and the like at load time just to give a different look each time you play, but that'd be it. The big problem is no matter how good the algorithm/idea is, you would inevitably start seeing a certain "sameness" overtime across different games. Speedtree was brought up and is a good example. Trees generated with Speedtree looked cool the first time I saw them. Now I can pick them out in any game that uses it due to their "sameness", and hence they've lost their coolness.
 
Regarding the textures, what I'm curious about is why games don't seem to be using some deterministically seeded pseudo-random noise (say for example on normal map) to hide the obvious texture tiling.
 
Lair and progressive mesh

Lair is full of popup looking geometry changes but I was having difficulty differentiating progressive mesh from blended regular LoD switch, maybe partially due to lighting kicking in late in or something.

However the latest Gamersyde video does clearly show progressive mesh at work (individual vertices being moved before new vertices added, i.e. morphing).
Especially visible for earth geometry which grows bigger as the player gets closer to the city (from 1:40 to 1:50). Unintentional I assume, but still interesting to see the technique in a game.

For those wondering what progressive mesh is:
ftp://ftp.research.microsoft.com/users/hhoppe/videos/pm.mpg

(stolen from inefficient's post in Lair thread)
 
A decent looking *material* texture (as opposed to "object texture") in the ProFX samples takes on the order of 70-100 operations.

Which is, by the way, obviously too slow for realtime rendering, so the only thing you can spare with that is background storage (DVD/BR space). It'll use the same amount of memory as a hand painted, and usually superior looking, bitmap texture.
 
ID5 populates the scene with procedural content first of all before the artists dive in and place stuff.

More like, proceduraly lay down tiling bitmap textures, where the type - grass, dirt, rock etc - depends on height and slope, and add some randomization within the different tilesets.

Procedural textures on their own look artificial, sometimes even ugly. They usually have very little relation with the underlying forms (where dirt, scratches, wear etc should go), and absolutely no artistic sense (where it would look good).

For example our artists can add such details in three ways:
- generate it using procedurals
- use photo textures of real dirt, grime, etc. and blend them in using Photoshop layers and such
- manually paint it using a tablet and a 3D texture painting app

We almost exclusively rely on methods two and three, prefering hand painted stuff most of the time.

There was one case, with the Armies of Exigo intro, where a large number of armour pieces were generated within Photoshop using actions. They've used procedural edge and dirt masks generated from the geometry, and a very large set of various bitmap textures that were combined together using these masks. It looked good, but not as cool as the fully hand painted stuff for Mark of Chaos.
 
And again, the general dilemma with textures is this:
- procedurals simple enough to generate in real time are far too simple and ugly
- pre-generated procedurals will also look inferior to bitmaps, but will consume the same amount of memory and take similar amount of work to build.
 
3 years after we started this thread, a few developers have finally gotten their hands deep in Cell programming:
http://www.nowgamer.com/features/682/developers-shift-to-leading-on-ps3?o=0#listing

3 years from now, we will see where stereoscopic 3D gaming is.

Also, has anyone in the press found out the name of the Saboteur developer who implemented the amazing AA ?

It isn't AA to be honest. The name it has been reported time ago in this same topic if I remember correctly.
 
I suppose the definition of AA could be argued as whether increasing information density or just decreasing visiblt aliasing, but I think for all intents and purposes the latter counts. Besides, MLAA is a form of information recontruction anyway.
 
Back
Top