Future application of fractal maths in computer...imaging (I think) *spawn

That's not quite true, as I understand it (though I may misunderstand). Isn't Kreiger and others mostly built on the premise of rebuilding a list of artist actions rather than creating content via algorithms? That'd make it a macro-generation system. Things like SpeedTree already use fractals for procedural content. Terrain generation is a classic. There are a few texture creators where you plug in variables and get a procedural texture out the other end, but I don't know of any realtime/in game procedural texture generation. I think fractals will be too processor intensive to have merit for realtime content. Other cheaper noise and pattern algorithms are move serviceable, I think.
 
your probably right
it doesnt really fit the standard definition of procedural content, but does sort of qualify
 
Perlin noise has been used to generate procedural content in realtime since 2002 or something.
 
That's not quite true, as I understand it (though I may misunderstand). Isn't Kreiger and others mostly built on the premise of rebuilding a list of artist actions rather than creating content via algorithms? That'd make it a macro-generation system. Things like SpeedTree already use fractals for procedural content. Terrain generation is a classic. There are a few texture creators where you plug in variables and get a procedural texture out the other end, but I don't know of any realtime/in game procedural texture generation. I think fractals will be too processor intensive to have merit for realtime content. Other cheaper noise and pattern algorithms are move serviceable, I think.

Unreal used procedual textures

http://wiki.beyondunreal.com/Legacy:FractalTextureFactory

ignore the word fractal though, it think it was called fire engine

We are going back some yeaaars though :)

I might try and put in a fractal shader in the next commercial game I'm working on :)
probably a 1 bit mandlebrot though :)
 
Perlin noise has been used to generate procedural content in realtime since 2002 or something.
Frontier: first encounter had full planets generated procedurally during runtime in 1995 and it certainly wasn't first game to use procedural generation for meshes. (midwinter 1989 and many others.)
 
Correct me if I'm wrong but isn't part of the RAGE (Id Tech 5) pipeline oriented around collecting artist actions akin to Kkrieger?

I always thought that was a bit of a paradox. You are collecting artists actions yet baking the end-result to a large unified data-set?

I probably misread that or something got lost in the chain of transmission. :LOL:

This is really the crux of why Mega-Texture seems to be problematic compared to the standard tiled-texture paradigm. The re-use of texture tiles is "of itself" a form of compression. I wonder if any engines are more amenable to a diversity of very very small textures such that you are essentially using textures like different brushes in a paint app rather than traditional (larger) photo-source approaches.
 
Correct me if I'm wrong but isn't part of the RAGE (Id Tech 5) pipeline oriented around collecting artist actions akin to Kkrieger?

I always thought that was a bit of a paradox. You are collecting artists actions yet baking the end-result to a large unified data-set?

I probably misread that or something got lost in the chain of transmission. :LOL:

This is really the crux of why Mega-Texture seems to be problematic compared to the standard tiled-texture paradigm. The re-use of texture tiles is "of itself" a form of compression. I wonder if any engines are more amenable to a diversity of very very small textures such that you are essentially using textures like different brushes in a paint app rather than traditional (larger) photo-source approaches.

I don't know but it could be that authoring the large data set for something like Rage is the problem, I would imaging some system of painting layers - yes you can have totally unique texels, but if you have to cover say 1km and more of textures what are you going to fill it with as an artist? You would want a way to paint it, undo , change and replace textures. If this is true, then you are sort of correct anyway with Rage (if you assume very small textures can be any size you like). But they dont compress it that way (I assume). Combine the fact you might be painting more than one layer, if say you were painting normals/spec/diffuse (Do they do that in Rage?) - you couldnt do each layer by hand, it would take too long.

As far as reuse for compression, that I guess if you think of it, it is kind of what traditional 3d games do, but with a lot of manual work of the artist.

You could of course paint layers record it and then repaint them into a megatexture to decompress them - but the reason not to do that might be that they want the data to be in a compressed format - such as DXT - so it would take too long to do all the work in the background.
 
Back in the win95 days I used to use an editor called satori (because it came free on a magazine coverdisk and it cost about 4 grand normaly)
anyway its claim to fame was massive image size and unlimited and selective undo because every brushstroke, effect, blur ect was an object in its own right
for example
you could apply Gaussian blur to an image then radial blur then another Gaussian blur and you could undo just the radial blur if you wanted.
 
Back
Top