hardware displacement mapping and morphing

mboeller

Regular
will it be possible to use the displacement mapping hardware for realtime morphing of geometry ( landscapes or characters for example )?

I have read the litalian article about the use of CLUT-lightmaps to simulate Dot3 and EMBM. The main processor only manipulates the CLUT-table and the bumpmapping is done with an simple multitexturing operation (base-texture + lightmap ). Even an old Pentium300 was able to do this in realtime.

Would it be possible to do the same here?

The displacement map is nothing more than an sort of lightmap, so would it be possible to use an CLUT-displacement map and manipulate the CLUT-table only to get an morphing effect?

If this is possible I would think that the effect would look quite cool :)
 
I think you'll have to do the blend on the CPU by changing one displacement map slowly into another one. I don't think the hardware allows to sample more than one heightmap, maybe you could put multiple channels (compress into half of the bits for example) into the same displacement map and somehow extract them, that would allow you to do the "morph" blend in the vertex shader...

K-
 
Kristof said:
I think you'll have to do the blend on the CPU by changing one displacement map slowly into another one. I don't think the hardware allows to sample more than one heightmap, maybe you could put multiple channels (compress into half of the bits for example) into the same displacement map and somehow extract them, that would allow you to do the "morph" blend in the vertex shader...

K-

But what if you just make a series of displacement maps beforehand? They could then make up a morphing effect as one displacement map is updated to another. It would have to be re-computed for each new maps of course, but if you need the effect it can be finely tuned by the game developer in advance.
 
Yes, as I said on the CPU. I don't think you want to upload and store all the morphs in advance. Better to calculate them on the fly and upload... you could precompute. Although currently I prefer the system where you compress things (maybe the displacement mapping even supports mutiple components, I would have to check the DX9 docs to know what formats are possible for the displacement map).

K-
 
You people have no imagination ;).

That's how I'd do it in OGL (Given that matrox introduces some nice extensions):
Get your 2 displacement maps, allocate an equally sized pbuffer and render the displacement maps as textures of buffer-filling quads, adjusting alpha as necessary. Then use the pbuffer texture as displacement map an your object. BAM! Landscape/model/etc blending without any CPU or vertex shaders involved.

I can't wait to get my hands on this card :).
 
PeterT this does assume that these "displacement maps" are valid render targets which I currently assumed are not... it depends on how the hardware is constructed. I do agree that allowing them to be render targets would allow for many cool effects... not just morphing.

Btw during the OpenGL2.0 GDC presentation I was told that it would not support displacement maps and 3DLabs hinted that Matrox was reluctant to implement an extension for it...
 
Kristof said:
PeterT this does assume that these "displacement maps" are valid render targets which I currently assumed are not... it depends on how the hardware is constructed. I do agree that allowing them to be render targets would allow for many cool effects... not just morphing.
It would really suck if they are not valid rendering targets - after all, they are just 2 dimensional arrays of data :(. If that's really the case then you couldn't use the GLCopy functions either... So much for my dreams of dynamically rendering displacement maps and the associated demo effects.
Btw during the OpenGL2.0 GDC presentation I was told that it would not support displacement maps and 3DLabs hinted that Matrox was reluctant to implement an extension for it...
Hmpf, I guess I'll have to get a P10 based board if I want good OpenGL support, and since I dislike DX that's a must for me. And I thought that Matrox had learned from its mistakes in the G200/400 era...
 
I think it's pretty much as safe bet that rendable texture will be usable, and if not then a glCopyTexSubImage2D() should always work.
 
Well, now we have assesments from 2 venerable members of this forum which point into different directions. Why do you believe that it will be supported, Humus?
 
The least I expect to be supported is a simple copy from system memory into the displacement texture, so uploading things from the CPU should be possible (else there would be no way to get it to work).

Blitting is basically a memory copy, this might work as kind of hacky way.

Direct support would mean that render core and displacement sampler can access the same data which might actually cause problems in parallelism (render core updating and vertex core already sampling, eek).

I'd have to check the DX9 docs to see if these formats are valid render targets. And even then its not sure, possibly this is just listed in the caps under supported render targets. Yes (just checked) there is a caps list that lists the supported render targets in DX8 as supported by the render core, I guess if they list the displacemet formats there it should be possible to render to them. But it will be wait and see for the caps structure of DX9 drivers.

Say the displacement format is 8 bits pure (like A8 texture format) you'd have to support the 332 (RGB) render format... hmm don't think anyone supports that currently. If the format is 16bits then there is more likely chance of this being supported.

I wonder IF this was possible I would have shown it in my demos... its an obvious super kewl effect to implement, dynamic displacement maps... I guess we'll have to place this on the B3D Interview Question List for Matrox... I'd hate to render to a format, then lock, copy it to system memory and then translate/copy it to a displacement format, this would zap performance and kill paralellism.

K
 
Kristof said:
I wonder IF this was possible I would have shown it in my demos... its an obvious super kewl effect to implement, dynamic displacement maps... I guess we'll have to place this on the B3D Interview Question List for Matrox...

Exactly what I thought when I first read about displacement mapping :). I REALLY REALLY hope that it will be possible - with or without hacks - to do it without resorting to the dreaded AGP bus.
 
Back
Top