A burning long unanswered question about smoke effects in games !!!!

CamRaiD

Newcomer
Hi,

I know little in general about terminology, but I would love it, if someone could answer me; Why is it, that games still have the annoying tendency for smoke and explosions to be cut off ugly like when their flatness passes through a wall ???

I would have thought that our technology would have found a way to stop that (clipping?) effect that looks a sorta bad...

Are there ways to stop that ??

Why don't programmers use those ways ??

We can do so much with graphics but can' stop the smoke from getting cut off in the floor !!:D
 
Depends on how the 'smoke' was generated.

If it's done with flat textures projected onto flat 2D planes in 3D space, you'll get clipping where that 2D plane intersects another object (wall, rock, tree, another actor, another explosion, etc.)

The most readily available answers would either be a particle system to "emit" the fog, or a volumetric texture. Particles will look more realistic most likely, given a proper implementation.
 
Hi,

I know little in general about terminology, but I would love it, if someone could answer me; Why is it, that games still have the annoying tendency for smoke and explosions to be cut off ugly like when their flatness passes through a wall ???

I would have thought that our technology would have found a way to stop that (clipping?) effect that looks a sorta bad...

Are there ways to stop that ??

Why don't programmers use those ways ??

We can do so much with graphics but can' stop the smoke from getting cut off in the floor !!:D

IIRC one of DX10's features was the ability to read Z and implement "soft particles" blending to, among other things, avoid exactly what you describe.
 
If you want to just stop the part where an explosion/smoke object is clipped against the wall, one possible approach would be to use an occlusion query; one query for each object, then render in their entirety any objects passing the test. This doesn't necessarily work all that well, though; if the occlusion query test region is occluded - e.g. the explosion in question happens just behind an enemy - you end up not drawing the explosion at all when you should have drawn it, resulting in the enemy looking as if it is being blown up by nothing at all.

A problem here is that the explosion effect - at least if it is drawn as a flat 2D object - presumably should be drawn either decidedly in front of or decidedly behind objects in the scene, with a decision being made on a per-object basis. Such decisions cannot always be resolved; e.g. in the case of an explosion happening behind an enemy, you may decide that the explosion should happen in front of the ground that the enemy stands on. This too doesn't work all that well; you now get a big chunk of explosion that violates depth perspective.

Using a particle cloud partially overcomes these problems; you can still get individual particles clipped, but this is likely much less egregious than having a big flat 2D object clipped. This may require a fair amount of GPU horsepower to look good, though.

A more elaborate method that uses the depth buffer to perform soft blending (instead of making hard ordering decisions) is documented on the Nvidia developer site, at http://developer.download.nvidia.com/SDK/10/direct3d/samples.html#SoftParticles; this looks much better than any of the approaches above, providing a good solution to the whole clipped-explosions problem. It apparently requires DirectX10 features, though - which rules out its use in cross-platform games.
 
IIRC one of DX10's features was the ability to read Z and implement "soft particles" blending to, among other things, avoid exactly what you describe.

Hmm, Team Fortress 2 had soft particles present during the beta, which they soon removed for performance reasons. Similar effect?
 
A more elaborate method that uses the depth buffer to perform soft blending (instead of making hard ordering decisions) is documented on the Nvidia developer site, at http://developer.download.nvidia.com/SDK/10/direct3d/samples.html#SoftParticles; this looks much better than any of the approaches above, providing a good solution to the whole clipped-explosions problem. It apparently requires DirectX10 features, though - which rules out its use in cross-platform games.
I think Incoming Forces used feathered/soft sprites on DX8 and many games did after that including CoD series.
 
Hmm, Team Fortress 2 had soft particles present during the beta, which they soon removed for performance reasons. Similar effect?

I really don't know, never played TF2. I'll answer anyway, the illuminati (pun intended) are free to correct me if I spew too much drivel. This is from my recollection of old conversations, I've never written a single line of DX code.

I assume TF2 has at best a DX9 path. In DX9 there is no way to read-access the Z buffer. But you are free to implement your own Z buffer as a MRT with the big caveat that you will bypass any Z acceleration fixed hardware.
Soft blending as described above uses the distance between the semi-transparent primitive and the Z depth to modify the alpha blending factor to make the primitive be fully transparent as it crosses the floor (opaque Z depth = primitive depth).
On naive DX9 you go from "accelerated Z write and ROPs" to "non-accelerated Z writes plus Z reads" incurring a substantial performance penalty.
On DX10 you get the accelerated Z writes and just have an extra Z texture read and some ALUs vs. a normal, horrible lines on the intersections, alpha blended "smoke" wall.
The "soft particles blending" was a bullet point in the DX10 PR campaign, with much screenshots on the usual websites.
 
I really don't know, never played TF2. I'll answer anyway, the illuminati (pun intended) are free to correct me if I spew too much drivel. This is from my recollection of old conversations, I've never written a single line of DX code.

I assume TF2 has at best a DX9 path. In DX9 there is no way to read-access the Z buffer. But you are free to implement your own Z buffer as a MRT with the big caveat that you will bypass any Z acceleration fixed hardware.
Soft blending as described above uses the distance between the semi-transparent primitive and the Z depth to modify the alpha blending factor to make the primitive be fully transparent as it crosses the floor (opaque Z depth = primitive depth).
On naive DX9 you go from "accelerated Z write and ROPs" to "non-accelerated Z writes plus Z reads" incurring a substantial performance penalty.
On DX10 you get the accelerated Z writes and just have an extra Z texture read and some ALUs vs. a normal, horrible lines on the intersections, alpha blended "smoke" wall.
The "soft particles blending" was a bullet point in the DX10 PR campaign, with much screenshots on the usual websites.
What you can do with dx9 and earlier is write depth values into an another MRT.
This way you do not have any strange problems you would have with z-buffer.

After basic implementation one might want to do particles with depth information about their shape as well.
 
Last edited by a moderator:
We have soft particles in Just Cause 2. We use it for clouds too. It comes at a performance penalty though, but for any decent video card I recommend to have it enabled.
 
I first remember seeing 'soft' smoke edges in COD 2 and though that we'd never see hard edges in smoke again. FEAR really surprised me with the hard edges there. It was the one graphical aspect of that game that really let it down for me.

I'm surprised to see it in any game nowadays. COD 2 was a long time ago.
 
Back
Top