Problems with Deferred Shading

Wysicon

Newcomer
Hi folks, I've been messing around with Deferred Shading (I'm using D3D10), and everything pretty much works till the G-Buffer filling pass and reconstruction. The reconstructed world space position is fine, the normals look fine, the light vector looks fine, etc... (I'm comparing with the forward rendering approach for reference). But somehow, the diffuse and specular terms dont turn out right. There seems to be a glitch in the Normals, though it somehow doesnt appear that way... (There's no problem with the position because when I use a directional light, I still get the same problem).

Here's my RT config:
Depth - R32F
Normals - R11G11B10F
Ambient + Occlusion - R8G8B8A8
Diffuse + MatID - R8G8B8A8
Specular + Emissive Term - R8G8B8A8

I've tried storing normals in a R8G8B8A8 buffer, storing xy components in a R16G16F/R32G32F buffer and recomputing z, but I still get the same problem. Can anybody give me some insight into what's going on? Thanks a lot.

Here's the Deferred Rendering code (from the lighting pass):

Code:
// Fetch the depth map and compute world space position
float fDepth = g_txRTDepth.Sample(g_sampPointClamp, In.TexCoord);
float4 cPos = float4(In.DepthCoord, fDepth, 1.0f); // Depth coord is (Tex.x * 2 - 1, 1 - 2 * Tex.y) and is computed in the vertex shader
float4 tPos = mul(cPos, g_mViewProjI);
float3 wPos = tPos.xyz / tPos.w;

// Fetch the normal map and compute the normals
float3 vNormal = g_txRTNormal.Sample(g_sampPointClamp, In.TexCoord);

// Fetch the ambient map and occlusion term
float4 vAmbOcc = g_txRTAmbOcc.Sample(g_sampPointClamp, In.TexCoord);
float3 vAmbient = vAmbOcc.rgb * vAmbOcc.a;

// Fetch the diffuse map and material ID
float4 vDiffMatID = g_txRTDiffMatID.Sample(g_sampPointClamp, In.TexCoord);
float3 vDiffuse = vDiffMatID.rgb;
int nMatID = vDiffMatID.a * 255;

// Fetch the specular map and emissive term
float4 vSpecEms = g_txRTSpecEms.Sample(g_sampPointClamp, In.TexCoord);
float3 vSpecular = vSpecEms.rgb;
float3 vEmissive = vDiffuse * vSpecEms.a;

// Compute the light vector and attenuation
float3 vLightDir = g_vLightPos - wPos;
float fLightDist = length(vLightDir);
float fAttenuation = 1 / (g_vLightAtten.x + fLightDist * g_vLightAtten.y + fLightDist * fLightDist * g_vLightAtten.z);
vLightDir = normalize(vLightDir);

// Compute the view direction
float3 vViewDir = normalize(wPos - g_vCamPos);

// Compute the diffuse and specular lighting terms
float NdotL = max(dot(vNormal, vLightDir), 0);
float RdotV = max(dot(normalize(reflect(vLightDir, vNormal)), vViewDir), 0);

// Compute the final lighting
float3 vLighting = vAmbient + (vDiffuse * NdotL + vSpecular * pow(RdotV, 50)) * g_vLightColor * fAttenuation;

return float4(vLightDir, 1.0f);

And here's the Forward Rendering code:

Code:
// Compute the normals
float3 vN = g_txNormal.Sample(g_sampLinearWrap, In.TexCoord) * 2 - 1;
float3 vBinormal = normalize(cross(In.Normal, In.Tangent)); // Should be done in the vertex shader
float3x3 mBTN = {vBinormal, In.Tangent, In.Normal};
float3 vNormal = normalize(mul(vN, mBTN));

// Compute the ambient color
float fOcc = g_txOcclusion.Sample(g_sampLinearWrap, In.TexCoord);
float3 vAmbient = g_vAmbientColor * fOcc;

// Fetch the diffuse map
float3 vDiffuse = g_txDiffuse.Sample(g_sampLinearWrap, In.TexCoord);

// Fetch the specular and emissive maps
float3 vSpecular = g_txSpecular.Sample(g_sampLinearWrap, In.TexCoord);

// Compute the light vector and attenuation
float3 vLightDir = g_vLightPos - In.WPos;
float fLightDist = length(vLightDir);
float fAttenuation = 1 / (g_vLightAtten.x + fLightDist * g_vLightAtten.y + fLightDist * fLightDist * g_vLightAtten.z);
vLightDir = normalize(vLightDir);

// Compute the view direction
float3 vViewDir = normalize(In.WPos - g_vCamPos);

// Compute the diffuse and specular lighting terms
float NdotL = max(dot(vNormal, vLightDir), 0);
float RdotV = max(dot(normalize(reflect(vLightDir, vNormal)), vViewDir), 0);

// Compute the final lighting
float3 vLighting = vAmbient + (vDiffuse * NdotL + vSpecular * pow(RdotV, 50)) * g_vLightColor * fAttenuation;

return float4(vLightDir, 1.0f);


Here are some images...

Normals - Deferred Rendering
normalsdeferredtj0.png


Normals - Forward Rendering
normalsforwardou0.png


Light Vector - Deferred Rendering
lightvecdeferredog4.png


Light Vector - Forward Rendering
lightvecforwardum9.png


Lighting - Deferred Rendering
lightingdeferredps3.png


Lighting - Forward Rendering
lightingforwardzc5.png
 
Solved! :D I just had to scale and bias the normal into the 0-1 range :smile: Worse perfect now! Strange that even a floating point format like R11G11B10F cant store negative values accurately :rolleyes:
 
Strange that even a floating point format like R11G11B10F cant store negative values accurately :rolleyes:

Not strange at all. R11G11B10F doesn't have any sign bit, so you can only store positive numbers in it. This is not an ideal format for normals, both because of this and because normals are limited to [-1, 1] range anyway, so a signed fixed point format is much more suitable for this kind of data. I'd suggest R10G10B10A2_UNORM, or if you don't want to do the scale-bias, you can use R8G8B8A8_SNORM. Or even R8G8_SNORM if you store eye-space normals and reconstruct z in the shader.
 
Depth

Maybe I some late, but
Wysicon tell me please how you pack depth to render target?

I have some problems with it.

thx.
 
I will not create a new topic, so:
Code:
// object to MRT ps:
Out.depth = float4(length(In.PosWV.xyz),0,0,0);
//where in vs:
Out.PosWV = mul(In.Pos, matWorldView);

//and lighting in quad:
//vs:
Out.EyeRay = float3(In.Pos.x * ViewAspect, In.Pos.y,  tan(60.0f*PI/180.0f)/2.0f);
//ps:
float depth = tex2D(depthSempler, In.Tex);
float3 view_space_position = normalize(In.EyeRay.xyz) * depth;
float3 world_pos = mul(view_space_position, matViewInv);
//problem:
view_space_position look correct
but world position some not correct. When I rotate camera position have a bit changes.
Help me please to find the correct algorithm of record of depth and taking of view space position, and maybe world position.

P.S. I am sorry for my English
 
Nick you may want to have a look at this thread from gamedev.net, it discusses some of the various techniques for reconstructing position from a depth buffer.
 
Hi, MJP
I already read this thread, but I have some questions

float far_clip = 1000.0f;
float fov_y = 60.0f*PI/180.0f;
float tangent_half_fov_y = tanf(fov_y/2.0f);
float far_y = tangent_half_fov_y*far_clip;
float far_x = far_y * 800.0f / 600.0f;

this 4 vectors to far frustum coordinates(in view space) I store in 4 quad vertexes:

vector4 far_vertex0 = vector4(-far_x, far_y, far_clip, 1);
vector4 far_vertex1 = vector4( far_x, far_y, far_clip, 1);
vector4 far_vertex2 = vector4( far_x, -far_y, far_clip, 1);
vector4 far_vertex3 = vector4(-far_x, -far_y, far_clip, 1);

// in object render
//in vs:In.PosWV = mul(In.Pos, WorldView);
//in ps:
float farZ = 1000.0f;
Out.Depth = float4(In.PosWV.z/farZ, 0, 0, 0);
----------------

they interpolate to pixel shader from vertex shader and I take screenDir.
//lighting vs:
Out.screenDir = In.DirTo4Corners;
//lighting ps:
half depth = tex2D(depthSampler, In.Tex).x;
float3 viewSpacePos = In.screenDir * depth;

viewSpacePos looks at normally.
But world position looks not correct when I rotating camera:

float4 worldPos = mul(float4(viewSpacePos,1), ViewInv);

screens:

deferredproblem0it1.th.jpg

deferredproblem1vh9.th.jpg



look a green color on leg
Maybe 4 vectors to far frustum coordinates is not right?
 
excuse me, I made a mistake.
An algorithm works perfectly!

Positions that I wrote to depth and Out.Pos were deffent!
Code:
//this was skeletal mesh!!!
//Pos - position is multiplied on the matrices of bones
half4 pos = mul(float4(Pos, 1), ViewProj);
Out.Pos = pos;
Out.Pos2 = pos;
//error: I wrote:
//Out.PosWV = mul(float4(In.Pos,1), View);
//need:
Out.PosWV = mul(float4(Pos,1), View);
 
Back
Top