Texture unit problems

kangcool

Newcomer
Trying to use a shader which draws a shadow
I can generate my shadow map and pass it into the shader (i think) but no shadow appears :cry:

Any ideas one how you get this to work :?:
Any ideas on where I can find a good tutorial on how to use Texture units because i got no clue what I am doing wrong :cry:
 
You will need to provide more information.

Video card?
Texture format (Depth stencil texture?)
does a normal texture show up?
screen shots + shader source?

Try nvidia's or Ati's developer resource centres, eg developer.nvidia.com, ati.com/developer. Each has code samples and lots and lots of docs.
Also, if you are using Direct3D, make sure you have the DirectX SDK installed, it has *heaps* of samples/tutorials/docs
 
Video Card
Nvidia 7900GT using latest drivers with OpenGL 2.0.2

Texture format
Depth test texture

does a normal texture show up?
Not sure what you mean?

screen shots + shader source
Code:
const GLchar* shaderVertSrc[] = {
"attribute float Accessibility;"
"varying vec4  ShadowCoord;"  
"const float As = 1.0 / 1.5;"
"const float Ds = 1.0 / 3.0;"
"void main()"
"{"
"    vec4 ecPosition = gl_ModelViewMatrix * gl_Vertex;"
"    vec3 ecPosition3 = (vec3(ecPosition)) / ecPosition.w;"
"    vec3 VP = vec3(gl_LightSource[0].position) - ecPosition3;"
"    VP = normalize(VP);"
"    vec3 normal = normalize(gl_NormalMatrix * gl_Normal);"
"    float diffuse = max(0.0, dot(normal, VP));"
"    float scale = min(1.0, Accessibility * As + diffuse * Ds);"
"    vec4 texCoord = gl_TextureMatrix[1] * gl_Vertex;"
"    ShadowCoord   = texCoord / texCoord.w;"
"    gl_FrontColor  = vec4(scale * gl_Color.rgb, gl_Color.a);"
"    gl_Position    = ftransform();"
"}"
};

const GLchar* shaderFragSrc[] = {
"uniform sampler2DShadow ShadowMap;"
"uniform float Epsilon;"
"uniform bool  SelfShadowed;"
"uniform float SelfShadowedVal;"
"uniform float NonSelfShadowedVal;"
"varying vec3 ShadowCoord;"
"float Illumination;"
"float lookup(vec2 position)"
"{"
"	float x = position.x;"
"	float y = position.y;"
"    float depth = shadow2D(ShadowMap, ShadowCoord + vec3(x, y, 0) * Epsilon).x;"
"    return depth != 1.0 ? Illumination : 1.0;"
"}"
"void main()"
"{"
"    Illumination = SelfShadowed ? SelfShadowedVal : NonSelfShadowedVal;"
"    vec2 o = mod(floor(gl_FragCoord.xy), 2.0);"
"    float sum = 0.0;"
"    sum += lookup(vec2(-1.5,  1.5) + o);"
"    sum += lookup(vec2( 0.5,  1.5) + o);"
"    sum += lookup(vec2(-1.5, -0.5) + o);"
"    sum += lookup(vec2( 0.5, -0.5) + o);"
"    gl_FragColor = vec4(sum * 0.25 * gl_Color.rgb, gl_Color.a);"
"}"
};

Been starting at all the demos and does for weeks wonder why it does not work :cry:
 
When I say 'normal texture', if you use any old texture, say, load up bubbles.bmp out of the windows directory, does using this texture show up at all? If it does, it make make it immediatly obvious whats wrong. If nothing shows up, then the shader might have problems.

The trouble with DST textures on nvidia hardware is that they are not treated as normal textures, so the texture lookup gets compared to the z texture coordinate. So you either get 1 or 0, white or black. This makes it rather difficult to debug the damned things.
 
Back
Top