Integer textures

onesvenus

Newcomer
Hi, I'm trying to use a 3d integer texture and I'm having some problems while doing a texture lookup in the fragment shader. I'm using OpenGL 3.1 in a GTX280 which supports the GL_EXT_gpu_shader4 extension.

In the call to glTexImage3D I'm using GL_UNSIGNED_INT_8_8_8_8 as the type parameter and GL_RGBA as the internal format.

In the shader I'm using the following call to do the texture lookup:

uvec4 vecColor = texture(3dtext, posFrag.xyz);

Now, I'm assuming that each component of the vecColor variable will be a 32-bit unsigned int where only the lower word has any kind of information. Is it wrong?

The problem is that when I write and load a texture that has all the values set as 0xFF000000 and check if vecColor.r is bigger than 0 (I'm using this comparison ((vecColor.r & 0xFFu) > 0u)), the comparison evaluates to false. In fact, checking any of the components evaluates this comparison to false.

Should I Bit-and each component with 0xFF before comparing or I can assume that the bigger words are 0x00?

A thing I have found while debugging and that may be of some help is that when I load the same texture using GL_UNSIGNED_BYTE and I do a texture lookup in the fragment shader using a sampler2d the results are the expected ones.

Well, if someone can help me I'd be appreciated, I've been looking at the OpenGL specs for hours without finding anything.

Thanks!
 
Thanks for your reply Humus. It now works.

As I'm using OpenGL 3.1 and integer are now part of the spec I just used GL_RGBA8UI as the internalformat and GL_RGBA_INTEGER as the format parameter. Looking at the Opengl 3.1 reference I can see some tables where RGBA8UI is mentioned but there isn't any kind of explanation.
 
Back
Top