Just twiddling around with filtering inside the pixel shader, and I wonder how is this supposed to work. Got no mip-map (texture too large), so I can't use HW filtering, and I'm off implementing my own.
Trilinear in the pixel shader seems easy enough. I snap the coordinates, compute four bilinear samples plus the one for the current mip-map level, and blend them together using 4 lerps (3 for the lookup at the non-existing higher mip map level, and one for the final blend). So far, this looks 1:1 like the HW filtering. Not sure if there is a more efficient way, as the lerps and such look a bit wasteful.
However, I'm stuck with anisotropic. I couldn't find a good reference in the web except:For some reason or another (maybe I'm just too stupid), I can't get the first one working. Didn't get around to implementing the second one yet. Well, nevertheless, I'm wondering how exactly is it implemented in hardware? Or at least, how can I get reasonably close to a hardware implementation in the pixel shader? Does anyone know a good source, or better yet, a working GLSL/HLSL implementation which is reasonably fast?
Trilinear in the pixel shader seems easy enough. I snap the coordinates, compute four bilinear samples plus the one for the current mip-map level, and blend them together using 4 lerps (3 for the lookup at the non-existing higher mip map level, and one for the final blend). So far, this looks 1:1 like the HW filtering. Not sure if there is a more efficient way, as the lerps and such look a bit wasteful.
However, I'm stuck with anisotropic. I couldn't find a good reference in the web except:For some reason or another (maybe I'm just too stupid), I can't get the first one working. Didn't get around to implementing the second one yet. Well, nevertheless, I'm wondering how exactly is it implemented in hardware? Or at least, how can I get reasonably close to a hardware implementation in the pixel shader? Does anyone know a good source, or better yet, a working GLSL/HLSL implementation which is reasonably fast?