Jerry Cornelius
Newcomer
That's an interesting idea, just point sample at a ridiculously high res and down sample. You wouldn't have any anisotropy in a sceme like that tho and you're frame/buffer buffer bandwidth requirements would be high.
Eventually I think TMUs will be programable to take a patch of samples instead of just blending the four nearest.
16 or more samples, 256 possible patterns and throw the weighted blending out the window.
You could make the density of the sample patch a variable depending on the z value and do away with mip levels as well. Sort of "sparse" texture filtering for textures on far away polygons. It would be hell on memory, but should avoid a lot of texture aliasing.
Eventually I think TMUs will be programable to take a patch of samples instead of just blending the four nearest.
16 or more samples, 256 possible patterns and throw the weighted blending out the window.
You could make the density of the sample patch a variable depending on the z value and do away with mip levels as well. Sort of "sparse" texture filtering for textures on far away polygons. It would be hell on memory, but should avoid a lot of texture aliasing.