Problem with floating point textures on ATI hardware

poly-gone

Newcomer
Suddenly, catalyst 3.9 onwards, ATI's drivers don't seem to be "filtering" floating point textures properly (well, we can't call it filtering since ATI's current generation hardware doesn't support fp filtering).

The problem is that in any app that uses fp textures, the textures seem jagged and pixelated. For example, this "effect" can be seen in the HDR Lighting sample that ships with the DirectX 9 SDK, where the regions around the bloom seem "jagged".

Does anybody know what's going on?
 
Point sampling has been there all along. So that wouldn't explain why the textures appear pixelated after Catalyst 3.9.
 
Have you talked to ATI about this? They may have corrected a bug; ie: misreporting what the hardware supports in the cap bits.
 
poly-gone said:
Point sampling has been there all along. So that wouldn't explain why the textures appear pixelated after Catalyst 3.9.

for some of the cat releases they had "some" support for bilinear filtering on fp textures but it got removed again because it only worked in some cases
 
Have you talked to ATI about this?
Nope. I think I will.

for some of the cat releases they had "some" support for bilinear filtering on fp textures but it got removed again because it only worked in some cases
That I didn't know. If they did support bilinear filtering, it is quite possible that they are still supporting it, probably causing the pixelation. But bilinear filtering is definately better than point filtering, so the whole thing simply doesn't seem right:)
 
reported months ago by humus
In the first version I used a floating point render target, which as you know only supports point sampling. Somehow though, it looked as if filtering worked. My app requested linear filtering, even though it wasn't supported, and somehow the driver did something odd so that it almost worked in this particular case. A bug made it work fine, so to speak. Basic and I had an email session over this theme not long after I released the demo and he had some theories that explained the behavior. Quite interesting subject anyway. It seems however that they have fixed that bug now.
and a nice descrition why by basic
The bilinear filtering hardware in R3x0 (and any other current chip too) can only do biliear filtering on integers. But if bilinear filtering was enabled for a FP16 texture, and you used drivers before 4.1, it seemed to work. The problem is that it didn't do floatng point filtering. It actually took the 16 bits that were supposed to be a FP16, treated it as an integer, did the bilinear filtering as an integer, and finally took the 16 bit output and said "here's your filtered FP16". That obviously don't work.
http://www.beyond3d.com/forum/viewt...hlight=ati+filter+floating+point&start=20
c:
 
Back
Top