JF_Aidan_Pryde
Regular
It's certainly possible. The NV30 demos pretty much showed realtime allocation of blur, noise, contrast, saturation etc.
So why not? Instead of taking whole seconds to do a image resize or minutes for a radial blur, crunching these calculations in the GPU's superior floating point units should take an order of magitude lower.
Same thing with any media application. Encodings DivX should fly on the GPU! We've already seen some limited use with video shaders etc, but the potential in this area, using the GPU to take over all SIMD / streaming / instruction cachable taskes is enormous.
So why not? Instead of taking whole seconds to do a image resize or minutes for a radial blur, crunching these calculations in the GPU's superior floating point units should take an order of magitude lower.
Same thing with any media application. Encodings DivX should fly on the GPU! We've already seen some limited use with video shaders etc, but the potential in this area, using the GPU to take over all SIMD / streaming / instruction cachable taskes is enormous.