GPGPU Assisted Photoshop Filters

GMâ„¢

Newcomer
Hello everyone, I was wondering if you guys could give me some advise.

I have been reading recently how both AMD and NVIDIA have released SDKs for GPGPU. Is it possible that these could be used to develop Photoshop or even Gimp filters that are hardware assisted?

Wouldn't a GPU be quite good at it?
 
Depending on the filter, yes, a GPU is likely very well suited to the task. As for whether you'd want to use their respective GPGPU setups for that, that would again depend on the filters you want to implement (and to a decent extent your platform and the hardware you want to make it available on), but you certainly could.
 
Mac OS 10.4 ("Tiger") already does something like this with its Core Image API. You may want to read up on its structure and documentation to give you some ideas... :)
 
Anyone that does astro imaging (moi) would really love heavy GPU lifting when it came to stacking, leveling, curves and heavy duty filter scripts - tasks that can take minutes on my quad core PC.

Tapping the power of a 8800 class GPU would be very welcome!
 
I'm not sure that Adobe has committed to GPU acceleration being in a specific release of Photoshop. Just that it is coming.
 
Nice. Lens geometry correction would be a good place to start as that tends to be pretty slow.

Maybe a slab of plug-ins will be part of Big Bang II. That'll make people take notice.

Jawed
 
Wouldn't a GPU be quite good at it?
Yes, but CPUs don't exactly suck at it. Many operations can be done with highly optimized MMX and SSE routines. In my experience even for some 3D operations the CPU can keep up with mid-end GPUs. The CPU's large cache can prove to be quite valuable.

Furthermore, GPUs tend to produce slightly different results between brands and models. That's often unacceptable so a software implementation remains necessary.
 
Yes, but CPUs don't exactly suck at it. Many operations can be done with highly optimized MMX and SSE routines. In my experience even for some 3D operations the CPU can keep up with mid-end GPUs. The CPU's large cache can prove to be quite valuable.

I think the situations in which a CPU can approach a comparably-priced GPU in terms of graphics rendering performance are so few and far between as to be non-existent, practically speaking.

Furthermore, GPUs tend to produce slightly different results between brands and models. That's often unacceptable so a software implementation remains necessary.

Which is why checks are performed frequently in any GPGPU workload (see FAH).
 
I think the situations in which a CPU can approach a comparably-priced GPU in terms of graphics rendering performance are so few and far between as to be non-existent, practically speaking.
If you invest in a high-end card or a recent mid-end card, then sure, the GPU will be noticably faster. But the average Photoshop user is not a gamer. Also, the situations in which the CPU can keep up with the GPU are quite select, but in these cases it's mainly thanks to being able to do things in a smarter way. The GPU only allows a brute-force approach, which for the weaker GPUs isn't always a win. GPUs generally can't handle a scatter operation, situations with per-pixel branching, neighboring pixel dependencies, and such.

So all I'm trying to say is that the CPU doesn't suck entirely at Photoshop. It used the CPU exclusively for years, using highly optimized routines. And with rapidly increasing performance (thanks to multi-core and SIMD enhancements) there's no instant need to move the workload to the GPU.
 
If you invest in a high-end card or a recent mid-end card, then sure, the GPU will be noticably faster. But the average Photoshop user is not a gamer. Also, the situations in which the CPU can keep up with the GPU are quite select, but in these cases it's mainly thanks to being able to do things in a smarter way. The GPU only allows a brute-force approach, which for the weaker GPUs isn't always a win. GPUs generally can't handle a scatter operation, situations with per-pixel branching, neighboring pixel dependencies, and such.

So all I'm trying to say is that the CPU doesn't suck entirely at Photoshop. It used the CPU exclusively for years, using highly optimized routines. And with rapidly increasing performance (thanks to multi-core and SIMD enhancements) there's no instant need to move the workload to the GPU.

That's the beauty of add-in-cards though - you can just pop one in your machine at any time. Now that GPUs are being treated more like co-processors, there's even more of a reason to do so.
 
Back
Top