How responsible is the CPU in image resizing these days?

Ken2012

Newcomer
So, I opened up a 500*1000 JPEG in XnView (picture viewer), zooming in and out of the image (resizing) and watched out for the small CPU 'spikes' via device manager.

Presumably, this kind CPU utilisation in 3D games/applications is eliminated now, thanks to the programmable graphics pipeline (that is, basically SM1.1 and higher), which brought us hardware mipmapping support (i.e the GPU handles all texture ops, per frame)... Or has it? Is image (texture) resizing still potentially a CPU-bound thing? Does it have anything to do with the compression used in e.g JPEG; is this in itself still a CPU-bound task- that is, to display a compressed image?

Great confusion here, could use some enlightenment ;).

Thanks.
 
Image resizing is pure CPU operation (if you resize it with some image tool in Windows).
Texture resizing is another thing...
 
Picture viewers and such typically use the CPU for both (de)compression and resizing. I'm sure it's possible to use a modern programmable 3D accelerator to do these tasks, but I don't actually know of any such application.

In any case, unless great care is taken the scaling done by a 3D accelerator would be fast but inaccurate. An advanced pixel shader program could probably apply bicubic interpolation and stuff to scale an image, but the standard hardware scaling filters in 3D hardware really are very crude all things considered.
 
I'm pretty sure GDI+ will do some image rendering operations with hardware acceleration...

It's hard to say though. I just did a test doing a full screen image resize at 1600x1200 and got approximatly 100fps... This sounds a tad too high to me...?
 
Thanks for the responces.

One of the reasons I asked is because I was on another forum some time ago, and someone was talking about PC Silent Hill 3. They were talking about it's odd rendering system where one can select an 'internal' resolution system (controlling AA, texture res etc) that is higher than the supported screen (output) resolutions. Anyone here who has a copy of PC SH3 or has at least played/read about it will know what I'm on about. Anyway, the long and short of it was that this image resizing was said to be CPU intensive as opposed to GPU...

I just thought for a moment that this philosophy would extend to any[/i] .bmp/.jpeg/etc-based resizing in existance, including individual texture resizing in a 3D scene.
 
Graham said:
I'm pretty sure GDI+ will do some image rendering operations with hardware acceleration...

It's hard to say though. I just did a test doing a full screen image resize at 1600x1200 and got approximatly 100fps... This sounds a tad too high to me...?

Vista will have no hardware acceleration for GDI.
 
Image programs like XnView or Photoshop don't need hardware acceleration. Computers these days have multi-ghz CPUs, more than enough power to simply scale an image. Video playback/scaliong is another matter, modern video cards will accelerate that.
 
Ken2012 said:
They were talking about it's odd rendering system where one can select an 'internal' resolution system (controlling AA, texture res etc) that is higher than the supported screen (output) resolutions.
It selects the resolution of the backbuffer, nothing else - it has nothing to do with texture resolution. And the AA is simply supersampling - if you select a higher resolution for backbuffer than front.

Anyway it should definately not have any CPU overhead - it's a simple texture resize when copying back to front buffer. And the reason it was made this way is the PS2 version did it same way, so they didn't bother changing the rendering layout of the port.
 
Back
Top