Why does my new TV look worse? *spawn

The algorithms are more sophisticated than straight (bi/trilinear) interpolation as that just blurs the image, and these algorithms can consume a fair bit of CPU performance. If the display device has dedicated hardware to perform the upscale, possibly optimised hardware solutions, it frees the CPU/GPU up from the task, and we call that a hardware scaler.
 
The algorithms are more sophisticated than straight (bi/trilinear) interpolation as that just blurs the image, and these algorithms can consume a fair bit of CPU performance.
??
This interests me...as far as I am aware of (from maths point of view), for a given ordered data set on a 'grid' (=720p), you can interpolate or reconstruct (I think in the case of an image both methods are fairly equal) to get the data set to the new 'grid' (=1080p).
Suppose that we have ordered data, i.e. the horizontal and vertical direction can be treated separately, it should be ultra easy and cheap to use a one-dimensional high accuracy (mind you that linear interpolation is for babies!!) interpolation approach!?!?
Which sophisticated methods do you think of?
 
Last edited by a moderator:
Well, there are a number of different window filters such as Lanczos, Hamming, Hann etc...the other issue to consider is the number of taps being evaluated (sample rate). Linear interpolation sucks!
 
Well, there are a number of different window filters such as Lanczos, Hamming, Hann etc...the other issue to consider is the number of taps being evaluated (sample rate).
Thanks for the answer.

Maybe you can answer this: do people use for instance WENO interpolation algorithms to scale images in real time?
 
Back
Top