Square pixels no more?

Scott_Arm

Legend
http://www.wired.com/wiredscience/2010/06/smoothing-square-pixels

Wasn't sure which forum to put this in. Interesting read, but I wish there was a little more detail on exactly what this was. The idea is, he's abandoned the square pixel format for storing digital information about an image. Supposedly he gets a smoother image. It seems to suggest there is some saving in compression as well. Is this the same as vector graphics? I'm also wondering, since all displays use square pixels, how this would be of benefit, unless it maintains smooth output when zooming. Vector graphics does the same, right?

Curious, but the story is lacking in detail.
 
Vector graphics are the first thing that came to my mind, too. I agree that the story is high on hype and low on actual information though...
 
Diffusion curves? Otherwise, yes this article is nearly all fluff: masks + *MAGIC* = lower size pics.
 
I dunno what the intended use is? Storing pictures or displaying pictures?
If its for storing then there are a bazillion formats that compress the pixels lossy or lossless, I dont think there is much to gain here. In terms of Jpeg you already dont store pixels but blocks.
For displaying and modifying pictures you need a solid foundation, be it vectors or pixels - I cant see how changing pixel-sizes on the fly would be helpful for an artist or how displaying them on raster-devices would improve anything.
 
This is just a load of crap. It's not technically possible to build display devices that offer morphing pixels as described in this article, unless we're willing to go back to monochrome scanning electron beam monitors (and even then pixels would have a fixed width, and to some extent, length...) I just don't see how this could be done, certainly not in an economic fashion.

On a conceptual level, this guy's technique resembles supersampling regular bitmaps and then running them through a 2xSai or SuperEagle filter, as used in many oldskool game system emulators.
 
it looks like a way to do compression (finding a better "shape" for a pixel block), which is probably not very useful compared to current image compression algorithms. Of course, it could be useful for certain special images such as cartoons.

Also technically pixels are not little squares, they are sampling points :)
 
They are sample points of functions with frequency components above the Nyquist limit.

(It always annoys me that people say pixels aren't little squares and then try to pretend they are little sincs when in practice images are always sampled with significant aliasing components, because that looks better.)
 
They are sample points of functions with frequency components above the Nyquist limit.

(It always annoys me that people say pixels aren't little squares and then try to pretend they are little sincs when in practice images are always sampled with significant aliasing components, because that looks better.)

The problem is, apparently, many people (as evidenced by this article) think that because pixels are created, in many cases, by little squares (e.g. CCD sensors), so they are little pixels. But of course they aren't. Trying to reproduce the image by using little sqaures is not the best approach (although it's probably the most common approach now as LCD gained popularity).
 
If you display information created in an ordered grid of sensors on a screen that is not an ordered grid (or other way around) you will introduce distortion.

Unavoidable fact. (CRT monitors proves this, I might add. :))

Also...where's the actual problem? Can't really see it. Regardless how a pixel is shaped or arranged, some imagery will be non-ideal for that arrangement, but the grid is a practical, sensible approach that is ideal for a lot of elements common in today's GUIs (windows, buttons, various bars and such tend to be constructed out of horizontal or vertical straight lines), and there are workarounds for most other problems.
 
If you display information created in an ordered grid of sensors on a screen that is not an ordered grid (or other way around) you will introduce distortion.

Unavoidable fact. (CRT monitors proves this, I might add. :))

Also...where's the actual problem? Can't really see it. Regardless how a pixel is shaped or arranged, some imagery will be non-ideal for that arrangement, but the grid is a practical, sensible approach that is ideal for a lot of elements common in today's GUIs (windows, buttons, various bars and such tend to be constructed out of horizontal or vertical straight lines), and there are workarounds for most other problems.

Eh? With CRT's the problem was screen uniformity. Any source on a CRT was going to be distorted whether analog or digital. Switch that to a digital display and it doesn't matter if your source is analog or digital, it will still reproduce it more accurately than an analog CRT.

Color of course can be a problem, but then that doesn't speak to whether the source is analog or digital (ordered).

Regards,
SB
 
I was actually thinking of moiré rather than uniformity, because pixels didn't neccessarily line up with the phosphor dots on the inside of the screen...
The assumption should be that there are there are many more phosphor dots than pixels or else you will incur the wroth of Nyquist.
 
At rezzes like 800*600 on any decent size tube there's probably plenty phosphorous to satisfy Nyquist, but at 1600*1200 on my old 19" Trinitron tube, moiré was QUITE obvious in some situations...

Even considering the analog circuitry of many gfx cards didn't like high resolutions (for that time era anyway) it wasn't too hard to spot moiré in checkered patterns even at 1200*1024 etc... Hell, turning up the bass on my subwoofer while gaming sometimes introduced some interesting on-screen graphical effects when the wires of the pixel mask started resonating. :D
 
I lusted after a 21" Trintron for years, but they were close to £1000
not too long ago I saw one in a pawn shop for £12, made me laugh
 
Ugh, my 19" weighed close to 30kg and pulled upwards of 250W or somesuch... I don't want to think how much a 21" unit would guzzle up... I actually used to heat food on that thing, if you just wanted some ready-made meatballs or something you'd stick the package on the rear of the monitor and leave it there for a while. :D Not instantly hot exactly, but it worked surprisingly well.
 
Ugh, my 19" weighed close to 30kg and pulled upwards of 250W or somesuch... I don't want to think how much a 21" unit would guzzle up... I actually used to heat food on that thing, if you just wanted some ready-made meatballs or something you'd stick the package on the rear of the monitor and leave it there for a while. :D Not instantly hot exactly, but it worked surprisingly well.

I had a 22" Philips with a Trinitron tube in it. It never used more than 150W. Image was *way* better than the POS LCD that replaced it.

Hauling it around was a hernia inducing exercise though.

Cheers
 
Back
Top