AA : Monitor size, viewing distance

Great links. Liked specially:
The resolution of our eyes is 12 vertical lines per arc angle (one line per arcminute for 20/20 acuity) times 2. Now 28 degrees x 12 lines x 2 = 672. This means we really can't see a display component (pixel) smaller than 1/672 x image width. Our minimum resolvable element size is about 0.065", or about twice the size of the pixels of the WXGA image! Put bluntly, from 8 feet away while watching a 50 inch plasma TV, the human eye is generally incapable of reliably distinguishing any detail finer than that shown on a true 720p display!
 
This post "Seeing the Grids" from slashdot is very interresting too.
Seeing the Grids(Score:5, Informative)
by Doc Ruby (173196) on Tuesday April 10, @12:44PM (#18677583)
(http://slashdot.org/~Doc Ruby/journal | Last Journal: Thursday March 31, @02:48PM)

* Keep in mind this article is written in general terms, so you scientists out there don't need to stand in line to file corrections!

I was in the Joint Photographic Experts Group (JPEG) when we invented the popular image format. While I worked for a digital camera company inventing an 8Kx8K pixel (40bits color) scanner, having studied in pre-med college both the physics of light and brain neurology of the visual system. So I'll just jump that line of "scientists" to file this correction.





* It's safe to say, however, that increasing resolution and image refresh rate alone are not enough to provide a startlingly better viewing experience in a typical flat panel or rear projection residential installation.

It's safe to say that only once you've dismissed the scientists who would correct you.

The lockstep TV screen is a sitting duck for the real operation of they eyes & brain which compensate for relatively low sampling rates with massively parallel async processing in 4D.

Joseph Cornwall's mistake in his article is to talk like viewers are a single stationary eye nailed at precisely 8' perpendicular to a 50" flat TV, sampling the picture in perfect sync with the TV's framerate. But instead, the visual system is an oculomotor system, two "moving eyes", with continuous/asynchronous sampling. Each retinal cell signals at a base rate of about 40Hz per neuron. But adjacent neurons drift across different TV pixels coming through the eyes' lenses, while those neurons are independently/asynchronously modulating under the light. Those neurons are distributed in a stochastic pattern in the retina which will not coincide with any rectangular (or regular organization of any linear distribution) grid. The visual cortex is composed of layered sheets of neurons which compare adjacent neurons for their own "difference" signal, as well as corresponding regions from each eye. The eyes dart, roll and twitch across the image, the head shakes and waves. So the brain winds up getting lots of subsamples of the image. The main artifact of the TV the eye sees is the grid itself, which used to be only a stack of lines (of nicely continuous color in each line, on analog raster TVs). When compared retinal neurons are signaling at around 40Hz, but at slightly different phase offsets, the cortex sheets can detect that heterodyne at extremely high "beat" frequencies, passing a "buzz" to the rest of the brain that indicates a difference where there is none in the original object rendered into a grid on the TV. Plus all that neural apparatus is an excellent edge enhancer, both in space (the pixels) and in time (the regular screen refresh).

Greater resolution gives the eyes more info to combine into the brain's image. The extra pixels make the grid turn from edges into more of a texture, with retinal cells resampling more pixels. The faster refresh rate means each retinal neuron has more chance to get light coordinated with its async neighbors, averaged by the retinal persistence into a single flow of frequency and amplitude modulation along the optic and other nerves.

In fact, the faster refresh is the best part. That's why I got a 50" 1080p DLP: the micromirrors can flip thousands of times a second (LCD doesn't help, and plasma as it's own different pros/cons). 1600x1200 is 1.92Mpxl, at 24bit is 46.08Mb per image. 30Hz refresh would be 1.3824Gbps. But the HDMI cable delivering the image to the DLP is 10.2Gbps, so that's over 200FPS. I'm sure that we'll see better video for at least most of that range, if not all of it. What I'd really like to see is async DLP micromirrors, that flips mirrors off the "frame grid". At first probably just some displacement from the frame boundary, especially if the displacement changes unpredictably each flip. Later maybe a stochastic shift - all to make the image flow more continuously, rather than offering a steady beat the brain/eyes can detect. And also a stochastic distribution of the mirrors (or their projected pixels). The more the projector goes off the time/space grid, the happier the eyes will send the image to our imaginations without passing the mesh packaging.

If only the TV content was improving as fast as the TVs themselves.
 
Proper distance might remove the jagged appearance of aliasing on a HDTV.

However, it does not remove the aliasing artificts. Those are still VERY prominent, at least to me. Again using the Xbox 360 on my friends 42" HDTV.

The crawling/shimmering/distortion of what you should be seeing is most apparent. Especially in situations where you have many parallel lines in close proximity to each other.

Stairs at a distance for example where the horizontal plane of the each successive stair comes close to being on level with your eye. If the game features some sort of "head bobbing" motion, then that just exacerbates the problem. As you move towards or away from them they will appear to crawl and parabolic shapes will appear and crawl. Even with 6x AA on my X1800XT I can notice this phenomena at times although it's not nearly as pronounced as without AA.

It's distracting to the point of making a game un-enjoyable (for me). Even fast action first person and third person games, the aliasing artifacts as much or more so than simple jagged edges distracts from any sort of immersion or enjoyment.

As I've theorized before I think it's due to fact that I focus much more on peripheral vision than I do on direct vision. Even the slightest movement from an area not under direct observation will immediately pull my vision in that direction.

As such when the whole screen has crawling aliasing artifacts my vision is constantly being pulled to all areas of the screen and it thus they become the focus.

And it's more certainly unpleasant when a series of edges in parallel rather than appearing to be straight lines in parallel becomes a series of parabolas due to aliasing artifacts that introduct artificial lines spanning multiple real lines. I'm not sure how to explain this properly.

Unless I sit so far away tha I cannot discern any features on the TV (greater than 20-30 feet for my friends 42" TV) it's still quite noticeable. Of course, at that point trying to play a game is a moot point as I can't properly tell what's going on.

Likewise 16x non-optimized AF is still not sufficient for all situations in games I've run into. Especially if there is a large viewing distance. As someone pointed out previously this is especially noticeable in racing games.

Regards,
SB
 
Depends on the game. It'll either be 720p or 1080i. He has the X360 set to 1080i.

Doesn't bug me "as" much as playing games on my computer since I don't play games at his place every night.

I'm really hoping the rumored 24x AA on the R600 helps out in these situations on the comp.

Regards,
SB
 
Scaling probably introduces the most artifacts. And I don't know of a tv that has a native resolution of 720 and 1080 lines at the same time. Most have neither.

AA and Af aren't going to help there.
 
Except that it almost exactly mimics the aliasing artifacts I see on the PC. Even with 4x AA it's apparent more than it should be.

So the fact that I'm viewing the TV from a greater distance then I do with my PC doesn't help significantly with aliasing artifacts.

After all, aren't even those relatively simple Transformers 3D cartoons on TV rendered with 32x AA? If aliasing weren't a problem wouldn't they save the trouble and just render with lower levels of AA?

Regards,
SB
 
Back
Top