Does Anti Aliasing still make sense if you have Ultra High DPI?

According to some studies, average human eye has resolution of 3k*3k, while children and more sensitive person can exceed 10k*10k. Aliasing artifacts can be perceptible even at higher resolutions...
 
The rendering power and memory you'd have to use for 3k x 3k would be far better spent on a simple 1080p image with high quality AA.
 
If you want to avoid anti aliasing then you need to have more resolution than the eye can resolve.

I assume it's much more cost effective to e.g. do 4x multisampling than to build a display that has a resolution four times what the eye can see. I don't see displays going much beyond 300 dpi, so we will still need anti aliasing.
 
The rendering power and memory you'd have to use for 3k x 3k would be far better spent on a simple 1080p image with high quality AA.

In terms if Rendering power, yes. But would a 1080p upscale image on a 4K Resolution monitor be a better solution?
 
If you want to avoid anti aliasing then you need to have more resolution than the eye can resolve.

I assume it's much more cost effective to e.g. do 4x multisampling than to build a display that has a resolution four times what the eye can see. I don't see displays going much beyond 300 dpi, so we will still need anti aliasing.

Just a dream of a Desktop Class Retina Display.
 
In terms if Rendering power, yes. But would a 1080p upscale image on a 4K Resolution monitor be a better solution?

An 'upscale' doesn't produce more information, it simply stretches the existing information over a bigger area. Native resolution is always preferable, or else you'll get all kinds of other issues.
 
Small 4k monitors are an absurd idea. Retina display my a** to quote myself from previously.
 
I personally think that for 32" 1920x1080 screens at sufficient distance 2.5m +), the image is already pretty good without AA.

The advantage of not having AA is that you can have a crisper image with better contrasts. The other day I saw Aliens versus Monsters on a 140cm display with 1920x1080 resolution, and I was standing really close to it. There were some neat particle effects going on, with pixel sized particles - they looked great because they weren't AA-d at all. The combination of high-intensity and great contrast of that one particle could not have been replicated as well if AA had been enabled. Of course it's an extreme example, but it's not hard to find similar cases. If you have an image in which you blur/dof part of the image and have no AA on the focussed bit, that's going to look very good.

It may well be that for gaming having AA is more effective than having higher resolution, as Laa-Yosh says, of course, and for all I care 1920x1080 is here to stay at least another 5 years. But I think at 1920x1080 for most applications we're already pretty close to the point where no AA is needed.
 
To my mind the best argument for higher DPI monitors isn't graphics in games, etc., but regular desktop use - particularly text. Even with Cleartype and other crutches to stand on a standard res LCD monitor at desktop distances fails the "why do I feel like I need new glasses?" test in my opinion (OS X is worse, Linux worse still).
 
To my mind the best argument for higher DPI monitors isn't graphics in games, etc., but regular desktop use - particularly text. Even with Cleartype and other crutches to stand on a standard res LCD monitor at desktop distances fails the "why do I feel like I need new glasses?" test in my opinion (OS X is worse, Linux worse still).
Definitely. After slowly getting used to various WVGA smartphone screens and now the gorgeous iPhone 4 screen I find that looking at ~100 ppi becomes less and less bearable. It's especially bad if you use an LCD in portrait orientation (as I do), making subpixel font rendering much less useful.

It isn't just edge clarity though, higher pixel density also means that areas of uniform colour look much more "solid", you can't see the subpixels forming stripes any more. For that reason I'd even take a 4K monitor if it could only upscale 1080p.
 
It's hard to compare it to desktop use though, as you're sitting far too close to the screen.

And for what it's worth, there are two things horribly broken about the Mac's desktop. One is font-rendering up to and including size 12, and the other is resizing windows being possible form just one corner. Everything else is either good or very good (imho).

Font rendering on Mac though? Yuck. They just don't have fonts that are optimised for smaller sizes. Aliasing has little to do with it. In fact, non-aliased fonts look much, much better. You can force Mac OS to use these for smaller font sizes, but I wish I could force it for all sizes.

Crawling edges? Possible, I haven't seen the effect, but it could be.
 
On an iPhone 4 (326 DPI), it is hit or miss.

In my tests, 4xMSAA does improve IQ in my eyes, but not much. Several other people I asked didn't see a difference at all (people who don't know what to look for. But then again, they are representative of most of the population. We're weird).

My thought is that 4xMSAA may still be worthwhile on this display for particular kinds of content, but going to 6x or beyond is probably very well into diminishing returns.
 
On an iPhone 4 (326 DPI), it is hit or miss.

In my tests, 4xMSAA does improve IQ in my eyes, but not much. Several other people I asked didn't see a difference at all (people who don't know what to look for. But then again, they are representative of most of the population. We're weird).
It depends on the content. High contrast edges, especially slow moving ones, still show aliasing quite well.
 
you could probably test wether aa makes a difference with a printer most do 2400x2400 dpi no problem
unless theres a difference between a image on screen and on paper
 
Isn't the old argument, yes its still important because texture filtering is a form of antialiasing and you wouldn't turn that off. So antialiasing is always needed.
 
Back
Top