Resolutions and aliasing

zeckensack said:
The effect is similar to ordered grid supersampling. Far from equivalent though.
I notice changing between the two that the font here is quite alised at 1280 where it's smooth at 1600x1200.
I def prefer 1600x1200.
Oh and I can't stand any lower than 1024 even with 8xS fsaa.

<1280 or bust!
 
Last edited by a moderator:
Colourless said:
At 1600x1200 you are probably having your entire screen slightly blurred
Yup.. ever so slightly I prefer 1600x1200 for that reason though:LOL:
webpages look a bit rough since I can make out the font aliasing.
 
I agree that dpi is definitely the right thing the concentrate on. Perhaps, printers could form the basis for what we are asking? Text is pretty darn crisp on a good quality laser print, right? What is the typical for a laser printer? 300 dpi? 600? (I'm not sure, but those numbers sound familar) I'm thinking that would certainly give a good reference for the minimum. Perhaps there is some debate as to how much below that we could stray and still be able to stave off the aliasing issue? Is it safe to say the typical monitor hovers around 100 dpi (some lower, some higher, of course), just for the sake of discussion?

From an in-practice standpoint, it seems to me the answer is simply to render out n-factor greater resolution than the native resolution of the display device. Once you've exceeded that and have to scale down to present at the native resolution, you will be getting AA one way or the other (via the mathematics of the conversion or the device naturally blurring any detail that exceeds native resolution, anyway).
 
I don't really care about AA on a high res display, I only care about alpha textures that look like crap and need it.
 
Hubert said:
That's theory. I would consider AA not that important above 1600x1200 (my personal opinion). But beacuse I own only a 1280x1024 LCD I'd say AA is quite important. CRT's have not that sharp image, so there one could bear no AA with 1600x1200, if must, but on a 1280x1024 or even 1600x1200 LCD the jagginess is really annoying.

IMO you must antialias if you want to get rid of any sort of aliasing and that even in 2048*1536. I still see aliasing (albeit "jaggies" tend to get smaller due to the higher frequency) in 2048; besides there's more to it than just polygon edge/intersection anti-aliasing. And that even considering that the CRT here is actually "stretching" content a bit in that high resolution which could be seen in a way as a form of "oversampling".
 
Colourless said:
At 1600x1200 you are probably having your entire screen slightly blurred

Far less than Quincunx IMO though and even then it's usually the vertical axis that has the problem and not always necessarily the horizontal.
 
radeonic2 said:
I notice changing between the two that the font here is quite alised at 1280 where it's smooth at 1600x1200.
I def prefer 1600x1200.
Oh and I can't stand any lower than 1024 even with 8xS fsaa.

<1280 or bust!

I use ClearType on all my displays.
 
radeonic2 said:
Cleartype is for lcds...

Says who? It can be adjusted via the relevant application from Microsoft to any display out there and at least to my eyes well fine-tuned ClearType shows benefits whether CRT or LCD.

ClearType is nothing else than font smoothing or font anti-aliasing. I don't see why any form of AA should be exclusive for LCDs; it's more a necessity for LCDs that much is true.
 
Ailuros said:
Says who? It can be adjusted via the relevant application from Microsoft to any display out there and at least to my eyes well fine-tuned ClearType shows benefits on any display out there.
linky:???:
 
I don't doubt that at all, but it actually can be helpful for CRT's, as well. The more razor sharp the display is (yes, there are hi-quality CRT's out there, too), the need for text AA becomes more obvious.

I've had fun playing around with it as of late, as well. It's a fine line between making things "smooth" and going too softened (and that is where some people will like it and some won't). The other downside I've noticed is sometimes CT can produce a "color halo" effect at the very edges of a text character- very similar to as if a CRT has taken a bad convergence set. Sometimes it is very apparent in inverted text color situations (white on black instead of black on white). I hope this artifact is something that can be addressed in later refinements of CT.
 
The best success of CT on CRTs I've had is with Trinitron aperture grills. Standard shadow/slot mask results have not been as useful. A 19" Trinitron has noticable legibility gain, followed by a 22" Diamondtron, whereas a 21" Hitachi SGA looks very peculiar with CT enabled. On the LCD side, just about the first thing I enabled on my Samsung 191T was CT via the online tuner.
 
CT is based on the ability to render subpixel effects. This works on LCDs because of the known fixed distribution of RGB subpixels in vertical strips, and the response of those subpixels given particular color values. You'll see what happens when the pattern that CT thinks you have differs from the actual pattern, just flip between BGR and RGB layouts.

Properly implemented and calibrated CT does not exhibit visible blurring nor color halos. If your CT is exhibiting that, please use the MS CT calibration tool to fix it.

CT on CRTs doesn't work, because CRT's use a phosphor triad. The result will look like bad convergence. When the font rendering engine thinks it is setting subpixel at position (0.5,1.0) on an LCD, on a CRT it is switching on a subpixel in a phosphor triad in completely the wrong position. People who report an improvement with CT on a CRT are seeing the effects of superior font antialiasing, they are not seeing the effects of cleartype, which is to take antialiased samples and render the subpixels. CT on CRT = artifacts. I've tried it on 2 different high end CRTs, side by side with an LCD, and it simply does not compare. I find that ordinary font smoothing with no CT works about as well, but CT may work slightly better due to an enhanced font rendering algorithm.

But it is a myth that "sharp" displays mean you need better AA. AA enhances readability A less sharp display is less readable. *That* is the reason for AA on fonts, NOT eliminating the "sharp, jaggy" look. Blurry pixels that blend together are not a readability enhancer.

Moreover, on AA, even commercial offline rendered CGI in films at 2k+ res use high levels of AA (both spatial and temporal), and some shots use 4k. Even at high resolutions, you will see see shimmering without AA. That's why film quality CG uses up to 64x supersampling in some spots. 64x supersampling @ 2k resolution would be akin to 131,000 x 98304 samples!
 
Last edited by a moderator:
ohNe22 said:
Aliasing comes from an "infinite" detail of input that is rendered to a finite amount of pixels. So there won't be a resolution that can be seen as "enough".

Aliasing isn't just "jaggies" or pixel popping. In fact Aliasing is the description that fits to nearly all the artifacts you get when you render try to get your infinitely detailed content in a finite and therefore displayable form.

AA is the only thing that can help. It catches more detail and turns up picture quality while reducing jaggies or moire. From what I said above it is clear that AA is necessary and we will see much better algorithms in future.

I think you should ask: What resolution can be considered "enough"? That one is easy to answer. Our eyes have a maximum resolution - this one in realtime + AA and you have "enough".
IMHO this will take a long time ...
First of all, thanks for the explanation about aliasing (I'm a newbie wrt 3D). I thought aliasing can be adequately described in a single sentence (i.e. aliasing is caused by not correctly displaying frequencies that are more than 1/2 the sampling rate, which is known as the Nyquist rate) but it's always nice to hear what folks think aliasing means (and not necessarily what it is).

As for your suggestion about what I should've asked instead -- if you'll re-read my post, you will realize I asked exactly that ("immediate impact" = "enough").
 
Hehehe Reverend - didn't want to critisize you with my posting! ;-)

Of course I know that you can just say "Nyquist" and everythings said, but in fact most people hear "Aliasing" and think "jaggies" ... I just wanted to clear up things. I just had a few semesters of computer graphics and learned a lot about things like AA.

By the way: YOU A NEWBIE? :D
 
radeonic2 said:
It was my understanding clear type was developed by studying lcd monitors.. hmmph
Well even with tool I still prefer standard.
While Clear Type improved text quality for some things, it makes tsome text to blurred for my liking.

Clear Type is adjustable; it's clearly a matter of preference and I respect that but if you fiddle around with the settings a bit I'm sure you could find a happy medium that won't make your text to appear blury.

I'm using it on this CRT even now with a 1600 desktop. I usually use a high quality text printout to compare it to.
 
I'd ask another question before anything else: is there a theoretical limit in dpi values where the human eye can still see a difference?

What I mean is that if viewable areas on monitors/screens continue to grow at the same time as resolutions scaling I don't see how the dpi relativity would change compared to today's standards.

If someone would tell me that a mainstream monitor in half a decade for example will have a native resolution of let's say 4096*2304, yet won't be bigger than 21-24" then I might be able to figure that due to the very high resolution yet very small relatively viewable area or dot size that it might mask some if not all aliasing patterns. Anyone care to clear it up for me?
 
Back
Top