Resolutions and aliasing

Reverend

Banned
At what rez will aliasing (theoretically) be a thing of the past in terms of immediate impact?

At what price, and prize, will such a rez be to a IHV?
 
Aliasing comes from an "infinite" detail of input that is rendered to a finite amount of pixels. So there won't be a resolution that can be seen as "enough".

Aliasing isn't just "jaggies" or pixel popping. In fact Aliasing is the description that fits to nearly all the artifacts you get when you render try to get your infinitely detailed content in a finite and therefore displayable form.

AA is the only thing that can help. It catches more detail and turns up picture quality while reducing jaggies or moire. From what I said above it is clear that AA is necessary and we will see much better algorithms in future.

I think you should ask: What resolution can be considered "enough"? That one is easy to answer. Our eyes have a maximum resolution - this one in realtime + AA and you have "enough".
IMHO this will take a long time ...
 
That's theory. I would consider AA not that important above 1600x1200 (my personal opinion). But beacuse I own only a 1280x1024 LCD I'd say AA is quite important. CRT's have not that sharp image, so there one could bear no AA with 1600x1200, if must, but on a 1280x1024 or even 1600x1200 LCD the jagginess is really annoying.
 
Another thing to consider is that resolution is not important when used to desribe the number of pixels in two dimensions (as in 1600 x 1200). What matters is pixel density or dot pitch. 1600 x 1200 on the size of a postage stamp would be difficult to see aliasing in without the aid of a magnifying glass, but stretched over the face of a large building you could have "jaggies" the size of people. So, in this case distance to the display area also matters.

Personally, and practically, I think a very good forward looking resolution would be 3200 x 2400. Imagine, if you will, a 30" or so monitor with said resolution and even only 4x AA (one can dream a little). I think it would look rather good. Right now, with something like a 21" CRT, I don't see the need for more than 1600 x 1200 with 4x AA. I even play games at 1024x768 with 4x AA on such a monitor and it can look very good. Personally, I think it may be better to advance anti-aliasing that screen resolution. You will always get better refresh rates at lower rez (now that I said "always" I will be proven wrong, hopefully) and I think that is important for enjoyment of the image(s).
 
I have a 1920x1200 LCD at home and I wouldn't go without AA - even at that resolution.

If you have a single image I'd say that you're right and that a high resolution diminishes the necessity of AA, but in motion that's another story. Good texture quality and high polygoncount make it even worse.

With computer generated graphics you have to fight against
- Jaggies
- Texture flimmering
- Pixel popping
- "Lost" triangles/polygons
- Moiré
and more side effects. Good AA algorithms can't solve the problems, but you get a better picture out of it. Going only the resolution route won't help that much and the amount of work is normally much higher than that of a good and fast AA.
 
I'd say when 4" displays can maintain about 5200x4072 resolution will aliasing become a thing of the past.

Resolution doesn't cure aliasing- it only makes it smaller. So this stands to reason that resolution is meaningless as it's the size of the display in proportion to the resolution to help reduce aliasing to the point of being invisible to the human eye.

1600x1200 without AA might look decent on a 14" monitor. Put it on a 21" monitor and it's an alias fest.. etc.etc.
 
  • Like
Reactions: Geo
The question (and many of the answers so far :oops: ) neglects that all major aa techniques implemented today are much more efficient than increasing resolution.

Rotated/sparse grid supersampling improves edge quality more efficiently than bumping resolution. If there'd be a wikipedia entry for EER, I'd link it here. Oh well ... short summary: 2xRGSS at 800x600 resolves edges roughly as well as 1600x1200 without anti-aliasing at half the fillrate cost.
Even naive (ordered grid, uncompressed) multisampling improves edge quality at a lower cost than increased resolution.
Present multisampling techniques (sparse grid plus compression) are just brutally more efficient.

(Isotropic) Mipmap texture filters improve texture quality at a much lower cost than brute-force oversampling without mipmaps (as was done in early software renderers).
Anisotropic texture filters (pre blatant benchmark centric cheats) are much more efficient at increasing overall texture quality than either supersampling or increased resolution.

Etc.

Pretty much the only benefit (though it's a rather important benefit) of high resolution is to maximize representable isotropic detail inside a surface (i.e. flat-on viewed minified textures, desktop stuff, small text that would be illegible with lower res). There's a point of diminishing returns there though, and IMO we've long passed it. If it weren't for broken AF implementations and boneheaded devs that do not understand nor respect vital concepts such as mipmapping, I'd look no further than 1280x960 on a 19" CRT. Then reach whatever higher quality standards you may have via multisampling and AF.

Sorry 'bout the rant. It's in my nature ;)
 
Last edited by a moderator:
ohNe22 said:
I have a 1920x1200 LCD at home and I wouldn't go without AA - even at that resolution.
Really? Interesting, thanks.

I have an old monitor so I game at 1024x768 and need AA, I've always wondered about if it still helps at higher resolutions or not...now I know.

Thanks. :)
 
zeckensack said:
2xRGSS at 800x600 resolves edges roughly as well as 1600x1200 without anti-aliasing at half the fillrate cost.
But what you see on the screen is not comparable at all. I can play Flight Simulator at 1600x1200 and it looks excellent even without anti-aliasing. At 800x600, with the highest possible anti-aliasing and anisotropic filtering, I can barely read the gauges.

So when talking about anti-aliasing you really have to compare it with the same resolution. Sure, some games would look great at 800x600 and high anti-aliasing and would run much smoother, but that's totally subjective. The only way forward is to increase anti-aliasing capabilities -and- resolution.
 
digitalwanderer said:
Really? Interesting, thanks.

I have an old monitor so I game at 1024x768 and need AA, I've always wondered about if it still helps at higher resolutions or not...now I know.
No, it doesn't help that much at higher resolutions. I have a 1600x1200 LCD and a Geforce 6600 and I can play everything fine at that resolution without anti-aliasing. You have to really stand still and look for aliasing effects to really see them. But they never bother me while playing.

Of course, if you have a more powerful graphics card it would be crazy not to use anti-aliasing.
 
Nick said:
But what you see on the screen is not comparable at all. I can play Flight Simulator at 1600x1200 and it looks excellent even without anti-aliasing. At 800x600, with the highest possible anti-aliasing and anisotropic filtering, I can barely read the gauges.
I totally understand and agree. You do need a reasonable base resolution, depending on what you do (I personally aim for 1280x960 most of the time as I said). With small resolutions plus supersampling spatial information between subpixels is lost, which sucks.

My point was that once you have your isotropic detail bases covered by resolution, there are, for different types of detail, more efficient solutions than upping the res.
If you want clean geometry edges => sparse grid multisampling.
If you want the bigger mipmap levels to contribute more detail to your textured surfaces => AF.
If you want to do something about alpha testing artifacts => TSAA or AAA.

None of these things can do alone what can be done by bumping up the res. But combined they can emulate most of the effect that a higher res would bring while being much easier on computing resources.
Nick said:
So when talking about anti-aliasing you really have to compare it with the same resolution.
Heh, sorry, but no, I don't have to ;)
I'd rather compare attained quality at approximately the same performance. Of course this is going to be highly subjective, but anyway ...
Assuming you have no issues with freestyle resolution switching (i.e. a CRT), I'd like to ask you to try out a (not totally CPU-limited) game at 1)1024x768, 2xAA, 2xAF and 2)1280x960, no AA, no AF, and observe which setting performs better and which one looks better.
Nick said:
Sure, some games would look great at 800x600 and high anti-aliasing and would run much smoother, but that's totally subjective. The only way forward is to increase anti-aliasing capabilities -and- resolution.
Of course improvements in both areas are always welcome. I'm merely trying to get the (for my tastes) best possible results from the hardware I have at my disposal -- which certainly isn't high end (a 6800 vanilla is my bruntiest graphics card). Obviously, If I could, I'd run everything at the limits of my CRT (1600x1200).
 
Last edited by a moderator:
I have to disagree with Nick's observation. I too have a 1920x1200 LCD notebook display (15" - that's 160 dpi), and aliasing is still readily apparent to me on high contrast boundaries. The imperfections may be small, but the fact that they move and shimmer just draws the eye. In fact, it's much worse when moving then when standing still.
 
What effect does running a 19" crt beyond 1280x960 have because of dotpitch?
Like say.. you have .26 dotpitch and run it at 1600x1200.
 
GraphixViolence said:
I have to disagree with Nick's observation. I too have a 1920x1200 LCD notebook display (15" - that's 160 dpi), and aliasing is still readily apparent to me on high contrast boundaries. The imperfections may be small, but the fact that they move and shimmer just draws the eye. In fact, it's much worse when moving then when standing still.

I'll throw my $0.02 in and have to agree that even at 1920x1200 24", one still wants some sort of AA. It's not the still images you can see the faults in, it's the motion that brings them out.
 
radeonic2 said:
What effect does running a 19" crt beyond 1280x960 have because of dotpitch?
Like say.. you have .26 dotpitch and run it at 1600x1200.
The effect is similar to ordered grid supersampling. Far from equivalent though.
 
zeckensack said:
Assuming you have no issues with freestyle resolution switching (i.e. a CRT), I'd like to ask you to try out a (not totally CPU-limited) game at 1)1024x768, 2xAA, 2xAF and 2)1280x960, no AA, no AF, and observe which setting performs better and which one looks better.
I have done that before, and 1024x768+AA/AF looks much better for the most part. Performance is more or less the same, GF4 isn't very efficient. The notable exception is alpha tests, which really stand out with MSAA. For this last reason I usually prefer straight 1280x960, sometimes with 2xAF if performance allows.

(BTW, it's about time I upgraded from my GF4. Unfortunately, AGP is at a dead end.)
 
GraphixViolence said:
I have to disagree with Nick's observation. I too have a 1920x1200 LCD notebook display (15" - that's 160 dpi), and aliasing is still readily apparent to me on high contrast boundaries. The imperfections may be small, but the fact that they move and shimmer just draws the eye. In fact, it's much worse when moving then when standing still.
Like I said, if you have a powerful graphics card it's still very useful to enable anti-aliasing. All I wanted to add is that for many games increasing the resolution can improve gameplay (because you can see further/sharper) while anti-aliasing 'just' removes jaggies, which are already less noticable at high resolution. I prefer 1600x1200 without anti-aliasing over 1280x960 with 4x anti-aliasing mostly. But it all depends on the game. Doom 3 looks great at low resolution with high anti-aliasing. Flight Simulator is hardly enjoyable at 1024x768 no matter how much anti-aliasing. Half-Life 2 is somewhere in between.

For low-end to mid-end graphics cards, it's in most cases a necessity to choose resolution over anti-aliasing. Aliasing will only be "a thing of the past" once we've reached the necessary resolution for all games, and then add anti-aliasing. So for each game individually I look at what resolution is comfortable, and then enable anti-aliasing if there's performance left. Not unreasonable, is it?
 
CMAN said:
We need to ask pixels per inch (PPI).
Pixels per centimeter/millimeter, thank you very much! :devilish:

Nick said:
For low-end to mid-end graphics cards, it's in most cases a necessity to choose resolution over anti-aliasing. Aliasing will only be "a thing of the past" once we've reached the necessary resolution for all games, and then add anti-aliasing. So for each game individually I look at what resolution is comfortable, and then enable anti-aliasing if there's performance left. Not unreasonable, is it?
Reasonable, but not how I look at it. I would generally prefer 640*480 with 4-sample sparse AA over 1024*768. Aliasing is much worse than blurriness to me. But for competitive gaming, then of course AA is useless.
 
Back
Top