Resolutions and aliasing

I just asked because I've hit onto quite a few cases where those kind of moire patterns appear, but usually go away when you switch to "high quality" AF. It doesn't have to be so in this particular case, but keep in mind that I had more sophisticated filtering methods than just AF in mind.
 
Sure, but no texture filtering is going to save you here, because it's the polygon edges that produce the moire effect. Granted, it's not a common scenario in games to have a large number of polygons in a very regular pattern, but like I said, increasing resolution just decreases the frequency of aliasing artifacts. There will always be some games where regular, high-contrast geometry makes lots of sense, as it happens to in some areas of City of Heroes.
 
I would think it no longer matters when the resolution and distance from the screen converg. I mean, as the monitors get larger the distance between the person behind the keyboard and the screen is going to increase. Has that happens, I believe AA and what not go out the window.

For example, I play from time to time World of Warcraft and EQ2 on my 60" wide HDTV and with 3.2 million dots of resolution it does not matter. Now granted, I'm also not sitting right in front of it, more like 12ft away. So maybe that makes a difference.

Is it there? Jaggies or such? Probably. But at the distance I'm sitting and resolution I'm playing, it no longer is a distraction.

Depending on what comp I move, I'm using a 9800 Pro or a 6800 GT. Base subsystem is the same with a P4 2.8 ghz with a gb of ram. And everyone that comes over never once says what computer is running which card... /shrug.
 
saf1 said:
I would think it no longer matters when the resolution and distance from the screen converg. I mean, as the monitors get larger the distance between the person behind the keyboard and the screen is going to increase. Has that happens, I believe AA and what not go out the window.
Did you see the moire patterns in either my screenshot or Ailuros'? Try hiding that by moving further away from the screen.
 
saf1 said:
I would think it no longer matters when the resolution and distance from the screen converg. I mean, as the monitors get larger the distance between the person behind the keyboard and the screen is going to increase. Has that happens, I believe AA and what not go out the window.
You can always create a scene that will alias at any sampling resolution.
 
ohNe22 said:
Nice Moiré (in both of the shots). I guess in movement it's even more visible!?

It's hard to detect in high quality mode; both are with gamma correction enabled. Meaning that I hardly can see any of it on this monitor here. In quality mode it's a dancing meander scheme in quite a few occassions.

But I added SSAA in both cases on purpose since it does make a difference. Chalnoth is right that the moire pattern gets a lot worse when you enable multisampling; what I'm trying to find out here is if any form of polygon interior data AA helps situations like that.

I could have used 1024*768 and try from 8xS up to 16xSSAA; with the latter albeit it would absolutely kill performance most of it goes away. The question would be if it will be ever possible to get equivalent to 16xSS poly interior data "treatment" in the future without such a huge impact and no not through Supersampling obviously.
 
Ailuros,

Seeing how you seem interested in the moire issues and you are familiar with Chronicles of Riddick: Escape from Butcher's Bay, I was wondering if you have done any tests with this title. On a 6800 Ultra with any Forceware, I see extremely horrible moire on almost all the walls even with "High Quality" filtering. We are talking warping cobwebs here, making the game look less than appealing. I never see this mentioned and read only praise about how good this game looks. In my experience it looks very bad because of this. The same can be said for the corrugated wall surfaces in Half Life 2. There just doesn't seem to be a way to get rid of this on Geforce 6000 hardware.

I read a lot about shimmering issues, but moire has been plaguing me much more and I am curios how this affects different hardware. It would be really interesting to compile some data on this as I, personally, find this much more distracting than some shimmering here and there. At least with shimmering there is no readily discernible pattern, but the warping patterns of moire tend to draw the eyes.
 
Chronicles of Riddick is a game that makes extensive use of bump mapping, so what you may be seeing is the fact that current games don't do any sort of antialiasing of bump maps in the shader (though some simple, high-performance techniques have been developed). But I can't be certain, so it'd be useful to do a comparison between different hardware.
 
Simon F said:
You can always create a scene that will alias at any sampling resolution.
I find F12002 is very good for this.
At 1600x1200 with 8xS fsaa, it's still quite jaggie for the tarmac texture white lines as well as just models.
It's amazing to me how aliased the game is.
 
saf1, I totally disagree with you and agree with Chalnoth.

It doesn't matter how far away you are from the screen because at any distance there will be some sort of artifact (aka aliasing).

A good example are offset prints (and even more with b/w offset) where the dpi starts at 190 and goes up to 600 or even 1200 or 2400. So we're talking very high resolution.
Still you see a lot moiré patterns in certain prints.

It all comes down to the rasterization. Both, offset printing and the graphics card rasterizers, use an ordered grid of sample positions. I think if some sort of disordered samplind points were used the effects of moiré could be diminished.
I played around with sample position and various antialiasing techniques in a raytracers and found that a randomized sampling helps a lot. But still we have an output on screen that is an ordered grid.
In future perhaps manufacturers WOULD be able to build displays that have an unregular pattern (or at least a good fake - like ink jet printers). Don't ask me how one could output any data on it, but I believe it's possible and would give some good results.

Even in this configuration AA would be needed, but less then today.
 
ohNe22 said:
It all comes down to the rasterization. Both, offset printing and the graphics card rasterizers, use an ordered grid of sample positions. I think if some sort of disordered samplind points were used the effects of moiré could be diminished.
Right, this is sort of what I'm trying to say. To fix moire from edges, you can't go to higher resolution (because that just changes where it happens...it doesn't eliminate it). But even ordered-grid anti-aliasing can help significantly for moire. It helps simply because in averaging over neighboring samples, you are effectively rendering at a higher resolution and eliminating any moire patterns smaller than your sample grid, while reducing the intensity of other moire patterns.

Moire patterns get even better, though, when you start to apply better sample patterns, but the only way to make it go away entirely would be to alter the sample patterns on a per-pixel basis.

One way of doing this that I discussed previously would be to make use of a sparse sample pattern on, say, a 6x6 grid, but then only make use of 4 samples per pixel. In this way, each neighboring pixel will use a set of 5 sample rows/columns that are shifted by one sample each, with typically four of the sample rows/columns being occupied, giving a simple permutation of sample positions on a per-pixel basis. Note that one every sixth pixel will have 5 samples that fall within the pixel, and thus one sample will have to be thrown out from these pixels.
 
Chalnoth said:
One way of doing this that I discussed previously would be to make use of a sparse sample pattern on, say, a 6x6 grid, but then only make use of 4 samples per pixel.
Unless you have a large number of samples per pixel, you can get some nasty visual artefacts if you change your sampling from pixel to pixel.:cry:
 
Chalnoth said:
Did you see the moire patterns in either my screenshot or Ailuros'? Try hiding that by moving further away from the screen.

Yes, I did. And thanks for posting them.

But I believe it gets a bit complicated when TV sets have engines on them that also do different forms aliasing. Which one looks at when they are buying a HDTV - or should anyway.

I don't have an answer nor even know. I was just stating an opinion based on what I see with my current computer and video cards using my Sony set.
 
radeonic2 said:
I find F12002 is very good for this.
At 1600x1200 with 8xS fsaa, it's still quite jaggie for the tarmac texture white lines as well as just models.
It's amazing to me how aliased the game is.

The only other mixed mode that works for me in F1 2002 is 4xS (and that for a long time now), so please make sure that while setting 8xS you get actually anything more than 4xMSAA in the end.
 
wireframe said:
Ailuros,

Seeing how you seem interested in the moire issues and you are familiar with Chronicles of Riddick: Escape from Butcher's Bay, I was wondering if you have done any tests with this title. On a 6800 Ultra with any Forceware, I see extremely horrible moire on almost all the walls even with "High Quality" filtering. We are talking warping cobwebs here, making the game look less than appealing. I never see this mentioned and read only praise about how good this game looks. In my experience it looks very bad because of this. The same can be said for the corrugated wall surfaces in Half Life 2. There just doesn't seem to be a way to get rid of this on Geforce 6000 hardware.

I read a lot about shimmering issues, but moire has been plaguing me much more and I am curios how this affects different hardware. It would be really interesting to compile some data on this as I, personally, find this much more distracting than some shimmering here and there. At least with shimmering there is no readily discernible pattern, but the warping patterns of moire tend to draw the eyes.


We'd really need AA within the shader for those cases, there's nothing really effective that can be done about it right now. There are a couple of nasty cases also in games like FarCry, it got a bit better when dropping to 1024 and using 16xSSAA, but it never went entirely away.

What you describe is common place on pretty much any hardware out there, with the only other difference that using "quality" vs. "high quality" (or optimisations "on" vs. "off") makes the patterns even worse.

For Chronicles for the side-effects to get a tiny bit better use either 78.03 and upwards where high quality has been fixed; the average performance drop in OGL games with those drivers isn't worth mentioning anyway.
 
Simon F said:
Unless you have a large number of samples per pixel, you can get some nasty visual artefacts if you change your sampling from pixel to pixel.:cry:

At least 16x or higher?
 
Ailuros said:
The only other mixed mode that works for me in F1 2002 is 4xS (and that for a long time now), so please make sure that while setting 8xS you get actually anything more than 4xMSAA in the end.
Well it's doing something (fps hits the dust) but it's just not enough for the game.
It's the worst at the pits.
I go damn near blind from all the aliasing.
I just tried 8xS to see if perhaps MSAA wasn't hitting the aliased portions for some reason, it did help a little but nearly enough.
Ill grab a screen cap tomorrow:smile:
 
Back
Top