AA/AF enhancements

no_way said:
Anyone willing to bet that the _AA solution_ will come from PowerVR ? Probably some kind of coverage mask AA. Should be reasonable to do on chip with TBR architecture.

I think I'd loose that bet, at least for now (concerning exotic AA algorithms).
 
Althornin said:
Also, the only "bad" case comes at the angle which benefits LEAST from AA anyways - the perfect 45 degree (off of any axis) angle looks the best anyways. Jaggies are most irritating when there is a nice horizon, with like 3-4 jaggies racing across the top of it as you move around.
Very near 45 degrees is the second worst spot, next to very near horizontal/vertical. You can get "jaggies" that look something like:
Code:
  x
    x
      x
         x
           x
             x
                x
                  x
                    x
...and so on.
 
By the way, I just thought of something that might be a good way of implementing per-pixel sample pattern changing.

Imagine just taking a sparse-sampled pattern, and only using part of it per pixel. For example, one could take 4 samples of 6-sample sparse pattern within a single pixel, letting the remaining two samples "bleed" into the next pixel.

Here's basically what I mean. Instead of using sample patterns like the Radeon 9700 does, as shown:
Radeon6x-6x.jpg


I'll divide up the same sample pattern differently, to get this:
Radeon6x-4x.jpg


The second method would essentially be a 4x method, but there is a kink that would have to be worked out in the hardware: there is the occasional pixel with 5 samples. This might be dealt with by simply modifying this 5-sample pixel by removing one of the samples (I would remove the center sample).

Anyway, it's just an idea.

(one thing to note, this idea could easily be generalized to any number of samples)
 
Chalnoth said:
By the way, I just thought of something that might be a good way of implementing per-pixel sample pattern changing.

Imagine just taking a sparse-sampled pattern, and only using part of it per pixel. For example, one could take 4 samples of 6-sample sparse pattern within a single pixel, letting the remaining two samples "bleed" into the next pixel.

Here's basically what I mean. Instead of using sample patterns like the Radeon 9700 does, as shown:
Radeon6x-6x.jpg


I'll divide up the same sample pattern differently, to get this:
Radeon6x-4x.jpg


The second method would essentially be a 4x method, but there is a kink that would have to be worked out in the hardware: there is the occasional pixel with 5 samples. This might be dealt with by simply modifying this 5-sample pixel by removing one of the samples (I would remove the center sample).

Anyway, it's just an idea.

(one thing to note, this idea could easily be generalized to any number of samples)
Your example fails because the texture samples aren't at the center of the pixel.
 
Chalnoth said:
By the way, I just thought of something that might be a good way of implementing per-pixel sample pattern changing.

Imagine just taking a sparse-sampled pattern, and only using part of it per pixel. For example, one could take 4 samples of 6-sample sparse pattern within a single pixel, letting the remaining two samples "bleed" into the next pixel.

Here's basically what I mean. Instead of using sample patterns like the Radeon 9700 does, as shown:
http://67.118.214.186/Radeon6x-6x.jpg

I'll divide up the same sample pattern differently, to get this:
http://67.118.214.186/Radeon6x-4x.jpg

The second method would essentially be a 4x method, but there is a kink that would have to be worked out in the hardware: there is the occasional pixel with 5 samples. This might be dealt with by simply modifying this 5-sample pixel by removing one of the samples (I would remove the center sample).

Anyway, it's just an idea.

(one thing to note, this idea could easily be generalized to any number of samples)

This reminds me of this paper :
interleaved.jpg
 
OpenGL guy said:
Your example fails because the texture samples aren't at the center of the pixel.
Um...obviously those would be in the center of each pixel in a real-world algorithm. Have some imagination. Those texture samples only weren't in the center of each pixel because of where I took the picture from.
 
Chalnoth said:
OpenGL guy said:
Your example fails because the texture samples aren't at the center of the pixel.
Um...obviously those would be in the center of each pixel in a real-world algorithm. Have some imagination. Those texture samples only weren't in the center of each pixel because of where I took the picture from.
Actually, the samples were in the center in the original picture. Anyway, I don't think your idea is very reasonable because of the corner cases. You need to have a consistent number of samples per pixel else you'll have all sorts of problems with compression and other Z related parts (i.e. how many depth compares do you need? If it can be different per pixel then that can cause weird problems.)

If you can find a way to avoid corner cases, then it will be an interesting idea.
 
OpenGL guy said:
Actually, the samples were in the center in the original picture. Anyway, I don't think your idea is very reasonable because of the corner cases. You need to have a consistent number of samples per pixel else you'll have all sorts of problems with compression and other Z related parts (i.e. how many depth compares do you need? If it can be different per pixel then that can cause weird problems.)

If you can find a way to avoid corner cases, then it will be an interesting idea.
Right, there is the occasional pixel with 5 samples instead of 4. The easiest way to fix it would be to just throw out the extra sample. There are a number of ways to do this. Perhaps the best would be to always throw out the sample in the central row (or column). Even easier may be to just keep the first four that the hardware algorithm calculates, whatever that calculation order may be.

And this algorithm is, btw, functionally identical to that that is in the paper Ilfrin linked. The only difference is that mine results in a longer repetition time with a smaller memory footprint, but adds the need to deal with corner cases.
 
OpenGL guy, okay, so the R300 can't change the AA sample positions from one pixel to the next. But could you change it from one frame to the next? I mean if we look at a game screen with maybe 100 Hz and with 100 FPS, changing the AA sample positions from frame to frame would be some kind of "time dithering". I guess that it would look very good.

I remember back when I had an Atari ST, which only did black&white, there was a program which was able to show gray colors by switching between black&white pixels in 70Hz. If the R300 would change the sample positions with each frame, it would be a similar effect. It could give the impression of a higher sample count.

What do you think? :p
 
madshi said:
OpenGL guy, okay, so the R300 can't change the AA sample positions from one pixel to the next. But could you change it from one frame to the next? I mean if we look at a game screen with maybe 100 Hz and with 100 FPS, changing the AA sample positions from frame to frame would be some kind of "time dithering". I guess that it would look very good.
I'm not so sure. If you alternated frames, that would be 50 Hz which would flicker...
 
OpenGL guy said:
madshi said:
OpenGL guy, okay, so the R300 can't change the AA sample positions from one pixel to the next. But could you change it from one frame to the next? I mean if we look at a game screen with maybe 100 Hz and with 100 FPS, changing the AA sample positions from frame to frame would be some kind of "time dithering". I guess that it would look very good.
I'm not so sure. If you alternated frames, that would be 50 Hz which would flicker...
I don't think the flicker would be perceptible. Remember that these are much lower brightness differences than the black-white change at screen refresh.
 
OpenGL guy said:
I'm not so sure. If you alternated frames, that would be 50 Hz which would flicker...
Yes, but the differences in the pixel colors would only be small. And only edge pixels would show the effect. I don't think that the eye would see that as flickering. Or what do you think? Would it not be worth a try? :D I could imagine that it quite significantly increased edge antialiasing!
 
OpenGL guy said:
RussSchultz said:
Actaully, I was aiming at seeding it with a deterministic pseudo-random sequence so that the sampling pattern is the same frame to frame, but different pixel to pixel. That should help break up patterns that form on angles that approach the critical ones.
Yeah that would be interesting, but we can't do that yet. Also, I'm not sure how useful it would be because neighoring pixels might have some weird interactions (samples may not be evenly spaced).

What Russ is asking for could effectively emulate stochastic sampling - this would transform aliasing (appearing as low frequencies) into high frequency noise which should look much better. There's a good description of what occurs in "the bible according to Foley, van Dam, Feiner, & Hughes".

RussSchultz said:
Now, what would be mighty cool to demonstrate the goodness of your pattern vs. that of the evil heinous competitor,
A little TLA springs to mind.... FFT.

Deleted to protect the guilty said:
I think I'd loose that bet
Please guys, "lose" not "loose". The first is "to misplace something", the second is "not tight". I'm seeing this mistake so often now even I'm beginning to lose my mind!!! :oops:
 
What Russ is asking for could effectively emulate stochastic sampling - this would transform aliasing (appearing as low frequencies) into high frequency noise which should look much better. There's a good description of what occurs in "the bible according to Foley, van Dam, Feiner, & Hughes".

We are trying to get rid of aliasing full stop, not semi-mask it. :LOL:
 
Simon F said:
Please guys, "lose" not "loose". The first is "to misplace something", the second is "not tight". I'm seeing this mistake so often now even I'm beginning to lose my mind!!! :oops:

Or are you just letting it loose?
 
K.I.L.E.R said:
What Russ is asking for could effectively emulate stochastic sampling - this would transform aliasing (appearing as low frequencies) into high frequency noise which should look much better. There's a good description of what occurs in "the bible according to Foley, van Dam, Feiner, & Hughes".

We are trying to get rid of aliasing full stop, not semi-mask it. :LOL:
Then "we" are "on a wild goose chase". Finite sampling will always have aliasing when something containing higher frequencies (eg edges of polygons) is sampled. I suggest reading the aforementioned book.

BoardBonobo said:
Simon F said:
Please guys, "lose" not "loose". The first is "to misplace something", the second is "not tight". I'm seeing this mistake so often now even I'm beginning to lose my mind!!! :oops:

Or are you just letting it loose?
Well, it is beginning to rattle a bit :)
 
Then "we" are "on a wild goose chase". Finite sampling will always have aliasing when something containing higher frequencies (eg edges of polygons) is sampled. I suggest reading the aforementioned book.

With current technology we are on a wild goose chase maybe. We may not be able to break the rules of physics but we sure as hell can bend them. :)

My outlook may not seem realistic but in 5 billion years time, we will see who's laughing. :)

BTW: Where can I read the book online? Or do I really have to borrow from a library or buy it?
 
mboeller said:
All this talk about sampling patterns for AA. How about an really weird idea?

Link :

http://www.8ung.at/mboeller/picture.html



have fun :)
I think you're making the mistake of considering the texture sampling point a real subsample, although it only determines where the color (or other texture data) for the two MS sampling points is taken.
 
People always say that varying the sample pattern from one pixel to the next is a good thing, but I'm not sure I buy that. In some images I've seen in the past the noise looked worse than the smooth gradient of a constant sample pattern. Before someone asks, I can't link to those images because they no longer exist.

Maybe there are a certain number of sample points necessary for the varied pattern's noise to look better than the smoothness of the constant pattern. I haven't read Foley's book in a while and I don't have a copy so it's possible the book explains this.
 
Back
Top