Wii games that use the deflicker filter for AA

dfi

Newcomer
Are there any games on the wii that use the HW deflicker to provide some sort of anti-aliasing?

I've played quite a few wii games, and I have to say that most of them are pretty jaggy in progressive scan and could use some sort of anti-aliasing. Since the deflicker filter basically gives you anti-aliasing with no performance penalty, you would think that developers would at least give you a menu option to turn that on in a game.

I know that quite a few gamecube games (mario golf, metroid, mario strikers, super smash bros, sonic, pikmin 2) employ this technique for anti-aliasing.

Some games such as super smash bros, sonic, and pikmin 2 even give you the option to turn the deflicker on and off.

I'm hoping that the reason this isn't done on wii games is due to inexperience and not any type of hw limitation that the wii has. I know that tiger woods 07 could have really benefited from having the deflicker filter turned on.
 
Hum.

I could be mistaken (I often am) but I think deflicker only works in display modes that actually flickers. In other words interlaced resolutions.

So when you play in progressive the screen isn't flickering. So the filter can't do anyhting.

Peace.
 
I'm running progressive scan and there is an obvious difference in super smash smash bros when deflickering is turned on (softer) and off (harder).

So its doing something in progress scan.


Hum.

I could be mistaken (I often am) but I think deflicker only works in display modes that actually flickers. In other words interlaced resolutions.

So when you play in progressive the screen isn't flickering. So the filter can't do anyhting.

Peace.
 
Ok, talking to a developer friend of mine, there are 2 specific wii/gcn API calls to turn on deflickering mode.

For interlaced displays, you call GXNtsc480IntDf().

For progressive scan displays, you call GXNtsc480ProgSoft().

Since anti-aliasing is really a side effect of deflickering and you don't actually deflicker anything in 480p, the api ends with "soft" in proscan, instead of "df" for interlaced displays.
 
Anti-Aliasing is not a side-effect of deflicker; softening is. Sure a soften filter can make aliasing less apparent, but it also softens all detail to a similar degree.
 
I remember correctly the deflicker filter simply results in a softer, slightly blurred image on a 480p display. Surely most wouldn't consider this an acceptable method of anti-aliasing. ;)
 
I remember correctly the deflicker filter simply results in a softer, slightly blurred image on a 480p display. Surely most wouldn't consider this an acceptable method of anti-aliasing. ;)

A lot of people accepted it when nVidia did it, but I agree... I wouldn't really call it AA.
 
A lot of people accepted it when nVidia did it, but I agree... I wouldn't really call it AA.

Acceptable or not, lots of gcn games used it. Metroid series, mario golf, mario strikers, super smash bros, sonic, etc.

And for the wi games I've played, I can see that sonic and the secret rings uses the deflicker filter as well as Super Monkey Ball Banna Blitz.

I personally don't think it makes the games look too soft. It makes the games look smooth while still preserving detail.
 
Quincunx isn't just a blurring of the output. It's actual anti-aliasing. It's not as accurate as supersampling or more modern multisampling algorithms, which is why it's blurrier than 4x, but it is actually antialiasing.
 
Acceptable or not, lots of gcn games used it. Metroid series, mario golf, mario strikers, super smash bros, sonic, etc.

And for the wi games I've played, I can see that sonic and the secret rings uses the deflicker filter as well as Super Monkey Ball Banna Blitz.

I personally don't think it makes the games look too soft. It makes the games look smooth while still preserving detail.

In interlaced graphics, there's a noticeable loss in quality on a good television, otherwise you won't notice. In motion, you probably won't notice it much anyway.

For progressive scan, the results are completely unacceptable. 640x480 is already blurry enough as is, it doesn't need to be made blurrier by cutting out half the resolution or whatever it is nintendo's line filter does.
 
Quincunx isn't just a blurring of the output. It's actual anti-aliasing. It's not as accurate as supersampling or more modern multisampling algorithms, which is why it's blurrier than 4x, but it is actually antialiasing.
Yes, a blur filter does remove aliasing from polygon edges. I don't think anyone really disputes that technical point, just whether the penalty involved is worth it, and whether such a method should be considered anti-aliasing even if that is a byproduct of what happens.
 
Yes, a blur filter does remove aliasing from polygon edges. I don't think anyone really disputes that technical point, just whether the penalty involved is worth it, and whether such a method should be considered anti-aliasing even if that is a byproduct of what happens.

QxAA isn't just a blur filter though, it actually uses 2 samples per fragment. What makes some things look blurry is the fact that 5 samples are used with a tent filter when filtering down.

02.jpg
 
QxAA isn't just a blur filter though, it actually uses 2 samples per fragment. What makes some things look blurry is the fact that 5 samples are used with a tent filter when filtering down.
Yeah, I guess mentally I think of Quincunx as "only" being the tent filter (blur) downsampling, as that is what makes it different from conventional AA techniques. It does include conventional AA, but that doesn't remove that fact that it also blurs.

So get rid of the association with any particular name, and my original statement stands. Blurring was accepted by a lot of people with nVidia did it.
 
Yeah, I guess mentally I think of Quincunx as "only" being the tent filter (blur) downsampling, as that is what makes it different from conventional AA techniques.

SSAA downsamples, and as far as I know, so do pretty much all MSAA techniques. That's how AA works: You have samples of some sort, and you average their values to get a pixel. The big difference with Quincunx isn't that it averages the values of samples while the others do something mathematically radically different, but that 3 of the 5 samples used are shared with neighboring pixels. That's why you get blurring--data is shared among pixels, and 4 of the 5 samples used are at the very edge of the pixel rather than inside the boundary.

If I am completely wrong here, feel free to correct me.
 
Yes, that's correct, and I understand what Quincunx did. It combined 2x MSAA with a blur across adjacent pixels. I'm pretty sure what I said previously was exactly that.

Quincunx may have included "real" AA, but the fact is that nVidia did implement a blur filter, which was my original point, long lost in this nonsensical quibbling over semantics.
 
Back
Top