Question about anti-aliasing across consoles: N64 to now.

What is the deal with anti-aliasing on consoles? I know that not all N64 games use it, but from what I can tell, the N64 has this feature built in or what? Someone said in another thread (that I can't find) that the N64's anti-aliasing is closer the real AA compared to 360/PS3. Huh? What does he mean by that?

What's the deal with the PS2? I remember an issue in EGM says it didn't have AA, but developers have found their way around it through software. Is it like the N64, where only some game use it while others don't? I remember how ugly the first PS2 games look, all the later PS2 games don't look as bad. Maybe I just got used to it?

Can the 360/PS3 actually afford to use AA in more games than previous generations? I read somewhere they don't have the luxury either.
 
Super Sampling like the voodoo4/5 and earlier nvidia and ati cards is basicly just internaly rendering a higher resolution and down sampling it. . Its the best form for image sampling but obviously it gets to be a really big system drain as you move up resolutions. Later generation of ati and nvidia cards like the radeon 9700 series started using sample based fsaa. That used less power but it was up to the pattern as to how good it was . This enabled higher numbers of samples used but some things like alpha textures weren't anti aliasied properly. Newer cards use both super sampling and other types of sampling. You would normaly go with a smaller super sample size but a higher pattern sample and you get the best of both worlds with a hit not as large as super sampling. I'm sure someone can explain it better. The n64 and ps2 would have to use super sampling. The ps2 had a huge fillrate though for its time and thats most likely how they were able to do it , though i wonder how the framebuffer would work and if it would affect games negatively. I'm sure a developer of the ps2 can explain that better. The xbox 360 uses a rotated grid sample I think like the r300-r520 series cards. I believe it can do 2 or 4x fsaa . however it affects the buffer size.

beyond3d.com has a good article on the xenos and here it shows frambuffer sizes with fsaa on and off
http://www.beyond3d.com/content/articles/4/5
2x 4x
1280x720 7.0 14.0 28.1

As you can see each level of fsaa doubles the framebuffer size. On a console with only 522mbs of ram you an see that increasing the fsaa size will limit the system in other ways.


Like I said there are some here that can easily go way more advanced on this than I can , but I hope i helped you out a little untill others can explain it better.
 
I thought n64 used edge AA? It's probably still supersampling, but it had something in place to selectively apply it.
 
N64 used edge antialiasing using Wu line antialiasing algorithm or something similar.

Good thing is that it is basically perfect in terms of AA gradient and involves no more sampling than normal rendering.
Bad thing, it needs sorted back to front polygons or order independent rendering and it cannot be properly applied to smaller than one pixel polygons.
Also there are some small errors at line ends.

Almost all GPUs support this method, including Voodoo, verite, anything fron nvidia & ati, GS.

On ps2 the spinning cubes at startup and music player uses this algorithm as well.
I always wondered why this method wasn't used in some scenes & edges on ps2.

Some games on Ps2 like BaldursGate: DA, used plain brute force SSAA.
 
jlippo said:
On ps2 the spinning cubes at startup and music player uses this algorithm as well.
I always wondered why this method wasn't used in some scenes & edges on ps2.
It was, but the sorting requirements make it really tricky to use often. You either run awfully slow, or have to live with sorting artifacts that may or may not be possible to minimize majority of the time. So it comes down to the question what type of artifacts are less noticeable (and PS2 main issue was texture, not edge aliasing anyway).

I remember Jak 2 demo used it in cutscenes (it was very clearly visible because of edge sorting artifacts popping up every now and then), but I'm not sure if they left it on for final game.
 
Super Sampling like the voodoo4/5 and earlier nvidia and ati cards is basicly just internaly rendering a higher resolution and down sampling it. .

I'm pretty sure Voodoos used MSAA?
 
I'm pretty sure Voodoos used MSAA?

No Voodoo that made it to shelves ever supported Multisampling. The VSA-100 (Voodoo4/5) sported the so called "T-buffer" and rotated grid Supersampling unlike most other GPUs of the time that supported ordered grid supersampling. Only its successor the mythical Rampage (which was never released) had the M-buffer and thus rotated grid multisampling.
 
2x2 ogss isn't as good as 4x rgms is for geometry.

But it can purge shader aliasing and it does aa alpha textures.

Plus it's 100% compatible.

I've always wondered why nvidia and ati never copied the voodoo5's method though with 4 or 8 samples.

I believe the geforce fx had a 2x rgss mode and that early superaa modes mixed 2x rgss with msaa. I remember that was the only one nvidia should've even spent transistors on with the geforce fx, but I still liked the Geforce FX better than the 9800 pro. I'm probably the only one here that did, however.
 
Back
Top