In many gameplay videos as well as my personal memory, n64 games with antialiasing enabled were always messy lookings. I originally thought this was due to some quick antialias hack back in the early days, which trades performance with sharpness (maybe applied a full screen blur or etc.)
However, after I read through related info, it seems not to be the case.
As far as I understand (sources from ultra64’s programming manual on n64), and to my surprise, n64’s hardware antialiasing works very similar to 8xMSAA with a checkerboard subpixel offset pattern. And the blender unit would proper resolve the polygon colors with each other and the background given the coverage values.
If that’s the case, then only polygon edges will be blurred, and we shouldn’t see a messy image. However, in actual games like goldeneye, with antialiasing on, it seems like the ground texture is also weirdly blurred (I believe those pixels are inside a triangle). Here’s a comparison video video
So, do I misunderstand anything here? Or did most developers choose to implement their own software antialiasing solution due to other concerns (maybe performance? I know n64 is hungry on bandwidth)? Or is this “blur” not related to antialiasing at all (maybe the video output signal is incorrectly processed/compression artifacts)?
Also, a side question is that where does n64 store its frame buffer and zbuffer? I know n64 used the unified memory architecture. But does the RSP include any sort of cache to quickly access the frame buffer and zbuffer, or they all locate on the main RDRam memory, and has a super high latency to access?
However, after I read through related info, it seems not to be the case.
As far as I understand (sources from ultra64’s programming manual on n64), and to my surprise, n64’s hardware antialiasing works very similar to 8xMSAA with a checkerboard subpixel offset pattern. And the blender unit would proper resolve the polygon colors with each other and the background given the coverage values.
If that’s the case, then only polygon edges will be blurred, and we shouldn’t see a messy image. However, in actual games like goldeneye, with antialiasing on, it seems like the ground texture is also weirdly blurred (I believe those pixels are inside a triangle). Here’s a comparison video video
So, do I misunderstand anything here? Or did most developers choose to implement their own software antialiasing solution due to other concerns (maybe performance? I know n64 is hungry on bandwidth)? Or is this “blur” not related to antialiasing at all (maybe the video output signal is incorrectly processed/compression artifacts)?
Also, a side question is that where does n64 store its frame buffer and zbuffer? I know n64 used the unified memory architecture. But does the RSP include any sort of cache to quickly access the frame buffer and zbuffer, or they all locate on the main RDRam memory, and has a super high latency to access?