Too many variables.Reverend said:What would that be?
It would take prefiltering of your object data so that it was (at least) below the Nyquist limit (minus a bit to allow for imperfect reconstruction by the display device) before sampling at the pixel level.What would that take?
A practical (near-but not entirely perfect) AA would probably get away with 128x stochastic (i.e. Poisson disc) supersampling with, say, a windowed sinc function filter. For your example, that'd be, say, 100Million samples.At, say, 1024x768, what would that be?
Pass.How can shaders help?
phenix said:Super sampling?
Simon F said:Supersampling of the original object data is not a solution for perfect AA because edges in the data => infinite frequencies => you need infinite number of samples.
Unfortunately, these monitor making people are not going to be forced into that anytime soon.RingWraith said:Couldn't they subvert this problem by enabling higher resolutions?
I don't know about that what with the whole UHDTV (7680x4320x60fps) thing. I'm sure when that becomes a standard, there will surely be an Extended-UHD spec coming up. Then when that's ratified, there'll be a HypereXtended-UHD spec... And then all our bandwidth will disappear for good, especially with all the bittorrent downloads of HyperX-UHD episodes of Naruto.Unfortunately, these monitor making people are not going to be forced into that anytime soon.
DudeMiester said:What about calculating the amount of the framebuffer pixel the pixel you're drawing will cover, and blend it into the framebuffer pixel based on the percentage of it covers. Then have automatic depth sorting to ensure everything is blended properly, via some kind of per pixel linked list of depth layers. Of course, this does nothing for AA within the objects, but you get really nice edges.