Anti-Aliasing in movies?

Metalmurphy said:
I'm aware that AA is done by rendering an image at an higher resolution and then displaying it at a lower one. That alone proves it can't be done in movies? How can you "render" a movie at an higher resolution? oO You can't. The closest thing you can do is upscale it, but when you show it again @ the lower resolution it will just look the same as the original right?
Here's another explanation of AA to help understand what the problem is and how you fix it. Aliasing is where the individual pixels of a pixelated diplay can be observed as discrete changes. These gives you the stair look along edges and shimmer on textures. Aliasing appears in computer graphics where each pixel has it's colour determined from looking at the 3D scene and just seeing the colour of the polygon underneath. If you have a black background and a white triangle, you get white pixels where that triangle is and the jaggies. That is, you only sample one point per pixel to determine that whole pixel's colour.

Antialiasing increases the number of samples per pixel, increasing the 'information density'. If you subdivide a pixel up into 4 quarters and take a sample from each of those, you can average them to prtray an approximation of the information held therein. In the case of the white tri on the black background, the pixels along the edge would be one of 4 colours, shades of grey through to white. This decreases the jump in intensity between adjacent pixels, and makes the jaggies less noticeable. It's worth pointing out that the step is still there due to discrete pixels, but when you decrease the difference between adjacent pixels, that step becomes less noticeable.

You can increase sampling of pixel information in various ways. One is supersampling, or downscaling, where you take a larger image and shrink it to a smaller resolution, averaging the values of adjacent pixels. From 1080p to 720p, you're only getting 2 samples per pixel. An improvement, but not great. This does apply to textures too though and can have a good effect there. From 1080p to SDTV, you've got 6 samples per pixel, and the IQ will be very good, but of course the fidelity of the TV is less...

When recording a film, you get very large samples per pixel. The colour information is carried on photons and for each pixel in the image, represents an awful lot of samples. When scanning film, likewise you get very high resolution sources. The quality of sampling is so good that even on an SD display, you can make out individual hairs on a person's head. This is because the human brain does a lot of filling in, and instead of seeing a collection of coloured squares, it sees averages and approximations of things it can assigned a real image to. When producing a CGI movie, it is rendered with high antialising. Laa-yosh is the guy to comment here on the average levels and methods used. You're rendering with multiple samples per pixel to elliminate the visible jaggies. When shown on a movie player, both will look fantastic as both have this very high level of sampling per pixel.

It is possible to get aliasing on filmed sources. Car programs are good examples where you have filming inside a car and a bright day outside. The contrast between window frame and outside brightness often leads to jaggies, especially on interlaced SD feeds. This is because the brightness contrast results in discrete light and dark pixels.

Now potentially, if you have a high res sources, like 1080p, of a filmed subject with discrete pixels like that car example, when you downsample to an SDTV display, it would average out pixels and give you an extra level of AA (because the data is in low dynamic range. If the source was high dynamic range you woudln't be any better off, but we're not there yet so no worries!). Thus it can be said that XB360 can add AA to a movie, because downsampling 1080p to 720p increases the number of samples per pixel, and helps reduce the visibility of discrete pixels. This will only work if the jaggies are visible in the source, though. Adding AA onto an already non-aliased scene isn't going to add anything. Are there any 1080 movies with jaggies that would benefit? I don't know.

But, where downsampling increases AA, it decreases fidelity. You have less pixels and less detail. You friend has the option to play 720p games on an SDTV right, which will add lots of AA (well, it should do, but some games seem to render at SD resolutions from what I hear). Does he choose to do that, or prefer the HD output, jaggies and all? On you're PC do you prefer playing games at 800x600 with 16xAA, or higher resolutions and less AA?

Aliasing is a side effect of pixelated displays and capture methods. As you increase pixel resolution, and decrease information per pixel (which isn't an issue on film) you trade fidelity for potential aliasing. Downsampling will help reduce jaggies from a higher resolution source, but that produces a softer, blurrier image. Ideally we want high resolution and antialising. You can't get better AA than filming that's taking crazy numbers of samples of light per pixel, and the idea of deliberately trying to AA films by downsampling is quite frankly ridiculous. That's like choosing to watch HD movies downsampled to an SD set to add antialising. Films don't need AA. The source material is already antialised and you want to benefit from the fidelity of the higher resolution display! The only situation that argument has a case, is if aliasing appears in the movie. But again, there's a case for watching HD movies on SD sets, in which case, if that's what people want, why are we wasting our time with HD?
 
Metalmurphy said:
I know your probably thinking this is a stupid thread :p But i just need your help in something.

I just got into a fight with a guy that says that the 360 can do AA in movies. Obviously that just rediculous. However he truly believes that he can and now hes not the only one, im almost the only one there that knows that its impossible (or is it)?

I'm aware that AA is done by rendering an image at an higher resolution and then displaying it at a lower one. That alone proves it can't be done in movies? How can you "render" a movie at an higher resolution? oO You can't. The closest thing you can do is upscale it, but when you show it again @ the lower resolution it will just look the same as the original right?

Then he said "Well if the source is at 1080p and the 360 shows it at 720p there will be AA" This is also false right? I mean thats not really AA thats just downscaling. For AA you HAVE to render it at an higher resolution first right?

Anyway what i wanted you guys to help me out (the more the merrior becouse if only 1 guy posts they'll just say that he doesnt know anything) is by posting in technical terms why AA in movies is simply rediculous. The more i try to explain to him the more he tries to make me sound redicolous and everyone else thinks he's the right one oO.

Thanks in advanced.

(Btw i wasn't sure if this should be posted here)


Ok, i think i get what your friend is trying to say.

Interlaced video does have annoying aliasing artifacts if displayed on a progressive screen like a monitor.

Probably what your friend is trying to say is that the xbox360 can deinterlace the movies and avoid these aliasing artefacts.

So what your friend is talking about is deinterlacing, not anti aliasing.

Bye,
Ventresca.
 
Last edited by a moderator:
Chalnoth said:
Any non-square resolution change will degrade the quality of the image in this context. 720p content displayed at 720p resolution is likely to look better than 1080p content displayed at 720p resolution.
Taking 1080p and downsampling it to 720p can look much better than working at 720p natively. Computer generated examples with no AA or AF highlight this difference clearly:

720p native

720p downscaled from 1080p with bicubic resampling.

Obviously the difference wouldn't be so profound if plenty of AA and AF is used, and with film of real life rather than CD the difference would be even less, but there is certainly nothing wrong with the non-square resolution change to make to look in any way degraded in comparison to the native one.

Chalnoth said:
No, it never will, not if both come from the same even higher-resolution content. Think about it this way: if you start from some very high resolution (say, 4000p or somesuch), and downsample straight to 720p, it will look better than if you first downsample to 1080p, and then downsample again to 720p.
There will be difference when downsampling in two steps rather than one, but I can't say I've ever seen a situation where the latter inherently "looks better" as you claim. Can you provide an image to use as an example to illustrate your case?
 
Wow, i ended up learning alot more then what i first asked for :p

Thanks for all the amazing explanations, it seems we we're both wrong, i was wrong becouse it actually is possible for a movie to be AA, in the true sense of the word, he however was wrong becouse he didnt knew that as well, he was talking about the same type of AA used in games.


Once again thanks for the wonderfull explanations, im alot more enlightened in this area now ;)
 
Kyle, we're talking about movies, not 3D graphics here. We're not talking about rendering 3D straight to 720p, we're talking about effectively rendering to some massive resolution, then downsampling to 720p. The image you showed will obviously show an improvement, because the 1080p source came from a higher-resolution source.

I'm sorry, but poring over images isn't going to help here at all. The difference is mathematically distinct, provided we assume that the people who designed the HD content went straight from the source art to generate each level of content, or at least used high-quality filters (which won't always be the case in consumer electronics when downsampling).
 
I directly acknowledge the difference between movies and 3D graphics; but my examples show quite clearly, contrary to your claim, a case where 720p content displayed at 720p resolution looks notably worse than 1080p content displayed at 720p resolution. So my question to you is; where is this "looks better" you speak of, your arguments of mathematics hold no weight as long as you cannot produce real world examples to back your claims.
 
Standard movie CG resolution is 2048*1536 pixels or so; but top and bottom gets cut off for the widescreen layout.
Antialiasing isn't simple brute force supersampling in most cases, but rather contrast-sensitive adaptive AA. Some renderers support decoupling of shading samples and geometry samples. Also, rendering technical directors usually adjust AA levels for their scenes individually, based on the requirements of the shot (complexity of geometry, shaders, displacements, etc.).

We typically use PRMan settings of a 0.25 shading rate and 10/10 pixel samples.
As far as I know it means that every piece of geometry gets subdivided into micropolygons smaller than 1/4th of a pixel. PRMan performs shading only on micropolygon vertices so this means that shading is performed 4 times per pixel. Then PRMan takes 100 samples from the whole pixel, using a stochastic pattern.

Many other movie effects use Mental Ray for rendering, like Posseidon's water or some Neo vs. Smith scenes in the Matrix sequels. MR uses adaptive AA with 1, 4, 16, 64... jittered samples per pixel, but decides the actual number based on contrast. Shading and geometry are sampled together as far as I know.
 
"CGI for films is usually rendered at about 1.4–6 megapixels. Toy Story, for example, was rendered at 1536 × 922 (1.42MP)."

http://en.wikipedia.org/wiki/Computer-generated_imagery

Rendering resolutions for CG movies go well higher than 2048x1536, and like Toy Story they often are not rendered with top and bottom gets cut off for the widescreen layout, but rather anamorphically to emphasis the vertical resolution which our eyes are more sensitive to.
 
Last edited by a moderator:
kyleb said:
I directly acknowledge the difference between movies and 3D graphics; but my examples show quite clearly, contrary to your claim, a case where 720p content displayed at 720p resolution looks notably worse than 1080p content displayed at 720p resolution. So my question to you is; where is this "looks better" you speak of, your arguments of mathematics hold no weight as long as you cannot produce real world examples to back your claims.

The problem with both sides of the argument are that "looks worse" and "looks better" are subjective.

I might argue that the 720p example you posted *does* look better, because it is more accurately representing the information in the picture, with nice sharp edges, and not overly blurred textures. The original 1080p picture which was down-sampled, would've had sharper features - you've lost those in downsampling.

The original statement should have probably been worded in terms of accuracy, rather than subjective quality. So long as it's subjective, there can be no single definitive answer.
 
kyleb said:
"CGI for films is usually rendered at about 1.4–6 megapixels. Toy Story, for example, was rendered at 1536 × 922 (1.42MP)."

http://en.wikipedia.org/wiki/Computer-generated_imagery

Rendering resolutions for CG movies go well higher than 2048x1536, and like Toy Story they often are not rendered with top and bottom gets cut off for the widescreen layout, but rather anthropomorphically to emphasis the vertical resolution which our eyes are more sensitive to.

I suspect you mean anamorphic... anthropomorphism would be the typical content of a CGI movie, and nothing to do with the rendering itself :)
 
MrWibble said:
The problem with both sides of the argument are that "looks worse" and "looks better" are subjective.

I might argue that the 720p example you posted *does* look better, because it is more accurately representing the information in the picture, with nice sharp edges, and not overly blurred textures. The original 1080p picture which was down-sampled, would've had sharper features - you've lost those in downsampling.

The original statement should have probably been worded in terms of accuracy, rather than subjective quality. So long as it's subjective, there can be no single definitive answer.
The original 1080p picture does have sharper features which were lost in downsampling; that is objective reality. Yet all that proves is that, with all else being equal, higher resolution renders have sharper features than over resolution ones; the fact you speak of does nothing to prove that an image produced at a given resolution will look better than an image produced at a higher resolution and downsapled to the lower one. So you say that you could contest that the 720p looks better, but then again I could say the sky looks green. Granted, I'm not going to say the sky looks green as I don't have any reasonble argument to back that up; but what about your argument.

And yeah, anamorphic, I'll fix that. :oops:
 
kyleb said:
I directly acknowledge the difference between movies and 3D graphics; but my examples show quite clearly, contrary to your claim, a case where 720p content displayed at 720p resolution looks notably worse than 1080p content displayed at 720p resolution. So my question to you is; where is this "looks better" you speak of, your arguments of mathematics hold no weight as long as you cannot produce real world examples to back your claims.
Yes, but the situation you presented has absolutely nothing to do with the situation at hand:
Would a movie on 1080p downsampled to 720p look better than the same movie at 720p?

I still claim there's no way it could look better, and, if anything, it would look worse. Basicaly, you'd get some blurring, but since the 720p content is already anti-aliased to a very high degree, you get no improvement in anti-aliasing.
 
kyleb said:
The original 1080p picture does have sharper features which were lost in downsampling; that is objective reality. Yet all that proves is that, with all else being equal, higher resolution renders have sharper features than over resolution ones; the fact you speak of does nothing to prove that an image produced at a given resolution will look better than an image produced at a higher resolution and downsapled to the lower one.

I wasn't attempting to prove or disprove either side of your argument - I was pointing out that you were arguing over something that was subjective. The 720p picture has sharp edges - the scaled one does not even though the 1080p source did. Therefore my argument is clear - if sharp edges are desired, the 720p one is "better". If sharp edges are not desired, but merely an artifact of rendering, then the 1080p is "better".

An even more arguable point would be the texture quality - in your 720p image the texture is filtered directly to screen resolution, whereas the scaled one filters to 1080p and then is re-filtered down to 720p.

Also, you're cheating - you didn't just do a simple downsample, you did a bicubic filter - if you're going to apply convolutions to the image, it would be fair to filter the 720p image too, which could potentially improve many of the artefacts.

But that's just my opinion - which was kind of my point.

So you say that you could contest that the 720p looks better, but then again I could say the sky looks green. Granted, I'm not going to say the sky looks green as I don't have any reasonble argument to back that up; but what about your argument.

Er, the colour of the sky is provable fact (provided atmospheric conditions haven't managed to *actually* turn it green). You don't need to an argument to support it being another colour, you need evidence. Whether I like one image more than another is strictly opinion, no matter what arguments I present to support it.
 
I'm sorry, MrWibble, but the downsampled image that kyleb presented is unequivocally better than the native 720p image (the textures are blurrier, there's less detail, and there would be more shimmering in motion). It's just that that comparison has no bearing on this thread (which is about movies, not aliased 3D content).
 
Chalnoth said:
I'm sorry, MrWibble, but the downsampled image that kyleb presented is unequivocally better than the native 720p image (the textures are blurrier, there's less detail, and there would be more shimmering in motion). It's just that that comparison has no bearing on this thread (which is about movies, not aliased 3D content).

You can list reasons why you prefer it, and you may share your opinion with 99.99% of the population on the planet - I may even agree with you all - but it's still just an opinion. There are arguable reasons that someone might prefer the 720p one, and as soon as there's any argument you can't claim it's "unequivocally better".

Opinion does not become fact no matter how many people might share it.

There is considerably less aliasing in the downscaled version - this much might be considered a fact. Whether or not that is actually better is not. Perhaps those pixels were important. Perhaps they're the nice crisp edge of something horizontal and applying AA is not an improvement.

And it does kind of relate to the thread - my straw-man argument (which no-one has actually countered) is that if you lose information in the downscaling process it's arguably better to 1:1 map a lower-resolution version rather than downscale a higher one.
 
Not in the least. All that you need to do is place objective measurements on what it is to be better. The 1080p downsampled to 720p image is better because it is a more accurate representation of the infinite-dimensional content which the hardware is attempting to display.
 
MrWibble said:
You can list reasons why you prefer it, and you may share your opinion with 99.99% of the population on the planet - I may even agree with you all - but it's still just an opinion.
We are all just subjective beings so when it comes right down to it so all we truly have is opinion, but objective reality is all around us rational opinions only follow from that. In this case you claim:
MrWibble said:
An even more arguable point would be the texture quality - in your 720p image the texture is filtered directly to screen resolution, whereas the scaled one filters to 1080p and then is re-filtered down to 720p.
I can circle all sorts of stuff all over those shots to show quite clearly the opposite of what you claim here; but can you circle a spot where you see reason to conclude otherwise?
 
Chalnoth said:
Not in the least. All that you need to do is place objective measurements on what it is to be better. The 1080p downsampled to 720p image is better because it is a more accurate representation of the infinite-dimensional content which the hardware is attempting to display.

As I said further up - the argument was not phrased in such a way as to have a solid basis to weigh one against the other.

I might also say the 720p is better because it is a more faithful representation of what was actually rendered.

So if the goal is accuracy to the source image, the winner might be different than if the goal is accuracy to some implied target image.

When simply asked "which image is better" everyone will apply slightly different criteria in making a choice.
 
kyleb said:
We are all just subjective beings so when it comes right down to it so all we truly have is opinion, but objective reality is all around us rational opinions only follow from that. In this case you claim:

I can circle all sorts of stuff all over those shots to show quite clearly the opposite of what you claim here; but can you circle a spot where you see reason to conclude otherwise?

It is tricky when the jpeg artefacts on both images corrupt the detail.

However I see more apparent detail in the poster to the right of center, on the 720p native version.
 
Back
Top