X360 and Anti Aliasing?

Ya that's what I was thinking too, your steps are now 2 pixels tall, and they are just as visible to the eye as before.

What's more, you cantake that image, upscale it 500%, and the jaggies are still there, perfectly preserved.
 
Shifty Geezer said:
That's not any degree of AA and nothing I'd be happy with. AA should blend between steps whereas that scaling shows a uniform extra pixel layer being added onto some steps, actually making them larger than one pixel deep :oops:
I left the upscaled image at it's full size so you can see what is happening pixel-for-pixel assuming your display is running pixel-for-pixel, but the steps wouldn't be any larger in the case of actually comparing an image fullscreened on two displays of equal size with one matching the resolution of the image and one having a slightly higher native resolution. Also note that the shallow grade of the line and the stark contrast between black and white were used to demonstrate basically a worse case scenario; lines closer to 45 degrees and with less contrast would be better smoothed by the upscaling. Again, I'm not trying to pass this off as any alternative to real AA, just pointing out the way upscaling can do a bit to smooth out the jaggies.
 
kyleb said:
I left the upscaled image at it's full size so you can see what is happening pixel-for-pixel assuming your display is running pixel-for-pixel, but the steps wouldn't be any larger in the case of actually comparing an image fullscreened on two displays of equal size
True. It'd perhaps be better if you image showed a monochrome gradient at one size, and a smaller monochrome gradient upscaled to the same size. Maybe.
black and white were used to demonstrate basically a worse case scenario; lines closer to 45 degrees and with less contrast would be better smoothed by the upscaling
Well 45 degrees doesn't need AA IMO!. It's those larger steps that really strike home.
 
fireshot said:
*nothing is free in 3d. ms refers to the "free" aa of xenos relative to normal framebuffer.
Tons of thing are "free" in 3D. Linear filtering, texture mapping, perspective correction, lighting...

What's free and what's not are decided by the architecture and the software implementation (If a game is shader bound, some AA, for instance, could be activated for "free").

And about your second sentence, what do you consider a normal framebuffer?
Because from what I did understand any engine that is not compatible with the tilling that occurs in the eDRAM cannot activate any AA.
 
Vysez said:
Tons of thing are "free" in 3D. Linear filtering, texture mapping, perspective correction, lighting....
They are only "free" because you have already paid for them.:rolleyes:
 
Rockster said:
I find it amusing how some will judge a console's hardware capabilities based on a few launch titles. I guess the PS2's hardware prowess was showcased by games like Ridge Racer 5, Kessen, and The Bouncer.

What I find amusing about this post, is that putting aside the deficent gameplay, the Bouncer was one of the best looking PS2 titles for a very long time. One of the few that wasn't overwhelmed by the looks of MGS2.

In fact one of B3D most ardent Sony critics a poster called CHAP, used to always cite the Bouncer as being one of the graphically strong titles on the PS2. He especially like the implementation of motion blur that gave the game a film like look.

It's a shame the gameplay was so poorly implemented.
 
Speaking of motion blur, I've never seen a film that featured motion blur throughout the whole film all the time.
The games that use motion blur use it too much, too visibly. That's not realistic, that's not even "film like", that's just making up the inadequancies of the game engine.
Motion blur in games make even less sense than lens flare.
While lens flare was used to simultate an ill-effect of film and still camera lenses as if you were viewing the game through a film camera, as a movie, the motion blur is more used to simulate an ill-effect of human eye... as if you were viewing the game through your own eyes, or as you through some other persons eyes (that's supposed to explain the motion blur over motion blur equating excessive motion blur).
It's your eyes and brain that are already doing the motion blurring (and more realistically than any game engine ever could do), why should a game simulate something like that and just make the game look weirdly blurry??

If motion blur were used to make movement seem more fluid than it really is (as TV does), would that have an ill effect on for example shooting accuracy in fps games?
 
Last edited by a moderator:
rabidrabbit said:
Speaking of motion blur, I've never seen a film that featured motion blur throughout the whole film all the time.
Video game use of effects is almost always painfully weak. That's not to say some devs don't try to use an effect artistically and not realistically. But any developer wanting to use certain effcts to recreate a movie look should read up a bit on optics.

On a film camera moblur is dependant on how fas tthe camera is moving and how long the exposure is. In brightly lit scenes a shorter exposure time is needed and so there's less blur. Cameras also employ a number of internal reflection dampening features to minimise lens flare. I can't think when was the last time I saw a pronounced falre in a movie. And the lighting is controlled by a whole lighting department to ensure in most cases an even exposure without huge amounts of whiteout.

It's very funny how the things the movie industry tries to elliminate and often added in spades on games!
 
rabidrabbit said:
Speaking of motion blur, I've never seen a film that featured motion blur throughout the whole film all the time.

I think you probably have.

AFAIK film cameras have their shutters open for a fixed amount of time (~1/48th of second) and so any movement in the shot will be blurred. Unfortunately, it really needs to be open for closer to 1/24th to make the motion smoother, but that is physically difficult.

To demonstrate this, I tried to think of a "slow moving" movie and "Gosford Park" sprang to mind. Note the snow in this picture
mptv1.gif
 
The way I believe things to work is that as images are up scaled they become pixilated. To hide this effect a blur is often applied. Working with Photoshop I know the right kind of blur applied very conservatively so you can barley notice it has a very pleasing look. Since the Job of a scaling algorithm is to decide how to fill in the additional pixels created when the image is up scaled some algorithms have a slight blurring effect. There are several things like this that high quality TVs and monitors can do to improve images. Some of them are quite good at it and produce some really amazing images when compared side by side to other TVs. Some PS2 games used similar techniques to reduce “jaggiesâ€￾. This kind of stuff is not AA but I still think games would look a good deal better if they incorporated these types of techniques. The CG and Film industry already uses this kind of stuff all the time.
 
Simon F said:
AFAIK film cameras have their shutters open for a fixed amount of time (~1/48th of second) and so any movement in the shot will be blurred. Unfortunately, it really needs to be open for closer to 1/24th to make the motion smoother, but that is physically difficult.

Isn't shutter time dependent on exposure, and thus on light conditions? Like, dark scenes need a longer exposure time to capture enough light, and thus a longer shutter time? It sure goes like this with photography, and even though movies tend to compensate for this with a lot of extra artificial lighting, I'd think that the dependency remains...
 
I'm confused on this too, but haven't found any conclusive info. If movie cameras use a fixed shutter speed then the cameraman is stuck to controlling light through aperture and filters. That might be the case. eg. If you want a narrow DOF in low light conditions, throwing the aperture wide open will get that. Now if you want that same DOF in bright conditions you'll overexpose, unless you apply a neutral filter to the lens.

It's possible but I'd have thought a variable shutter speed would be more flexible and convenient. But then you WOULD get a variable motion blur that might be worse, so I can understand why they'd use a fixed shutter speed. Be nice to have a confirmation link from a camera site but I haven't found one yet.
 
Regarding the lack of AA in launch games:
Maybe the logic on the daughter die can realistically only handle one type of filter/AA type per frame?
Could it be that the hit for changing the settings is so high that it can only be done once per frame (like if DoF is used, AA can't used)?
 
As far as I know, both DOF and motion blur are performed in the pixel shader as condition based blur filters, in their currently used implementations. Whereas AA is performed by the ROP units, which are completely different components of the GPU. Thus these features should not really have anything to do with each other.

So I'd say that it's more about having enough fill rate and/or bandwith to run an additional pass on the framebuffer - which shouldn't be a big problem with Xenos... Someone with more practical info and experience should correct me if I'm wrong though.
 
Film runs through the cinema projector at 24fps - therefore unless you're shooting for slo-mo or to speed-up, then the film shot has to be shot at the same rate.

The frame-rate might be "fixed", but the shutter speed can be varied - as well as the aperture and the film sensitivity.

Video cameras work much the same way, producing a fixed output frame rate but varying both shutter speed and aperture (and sensor sensitivity instead of film speed).

Jawed
 
That is, as long as shutter speed > frame rate. You can snap a 1/4000th exposure every 1/24 of a second. But if you need to snap a 1 second exposure, you can't achieve it unless you've got multiple CCDs or shutters, in fact, you'd need 24 CCDs/shutters, each "opening" to take their image 1/24 a second later.

The result on most video cameras of using "candle mode" is a severely jerky image that looks like you're on LSD.


On the other side, if you are snapping 1/4000th exposures, you end up with the "sports video" look (or "Gladiator" fight sequence stuff) You get images with no motion blur, but they are less pleasing to the eye.

Frame rate is fixed. If you need alot higher exposure than the frame rate, and artistically, you can't use a larger aperature, then you'll have to go with faster film. But faster film has annoying grain, so you'd need to clean that up in post processing, unless you want that effect ("Collateral")
 
Last edited by a moderator:
Most 16-35mm film is shot with a 180 degree shutter which equals 1/48 of second @ 24FPS in video camera terms. This is done because motion film cameras have a rotating shutter that needs to be opened and closed up to several thousands of times a minute, unlike a SLR still camera which would have a shutter failure after shoting a few minutes at 24fps. There are also different types of shutter speeds, for example a 90 degree shutter, like in the fight scenes in gladiator, gives a more jerky feeling since there is less motion blur between the frames.
Also, the only time you see what is happening through the lens is when the shutter is opened, like an SLR, because it is still an optical system that requires a mirror,( though, now that I'm thinking about it, a ccd via a split mirror seems like a more intelligent approach)

With film, you don't adjust to your location. Your location adjusts to your lens. The cinematographer will light for a certain stop throughout a scene, to take full advantage of the sharpness of the best part of the lens.

Shooting 35mm is really fun. Some of the nicer cameras have heaters for the chamois (Eyepiece) to keep it from fogging. Anyone want to pay for some film stock and camera rental? I'll show you how it works :D
 
Collateral was shot with a variety of different cameras with the artifical gain pushed to beyond the natural ISO rating of the digital cameras, creating digital interpolated noise. Mann also opted for a similar effect in ALi in the opening when he used an impossible (in film) 360 degree shutter on the digital camera which created a strange motion blur effect. Bleh.
 
Back
Top