AntiAliasing in nextgen ?

Jogi

Newcomer
Current generation of consoles only uses 2x antialiasing for reducing interlace flicker in games. Some early games in PS2 did not use even that, and one or two games used very good looking 4x or greater AA (Baldurs Gate)

I really think that for next-gen minimun or standard should be 4xAA but to get most of our TV's and projectors/HDTV we want 6-9 or even 16xAA ! Or 4x with 8-16x temporal AA (motion blur) but I thing we do not get that with next gen yet. In theory, if fillrate increases like prosessing power (with cell) it could be done, 4xAA with 16x motion blur requires something like 32 times of fillrate (16-32x, depending how AA is done). It sounds reasonable when comparing prosessing speed increase of 100-1000x that is expected (say advertised ;)) but in reality I think majority of gfx power is used for beautifying, bumpmaps, high res textures etc.

What you think is coming, are we going to get these or is it just little update in image quality, say compared to Xbox ? (quantity will be increased, no question about that..)
 
I think what you ask would result in current games visuals but of higher quality, instead of more interesting game visuals. I also don't like 'temporal' AA. Blurred images that lack sharpness are of no interest!
 
Shifty Geezer said:
I think what you ask would result in current games visuals but of higher quality, instead of more interesting game visuals. I also don't like 'temporal' AA. Blurred images that lack sharpness are of no interest!
I'm going to be blunt here, but you are mistaken.

In moving images, (proper) temporal AA will actually improve the perceived quality of the video.

There's a very simple practical example. If you watch some real-life action, eg sport, that has been captured on standard video camera, the playback looks smooth. This is because it effectively has temporal AA. Note that the stills from that video, of course, will look blurred.

If, OTOH, you watch video caught on a high speed camera where the "shutter" time is only a small fraction of the frame period, the stills from that video will be sharp but the video playback looks jerky and unpleasant.
 
Interesting approach- I had never considered that before, but it makes a lot of sense. Otoh, some motion blur is good, but it should never be overdone (not saying you were implying that, of course).

On the HD subject, it shouldn't be automatically assumed that higher-AA-orders will be desirable just because the HD resolutions are higher. The whole point of going to higher resolutions is so that you won't have to rely so heavily on the AA effect to mask HF artifacts and moiring. Otoh, if you are increasing the screen size considerably while the rendering resolution is largely out of your control (the game developers will have the ultimate say there), then stronger AA can then become a beneficial feature. That is only because the screen size has out-scaled the resolution you have available to use in the program material.

Aside from all of this, the whole case for AA may become moot when it comes to typical HD displays of the current and upcoming technology generation. "Everything" is going to be scaled and filtered internally in the digital hardware of the HD display, anyway, so it will "fit" to the native resolution of the screen element. Essentially, that is a form of AA, albeit, not necessarily the most elaborate form of AA (that would depend on the fanciness of the model, I suppose). At that point, feeding in a heavily AA'd image to a HD panel, just so it will be AA'd again internally in a cacophony of digital trickery (to make a relatively "puny" 720p feed look good on a 40+ in display), may not be the best prescription for best image quality.

Just something to think about before you (referring to people, in general) go high diving off the AA cliff as a matter of technological progress.
 
Simon F said:
Shifty Geezer said:
I think what you ask would result in current games visuals but of higher quality, instead of more interesting game visuals. I also don't like 'temporal' AA. Blurred images that lack sharpness are of no interest!
I'm going to be blunt here, but you are mistaken.

In moving images, (proper) temporal AA will actually improve the perceived quality of the video.

There's a very simple practical example. If you watch some real-life action, eg sport, that has been captured on standard video camera, the playback looks smooth. This is because it effectively has temporal AA. Note that the stills from that video, of course, will look blurred.

If, OTOH, you watch video caught on a high speed camera where the "shutter" time is only a small fraction of the frame period, the stills from that video will be sharp but the video playback looks jerky and unpleasant.

isnt there a difference between: 1) motion blur. (The actual multisampling of a fast piece of moving geometry with in the scene. (extremely expensive rendering approach, which doesnt even requirer any consideration for AA (Unless it results in some high contrast pixel colour difference?

2) Temperal AA.

EDIT: Motion blur in itself could still require anti-aliasing if the resulting image is high contrast. Thus motion blur is not a solution for AA. ? .. just a thought......I might be wrong...... *shrug*
 
Uh real temporal AA (not what ATI calls temporal AA) allows for motion blur not sure how you can say they are different exactly since if you do motion blur then you are doing temporal AA.

Now often motion blur is greatly exagerated to go beyond temporal AA I will agree (like light trails that slowly fade out which could represent the exponetial dropoff from signals instead of instanteous dropoff current video cards do).

Edit: If you mean motion blur to only include the technique of multisampling over several time points for one frame then yes Motion Blur=>Temporal AA but Temporal AA does not imply Motion Blur. But motion blur to me just means the blur we see in a single frame of the object over its entire volume carved out. Really I'm for a new type of renderer thats renders 3d objects instead of 2d objects but wouldn't of course be anywhere near as fast but you can get nice motion blur then.
 
Cryect said:
Uh real temporal AA (not what ATI calls temporal AA) allows for motion blur not sure how you can say they are different exactly since if you do motion blur then you are doing temporal AA.

Now often motion blur is greatly exagerated to go beyond temporal AA I will agree (like light trails that slowly fade out which could represent the exponetial dropoff from signals instead of instanteous dropoff current video cards do).

Edit: If you mean motion blur to only include the technique of multisampling over several time points for one frame then yes Motion Blur=>Temporal AA but Temporal AA does not imply Motion Blur. But motion blur to me just means the blur we see in a single frame of the object over its entire volume carved out. Really I'm for a new type of renderer thats renders 3d objects instead of 2d objects but wouldn't of course be anywhere near as fast but you can get nice motion blur then.

Yeah well i meant, Motion Blur as it's done today on current hardware is not necessarily the same as Temporal AA.
But Temporal AA can be called Motion Blur under most points of view...
Not sure i was very clear :D
 
Temporal AA's no substitute for supersampling. It doesn't work on static/slow moving objects. Isn't there an alternative though, to only sample edges of objects?
 
There is two kinds of temporal AA, and I think you are confusing them.
There is the tweening/smoothing between frames to make motion seem smoother.
Then there is the technique with regular "stochastic" antialiasing with different sampling pattern from frame to frame. It depends on high framerates to achieve the effect, because if you drop below 60fps the edges of the AAed objects start to shimmer.
 
randycat99 said:
On the HD subject, it shouldn't be automatically assumed that higher-AA-orders will be desirable just because the HD resolutions are higher. The whole point of going to higher resolutions is so that you won't have to rely so heavily on the AA effect to mask HF artifacts and moiring.

Screen resolution doesn't determine the need for AA exactly. Rather, it's the number of texels or polygons per pixel. Obviously, for a given rendering power, as screen resolution goes up, the number of texels or polys per pixel drops, and the need for AA decreases. But with the next generation, we should hope that rendering power increases more than screen resolution...
 
if the r500 chip in the xenon is 32 pipelines at 500 mhz i can see 6x fsaa done in games. I'm doing that onw on half life 2 at 1600x1200 on my x800xt pe .

Add in "temporal" aa which looks great as long as you have the power to do it and we can see some stellar jaggie free games
 
Motion blur -can- be very distracting in most kinds of games, especially if it's applied to the full scene.

Motion blur in a racing game would probably be cool to an extent. The same on enemies in an FPS would probably be very bad (remember the 3dfx Quake3 demo). Motion blur as an effect for powerups/spells would also be cool (Giants).
But for most cases I'd rather aim for 60fps. There's no motion blur in real life either, just infinitely high FPS :)

So it really depends on the implementation.
 
Laa-Yosh said:
Motion blur -can- be very distracting in most kinds of games, especially if it's applied to the full scene.

Motion blur in a racing game would probably be cool to an extent. The same on enemies in an FPS would probably be very bad (remember the 3dfx Quake3 demo). Motion blur as an effect for powerups/spells would also be cool (Giants).
But for most cases I'd rather aim for 60fps. There's no motion blur in real life either, just infinitely high FPS :)

So it really depends on the implementation.

That's very arguable. Try waving your hand fast enough in front of u and you'll see motion blur right there. Obviously it's nothing like artificial videogames motion blur, but it's still motion blur. It all depends on our eyes and how "fast" our brain can process what we see.
 
That's okay as long as your eyes don't follow your moving hand. However as soon as you do that, it's the hand that becomes sharp and everything else will get blured. Sort of... I know that the moving hand might not be the best example, but think about standing near a highway and looking at the cars passing by.

So this is our problem, the graphics engine wouldn't be able to guess where your eyes focus, and what should be motion blured. It's a good bet that you're not interested in the fine details of the race track passing under your car, but it's certain that you would rather aim at a discrete enemy instead of a blury mess in Quake...
 
jvd said:
if the r500 chip in the xenon is 32 pipelines at 500 mhz i can see 6x fsaa done in games. I'm doing that onw on half life 2 at 1600x1200 on my x800xt pe .

Add in "temporal" aa which looks great as long as you have the power to do it and we can see some stellar jaggie free games

if games on next gen consoles only look twice as good as half life 2 i will be very disappointed.
 
if games on next gen consoles only look twice as good as half life 2 i will be very disappointed.

ouch , well at least you will have a long time to get used to that disapointment
 
Laa-Yosh said:
...
So this is our problem, the graphics engine wouldn't be able to guess where your eyes focus, and what should be motion blured. It's a good bet that you're not interested in the fine details of the race track passing under your car, but it's certain that you would rather aim at a discrete enemy instead of a blury mess in Quake...

Hmm...I suppose this rules out any form of usefull depth of field focusing in games then...I was hoping this could be tied nicely in with motion blurring. I know certain autofocus camera viewfinders use zones to detect retina motion, maybe some integrated eyetoy device could use this funcionality or perhaps this is Nintendo's Revolution in gaming! :p
 
hovz said:
if games on next gen consoles only look twice as good as half life 2 i will be very disappointed.

Yeah, me too. If we only get a 2x Doom3 quality of graphics, then I´ll be dissapointed as well. Call me crazy, but remember that promo trailer for KH2? I´m actually expecting something closer to that :D

Yeah, I´m an optimist, but I´m least I´m not going to the far end of pessimism. Like Chris Rock at the Oscars said, If what you want is FF FMV and all you can get is slightly improved HL2/Doom 3 graphics, then WAIT!!
 
Back
Top