Could there be any PS3 games with no AA?

Lysander said:
I think that on 720p hdtv it is optimal and near to perfect; more than that and it is wasting of resources of system power.

Would not higher aa on 720p create "foggy" picture, destroying sharpness of edges and lines.


hummm no
SuperSampling works in all the part of picture, resulting in some "washing textures", but MSAA works in a different way, no foggy picture, only less alias
 
So? i doubt we will ever see SSAA(edited, didnt meant MSAA) on any recent AAA title on any platform.
And i agree AA is a waste of resources. (either technique)
 
Last edited by a moderator:
doob said:
So? i doubt we will ever see MSAA on any recent AAA title on any platform.

?
MSAA is almost the standard in pc and 360, xbox1 games, SSAA is almost dead, there're only few titles thaat uses this
 
Griffith said:
SuperSampling works in all the part of picture, resulting in some "washing textures",
What on earth are you talking about, "washing textures"? Supersampling provides a greater information density per pixel, improving accuracy of every pixel in the image. It's supersampling that enables a SDTV image to capture individual hairs on a person's head. High SS elliminates jaggies (within the limit of resolution. A 160x120 screen is going to look bad no matter how much SS you apply!), both on polygon edges and textures, including transparent textures, and elliminates texture shimmer. MSAA is only preferential because of the demands of SS. If you had a choice between the two with no other impacts, SS would be selected every time.
 
load

Griffith said:
?
MSAA is almost the standard in pc and 360, xbox1 games, SSAA is almost dead, there're only few titles thaat uses this

4xMSAA is only 4x bandwidth load but 4xSSAA is 4x bandwidth load & 4x shader processor load. This is because 4xSSAA is processing 4x pixels. So shader/pixel is 1/4 (not so much effects) or frame-rate is 1/4 (not so good for play). So this is why not so much SSAA game is available.
 
Vysez said:
The topic is: Could there be any PS3 games with no AA?

Let's not derail the thread into a discussion about the property of the renderer of a particular game. Especially when this particular game is not even running on PS3.


On topic, 4XMSAA is hardly what one could consider as being able to correct all the edge aliasing shimmering.
8XMSAA, or better 8XSSAA is what I'd consider as excellent AA. 16XSSAA could be labelled as "perfect", if you ask me, that is.
But I'm a known "impossible to please" kind of enthusiast, so...

Also, and that's a point some people tend to forget when they make parallels with the PC Desktop space, one has to remember that the PPI on a HDTV is generaly lower than the PPI one can get on a monitor.
What do they use in CG movies? Or is it that the temporal motion blur is good enough to get rid of all jaggies?
 
ihamoitc2005 said:
4xMSAA is only 4x bandwidth

It is usually a lot less than that, because the data is easily compressable. The actual bandwith increase is about 20%-50%.
 
nintenho said:
What do they use in CG movies? Or is it that the temporal motion blur is good enough to get rid of all jaggies?

CG movies don't have a fixed rate of AA - renders are usually adjusted to use as much supersampling as it takes to get rid of the aliasing artifacts; but some of this adjustment may be automatic when adaptive (contrast-sensitive) antialiasing can be used. Sophisticated texture filtering is also heavily used to help texture/shader aliasing, without loosing any details.

Typical AA settings for Mental Ray may use 1 to 64 samples.
PRMan is a bit more complicated, as shading is indipendent from rasterising (depends on the tesselation density of the micropolygon grids) and number of pixel samples may easily reach 100 or even more. The shading rate (average number of shading samples per pixel) is usually somewhere between 4 and 25, but it can also reach 100 when needed (I know for a fact that some Harry Potter 3 VFX scenes have indeed used such a high shading rate).
Raytracing effects like ambient occlusion, reflection occlusion, subsurface scattering and actual reflections/refractions/shadows also have adjustable sampling quality. Some of these effects need more than 1 sample per pixel; some others can do with undersampling, which means that rays are fired for every Nth pixel and results are interpolated and filtered.
One sample may need to fire several rays though, for example occlusion and SSS type effects have to examine every direction at each sample points; so dozens, maybe even hundreds of rays have to be fired to the enviroment, using quasi-random (stohastic) patterns.

Temporal AA, which we prefer to call motion blur, is another thing which needs additional sampling to keep the image free of aliasing artifacts. Same goes for DOF. So these won't cure aliasing in itself... which is why many target render CGI videos from console manufacturers had some 'jaggies' that people interpreted as as sign of them being realtime...
 
Laa-Yosh said:
CG movies don't have a fixed rate of AA - renders are usually adjusted to use as much supersampling as it takes to get rid of the aliasing artifacts; but some of this adjustment may be automatic when adaptive (contrast-sensitive) antialiasing can be used. Sophisticated texture filtering is also heavily used to help texture/shader aliasing, without loosing any details.

Typical AA settings for Mental Ray may use 1 to 64 samples.
PRMan is a bit more complicated, as shading is indipendent from rasterising (depends on the tesselation density of the micropolygon grids) and number of pixel samples may easily reach 100 or even more. The shading rate (average number of shading samples per pixel) is usually somewhere between 4 and 25, but it can also reach 100 when needed (I know for a fact that some Harry Potter 3 VFX scenes have indeed used such a high shading rate).
Raytracing effects like ambient occlusion, reflection occlusion, subsurface scattering and actual reflections/refractions/shadows also have adjustable sampling quality. Some of these effects need more than 1 sample per pixel; some others can do with undersampling, which means that rays are fired for every Nth pixel and results are interpolated and filtered.
One sample may need to fire several rays though, for example occlusion and SSS type effects have to examine every direction at each sample points; so dozens, maybe even hundreds of rays have to be fired to the enviroment, using quasi-random (stohastic) patterns.

Temporal AA, which we prefer to call motion blur, is another thing which needs additional sampling to keep the image free of aliasing artifacts. Same goes for DOF. So these won't cure aliasing in itself... which is why many target render CGI videos from console manufacturers had some 'jaggies' that people interpreted as as sign of them being realtime...
So....it's way too much for next-gen consoles, huh?:cry:
 
CG quality, using high SS etc., won't be with us for several generations if ever unless someone invents a brand new way of rendering totally unlike anything we've seen before. If you're looking at 16x supersampling, you're talking 4x the performance of 4xSSAA, which already brings GPUs to their knees. You'd need 16x the shading power of the current console GPUs to render what the current consoles can do with 16xSSAA, including 16x the BW (ouch!). The moment you increase scene complexity you're asking for multiples more power beyond that. And that's the rub. Next gen consoles improve on geometry, shader, and special effects, and in doing so up the demands for SSAA. Only if graphics stay static at current levels would you eventually catch up with performance enough to AA the whole thing.
 
london-boy said:
No it wouldn't. AA is not a Blur filter, which would cause a loss of detail. AA is completely different.

So you are telling me the N64 blur filter was not a form of AA :LOL:
 
Acert93 said:
So you are telling me the N64 blur filter was not a form of AA :LOL:
Wowza, where have you been?!

I get lovely edge antialising on my slowish LCD monitor thanks to temporal blurring. Blurring isn't an AA technique though in that it doesn't seek to increase the data density of the pixels, so I think it fair to say AA doesn't blur, but blur can provide an AA like effect.
 
doob said:
So? i doubt we will ever see SSAA(edited, didnt meant MSAA) on any recent AAA title on any platform.
And i agree AA is a waste of resources. (either technique)

We could always see it if a particular game is running in 480p ;)
 
Last edited by a moderator:
Shifty Geezer said:
Blurring isn't an AA technique though in that it doesn't seek to increase the data density of the pixels, so I think it fair to say AA doesn't blur, but blur can provide an AA like effect.

Not sure if by definition, but AA certainly does "blur", at least in the sense that the image appears to be softer and detail is lost somewhat in the AAing process - which is why most people think PS2 has a sharper output in games that don't use AA while Xbox/GC with AA often have a softer image. It's all semantics though. :D
 
Shifty Geezer said:
CG quality, using high SS etc., won't be with us for several generations if ever unless someone invents a brand new way of rendering totally unlike anything we've seen before. If you're looking at 16x supersampling, you're talking 4x the performance of 4xSSAA, which already brings GPUs to their knees. You'd need 16x the shading power of the current console GPUs to render what the current consoles can do with 16xSSAA, including 16x the BW (ouch!). The moment you increase scene complexity you're asking for multiples more power beyond that. And that's the rub. Next gen consoles improve on geometry, shader, and special effects, and in doing so up the demands for SSAA. Only if graphics stay static at current levels would you eventually catch up with performance enough to AA the whole thing.
10B+ transistor MultiGhz gpus x 100trillion in parallel with exabytes of memory, would that suffice? Or would that not allow for beyond present day cg realtime gphx? ;)
 
zidane1strife said:
10B+ transistor MultiGhz gpus x 100trillion in parallel with exabytes of memory, would that suffice?
When that setup finds it's way into a home console, let me know :p

Actually you'd also need an incredible memory system. It's often overlooked, but bigger faster processors are only as fast as the memory that feeds them, and that's not progessing in specs as fast as the processors.
 
Phil said:
Not sure if by definition, but AA certainly does "blur", at least in the sense that the image appears to be softer and detail is lost somewhat in the AAing process - which is why most people think PS2 has a sharper output in games that don't use AA while Xbox/GC with AA often have a softer image. It's all semantics though. :D
The image may appear softer, but antialiasing adds detail, and that's not semantics.
 
Those PS2 games with unchecked aliasing and interlacing weren't any sharper, just very messy. Actually, PS2's more common use of lower resolution front or back buffers makes games blurrier/less sharp.
 
Back
Top