What is the maximum AA the Playstation 3 can have and be able to run at 60fps

Status
Not open for further replies.
And where did I say they would not be playable? I could have sworn I said "issues" depending on your rig and we all know that not all PC gamers have the same PC system hardware when a game is released, even if it is high end parts and you are all over the place, the topic questioned here is PS3 not XBox 360.

By playable, I mean playable with no issues. You suggested that to turn on "advanced settings" like AA in new PC games "slows them to a crawl" even on high end hardware and these "driver issues" take months to be resolved.

That simply isn't the case. Were AA is an option in the game or even in the engine if not explicitly in the game itself, I can't think of an example in the last few years were turning it on with a high end GPU would slow the game to a crawl due to driver issues (assuming you are reasonable with resolution and the amount of AA applied). Im sure there are examples but they will be the exception rather than the rule.

Generally, a game that supports MSAA will be useable with it on a high end GPU from the day its released.
 
In the Sony Playstation 3, I have to say that IMHO, the only way we are going to see 60FPS+1080P+*XAA (* meaning 2x, 4x, 8x, etc) is greatly going to depend on the game developer, what level of dev tools they are using, how talented are they at using it and later how experienced they will be at using it.

PS3 hardware only supports 4xmsaa tops. If you want to go beyond that, you'll have to do a hybrid software/hardware approach. It's questionable if the effort would be worth it, as far as bang for the buck is concerned.
 
I can see 60FPS vs 30FPS as a very relevant argument due to its impact to certain genres and games where speed is important.

However, when it comes to 1080P or AA above 4X, I don't know if we've seen real tangible benefits for these things to become relevant check mark features. They seem like marketing tools and fodder for loyalist arguments as of now.

Of course a game will look better rendered at 1080P vs 720P when all other variables are equal. But rendering at 1080P comes at a cost and if staying at 720P and using the cost saving from not going 1080P consistently produces better visuals than possible at 1080P then why should a dev target 1080P?

While there is a small pool of 1080P games and a larger # of 1080 titles need to be release to get an accurate measure. There is little evidence that 1080P is going to boost visual quality of whats presently possible. Most of the increasing visual quality that we have seen for past, current and future nex gen games comes irrespective of resolution.

The same argument could be made for AA.
 
Last edited by a moderator:
OT (sorta): Would you happen to know any papers on implementing software AA?

oversampling
you render to 3840x2160 with MSAA 4x, downscale to 1920x1080 and you obtain full HD AA x16 with incredible IQ (oversampling is very good for IQ) but it's not really possible
 
oversampling
you render to 3840x2160 with MSAA 4x, downscale to 1920x1080 and you obtain full HD AA x16 with incredible IQ (oversampling is very good for IQ) but it's not really possible

I meant more about selective edge blurring or anti aliasing that BioShock and Gears of War appear to use at times. SSAA isn't that ingenious. :p
 
I meant more about selective edge blurring or anti aliasing that BioShock and Gears of War appear to use at times. SSAA isn't that ingenious. :p

IIRC, there was a paper talking about HPF (edge detection) on z-buffer and using it as a mask for LPF (blur) .

I think the real interesting work would be doing selective supersampling possibly with software aid.
 
insanse levels of AA isnt required this gen and 4xaa would be suffiecient enough imho

with resolution up to 720p and 1080p 6x AA and beyond is just over kill imo as the higher resolution already eleminates jaggyz

look at resistance that has 2xaa ( i thought it had 4 xaa ) and almost impossible to pick a jaggy in that game, infact its the cleanest next gen game out right now

hands down .
 
OT (sorta): Would you happen to know any papers on implementing software AA?
Hugues Hoppe got quite nice looking results with Discontinuity edge overdraw.
It basically is rerenders edges with anti-aliased lines to get almost perfect looking AA.
Biggest problems would be edge bloating and need for sorting the lines, altough I'm sure SPUs are quite fast for that kind of work.

http://research.microsoft.com/~hoppe/overdraw.pdf (.PDF)
http://research.microsoft.com/~hoppe/ (Homepage)

If this as such wouldn't be feasible becouse of the line rendering, it could be better to create edge fins and render using them.
Fins would give more control over the thickness and ability to easily have proper shaders working on edges as well.

IIRC, there was a paper talking about HPF (edge detection) on z-buffer and using it as a mask for LPF (blur) .

I think the real interesting work would be doing selective supersampling possibly with software aid.
Biggest problem with finding edges from buffers is that you loose ability to find thin polygons, is that polygon rasterizing doesn't fill every pixel the polygon goes trough in screen space.
Something like phonelines and small detail might end with no AA at all.
 
Last edited by a moderator:
Biggest problem with finding edges from buffers is that you loose ability to find thin polygons, is that polygon rasterizing doesn't fill every pixel the polygon goes trough in screen space.
Something like phonelines and small detail might end with no AA at all.

I don't see how polygon phonelines can escape the edge detection on z-buffer.

If it is visible, it's pixel depths are in z-buffer. Since the difference between background and wire will be huge it will almost certainly come up on edge detection.
Of course I expect most engines to draw those lines with some kind of AA anyway, unlike real meshes.

That said, I believe blur is not the right approach. It fakes antialiasing only for stills, does little to flickeriness in motion.
 
4XAA is the standard pretty much for consoles this gen...btw, is 360 able to handle 4XAA at 1080p?

Yes, if you're rendering a single spinning teapotahedron.
No, if you want to render a scene with the complexity of an RTS, the texture resolution of Gears, the view distance of Halo and the lighting of a deferred renderer game like Killzone.

Now will you please ask the question in a way that a) you couldn't answer it yourself and b) it actually makes any sense?
 
Well, first of all my initiative to ask that question obviously puts me in a position that i cannot answer that question myself as i am not familiar with the capabilities of the Xbox 360. I research more about PS3 anyways...and who are you to say that 360 cannot have 4XAA at 1080p without seriously backing up your claim?
 
Well, first of all my initiative to ask that question obviously puts me in a position that i cannot answer that question myself as i am not familiar with the capabilities of the Xbox 360. I research more about PS3 anyways...and who are you to say that 360 cannot have 4XAA at 1080p without seriously backing up your claim?

If a developer wants to make sure his game has 4xAA he can always drop the number of poly's. So instead of creatures with 1000 polys and no AA you have creatures with 100 polys and 4xAA. It's always a tradeoff.
 
Well, first of all my initiative to ask that question obviously puts me in a position that i cannot answer that question myself as i am not familiar with the capabilities of the Xbox 360. I research more about PS3 anyways...and who are you to say that 360 cannot have 4XAA at 1080p without seriously backing up your claim?
It's the same answer to the PS3 question. It all depends on what you're drawing and how. The XB360 could do Echochrome with 16x AA at 1080p, 60Hz, if that's any use to you.

On the whole, for average games, it's as unlikely to support 4xMSAA @ 1080 as PS3. The demands of 1080p are great enough without having to worry about AA. In the case of XB360, you'd end up with a lot of tiles due to limited eDRAM room, which adds a fair bit of extra overhead.
 
The demands of 1080p are great enough without having to worry about AA.

<rant>
If it were up to me, I'd yank 1080 support from the sdk's and force everyone to use 1280x720 for games. 1080 is just a disservice to gamers on this round of game/tv hardware, it's only purpose is as a marketing bullet point. Unless you are making a vector based asteroids game then a 720p game will look better than a 1080p anytime.
</rant>

Ahh that feels so much better. Apologies for the OT post, been I've wanting to say that for a while now ;)
 
<rant>
If it were up to me, I'd yank 1080 support from the sdk's and force everyone to use 1280x720 for games. 1080 is just a disservice to gamers on this round of game/tv hardware, it's only purpose is as a marketing bullet point. Unless you are making a vector based asteroids game then a 720p game will look better than a 1080p anytime.
</rant>

Ahh that feels so much better. Apologies for the OT post, been I've wanting to say that for a while now ;)

I agree with this man .

1080p really is overkill and its really limiting developes for adding increased details into there games..also sacraficing details..
 
<rant>
If it were up to me, I'd yank 1080 support from the sdk's and force everyone to use 1280x720 for games. 1080 is just a disservice to gamers on this round of game/tv hardware, it's only purpose is as a marketing bullet point. Unless you are making a vector based asteroids game then a 720p game will look better than a 1080p anytime.
</rant>

Ahh that feels so much better. Apologies for the OT post, been I've wanting to say that for a while now ;)

I totally agree. The jump from last gen's 480p to 720p seems fair enough at 3x the pixels. 1080p seems too much too soon IMO, not to mention the seemingly few people who even own a "True/Full" 1080p device.

For this round, 1080p would be nice for those XBL/PSN games that don't require a lot. Heck, it'd be nice if that were the standard with downscaling for super sampling on everyone else's displays!
 
<rant>
If it were up to me, I'd yank 1080 support from the sdk's and force everyone to use 1280x720 for games. 1080 is just a disservice to gamers on this round of game/tv hardware, it's only purpose is as a marketing bullet point. Unless you are making a vector based asteroids game then a 720p game will look better than a 1080p anytime.
</rant>
Why stop at 720p? Why not 600p at least for the consoles that support almost free scaling?
What about 480p? There are many platformers or third person action adventures where I would prefer more realistic (or CGI) look as opposed to 720p polygony look and worse ligthing.

On the other hand there are FPSs where I would prefer higher resolution for better targeting.

720p shouldn't be set in stone.
 
Status
Not open for further replies.
Back
Top