What is the maximum AA the Playstation 3 can have and be able to run at 60fps

Status
Not open for further replies.
Do the console holders force people to go 1080p ? If not, why can't we let the developers decide ? Super Stardust works well in 1080p right ?
 
I don't see how polygon phonelines can escape the edge detection on z-buffer.

If it is visible, it's pixel depths are in z-buffer. Since the difference between background and wire will be huge it will almost certainly come up on edge detection.
Of course I expect most engines to draw those lines with some kind of AA anyway, unlike real meshes.

That said, I believe blur is not the right approach. It fakes antialiasing only for stills, does little to flickeriness in motion.


This would be the example I tried to explain.

This is the found edges.

No matter how you do the edge detection with Z, normal or ID buffers there is no way that you can easily find the proper polygon edge as there just isn't enough information of them.
Yes SSAA on detected pixels would make it look better by reducing contrast, but it would still lose the lines in those big gabs.
This would be the case with any very small polygons or objects.

I do agree that normal blur isn't any good.
We already have DoF and other blur effects, why would we want to blur the part of the screen which should be in focus?
 
You only have to look at the most impressive looking titles to determine that 720p is the optimal resolution this generation so you still have plenty of resources to budget for the rest:

Gears
Heavenly Sword
CoD4
Halo3
Bioshock
etc..

Ofcourse games like Virtual tennis and such should be able to hit 1080p due to their enviornment and gameplay.

However if you want all the bells and whistles in an epic sort of setting, 720p will have to do this gen. Stop drinking the marketing kool aid......

No dev should worry about checking the 1080P resolution checkbox when developing a game meant to compete with the list above.
 
You only have to look at the most impressive looking titles to determine that 720p is the optimal resolution this generation so you still have plenty of resources to budget for the rest:
Are you suggesting that which kind of TV is used to play them doesn't matter?
 
Do the console holders force people to go 1080p ? If not, why can't we let the developers decide ? Super Stardust works well in 1080p right ?

Because its a marketing checkbox feature which devs feel pressured into ticking. Thus the graphical quality of the game will suffer at the expense of a higher resolution which few can even use and all for the sake of marketing pressures.

If you take 1080p out of the equation altogether then you remove that pressure on devs and everyone is free to get the best out of 720p that they possibly can.
 
Because its a marketing checkbox feature which devs feel pressured into ticking.
Why though? All the other devs are rendering at 720p, and they accept 1080p has issues. It's not like if you target 720p you'll be in a minority and all the other devs at 1080p will laugh at you! For the disc case checkbox, you only need upscaling to support 1080p out. In reviews, you're not going to be marked down on account of rendering to 720p either. The only place 1080p matters in fan forums where it's held aloft as evidence of hardware superiority, and appeasing fan forums isn't something developers should care about. It's not going to net you more sales. Especially if your game would look a lot better at 720p.

AFAICS the only incentive for 1080p comes from Sony wanting to promote to the new hardware standard and sell 1080p TVs. MS followed with a catch-up output option. MS aren't serious about supporting it as they appreciate 720p is better in most instances. Sony games will likely only support it where it can be used in part promotion of 1080p sets. eg. LAIR and GTHD can be called 'True HD' and help market 1080p, and are funded by Sony. Most PS3 titles will remain 720p as the developers don't benefit from targeting 1080p.
 
You only have to look at the most impressive looking titles to determine that 720p is the optimal resolution this generation so you still have plenty of resources to budget for the rest:

Gears
Heavenly Sword
CoD4
Halo3
Bioshock
etc..

Ofcourse games like Virtual tennis and such should be able to hit 1080p due to their enviornment and gameplay.

However if you want all the bells and whistles in an epic sort of setting, 720p will have to do this gen. Stop drinking the marketing kool aid......

No dev should worry about checking the 1080P resolution checkbox when developing a game meant to compete with the list above.

I hope the games that are already out are not indicative of the best developers can achieve with the xbox360 and PS3. After seeing how the games improve over the life of the Playstation2 (and reading all the struggles devs face with the xbox360 and PS3 hardware), I'd be very dissapointed if any of those games listed are the best we get with this generation of consoles.
 
Why though? All the other devs are rendering at 720p, and they accept 1080p has issues. It's not like if you target 720p you'll be in a minority and all the other devs at 1080p will laugh at you! For the disc case checkbox, you only need upscaling to support 1080p out. In reviews, you're not going to be marked down on account of rendering to 720p either. The only place 1080p matters in fan forums where it's held aloft as evidence of hardware superiority, and appeasing fan forums isn't something developers should care about. It's not going to net you more sales. Especially if your game would look a lot better at 720p.

AFAICS the only incentive for 1080p comes from Sony wanting to promote to the new hardware standard and sell 1080p TVs. MS followed with a catch-up output option. MS aren't serious about supporting it as they appreciate 720p is better in most instances. Sony games will likely only support it where it can be used in part promotion of 1080p sets. eg. LAIR and GTHD can be called 'True HD' and help market 1080p, and are funded by Sony. Most PS3 titles will remain 720p as the developers don't benefit from targeting 1080p.

I agree with you. However, if "True HD" start finding its way on retail boxes for games and becomes accepted by the general public then there might be an issue there.

I know alot of us wouldn't want devs' motiviation to go from "Lets make the best looking game we can on the PS3 (or 360)" to "Lets make the best looking PS3 game (or 360 game) we can at 1080P.
 
At 1080p don't the current consoles have less fillrate per pixel than Xbox 1? People are missing that little tidbit. Add AA on top of that and things are gonna get ugly.
 
At 1080p don't the current consoles have less fillrate per pixel than Xbox 1? People are missing that little tidbit. Add AA on top of that and things are gonna get ugly.

Yea it's by a rather significant factor, but then the NV2A had that bandwidth problem... :???: Barring bandwidth, Xenos doing 720p @4Gp/s is already a better situation than the NV2A doing 480p@932Mp/s, but at 1080p, it's quite a bit worse.
 
Slightly OT, but how much more expansive is 720p/2XMSAA with twice as much AF than 1080p/1xAA if you assume relatively little overdrawn/transparent pixels ?

Is it possible for a developer aiming for 1080p on the PS3 to calculate the IQ settings for a 720p mode causing less or equal costs or would a second rendering mode always increase the amount of testing needed?
 

This would be the example I tried to explain.

This is the found edges.

No matter how you do the edge detection with Z, normal or ID buffers there is no way that you can easily find the proper polygon edge as there just isn't enough information of them.
I was talking about visible pixels only.
If one uses open thin polygons that may not be visible from some angle/distance, even hardware FSAA wouldn't be much of a help either.
 
I am slightly confused by the consoles having less pixel fillrates at 1080p than xbox 1? I thought Lair looks amazing at 1080p and things aren't ugly in my honest opinion...can someone show me how costly it is to render 1080p? I thought the PS3 is good enough to render at that resolution and that its potential isn't really fully tapped yet.
 
I am slightly confused by the consoles having less pixel fillrates at 1080p than xbox 1? I thought Lair looks amazing at 1080p and things aren't ugly in my honest opinion...can someone show me how costly it is to render 1080p? I thought the PS3 is good enough to render at that resolution and that its potential isn't really fully tapped yet.

At 1080p you're doing ~7x the work of 480p and twice the work of 720p.
 
I was talking about visible pixels only.
If one uses open thin polygons that may not be visible from some angle/distance, even hardware FSAA wouldn't be much of a help either.
FSAA woundn't get much better, but at least it would give some visual cues what might be in there.
If one would rendered some edges as lines to the mask buffer, your method would work like a FSAA even in that case.
Even without this, the method you described should indeed work nicely in most normal cases. :)
 
Last edited by a moderator:
<rant>
If it were up to me, I'd yank 1080 support from the sdk's and force everyone to use 1280x720 for games. 1080 is just a disservice to gamers on this round of game/tv hardware, it's only purpose is as a marketing bullet point. Unless you are making a vector based asteroids game then a 720p game will look better than a 1080p anytime.
</rant>

Ahh that feels so much better. Apologies for the OT post, been I've wanting to say that for a while now ;)

thank you. I would do pretty much the same. maybe even force every game down to 640x480p. then use the performance & bandwidth savings to generate better, more complex graphics and/or 60FPS.
 
I think the question asked by the OP has been pretty-much answered since the second post of the thread.
 
Status
Not open for further replies.
Back
Top