720p w/ 4xAA = 1080p w/o AA?

Would the average joe be able to tell the difference in resolutions if there is 4xMSAA enabled on the 720p? The whole point of AA is to reduce jaggies. Higher resolutions also reduce the jaggies, so..........

Just thinking out loud. ;)
 
Higher resolution doesn't rightly reduce jaggies; it gives smaller jaggies but more of them. at angles near horisonal or vertial, higher res can make the jackgies even more noticeable as instead of for instace on stairstep to simulate the angle you wind up with four or whatever.

That said, you can put the average Joe of a computer and play with resolution, AA and AF all day long and get the response that it all looks about the same.
 
Jaggies break the perception IMO.

That is why I run my games on my PC at a lower resolution to ensure I get a solid framerate with AA. 4x AA at 1280x960 looks better than 1600x1200 IMO.

AA is an IQ issue. Not all games benefit the same, but games with shiney surfaces like cars, plastic, or CGI characters (like Toy Story), leather, etc... really look better with AA. More gritty full colored palettes seem to fair better with less AA imo, although lighting and shadowing have a lot to say about that.
 
It really depends on the viewer. It's all personal preferences, as for example i much prefer a 1600x1200 image without AA than a less detailed 1200x1000 with AA, whereas Acert prefers the opposite.
 
It depends .


IF you give someone an x360 with 4x fsaa and 720p and let them play it for 5 or 6 months and then give them a ps3 with 1080p they may not like the change .


With my 9700pro i was playing at 1027x768 with 4x fsaa and then i moved to a new screen and got 1600x1200 and i didn't use any fsaa and the diffrence was apparent to me .

Your eyes get used to something and then nothing else looks right .

How 2x / 4x 720p looks compared to 1080p i dunno. But i'm sure they will look very close that no one is going oto care .
 
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423&p=2'

Because of the extremely large amount of bandwidth available both between the parent and daughter die as well as between the embedded DRAM and its FPUs, multi-sample AA is essentially free at 720p and 1080p in the Xbox 360. If you're wondering why Microsoft is insisting that all games will have AA enabled, this is why.

ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.
 
@ Shapeshifter:

This was mentioned elsewhere, but the chip outputting the signal to the TV will only support up to 720p/1080i so even if the GPU theoretically could do it, the system can't.
 
Mordecaii said:
@ Shapeshifter:

This was mentioned elsewhere, but the chip outputting the signal to the TV will only support up to 720p/1080i so even if the GPU theoretically could do it, the system can't.

when did they give us encoder/decoder info? I haven't seen anything.
 
Mordecaii said:
@ Shapeshifter:

This was mentioned elsewhere, but the chip outputting the signal to the TV will only support up to 720p/1080i so even if the GPU theoretically could do it, the system can't.

But that means it could be rendered at 1080p w/ 4x AA in the framebuffer and then the chip MS has for output would take that frame buffer image and display it however. The X360 is supposed to have a very powerful scaler, so I could imagine it being capable of doing this.

@ Shapeshifter = It came out in yesterdays news. The chip is separate and was made by MS's Webtv division. Check firingsquad I believe.
 
Mordecaii said:
I do have to wonder, however... at what resolution does the 10MB eDRAM become an issue in terms of being too small?

I believe it was stated it can tile to the main memory from the frame buffer. I *thought* I remembered reading that in some of the new info.
 
It very well could have been, but even that would cause slight latency issues would it not? I'm just wondering if it would be optimal for MS to stay at 720p instead of 1080p...
 
Mordecaii said:
It very well could have been, but even that would cause slight latency issues would it not? I'm just wondering if it would be optimal for MS to stay at 720p instead of 1080p...
Well I certainly imagine it's optimal since the 360 can't do 1080p. ;) It will likely also be "optimal" for the PS3 as well, since 1080p is on a bare minimum of REALLY expensive HDTV's right now and will take a number of years to filter out the way 720p/1080i has done up to this point. (Monitors are another matter, I suppose, but I don't think we'll be seeing people wanting to use that as their primary output. But maybe use a spare in a 2nd port?)

As much as Sony has referred to it as a "baseline" I'm not really thinking they want to force it on all developers, and developers as always want to use the the available resources where it makes most sense for THEM to. Perhaps the PS3's overall design and accompanying tools will make it not as much of a trade-off, but there has to be SOME, ne?
 
Acert93 said:
Mordecaii said:
I do have to wonder, however... at what resolution does the 10MB eDRAM become an issue in terms of being too small?

I believe it was stated it can tile to the main memory from the frame buffer. I *thought* I remembered reading that in some of the new info.

Tiling will come with a cost, of course. I mean, the PS2 can tile its back buffer to get around its 4MB of eDRAM as well, but it just wasn't worth it. Granted, the X360 *probably* has hardware assist, but it can't do magic. Otherwise, MS would have gone with even less eDRAM to save costs and tile to get 720p. But they spec'd the eDRAM to hold enough for a good-looking 720p image, without tiling, and offer tiling as a theoretical means of achieving 1080p (with the same level of anti-aliasing).

But this 1080p vs 720p debate is rather academic. You probably won't notice the difference except on very high-quality 60" screens.
 
Could the 1-5% hit taken by going with 4x AA at 720p / 1080i that ATI talked about be related to tiling?

I am not sure 1080p is a big deal. It sounds nice in theory, but look at the time to market for HD TVs in general. When you consider that 720p and 1080i are going to be the mass market low end this year, and probably even next year, 1080p would be a feature for so few people.

Even then I believe Sony said they support 1080p. I do not believe they said all games had to support it. I would be really surprised if that did--but maybe the RSX can do it.

But it does look like some tradeoffs. MS focused on every game REQUIRING 2x AA and preferably 4x AA; Sony said not a word about it.

Sony focused on 1080p, while MS does not support it currently and does not plan to instead focusing on the standards of 1080i and 720p.

Btw, does the current 1080p standard support 60Hz? I heard it is only current 24 and 30.

*IF* it does not support 60Hz, then I would take 720p 60Hz over 1080p 30Hz any day. And that is just not for console gaming, but in general. But from what people have said 60Hz is on the agenda... but that may be why MS is not currently supporting it because it is not accepted?'

Can anyone throw light on that issue?
 
The early 1080p displays are geared towards 30Hz. In fact, they may not support anything greater than 1080i inputs although it's supposedly trivial to go to 1080p inputs.

The main reason is film content is 24fps and video is 30 fps for 1080p.

HD-DVD or Blu-Ray players at the beginning may only have 1080i outputs but store content at 1080p. Some people were surprised to hear PS3 would have 1080p output (presumably for BR video).

Most of the CRT HDTVs support only 1080i because 720p would be more costly. In the last year or two, with the growing popularity of DLP and LCD, obviously digital displays, there are more native 720p displays in the market, but it's in the high end, so the installed base is probably still mostly at 1080i.

But as others noted, DLPs and other digital displays are pushing forward to 1080p, again on very high-end SKUs ($$$). For the next year or two, there isn't going to be much adoption of 1080p displays unless prices come down dramatically.

So it's going to be expensive for consumers but the other question is, how much more expensive will it be to support 1080p games?

Also, is it confirmed that PS3 won't be able to do any AA without a substantial hit to the framerate? IOW, is the premise of this thread valid? Maybe PS3 games won't be 1080p but maybe they will have AA?
 
How would 720p with 4X AA compare to 1080P that gets downscaled to 720p? I can't afford a large 1080p TV any time soon so I am getting a 720p TV this fall.
 
quest55720 said:
How would 720p with 4X AA compare to 1080P that gets downscaled to 720p? I can't afford a large 1080p TV any time soon so I am getting a 720p TV this fall.

I am going to use my 19" LCD (I do not even have a TV). I would be very interested to know if the PS3 supports VGA or a standard DVI (or does HDMI work on DVI?)
 
quest55720 said:
How would 720p with 4X AA compare to 1080P that gets downscaled to 720p? I can't afford a large 1080p TV any time soon so I am getting a 720p TV this fall.

If a game supported 1080p you can be damn sure it will also have 720p support.

So best thing to do would be to set your console+game to run at 720p, thus it's running at the native res of your TV and will look its best.
 
Most TVs have decent enough scalers to where the supersampling provided by running higher resolution will provide better image quality than running at the native resolution.
 
Back
Top