Svensk Viking
Regular
I guess it is obvious that 540 4xMSAA will look better than 720p no AA when played on SDTV/EDTV?
yea..because of the AA. A game that's 720p/no AA will be 480p/no AA while playing on SDTV while 540p/4*AA will be 480p/4*AA.I guess it is obvious that 540 4xMSAA will look better than 720p no AA when played on SDTV/EDTV?
Yes, today. A year ago, 32" sets were 720p. These are a very common size among Europe as I understand it, altrhough I believe Americans have typically larger sets. 40" is about the maximum common standard here. 40" 1080p sets were rare at the beginning of this gen. My friend bought a 1080p Sammy for his PS3 some 6 months after its release, and he had very few options. So although 1080p is commonplace now, it wasn't, and for the majority of this generation hasn't been, and unless everyone who bought a 720p set for their XB360 or PS3 has recently upgraded to 1080p, the expectation should be that the majority (>50%) are at 720p.Majority of install base 720P??? No way. 720p was only normal when HDTV market penetration was just beginning (and even then 1080P quickly battled it), which necessarily means it's a small share . You can hardly buy a TV above 32-37" today that's not 1080P.
Who has a 720P TV? 99% of HDTV's are 1080P.
You know what ? there has been conflicting reports on the graphics of this game. Some at GAF are claiming that the game looks gorgeous on HD tv , some are saying just this .
http://www.neogaf.com/forum/showpost.php?p=21070774&postcount=3809. I know GAF is also a fanboy forum , but i would still wait for a B3D mod/guru's confirmation of the game's look ,before i purchase it.
No-one can answer that for you but yourself. You need to view upscaled content on your TV to see if you like the look of it. If you have no problem watching SD content on your TV, you shouldn't have a problem viewing Alan Wake....but i would still wait for a B3D mod/guru's confirmation of the game's look ,before i purchase it.
Perfectly said.Console game graphics technology development is all about finding the best compromise between your frame rate, image quality and resolution (for output image, shadowmaps, post effects, particles, etc). All console platforms have predetermined resources to use. If you want to have better looking pixels, you have to either optimize your code, reduce your resolution(s) or reduce your frame rate. Room for optimization always ends up somewhere (current gen consoles are already pretty old and well studied), so at some point you have to decide whether you want to have less better looking pixels or more worse looking pixels.
The current generation hardware is the first HD generation console hardware, and it also is the first console hardware with flexible programmable shaders (hundreds of instructions can be executed per pixel). The processing requirement per outputted pixel has increased dramatically since the last generation. Now we calculate real time pixel perfect lighting using per pixel normals and per pixel material definition. And we do this for dozens of light sources every frame, and we calculate real time shadows from all occluders and all light sources also.
Any developer can still choose to render using a simple color texture with baked light map lighting if he/she chooses so. Using technology like this, there would be no problem at all to render at full 1080p at 60 fps (with MSAA and anisotropic filtering even). However most developers feel that it's more imporant to get better looking image instead of the highest possible resolution. Pixel quality over pixel quantity.
I personally prefer Alan Wake image quality (at quarter Full HD) over the new Perfect Dark version (at Full HD). Alan Wake has four times less output pixels...
Majority of install base 720P??? No way. 720p was only normal when HDTV market penetration was just beginning (and even then 1080P quickly battled it), which necessarily means it's a small share . You can hardly buy a TV above 32-37" today that's not 1080P. In fact even those would be discontinued models.
But I googled and couldn't really find any statistics. OK, 99% is obviously a little on the high side
Yes, today. A year ago, 32" sets were 720p. These are a very common size among Europe as I understand it, altrhough I believe Americans have typically larger sets. 40" is about the maximum common standard here. 40" 1080p sets were rare at the beginning of this gen. My friend bought a 1080p Sammy for his PS3 some 6 months after its release, and he had very few options. So although 1080p is commonplace now, it wasn't, and for the majority of this generation hasn't been, and unless everyone who bought a 720p set for their XB360 or PS3 has recently upgraded to 1080p, the expectation should be that the majority (>50%) are at 720p.
Who has a 720P TV? 99% of HDTV's are 1080P.
In which case up until a couple of years ago at the earliest I think, depending how far ahead the US is in product availability, 1080p sets wouldn't have been available at that size except maybe in the very high end.No, 32" 720p sets are still the lions share of consumer LCD TV purchases in the US also.
Thats not quite the same, as those TVs don't accept a 1366x768 dignal (at least not from HDMI/composite) such that games can't really target that resolution. The notion here is that isf most people have 1080p sets, rendering at exactly one quarter 1080p resolution would benefit the upscaling quality. Whatever the state of set resolutions, if they're not commonly 1080p, that idea goes out the window.even then 1280x720 is just an arbitrary number. whether you get a 720p set or a 1080p set its being upscaled. since 720p sets 99% actually have a pixel resolution of 1366x768.
What you said maybe perfectly fitting for last gen consoles along with the universal spread of SD TVs in every households. But since we're living in the HD era with more than half of end users owning a HD capable display at least in the states, expectations in video games should be a lot higher than simply making a sever decrease in resolution for better effects. Of course, one can still tolerate a slight or moderate resolution decrease but there is only so much to cut for before the picture ruins your vision. I personally prefer the best balance with a 720p native res with some sort of AA for the baseline and above average effects. A game like AW can probably get away with the night scenes when rendering at this extreme low res but daytime footage looked far more jarring. All the effects and textures simply couldn't be appreciated in full potential so in a sense they're a little wasted. Games like Gears2 and res5 looked a lot more balanced to me in comparison.Console game graphics technology development is all about finding the best compromise between your frame rate, image quality and resolution (for output image, shadowmaps, post effects, particles, etc). All console platforms have predetermined resources to use. If you want to have better looking pixels, you have to either optimize your code, reduce your resolution(s) or reduce your frame rate. Room for optimization always ends up somewhere (current gen consoles are already pretty old and well studied), so at some point you have to decide whether you want to have less better looking pixels or more worse looking pixels.
The current generation hardware is the first HD generation console hardware, and it also is the first console hardware with flexible programmable shaders (hundreds of instructions can be executed per pixel). The processing requirement per outputted pixel has increased dramatically since the last generation. Now we calculate real time pixel perfect lighting using per pixel normals and per pixel material definition. And we do this for dozens of light sources every frame, and we calculate real time shadows from all occluders and all light sources also.
Any developer can still choose to render using a simple color texture with baked light map lighting if he/she chooses so. Using technology like this, there would be no problem at all to render at full 1080p at 60 fps (with MSAA and anisotropic filtering even). However most developers feel that it's more imporant to get better looking image instead of the highest possible resolution. Pixel quality over pixel quantity.
I personally prefer Alan Wake image quality (at quarter Full HD) over the new Perfect Dark version (at Full HD). Alan Wake has four times less output pixels...
What you said maybe perfectly fitting for last gen consoles along with the universal spread of SD TVs in every households. But since we're living in the HD era with more than half of end users owning a HD capable display at least in the states, expectations in video games should be a lot higher than simply making a sever decrease in resolution for better effects. Of course, one can still tolerate a slight or moderate resolution decrease but there is only so much to cut for before the picture ruins your vision.
What you said maybe perfectly fitting for last gen consoles along with the universal spread of SD TVs in every households. But since we're living in the HD era with more than half of end users owning a HD capable display at least in the states, expectations in video games should be a lot higher than simply making a sever decrease in resolution for better effects. Of course, one can still tolerate a slight or moderate resolution decrease but there is only so much to cut for before the picture ruins your vision. I personally prefer the best balance with a 720p native res with some sort of AA for the baseline and above average effects. A game like AW can probably get away with the night scenes when rendering at this extreme low res but daytime footage looked far more jarring. All the effects and textures simply couldn't be appreciated in full potential so in a sense they're a little wasted. Games like Gears2 and res5 looked a lot more balanced to me in comparison.