The technology of Alan Wake *spawn

I guess it is obvious that 540 4xMSAA will look better than 720p no AA when played on SDTV/EDTV?
yea..because of the AA. A game that's 720p/no AA will be 480p/no AA while playing on SDTV while 540p/4*AA will be 480p/4*AA.
Unless ofcourse the game uses a different framebuffer while displaying in SD resolution, like Gears of War 1 which does a downscale from 960*720 to 480p & provides a super sampling effect.
 
Majority of install base 720P??? No way. 720p was only normal when HDTV market penetration was just beginning (and even then 1080P quickly battled it), which necessarily means it's a small share . You can hardly buy a TV above 32-37" today that's not 1080P.
Yes, today. A year ago, 32" sets were 720p. These are a very common size among Europe as I understand it, altrhough I believe Americans have typically larger sets. 40" is about the maximum common standard here. 40" 1080p sets were rare at the beginning of this gen. My friend bought a 1080p Sammy for his PS3 some 6 months after its release, and he had very few options. So although 1080p is commonplace now, it wasn't, and for the majority of this generation hasn't been, and unless everyone who bought a 720p set for their XB360 or PS3 has recently upgraded to 1080p, the expectation should be that the majority (>50%) are at 720p.
 
Who has a 720P TV? 99% of HDTV's are 1080P.

i think you misunderstood my question. I was wondering as to how the game could look like when displayed on a 720p or HD tv. I was inquiring since the game has roughly half the resolution as a normal 720p game . So my question was -could it look very blurry on an HD tv? i was also wondering if 4xAA was enough to compensate for half the resolution of a 720p game?

if you look at the NON DOCTORED/BULLSHOT images of AW which were released like a few days ago,then you could see that the game was looking blurry in those screens
 
One thing you can try is to download some HQ movie in WMV format, copy it onto a pendrive and watch it through your Xbox on your TV. It should give you a relatively good idea of what to expect.
Unfortunately, very high quality movies are a rarity...
 
You know what ? there has been conflicting reports on the graphics of this game. Some at GAF are claiming that the game looks gorgeous on HD tv , some are saying just this .
http://www.neogaf.com/forum/showpost.php?p=21070774&postcount=3809. I know GAF is also a fanboy forum , but i would still wait for a B3D mod/guru's confirmation of the game's look ,before i purchase it.

The low resolution is simply gonna cause the game to look blurry. If you can deal with that, buy it, if not, don't ;)
 
...but i would still wait for a B3D mod/guru's confirmation of the game's look ,before i purchase it.
No-one can answer that for you but yourself. You need to view upscaled content on your TV to see if you like the look of it. If you have no problem watching SD content on your TV, you shouldn't have a problem viewing Alan Wake.
 
Console game graphics technology development is all about finding the best compromise between your frame rate, image quality and resolution (for output image, shadowmaps, post effects, particles, etc). All console platforms have predetermined resources to use. If you want to have better looking pixels, you have to either optimize your code, reduce your resolution(s) or reduce your frame rate. Room for optimization always ends up somewhere (current gen consoles are already pretty old and well studied), so at some point you have to decide whether you want to have less better looking pixels or more worse looking pixels.

The current generation hardware is the first HD generation console hardware, and it also is the first console hardware with flexible programmable shaders (hundreds of instructions can be executed per pixel). The processing requirement per outputted pixel has increased dramatically since the last generation. Now we calculate real time pixel perfect lighting using per pixel normals and per pixel material definition. And we do this for dozens of light sources every frame, and we calculate real time shadows from all occluders and all light sources also.

Any developer can still choose to render using a simple color texture with baked light map lighting if he/she chooses so. Using technology like this, there would be no problem at all to render at full 1080p at 60 fps (with MSAA and anisotropic filtering even). However most developers feel that it's more imporant to get better looking image instead of the highest possible resolution. Pixel quality over pixel quantity.

I personally prefer Alan Wake image quality (at quarter Full HD) over the new Perfect Dark version (at Full HD). Alan Wake has four times less output pixels...
 
Console game graphics technology development is all about finding the best compromise between your frame rate, image quality and resolution (for output image, shadowmaps, post effects, particles, etc). All console platforms have predetermined resources to use. If you want to have better looking pixels, you have to either optimize your code, reduce your resolution(s) or reduce your frame rate. Room for optimization always ends up somewhere (current gen consoles are already pretty old and well studied), so at some point you have to decide whether you want to have less better looking pixels or more worse looking pixels.

The current generation hardware is the first HD generation console hardware, and it also is the first console hardware with flexible programmable shaders (hundreds of instructions can be executed per pixel). The processing requirement per outputted pixel has increased dramatically since the last generation. Now we calculate real time pixel perfect lighting using per pixel normals and per pixel material definition. And we do this for dozens of light sources every frame, and we calculate real time shadows from all occluders and all light sources also.

Any developer can still choose to render using a simple color texture with baked light map lighting if he/she chooses so. Using technology like this, there would be no problem at all to render at full 1080p at 60 fps (with MSAA and anisotropic filtering even). However most developers feel that it's more imporant to get better looking image instead of the highest possible resolution. Pixel quality over pixel quantity.

I personally prefer Alan Wake image quality (at quarter Full HD) over the new Perfect Dark version (at Full HD). Alan Wake has four times less output pixels...
Perfectly said.
 
Majority of install base 720P??? No way. 720p was only normal when HDTV market penetration was just beginning (and even then 1080P quickly battled it), which necessarily means it's a small share . You can hardly buy a TV above 32-37" today that's not 1080P. In fact even those would be discontinued models.

But I googled and couldn't really find any statistics. OK, 99% is obviously a little on the high side :)

iirc 720p tvs are cheaper than 1080p, I have two 720p tvs.
 
Yes, today. A year ago, 32" sets were 720p. These are a very common size among Europe as I understand it, altrhough I believe Americans have typically larger sets. 40" is about the maximum common standard here. 40" 1080p sets were rare at the beginning of this gen. My friend bought a 1080p Sammy for his PS3 some 6 months after its release, and he had very few options. So although 1080p is commonplace now, it wasn't, and for the majority of this generation hasn't been, and unless everyone who bought a 720p set for their XB360 or PS3 has recently upgraded to 1080p, the expectation should be that the majority (>50%) are at 720p.

No, 32" 720p sets are still the lions share of consumer LCD TV purchases in the US also.

Regards,
SB
 
No, 32" 720p sets are still the lions share of consumer LCD TV purchases in the US also.
In which case up until a couple of years ago at the earliest I think, depending how far ahead the US is in product availability, 1080p sets wouldn't have been available at that size except maybe in the very high end.

even then 1280x720 is just an arbitrary number. whether you get a 720p set or a 1080p set its being upscaled. since 720p sets 99% actually have a pixel resolution of 1366x768.
Thats not quite the same, as those TVs don't accept a 1366x768 dignal (at least not from HDMI/composite) such that games can't really target that resolution. The notion here is that isf most people have 1080p sets, rendering at exactly one quarter 1080p resolution would benefit the upscaling quality. Whatever the state of set resolutions, if they're not commonly 1080p, that idea goes out the window.
 
Console game graphics technology development is all about finding the best compromise between your frame rate, image quality and resolution (for output image, shadowmaps, post effects, particles, etc). All console platforms have predetermined resources to use. If you want to have better looking pixels, you have to either optimize your code, reduce your resolution(s) or reduce your frame rate. Room for optimization always ends up somewhere (current gen consoles are already pretty old and well studied), so at some point you have to decide whether you want to have less better looking pixels or more worse looking pixels.

The current generation hardware is the first HD generation console hardware, and it also is the first console hardware with flexible programmable shaders (hundreds of instructions can be executed per pixel). The processing requirement per outputted pixel has increased dramatically since the last generation. Now we calculate real time pixel perfect lighting using per pixel normals and per pixel material definition. And we do this for dozens of light sources every frame, and we calculate real time shadows from all occluders and all light sources also.

Any developer can still choose to render using a simple color texture with baked light map lighting if he/she chooses so. Using technology like this, there would be no problem at all to render at full 1080p at 60 fps (with MSAA and anisotropic filtering even). However most developers feel that it's more imporant to get better looking image instead of the highest possible resolution. Pixel quality over pixel quantity.

I personally prefer Alan Wake image quality (at quarter Full HD) over the new Perfect Dark version (at Full HD). Alan Wake has four times less output pixels...
What you said maybe perfectly fitting for last gen consoles along with the universal spread of SD TVs in every households. But since we're living in the HD era with more than half of end users owning a HD capable display at least in the states, expectations in video games should be a lot higher than simply making a sever decrease in resolution for better effects. Of course, one can still tolerate a slight or moderate resolution decrease but there is only so much to cut for before the picture ruins your vision. I personally prefer the best balance with a 720p native res with some sort of AA for the baseline and above average effects. A game like AW can probably get away with the night scenes when rendering at this extreme low res but daytime footage looked far more jarring. All the effects and textures simply couldn't be appreciated in full potential so in a sense they're a little wasted. Games like Gears2 and res5 looked a lot more balanced to me in comparison.
 
What you said maybe perfectly fitting for last gen consoles along with the universal spread of SD TVs in every households. But since we're living in the HD era with more than half of end users owning a HD capable display at least in the states, expectations in video games should be a lot higher than simply making a sever decrease in resolution for better effects. Of course, one can still tolerate a slight or moderate resolution decrease but there is only so much to cut for before the picture ruins your vision.

I think you are looking at what sebbbi says from the wrong angle. Maybe try to consider this: compare a video of lap of a real car on a real track in widescreen SD on an HD tv, with seeing that same car and track in a videogame like Forza or GT5 Prologue. Which one looks better? In my view, while sometimes the games can look really great, the video of real life still tends to look a lot better despite the lower resolution. This indicates that there is more to pixel information than just resolution, see? And that leads to the conclusion that there may be cases where spending your budget on more effects rather than on more pixels is more effective. It's very similar to games choosing to use 720p over 1080p - the latter is always also possible, but can still look less impressive, especially in motion.

http://www.youtube.com/watch?v=TlKOSI8PeW8&feature=fvw

And I took a game type here that can match reality fairly closely much easier than most games can.
 
What I wonder is how many people actually sit at the correct distance from their TV to actually be able to notice the difference with sub-hd games. I know I sit too far away to notice a difference with my 55" 720p plasma.
 
That's why I chose 720p in the end. At the distance we'll be sitting, the benefit of 1080p on a 32" screen wouldn't really eb onticeable, and certainyl wasn't worth the price premium along with the lower quality reviews of the 1080p sets versus the Sammy I picked (although I would have liked the better screen, as this LCD is prone to smearing before it's warmed up and on very grey frames).
 
What you said maybe perfectly fitting for last gen consoles along with the universal spread of SD TVs in every households. But since we're living in the HD era with more than half of end users owning a HD capable display at least in the states, expectations in video games should be a lot higher than simply making a sever decrease in resolution for better effects. Of course, one can still tolerate a slight or moderate resolution decrease but there is only so much to cut for before the picture ruins your vision. I personally prefer the best balance with a 720p native res with some sort of AA for the baseline and above average effects. A game like AW can probably get away with the night scenes when rendering at this extreme low res but daytime footage looked far more jarring. All the effects and textures simply couldn't be appreciated in full potential so in a sense they're a little wasted. Games like Gears2 and res5 looked a lot more balanced to me in comparison.

The lighting and shadows in Alan Wake look incredibly more sophisticated than anything in Gears2 and Resident Evil 5. Both of those are great looking games, but they have different goals. 540p does seem incredibly low resolution, but until the game is out we won't know how much pixel quality they gained, to use Sebbbi's words. Sebbbi's post was spot on, I think. I don't think resolution should be prioritized over anything else. The only thing that matters is if the game looks good.
 
Back
Top