The technology of Alan Wake *spawn

The first one is not. What I find odd is that some pixelcounters have counted the pixels and said it's 640p, Quaz and other counted 540p and another user above has counted 720p :S

dfnbsfd.jpg

This is a shot from halo 3
To be exact they got it from this post
http://forum.beyond3d.com/showpost.php?p=1070972&postcount=282

I know because i wanted to know how to count pixels.;)
 
The top image looks more like an albedo only, theres a lot more missing than post processing.
Indeed, I took a closer look at the before shot and eventhough its the same gemetry,textures & models.....I see a lack of shadows & a flat point light, you don't add things like lighting & shadowing in a post process filter. Its almost as if its Alan Wake with base geometry & Textures with make do lighting, nothing else !


This is a shot from halo 3
To be exact they got it from this post
http://forum.beyond3d.com/showpost.php?p=1070972&postcount=282
I know because i wanted to know how to count pixels.;)
Ha !! You & me my friend are one & the same :p
 
Last edited by a moderator:
I would never say that I think that 720p is worse than 540p, quite the opposite, it isn't even slightly debatable, but 720p no AA in comparison with 540p + AAx4 isn't that great.

I see it as an interwoven choice made by the developers knowing that it was the best image quality taking into account all the features they wanted to add to the game, attempting to preserve and accomplish their IQ goals.

Rendering at 960x540p + AAx4 means that the game is actually rendered at 3840x2160.

Does it mean that it's rendered at UltraHD? Perhaps it isn't, but it's certainly 2-3 times better than 1280x720p no AA.
I'm guessing you would have preferred for this entire generation of games to be run at SD resolutions then, if perhaps 4-8x AA were a prerequisite to address edge aliasing. Because 540p is approaching pretty closely to the SD resolutions we saw last generation with the Xbox 1 and PS2.
 
I would never say that I think that 720p is worse than 540p, quite the opposite, it isn't even slightly debatable, but 720p no AA in comparison with 540p + AAx4 isn't that great.

I see it as an interwoven choice made by the developers knowing that it was the best image quality taking into account all the features they wanted to add to the game, attempting to preserve and accomplish their IQ goals.

Rendering at 960x540p + AAx4 means that the game is actually rendered at 3840x2160.

Does it mean that it's rendered at UltraHD? Perhaps it isn't, but it's certainly 2-3 times better than 1280x720p no AA.

You are very, very off.

First of all 3840x2160->960x540 = 16x SSAA
1920x1080->960x540 = 4xSSAA

Second, MSAA is far less demanding than SSAA, that's why it's used in the first place despite its issues, so you can't just say 960x540 with 4xMSAA is equal to full 1080p noAA, not even close.

Second, according to your logic above that says MSAA~SSAA (which is false), a typical game such as UC2 running at 1280x720 with 2xAA would be equivalent to 960x540 with
(1280x720*2)/(960*540) = 3.56x AA.

960x540 4xAA is certainly not 2-3 times more demanding as 720p noAA, I'd say they're comparable and there are too many other factors involved.
 
540p (960x540) is actually a significant jump in pixels from 480p (720x480). 480p is 2/3 the number of pixels of 540p. Plus you're getting 4x MSAA. 720p is definitely a massive increase in pixels from both.
 
The AA looks beyond 4xMSAA. I believe these are promotional shots.
Are you talking about both images or only the 2nd ? The first one is just too blurry to even get some decent edges for counting pixels, shouldn't it be crisper if its 720p. (is it even 720p ? cause I for one can't count the pixels there with all the blur, I'm still intermediate in counting pixels)
 
Last edited by a moderator:
960x540 4xAA is certainly not 2-3 times more demanding as 720p noAA, I'd say they're comparable and there are too many other factors involved.

Generally speaking, that part of your post really depends on the hardware. On 360 you are right the two might be comparable, any losses due to extra geometry processing are offset by less pixel shader execution since it's a unified hardware setup, so perhaps they are comparable. On PS3 though definitely not since 4xmsaa on rsx is very slow hence why many have been doing research into spu based aa.

The other point missed especially on a game like Alan Wake which apparently has tons of post process related buffers is that the size of the buffers gets tuned to the opaque buffer size. So if you bump the opaque pass up to 1280x720 then you may have to bump up many of the post process buffers as well to maintain desired image fidelity and effect. So even if one made the argument that 960x540 4xmsaa was directly comparable to 1280x720 0xmsaa, indirectly they aren't with the latter likely somewhat slower since I'd expect many of it's post process steps to be more expensive due to changes in their buffer sizes.
 
MSAA is not SSAA, perfomance penalty is not even close and IQ quite a bit better with SSAA vs say 8xMSAA/TSAA. Also from my time with PC games having MSAA enabled vs disabled has never been a big penalty bar a few games with upwards 50% framerate hit (7900/8800/48xx). 800x600 4xMSAA vs 1280x720 are fairly close perfomance wise in several games. Though 960x540 has about ~35k more pixels than 800x600.
 
resolution starts to become a bigger factor the larger your tv is...is that a good assumption?

I think so.

And as it seems remedy isn't ashamed to show their game even on a ultra big setup:

h90.jpg



If that's what they are using to show the game to the press, and they still can tell the game is upscaled then i guess the game IQ may end up being really good XD
 
Yeah I guess they could do this to maintain superior IQ at least in some areas, other options could've been something like 640p with 2xAA or 720p with no AA but I guess they wanted to keep the 4xAA no matter what.

Of course Remedy had their reasons to go that low but it's still weird mainly because they had other options that maybe were better than a 540p w/4xAA especially when displayed at HDTV's bigger than 40".

540p 4xAA will look better than 640p 2xAA, COD MW1/2 run at 600p 2xAA and the IQ of Alan Wake is definitely better than COD.

I've never seen a console game with variable resolution depending on workload, would be interesting though - but I suppose the consistency of the visuals would suffer between areas.

And what's this about Remedy using their own software solution to do the 720p scaling rather than rely on Xenos? Why would they do that?, can't Xenos do even Lanczos resampling for 'free' Which would be very hard to better with a software scaler without your framerate plummeting.
 
Generally speaking, that part of your post really depends on the hardware. On 360 you are right the two might be comparable, any losses due to extra geometry processing are offset by less pixel shader execution since it's a unified hardware setup, so perhaps they are comparable. On PS3 though definitely not since 4xmsaa on rsx is very slow hence why many have been doing research into spu based aa.
Yeah I was more thinking about traditional PC GPU's, which are all unified shaders for a while. PS3 is a different beast, but even there you could use MLAA to make up for the GPU deficiency, and for comparison sake we say it's equivalent to 4xMSAA (it's better in regular edges but has issues with subpixel and shimmering), so still nowhere as bad as 4xSSAA.


The other point missed especially on a game like Alan Wake which apparently has tons of post process related buffers is that the size of the buffers gets tuned to the opaque buffer size. So if you bump the opaque pass up to 1280x720 then you may have to bump up many of the post process buffers as well to maintain desired image fidelity and effect. So even if one made the argument that 960x540 4xmsaa was directly comparable to 1280x720 0xmsaa, indirectly they aren't with the latter likely somewhat slower since I'd expect many of it's post process steps to be more expensive due to changes in their buffer sizes.

The other buffers can be less resolution than the opaque. Most people are more sensitive to overall image blurriness compared to low res secondary buffer effects.
 
The other buffers can be less resolution than the opaque. Most people are more sensitive to overall image blurriness compared to low res secondary buffer effects.

For sure, but often the buffers are sized to a multiple of the opaque buffer hence why they may still end up larger if one went with 1280x720 for opaque. I'm not so sure anymore how sensitive people really are to blurriness, it's a very confusing topic. Quincunx is by far the worst offender when it comes to blur, more so than sub hd games, but many seem oddly fine with it. Or I had mentioned way back how all PS3 720p games look blurry on 1080p tv's compared to their 360 counterparts because the PS3 depends on the tv's scaler which is usually junk and results in lots of blur, but you'd be hard pressed to find anyone that can see the difference even though to me it's as clear as night and day. I'm starting to think that some see blur as more cinematic somehow, it's weird. It must be the aliasing that bugs people with sub hd games, not the blur, because plenty of 720p games exhibit blur.
 
540p (960x540) is actually a significant jump in pixels from 480p (720x480). 480p is 2/3 the number of pixels of 540p. Plus you're getting 4x MSAA. 720p is definitely a massive increase in pixels from both.

Oh, I wasn't saying 540p wasn't a step above 480p, just that it's a pretty meager step relative to the full 1280x720 benchmark that we expect from and were promised this generation. If I were told before the current consoles launched that games would be standardized at 540p rather than 720p, regardless of the 4xAA, I would not have been a happy HDTV owner.

Speaking of which, did any games last generation (Xbox, PS2, Gamecube) utilize 4xAA at all? Or did they all max out at 2xAA? I can't seem to recall.

Quincunx is by far the worst offender when it comes to blur, more so than sub hd games, but many seem oddly fine with it. Or I had mentioned way back how all PS3 720p games look blurry on 1080p tv's compared to their 360 counterparts because the PS3 depends on the tv's scaler which is usually junk and results in lots of blur, but you'd be hard pressed to find anyone that can see the difference even though to me it's as clear as night and day.
Well, I wouldn't go that far. It's debatable actually how much of an impact quincunx has on image blur, at least varying on each individual game and implementation; I've played Resistance 1/2 and Killzone 2, and those are all relatively clean, sharp looking QAA games. Whatever the case, QAA generally isn't as bad a full-on blur-filter, as you can see from the screenshots someone posted detailing the QAA edge pattern in this forum.

I agree that many multiplatform games exhibit an extra layer of blur on the PS3 versions, or at least they used to, but I think that is down to a blur filter being applied or a PS3-specific optimization that leads to a slight performance increase at the cost of some screen sharpness (something that the Bioshock devs alluded to when they addressed the apparent "blur filter" of the PS3 version of Bioshock).
 
Last edited by a moderator:
For sure, but often the buffers are sized to a multiple of the opaque buffer hence why they may still end up larger if one went with 1280x720 for opaque. I'm not so sure anymore how sensitive people really are to blurriness, it's a very confusing topic. Quincunx is by far the worst offender when it comes to blur, more so than sub hd games, but many seem oddly fine with it. Or I had mentioned way back how all PS3 720p games look blurry on 1080p tv's compared to their 360 counterparts because the PS3 depends on the tv's scaler which is usually junk and results in lots of blur, but you'd be hard pressed to find anyone that can see the difference even though to me it's as clear as night and day. I'm starting to think that some see blur as more cinematic somehow, it's weird. It must be the aliasing that bugs people with sub hd games, not the blur, because plenty of 720p games exhibit blur.

I guess my TV has a decent scaler, and I don't think the TV scalers mess up for 720p->1080p since it's a really easy process, most TV's suck much worse when it comes to upscaling SD or interlaced content. Last time I compared games on PS3/360 head to head was Fifa 10, and PS3 uses QAA as well as relying on the TV's scaler, but the difference was very minor for me, probably because Fifa is not a game that relies on texture detail. There was little difference, compared to GTA4 which I thought was really blurry on the PS3. Strangely, although Conan is 576p, I don't remember it as a blurry experience, maybe because it's been 2 years since I played it though.

Still I enjoyed GTA4, and Alan Wake being 540p should not be a detriment to its enjoyment.
 
Last edited by a moderator:
I'm new to all this, so forgive the newb question. What would be a likely explanation for the blurry appearance from the screens and video...? Does that question fit within the scope of this topic? Basically, what technology is being employed to give that look, or is it simply due to sub-hd res being upscaled?
 
I'm new to all this, so forgive the newb question. What would be a likely explanation for the blurry appearance from the screens and video...? Does that question fit within the scope of this topic? Basically, what technology is being employed to give that look, or is it simply due to sub-hd res being upscaled?

There is definitely a post processing motion blur in the game giving it some of the blur intentionally. But a fair amount of it is from the 540p opaque geometry buffer being upscaled. And some of it is due to the jpg image compression from online hosting on some of these websites(not saying that is all of it, just some of it).
 
Back
Top