The technology of Alan Wake *spawn

Well many people on this board have wondered how a game that targeted sd resolutions and went crazy with post-processing and shaders would look and it looks like that is what we got here. Pretty disappointing though; too bad they couldn't have done 720p with AAA.
 
That being said, the game is easily the best looking 360 title to date (can anyone think of a close competitor?) and the complete lack of aliasing along with all the post processing effects give it a very cinematic, realistic quality to the visuals compared to most other titles even though they run at higher resolutions (similar to the visual realism of KZ2).

What do you mean by lack of aliasing? Its there.
 
Well many people on this board have wondered how a game that targeted sd resolutions and went crazy with post-processing and shaders would look and it looks like that is what we got here. Pretty disappointing though; too bad they couldn't have done 720p with AAA.

So it would be like the requirement for HD gaming never happened and the developers can use hardware power however they want. I hope that this was the right decision, I expect from Alan Wake the best post processing and shaders effect for consoles.
 
Recent screenshots @ http://www.ps3news.com/XBox-360/amazing-new-in-game-alan-wake-xbox-360-images-arrive/

I am by no means an expert on counting pixels, but looking at these, I instantly feel something is wrong. Having played many, many games on the 360 and PC in recent times (at different resolutions of course), Alan Wake seems to have very little aliasing, but still looks to be sub-720p and an overall blurry mess. This either means there is some heavily used blurring filter in play throughout the entire game experience (no, that does NOT make it more cinematic, like films or Twin Peaks) or it's simply not 720p material and they went for AA rather than resolution.

This game may have impressive volumetric effects, shadows, and lighting, but these screenshots are thoroughly disappointing, especially with so many years of development and to anyone who was looking forward to playing Alan Wake on the PC. I realize that it looks better in motion, when running or shooting baddies -- but we all know there will be a lot of slow walking or stopping to enjoy the nature or scenery. When that happens, the blurry nature of the visuals will bother me a lot. Am I the only one?
 
And despite being 540p AW's IQ is better than COD or Battlefield, so I don't think it's a big deal (apart from the fact we were told it was 720p).
I meain, I wouldn't mind seeing Halo 3 or Reach running in 540p with 4xAA, IMO much much better than 640p no AA. For me 'jaggies' destroy the suspension of disbelief like nothing else.
It won't work out with a game like Halo, that's a vibrant looking game with high contrast edges everywhere while Alan Wake has an extremely muted color palate even no AA would've looked fairly "jaggie free" for the most part given the color space & further blur cause due to upscaling. The thing is lower the resolution the uglier it gets in bigger sets no matter how much AA you pour in because there's simply isn't many pixels on screen.
 
It won't work out with a game like Halo, that's a vibrant looking game with high contrast edges everywhere while Alan Wake has an extremely muted color palate even no AA would've looked fairly "jaggie free" for the most part given the color space & further blur cause due to upscaling. The thing is lower the resolution the uglier it gets in bigger sets no matter how much AA you pour in because there's simply isn't many pixels on screen.

Alan Wake isn't that gray, there's plenty of color in the daytime scenes
image_alan_wake-12275-759_0008.jpg


The other post-processing effects/blur also going on in Alan Wake would also make Halo look better as it looks a bit sterile and last-gen as it is compared to games like Gears and KZ2. The trend in graphics is definitely towards heavy use of post-processing effects.

And to me, the fact that Bungie's environment art is largely based around straight edges and geometric shapes but the engine has no form of antialiasing or edge smoothing whatsoever is incredibly stupid. I would much rather see them take the 40% reduction in pixels to gain 4xAA and copious amounts of alpha heavy and post processing effects like we see in AW.
 
Last edited by a moderator:
Alan Wake isn't that gray, there's plenty of color in the daytime scenes.
Did I ever said its gray ?
I just said Alan Wake isn't a vibrant looking game cause of muted color palate & low contrast edges.

And btw I haven't noticed any object motion blur shader in the game yet, all I've seen is just a camera motion blur..but maybe thats just cause of the videos.
 
Last edited by a moderator:
There really should be a good balance between all those rendering buffers and I definitely agree the main priority should be the geometry buffer. It is the HD era after all and I just hope devs don't follow the sub-hd trend as things would only get blurrier time after time. I don't know about you guys but personally I'd prefer a 720p 0xaa game than a 540p 4xaa game generally speaking, I feel like the higher res buffer would naturally help to ease the aliasing anyway or am I wrong?
 
There really should be a good balance between all those rendering buffers and I definitely agree the main priority should be the geometry buffer. It is the HD era after all and I just hope devs don't follow the sub-hd trend as things would only get blurrier time after time. I don't know about you guys but personally I'd prefer a 720p 0xaa game than a 540p 4xaa game generally speaking, I feel like the higher res buffer would naturally help to ease the aliasing anyway or am I wrong?

Im agree.Its HD era and 720P 0*fsaa dont bother me more than blurry image in 540P with 4*fsaa (and probably 4*msaa certainly dont hide all jagguies at this resolution...) despite all post-processing effects.
 
Did I ever said its gray ?
I just said Alan Wake isn't a vibrant looking game cause of muted color palate & low contrast edges.

The low contrast and muted color palate(and some of the things people are moaning about in the screens) could be attributed to Videogamerzone.de(who has leaked all of these screens) being biased against 360 and going out of their way to make the 360 version of games look worse than PS3.

Proof of this you say? Glad to! lol

Well it seems the main things they mess with are brightness and contrast, most likely In-Game and maybe other methods. It's obvious in any of the leaked Screens that the Hud is brighter and shows signs of having both the contrast and brightness changed farther from normal than should be.

Aside from seeing that in the difference between official shots and the leaked screens(of course ignoring some of what CAN be attributed to lower resolution), the main damning evidence is that they HAVE done it before.

See this link http://forum.alanwake.com/showpost.php?p=70705&postcount=1313to the proof of that with Dante's Inferno.

Isn't it funny how the same website that produced these blurry, color muted and low contrast screens(that some peopel here and on GAF are saying is exactly what Wake will look like) shows the exact same kind of visual problems on the 360 version of Dante's Inferno Compared to ps3? Especially in light that Digital Foundry and Lense of Truth have both shown that both versions of that game are IDENTICAL.

Ok, that's my 10 cents. BTW, I'm not criticizing the pixelcounters work or that it's 540p base resolution. I'm disputing the quality of videogameszone.de's screens because some people around here and on GAF seem to assume that is 100% what the game will look like when playing it. That's Bullcrap. :D
 
There really should be a good balance between all those rendering buffers and I definitely agree the main priority should be the geometry buffer. It is the HD era after all and I just hope devs don't follow the sub-hd trend as things would only get blurrier time after time. I don't know about you guys but personally I'd prefer a 720p 0xaa game than a 540p 4xaa game generally speaking, I feel like the higher res buffer would naturally help to ease the aliasing anyway or am I wrong?

Between 720p/no AA & 540p/4*AA ? Its a trade off seriously....you get to choose between:

1) crisp image with uniform amount of jaggies everywhere on screen.

2) Blurred image with less jaggies overall...but horrible jagged edges in some areas that have lots of thin objects.

In most cases I'll go with option 1 cause subHD doesn't mixes well with big TV sets.
 
DF in their really interesting article that's up now at Eurogamer said that last year's footage were definitely native 720p...so were also the gameplay bits in that video 720p? and if yes I guess it was Remedy's decision to lower the resolution for better performance but that much lower? Doesn't make much sense since they didn't add - at least from what we've seen - any new graphical effects (like object motion-blur) though I must say that the frame-rate is definitely better in the latest videos.
 
They must've started to have some problems in some of the area later in the game. But what I don't understand is why lower the resolution throughout ? Why not just keep it variable with 720p for those areas which worked well (like last year's gameplay) and consort to subHD when in the areas that were a problem to do at 720p ?
 
It's not such a simple trade-off at all...

If you're using MSAA, you only calculate 1 shading sample per pixel, the rest of the work is to determine the coverage for the current triangle. The difference in fragment shader load between 540p and 720p is about 1 to 2, and it is more then significant. Going to the higher res would nearly cut the framerate in half, even if it meant avoiding tiling and such.

So no, you don't get to choose. You get various bottlenecks and performance figures, and you have to balance the features against each other. Alan Wake's probably putting the lighting first, rendering very expensive pixels in the fragment shader, but it's still left with enough resources elsewhere to add 4xMSAA and the extra memory consumption.
 
Well then what about last year's video which was 720p/ 4*MSAA ?
You had console esque things like low level texture filtering & such....though I am not denying that it could be from the PC with low level texture filtering too.
 
They must've started to have some problems in some of the area later in the game. But what I don't understand is why lower the resolution throughout ? Why not just keep it variable with 720p for those areas which worked well (like last year's gameplay) and consort to subHD when in the areas that were a problem to do at 720p ?

Yeah I guess they could do this to maintain superior IQ at least in some areas, other options could've been something like 640p with 2xAA or 720p with no AA but I guess they wanted to keep the 4xAA no matter what.

Of course Remedy had their reasons to go that low but it's still weird mainly because they had other options that maybe were better than a 540p w/4xAA especially when displayed at HDTV's bigger than 40".
 
What I don't understand is why not let the 360 scale the 540p into 1080p. It would be a pixel perfect scaling 1:4 and would look better than 540p first scaled to 720p then to 1080p.
 
Back
Top