The technology of Alan Wake *spawn

nightshade

Wookies love cookies!
Veteran
Mod: Spawned from the Analysis thread, this game is generating a lot of talk and so needs its own space.

Well, much of the post processing appears to be done in 720p though.
I don't think they're doing 720p resolution post processing...its madness cause even most of the PC games like to do away with full res post process. And to top that this game has volumetric lighting, doing things like those in full res is an overkill & even the modern graphics card struggle doing it [think Metro 2033 & Stalker's performance when enabling the highest quality lighting]
 
I don't think they're doing 720p resolution post processing...its madness cause even most of the PC games like to do away with full res post process. And to top that this game has volumetric lighting, doing things like those in full res is an overkill & even the modern graphics card struggle doing it [think Metro 2033 & Stalker's performance when enabling the highest quality lighting]

I think some parts of the post processing are 720p... I don't know honestly... Forget what I said.
 
Don't understand why the low rez. When using complex lighting effects on the 360 does eDRAM become an achilles heel? Would like to see DF article on it.
 
No. I don't know exactly what is rendered at what resolution. I just think relying purely on the opaque geometry buffer is a bit dated when you consider modern engines... There is simply too much going on elsewhere...
That's true of every game, and taking that view, this thread wouldn't exist! For understanding game rendering tech on these boxes, looking at the rendering resolution is a big factor. Hence for this thread, we take the rendered resolution as being the framebuffer resolution used for rendering the bulk of the opaque geometry. Otherwise it'd be impossible to get a clear position for discussion. eg. A game with the 3D rendered at 540p, upscaled to 1080p and then with a native 1080p UI overlaid, would not be a 1080p game and would not be comparable to a game rendering in native 1080p. For discussing nitty-gritty details, like comparative alpha-blended particle framebuffer resolutions, there are the Tech Thread and individual threads.

Thanks Shifty. But when the final image is 540p (upscaled by the Xbox to 720p): does it even make sense that one of the "intermediate" buffers is > 540p?
Yes, the UI framebuffer. Upscaled low-resolution UI's look horrible, and as they are cheap to render, a native UI makes perfect sense. Also note that the response says 'render targets' and not 'framebuffers'. Some of these are going to be textures at maybe 512 x 512. You can also render larger than the output resolution and downscale for improved quality, although that's expensive. The Snowblind Studios engine on PS2 did this, adding 2x supersampled antialising and creating an incredibly clean look for the hardware.
 
Indeed... shadowmaps could be rendered to a 1024x1024 buffer for instance. I'd be very surprised if their shadow implementation didn't take advantage of 4xMSAA too...

Don't understand why the low rez. When using complex lighting effects on the 360 does eDRAM become an achilles heel? Would like to see DF article on it.

Lighting -> shaders -> per pixel issue. eDRAM + tiling is a geometry re-processing @ tile boundaries issue.

They do mention deferred lighting, but the number of simultaneous render targets is a bit of an unknown. It would be interesting to see how many tiles they require...
 
That's true of every game, and taking that view, this thread wouldn't exist! For understanding game rendering tech on these boxes, looking at the rendering resolution is a big factor. Hence for this thread, we take the rendered resolution as being the framebuffer resolution used for rendering the bulk of the opaque geometry. Otherwise it'd be impossible to get a clear position for discussion. eg. A game with the 3D rendered at 540p, upscaled to 1080p and then with a native 1080p UI overlaid, would not be a 1080p game and would not be comparable to a game rendering in native 1080p. For discussing nitty-gritty details, like comparative alpha-blended particle framebuffer resolutions, there are the Tech Thread and individual threads.

Yes, the UI framebuffer. Upscaled low-resolution UI's look horrible, and as they are cheap to render, a native UI makes perfect sense. Also note that the response says 'render targets' and not 'framebuffers'. Some of these are going to be textures at maybe 512 x 512. You can also render larger than the output resolution and downscale for improved quality, although that's expensive. The Snowblind Studios engine on PS2 did this, adding 2x supersampled antialising and creating an incredibly clean look for the hardware.
Got it.
 
Just for the heck of it

"In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."

Dear fellas, would this Inferred lighting technique have anything to do with what the developer said? (multiple low-res buffers combined together to achieve pseudo-AA effect in 720p?)
 
Got what? There still isn't a definitive answer.
Yes there is. 960x540 rendering of the main polygon visuals, accounting for the visible step sizes you'd get when upscaled to 720p, with 720p UI overlays and an assortment of buffer sizes during rendering. The framebuffer resolution question as asked and used on this board has a clear answer for the current unreleased engine.
 
I think everyone can agree that in an ideal world, every single screen space buffer would have the same resolution (and the same amount of AA as well!), be it 720p, 1080p, or just 480p. This would ensure the best possible image quality overall.

Now, if any of the intermediate buffers or passes is rendered at a lower resolution, then it will definitely detract from the global IQ. The question is how the gains resulting from such performance optimization of this buffer measures up to the perceivable loss of IQ due to the lack of resolution.

In the case of Alan Wake, the videos so far show a remarkably aliasing-free image, which is still a rarity on this generation. It also complements the movie-like post processing and camera work.
But sharpness is lost as well, especially compared to something like God of War 3 which is also quite clean but detailed too. It would obviously look significantly better if everything was rendered at 720p; then again it'd look even better if it was 1080p with 16xAA, too...
 
Yes there is. 960x540 rendering of the main polygon visuals, accounting for the visible step sizes you'd get when upscaled to 720p, with 720p UI overlays and an assortment of buffer sizes during rendering. The framebuffer resolution question as asked and used on this board has a clear answer for the current unreleased engine.

Yeah, that's true. I guess that it can be added to the list as 540p then. Still I'd love to see these tests done again by DF.
 
I guess 720p and 4xAA really is too good to be true for this gen.
do you mean with alan wake's tech under the hood?

there are some pretty impressive games out there running 720p with some anti-aliasing too.
halo reach has some great lighting in a sandbox environment @ 720p res, for example.

i just don't understand the remedy's choice of dropping so low the resolution, i am an image quality whore :smile:
 
They are just describing pretty much every game out there, and it isn't really and explanation! Games frequently use mixed resolution framebuffers, like quarter-resolution particle buffers on PS3, and then combine everything into the final output. "720p" here means outputting a 720p composite of mixed-rendering framebuffers, without any clue as to what's rendered at 720p native or not. Everything so far is pointing towards a 960x540 main rendered opaque geometry framebuffer, what we gamers consider the game resolution, with a 720p native HUD/UI overlaid. This official response is a PR-phrased explanation of what they mean by 720p given that the game was described as 720p, and as usual '720p' is used when the output is 720p irrespective of the render resolution.

Much appreciated Shifty, I keep hearing about the quarter resolution of particle buffer in KZ2 and now I see what's going on.
 
From what I remember, the XTS version of RE5 implemented 720p MSAAx4 (with the amount of AA applied changing dinamically) and Heavenly Sword pushed the same on PS3 some years ago. Nevertheless, the engine seems to handle a huge amount of transparencies and alpha textures... maybe the reason for the 540p render size is found there.

Now, I´m very curious about the evolution of the engine during all this time. From being a ambitious technical showcase for expensive hardware on PC to the more cautious console version, there´s quite a change in my opinion, and the developer for sure had to change their mind (and the goals of the project) a bit . I wonder if the different versions of the engine have differed very much during all these years.

The developer said that the engine was reworked several times during the process of creation. Is it reasonable to think that one of these changes could happened in the last months? Could the moderate resolution be forced for the annoying screen tear reported during E3? Now the game seems to be very stable.

A post-mortem analysis of the engine will be really, really interesting. I hope DF will do it.

Bye
 
Back
Top