for comparison it's the same type of resolution, AA and visual style than Forbidden Siren on the PSN
it's the same choice
The vast majority are pre-rendered with post processing elements, and in 720p.And the same genre.
It´s a funny coincidence, though.
More questions: Is the engine using the same resolution for cutscenes (real time or not) and in-game? Are the cutscenes, from what we know, real-time rendered, or just prerecorded?
Also I think they use a more detailed character models for cutscenes if I'm not mistaken.
Yes higher quality shading as well as higher poly models.Almost same as UC2 which uses higher quality skin shading but same poly model for cutscenes. (for Drake atleast)Also I think they use a more detailed character models for cutscenes if I'm not mistaken.
What I dont get is the following tho: The 360's OS/Scaler is the one responsible for upscaling. Remedy said the signal that goes to the scaler is 720p. So before the 360's scaler even touches it, it's a 720p signal. But becouse the geometry asset is 540p, it'll be listed that way? Right?
Yeah but how can the signal that goes to the scaler be 720- when the framebuffer is at 540p? if the game runs at 540p then the signal to the scaler must've been 540p and then the 360 upscales the image to 720p, 1080i/p e.t.c. Isn't that right?
I'm confused.
And the same genre.
It´s a funny coincidence, though.
More questions: Is the engine using the same resolution for cutscenes (real time or not) and in-game? Are the cutscenes, from what we know, real-time rendered, or just prerecorded?
"Modern renderers don't work by rendering everything to a certain final on-screen resolution, but use a combination of techniques and buffers to compose the final detail-rich frames, optimizing to improve the visual experience and game performance.
Alan Wake's renderer on the Xbox360 uses about 50 different intermediate render targets in different resolutions, color depths and anti-alias settings for different purposes. These are used for example for cascaded shadow maps from sun & moon, shadow maps from flashlights, flares and street lights, z-prepass, tiled color buffers, light buffers for deferred rendering, vector blur, screen-space ambient occlusion, auto-exposure, HUD, video buffers, menus and so on. In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."
there's not a single word that suggests that the game runs at 540p, on the contrary. I don't understand how you guys are not getting this.
Prerecorded.
"Modern renderers don't work by rendering everything to a certain final on-screen resolution, but use a combination of techniques and buffers to compose the final detail-rich frames, optimizing to improve the visual experience and game performance.
Alan Wake's renderer on the Xbox360 uses about 50 different intermediate render targets in different resolutions, color depths and anti-alias settings for different purposes. These are used for example for cascaded shadow maps from sun & moon, shadow maps from flashlights, flares and street lights, z-prepass, tiled color buffers, light buffers for deferred rendering, vector blur, screen-space ambient occlusion, auto-exposure, HUD, video buffers, menus and so on. In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."
there's not a single word that suggests that the game runs at 540p, on the contrary. I don't understand how you guys are not getting this.
Old quote. Just read the previous post.
maybe they're just upscaling in-engine using some ad-hoc code, so that the framebuffer is actually 720p without x360's scaler working