Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

Unless these people who are coming up with their own "analysis" & stating that its all due to the video's resolution, come up with some explanation as per why the recent screenshots have a proper 720p HUD....everything they mention is ir-relevant.

Lets just wait a while...we'll get the answers eventually.
 
Unless these people who are coming up with their own "analysis" & stating that its all due to the video's resolution, come up with some explanation as per why the recent screenshots have a proper 720p HUD....everything they mention is ir-relevant.

Lets just wait a while...we'll get the answers eventually.

I went ahead and spent time counting the pre-rendered cutscene shots. They are 1280x720(I think). Considering videogamezone.de captured the cutscene shots in the same batch with the gameplay shots, I'd say that settles it... the shots are legit.
 
Guys, there is a response from Remedy:

"Alan Wake's renderer on the Xbox360 uses about 50 different intermediate render targets in different resolutions, color depths and anti-alias settings for different purposes. These are used for example for cascaded shadow maps from sun & moon, shadow maps from flashlights, flares and street lights, z-prepass, tiled color buffers, light buffers for deferred rendering, vector blur, screen-space ambient occlusion, auto-exposure, HUD, video buffers, menus and so on. In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."

I have absolutely no idea what that means :p
 
Guys, there is a response from Remedy:

"Alan Wake's renderer on the Xbox360 uses about 50 different intermediate render targets in different resolutions, color depths and anti-alias settings for different purposes. These are used for example for cascaded shadow maps from sun & moon, shadow maps from flashlights, flares and street lights, z-prepass, tiled color buffers, light buffers for deferred rendering, vector blur, screen-space ambient occlusion, auto-exposure, HUD, video buffers, menus and so on. In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."

I have absolutely no idea what that means :p
I do...it is 540p :(
 
Last edited by a moderator:
Why not just say the frambuffer is 720p? I'll give them the benefit of doubt, but to me it sounds like we do a lot of stuff and we use the 360 to output a 720p upscaled image. But until its released lets give them the benefit of the doubt :)
 
Guys, there is a response from Remedy:

"Alan Wake's renderer on the Xbox360 uses about 50 different intermediate render targets in different resolutions, color depths and anti-alias settings for different purposes. These are used for example for cascaded shadow maps from sun & moon, shadow maps from flashlights, flares and street lights, z-prepass, tiled color buffers, light buffers for deferred rendering, vector blur, screen-space ambient occlusion, auto-exposure, HUD, video buffers, menus and so on. In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."

I have absolutely no idea what that means :p
This is one very confusing reply, is it rendering 720p natively or upscaled for christ sake?
 
This is one very confusing reply, is it rendering 720p natively or upscaled for christ sake?

This is what I understand from it:

"...In the end all are combined to form one 720p image, with all intermediate buffer sizes selected to optimize image quality and GPU performance. All together the render targets take about 80 MB of memory, equivalent in size to over twenty 720p buffers."

So the consoles renders everything in 720p, but if it can't handle 720p for that moment than it'll lower the rez? Like WipeOut HD?
 
i think it can be safely deduced that it's upscaled,first the try and discredit the source then they create a long winded reply omitting that the game is rendered at 720p. The last line explaining they use over 20 times the memory is as good as saying it's being upscaled.
 
i think it can be safely deduced that it's upscaled,first the try and discredit the source then they create a long winded reply omitting that the game is rendered at 720p. The last line explaining they use over 20 times the memory is as good as saying it's being upscaled.

If it's upscaled, I don't understand the significance of 720p. Becouse from what I understand, the Xbox 360 OS does upscaling. That means the original output should be less than that.

If it in indeed is 720p upscaled, then what would happen if I set my 360 in 1080p mode? It'll upscale the already upscaled 720p image?
 
Kinda surprising for Remedy to reply like this, how hard can it be to give a straightforward reply rather than playing with words & going in circles until the game's released ?
For all I know one could care less about the size/reso of intermediate buffers when the main doubts are over the native resolution. Also are they trying to suggest that they are running an 80MB Framebuffer ? :eek:


Btw someone mentioned the case of Ratchet & Clank games, could it be possible that Remedy is pulling off something on the same level ?
I don't have much idea regarding R&C's implementations though.
 
Last edited by a moderator:
If it's upscaled, I don't understand the significance of 720p. Becouse from what I understand, the Xbox 360 OS does upscaling. That means the original output should be less than that.

If it in indeed is 720p upscaled, then what would happen if I set my 360 in 1080p mode? It'll upscale the already upscaled 720p image?

But won't the user notice the huge difference in quality? Its almost a 50% reduction in pixels.
 
This is one very confusing reply, is it rendering 720p natively or upscaled for christ sake?

I do not think the "upscale" is just as we thought.

Traditional "upscale" is the engine renders the image in one lower framebuffer,then the software or hw scalar upscale it to 720p

The case of AW would be serveral intermediate rendered section are combined for one "720p framebuffer",no upscale process involved.
 
Back
Top