The technology of Alan Wake *spawn

You can apply post-processing anywhere along the chain. It depends on what exactly they are doing! Adding film grain can be done on the upscaled 720p which will improve the sense of higher quality a notch versus upscaling noise applied at 540p. I'd need to know what they are doing in post-processing to guess if the resolution is affecting perceptible quality though.
 
Yes, because every game we've counted resolution on has been treated this way. It is the standard measure. The only thorough alternative is to list 50 different rendertarget resolutions for every game, which is absurbed and and doesn't give any useful comparison. This isn't about applying a number to how good a game looks - "It's only 540p so it looks worse than a 720p game". How good a game looks is subjective and dependent on many factors. It's about comparing engines. Take two similar games and compare the opaque geometry complexity and rendering quality, and you'll have a reasonable comparison of what rendering performance the developers have extracted from the hardware.

Furthermore, the opaque geometry makes up by far the glut of a player's visual experience. There'd be no point rendering particles, UI, smoke etc. at 2160p if the opaque geometry is being rendered at SD resolutions no AA. The game overall will have poor IQ and look rough. If there are cutbacks to be made in buffer resolutions, it'll be made in particles, shadows and so forth that don't have as prominent an appearance in the final image. Aiming for high opaque geometry resolution and high AA is a smart target if you want your game to look good on an HD set. There's a good argument to be made for lower resolutions as long as you crank up the AA (real-life SDTV looks better than computer games), but achieving that high AA is as much a challenge as rendering to a high resolution.

it makes sense, but I still find questionable the source where the results are based on, unless the actual game/demo was analyzed, that I don't know.
if it can already be really proved that the opaque geometry is actually rendered at sub-hd resolutions, then those hd-not-hd debates don't make any sense anymore
 
it makes sense, but I still find questionable the source where the results are based on, unless the actual game/demo was analyzed, that I don't know.
if it can already be really proved that the opaque geometry is actually rendered at sub-hd resolutions, then those hd-not-hd debates don't make any sense anymore
Those screenshots indicate a 720p HUD,720p cutscene & 540p gameplay. Its fair to assume that the source is legit.
 
Personally, saying this game runs at 540p is not a fully correct statement.

It's more then adequate, considering that the visually most important parts are indeed rendered at 540p. There's no real difference compared to any other sub-720p game out there so let's not try to make this one an exception.
 
yes, of course. but we have an "if" there.
once ascertained, it'll be set down in black and white. now, would you go so far as to state that this game has to be considered 540p? hmm..

Is COD4 640p or 720p? Is Halo3 720p? GTA4 on the PS3?

Or is Killzone 2 1440p because they do some stuff on a 2x multisample buffer?

Don't treat this game differently just because of some personal bias, please.
 
it makes sense, but I still find questionable the source where the results are based on, unless the actual game/demo was analyzed, that I don't know.

Plenty of official and evidently 540p screenshots are already out on the internet.
 
Don't understand why the low rez. When using complex lighting effects on the 360 does eDRAM become an achilles heel? Would like to see DF article on it.

If by having a 540p main buffer (lets put it this way :p) they are already hitting 80mb+ just in render targets i can only imagine how that would sky rocket with a higher res.

It seems to me that they choose to have better looking transparency and volumetric effects than having the geometry outputed at a higher resolution.

If that was a good choice we will just to wait and see it XD
 
It's more then adequate, considering that the visually most important parts are indeed rendered at 540p. There's no real difference compared to any other sub-720p game out there so let's not try to make this one an exception.

I wouldn't call it the MOST visually important parts of a game. Not anymore. Too much is going on during the post processing phase to say that. I mean, Alan Wake is certainly proof that even at a very low resolution, a game can still look absolutely outstanding.

For the purposes of this thread and the resolution it is concerned with, yeah, opaque geometry framebuffer is all that matters. But saying that it is the MOST visually important part is something I don't completely agree with. Certain a very important part, but most is a bit of a stretch to me when talking about modern engines. I mean, if it was really the most important part and they were ending up with a worse looking game visually, they would not have dropped the resolution to 960x540.
 
I did not say that the resolution was the most important; but that what's the most important - the scenery, characters, lighting, shadows and so on - are all rendered at 540p. I don't care if they add noise at 720p, that's not an important element of the final image.
 
Is COD4 640p or 720p? Is Halo3 720p? GTA4 on the PS3?

Or is Killzone 2 1440p because they do some stuff on a 2x multisample buffer?

Don't treat this game differently just because of some personal bias, please.

there's no personal bias whatsoever. I just think that when you put the hands on the actual game, you'll be able to firmly state whether it is X or Y. Otherwise no one would be arguing about it, no?
Even if it's not relevant I've had my fair dose of experience with bullshots and doctored videos and I'd tend to be prudent and be 100% sure before confirming something as definitive.
I'm not question the PC's here, but I do have my reasons not to draw conclusions so fast.
that's just it.
 
Well R&C: Tools Of Destruction had very good IQ at least on my set and didn't looked blurry or sub-HD at all..also the impressions from the ones that have seen the game in action like blim from gamersyde said it looked great when demonstrated on a big HDTV.

The most confusing part in Remedy's answer is that whole bit about feeding the 360's scaler with a 720p signal..I guess we have to wait and see the game in action on our sets to judge, though I'm sure and hope that digitalfoundry will have an in-depth analysis on the game really soon.

Thats what i was thinking... Ratched is so clean that i didn't even think it was sub 720p until i saw this thread XD

Hopefullly remedy has done something similar with Alan Wake.
 
For the purposes of this thread and the resolution it is concerned with, yeah, opaque geometry framebuffer is all that matters. But saying that it is the MOST visually important part is something I don't completely agree with.
What is even as important, let alone more important? Particle buffer resolution? Do blocky pixels on explosions/smoke stand out more than blocky pixels on object edges? Bloom buffer? Do you need to gaussian blue on a 720p native buffer instead of upscale 1/16th resolution? Shadow buffer? If the shadows are silky smooth but you characters have chunky edges, that's less jarring than smooth edges and chunky shadows?
I mean, if it was really the most important part and they were ending up with a worse looking game visually, they would not have dropped the resolution to 960x540.
Without any idea what the bottleneck is, you can't make that call. However, a look at every game out there that prioritises opaque geometry buffer above all other buffer resolutions shows it's recognised as the most important buffer to have a high resolution on if you want crisp visuals. Reining back the resolution on the main render target is going to create a blocky or smooth result that will be immediately noticeable.

Putting it another way, how would you budget your memory and bandwidth? Would you rather have:

1) 1280x720 opaque geometry buffer and a 960x540 particle/shadow/whatever-you-choose buffer
2) 1280x720 particle/shadow/whatever-you-choose buffer and a 960x540 opaque geometry buffer

(If you answer 2, you'll need to say what buffer you'd prioritise above the opaque geom one!)
 
there's no personal bias whatsoever. I just think that when you put the hands on the actual game...but I do have my reasons not to draw conclusions so fast.
that's just it.
Don't confuse the current round of discussion with a critique of final game quality. This is spawned from the Pre-release thread with good reason, and we're only talking about the current showings of the current engine. This makes no suppositions about the final game people will get to play, or whether it'll feel high quality or not. It's a very cold, analytical consideration of the render target resolutions being used in the current (showcased) build.
 
Don't confuse the current round of discussion with a critique of final game quality. This is spawned from the Pre-release thread with good reason, and we're only talking about the current showings of the current engine. This makes no suppositions about the final game people will get to play, or whether it'll feel high quality or not. It's a very cold, analytical consideration of the render target resolutions being used in the current (showcased) build.

okay then that changes everything. sorry for misunderstanding
thanks
 
Thats what i was thinking... Ratched is so clean that i didn't even think it was sub 720p until i saw this thread XD

Hopefullly remedy has done something similar with Alan Wake.

I didn't know that either.

I believe this is the only Sony published PS3 series this gen that is not 720p and up. right?
 
Maybe they are seeking at all cost better AA than largest native resolutions(to avoid screen tearing ?), im particularly not appreciate this.I prefer at least 720P even with a lower resolution in somethings like we see in particle effects in Killzone 2.

Interesting article about this:

http://www.eurogamer.net/articles/digitalfoundry-not-so-high-definition-article

(540P is way to do 4*MSAA in 10MB eDRAM space,and is something like 800x600P with 4xfsaa whem i played some pc games ,many details are lost even in high)
 
Last edited by a moderator:
I only played the demo of a Crack in Time, but I finished both Tools of Destruction and Quest for Booty and the first thing that popped into my head was whether I accidently disabled 720p/1080p in the PS3 system settings. The blurry image was immediately apparent to me.
 
Oh wow, so it's only 540p then, that's going to provide yet more console war ammunition. I always wondered how they managed to do 720p 4xAA whilst having so much going on.

That being said, the game is easily the best looking 360 title to date (can anyone think of a close competitor?) and the complete lack of aliasing along with all the post processing effects give it a very cinematic, realistic quality to the visuals compared to most other titles even though they run at higher resolutions (similar to the visual realism of KZ2).

And despite being 540p AW's IQ is better than COD or Battlefield, so I don't think it's a big deal (apart from the fact we were told it was 720p).
I meain, I wouldn't mind seeing Halo 3 or Reach running in 540p with 4xAA, IMO much much better than 640p no AA. For me 'jaggies' destroy the suspension of disbelief like nothing else.
 
Last edited by a moderator:
Back
Top