Image Quality and Framebuffer Analysis for Available/release build Games *Read the first post*

If Bethesda is indeed using FXAA for the PS3 build, I find it a little perplexing why they didn't just opt for the tried-and-true Edge MLAA solution that so many PS3 titles use nowadays. It may be heavier in terms of SPU usage, but it seems doubtful that they're so already so taxed or maxed-out in terms of SPU cycles w/ Skyrim that they couldn't fit Cell MLAA within their performandce budget and had to use FXAA instead (which presumably taxes the GPU instead).

Somehow I don't think you're getting the point of what HDR is for... or at least proper HDR rendering. The lighting shouldn't stay overexposed and washed out; it just ruins the textures and well... the scene. The bloom shouldn't be a bleeding mess either.
Oh, I know what HDR's purpose is. I also know that it's not always implemented well... or in accordance to everyone's taste. Bloom lighting is a typical component of HDR that, like it or not, obscures details by its very nature. Again, it's not always implemented well, or to every individual's taste. The 360 version of the Modern Warfare titles are one example I can think of off the top of my head where the bloom effects might be overdone to some people.

The PS3 version of Skyrim seems to pop more because of its more prominent bloom effects, but whether or not it looks good or gets tiring over time is up to the individual.
 
The 360 version of the Modern Warfare titles are one example I can think of off the top of my head where the bloom effects might be overdone to some people.
Not exactly, cause none of the COD games use HDR so its given that the bloom will cause a loss of detail due to over exposure. Good examples would be games with good HDR like Uncharted or Halo where the detail isn't lost.
 
If Bethesda is indeed using FXAA for the PS3 build, I find it a little perplexing why they didn't just opt for the tried-and-true Edge MLAA solution that so many PS3 titles use nowadays. It may be heavier in terms of SPU usage, but it seems doubtful that they're so already so taxed or maxed-out in terms of SPU cycles w/ Skyrim that they couldn't fit Cell MLAA within their performandce budget and had to use FXAA instead (which presumably taxes the GPU instead).

Well, for starters, it's 1-2ms depending on the type they've implemented. It's got heavy blur, so I presume it's the really fast and cheap version. They've already got the same thing on PC IIRC. Who knows what their priorities are or what their development in-house is like. Just be glad you got anything I suppose.

They don't have to do any work at all regarding scheduling SPUs - how do you know what their SPU usage characteristics are? They could be doing a lot of decompression unlike say other titles like Portal 2, which itself is truly an example of just dropping it in (constant level loading and nothing else happening in the world). Nor are they wasting memory on frame copies so that the SPUs have access via XDR.

Plenty of plausible and simple reasons, and I'll leave it at that because we're so far off-topic.
 
I believe so. The PC version's "HDR" option looked fairly different when it was doing the tonemapping stuff, or at least, it was much more limited on console, and even here I see very little in dynamic range. They went a bit overboard with the sky lighting IIRC. Skyrim so far seems quite a bit different in the art style there.

Amusing, but not the point of the thread or tech discussion. *shrug*

Somehow I don't think you're getting the point of what HDR is for... or at least proper HDR rendering. The lighting shouldn't stay overexposed and washed out; it just ruins the textures and well... the scene. The bloom shouldn't be a bleeding mess either.

Are you so sure about LDR on the ps3? Because in this video yet posted:
http://www.youtube.com/watch?v=55P2W3F26sA&feature=player_embedded
in the panoramic view of the castle the texture on ps3 seem reflect more the environment light where on 360 appear more dark ..
 
Are you so sure about LDR on the ps3? Because in this video yet posted:
http://www.youtube.com/watch?v=55P2W3F26sA&feature=player_embedded
in the panoramic view of the castle the texture on ps3 seem reflect more the environment light where on 360 appear more dark ..
The PS3 has inherently higher gamma compared to the Xbox360, which usually results in a slightly brighter image. A lot of comparisons seem to forget about that, and don't adjust for it. Make sure that's not what you're seeing.. just because something on the PS3 might be "brighter" doesn't mean the lighting engine is doing anything different, it's probably just the way the system itself is spitting it out.

That's one of the things that bugs me about most of the comparisons I see out there. If your TV is properly calibrated on each input, the output of those two systems will be nearly identical in terms of brightness, color, and contrast, yet there's always a big disparity at places like LensofTruth. Digital Foundry seems to get it right from what I've seen.
 
The PS3 has inherently higher gamma compared to the Xbox360, which usually results in a slightly brighter image. A lot of comparisons seem to forget about that, and don't adjust for it. Make sure that's not what you're seeing.. just because something on the PS3 might be "brighter" doesn't mean the lighting engine is doing anything different, it's probably just the way the system itself is spitting it out.

That's one of the things that bugs me about most of the comparisons I see out there. If your TV is properly calibrated on each input, the output of those two systems will be nearly identical in terms of brightness, color, and contrast, yet there's always a big disparity at places like LensofTruth. Digital Foundry seems to get it right from what I've seen.

I'm not talking of brightness but of the light environment reflected on the texture.
 
Regarding to the shadows from the cage, I thought I noticed a slight movement on the character's body. It was a very low res shadow and subtle movement, you really have to stick your face in front of it to notice it. But yes, at least one shadow casting light source is dynamic.
I've been looking for videos on the lighting of the game and I found this one. The shadow casting is completely dynamic as you can see. And I don't think there is a difference between versions. The skellie's shadow on the PS3 being a bit less dynamic in that video could have happened because of the open nature of Skyrim.:?:

Here is the aforementioned video:

 
I've been looking for videos on the lighting of the game and I found this one. The shadow casting is completely dynamic as you can see. And I don't think there is a difference between versions. The skellie's shadow on the PS3 being a bit less dynamic in that video could have happened because of the open nature of Skyrim.:?:

Here is the aforementioned video:


I not have the game but I have seen enough walkthroughs to have an idea... honestly, imho I'm not found ps3 version any sort of LDR, especially rewatching game like killzone where the LDR is really evident... weird, compared sometimes on 360, the last seem too much dark & the texture seem to reflect the light a bit less in the extremely high contrast scene... but I imagine it's all caused more for the particular palette of 360.
 
Last edited by a moderator:
imho I'm not found ps3 version any sort of LDR, especially rewatching game like killzone where the LDR is really evident...

Not very comparable considering the wealth of post-processing and lighting effects that compound the issue of LDR buffers much more severely.
 
I think it's safe to expect that like Fallout 3 we'll see the 360 having the advantage again. It really is quite impressive that they've managed to get 4xMSAA in, what is it with open world games and 4xMSAA (Red Dead also had it).

Also, going back to the BF3 analysis - with regards to the banding on the red flare in that campaign scene on 360 - I think the 360 is using a lower precision FP10 buffer since I see the same banding on the tac lights in multiplayer (Though it's more noticeable when looking at a tac light from further away rather than when it's in your face).

Has anyone else noticed this? The other thing it could be is some sort of LOD where further away the tac light's beam is changed from a spot light to some sort of animated sprite?
 
I think it's safe to expect that like Fallout 3 we'll see the 360 having the advantage again. It really is quite impressive that they've managed to get 4xMSAA in, what is it with open world games and 4xMSAA (Red Dead also had it).

Off-topic...

Also, going back to the BF3 analysis - with regards to the banding on the red flare in that campaign scene on 360 - I think the 360 is using a lower precision FP10 buffer since I see the same banding on the tac lights in multiplayer (Though it's more noticeable when looking at a tac light from further away rather than when it's in your face).

Has anyone else noticed this? The other thing it could be is some sort of LOD where further away the tac light's beam is changed from a spot light to some sort of animated sprite?

Yeah, seems a high possibility (FP10 issue). I think there was a siggraph presentation on Cars 2 that talked about the problem and how to mitigate it.
 
Off-topic...



Yeah, seems a high possibility (FP10 issue). I think there was a siggraph presentation on Cars 2 that talked about the problem and how to mitigate it.

Sorry, was going to post it in the Digital Foundry thread but had the wrong tab open.

With the FP10 wouldn't there be other effects on the lighting besides just the banding? ie. wouldn't it look worse than the PS3 in other areas too?
 
I wouldn't really be using a video that's clearly showing bugged textures on the 360 to support your point here. :p

I miss the logic of this quote :???: what have to do HDR with bugged texture? It's just a streaming issue, no more...:???:
However, ps3 it's a bugged mess even in the graphic, I'm not would be surprise even the dynamic light system follow the same path... I have great difficult to understand what's going on, it's the first time I see LDR shows 'stuttering' effects of HDR ...
 

No, isn't it any kind of MSAA... it seems more FXAA or MLAA... unfortunately blurry it's a typical collateral effect of it; they can tries to calibrate more edge detected/IQ sharpness but less blurry means decrease the edges detected & cause more jaggies, how more deeper use of the filter add more blurried edges... isn't it so simple to realize the best combination because the filter could become useless for it's purpose.
 
Last edited by a moderator:
Back
Top