HDR in Xbox 360 games

nightshade

Wookies love cookies!
Veteran
I've actually spent some time in the past looking around for this this but recently I've developed more curiosity in this area. I know that a few PS3 games used HDR using the LogLuv format like Uncharted, TLoU, God of War, Heavenly Sword. Other games like GT5/GT6 used it too and so did Battlefield 3 on PS3 (FP16 vs Xbox 360's FP10) . But when it comes to Xbox 360 I never really heard or read of many games, in fact the only games I can recall are the Halo games with their dual buffer setup, and maybe PGR3 (which was dropped in PGR4).

So my question is did a vast majority of games last gen, especially on xbox 360, end up opting for either RGBA8 (LDR) or FP10(MDR) for their range? Or am I missing a lot here? It'd be funny if that was the case considering it was such a hot topic before the introduction of seventh gen era and it was touted as one of the features that would make games look next generation, quite similar to how PBR is seen today.
 
Last edited by a moderator:
I never shipped an Xbox 360 game, but I would assume that the FP10 format was pretty popular since it had a low eDRAM footprint and it supported alpha blending. Blending support is important not just for transparencies, but also for the kinds of deferred rendering techniques that were common last generation (most of them used additive blending to accumulate lighting from multiple light sources).

DX10 added a similar format (R11G11B10_FLOAT) that is now supported by all modern GPU's, and I wouldn't be surprised if quite a few recent games make use of it.
 
I never shipped an Xbox 360 game, but I would assume that the FP10 format was pretty popular since it had a low eDRAM footprint and it supported alpha blending. Blending support is important not just for transparencies, but also for the kinds of deferred rendering techniques that were common last generation (most of them used additive blending to accumulate lighting from multiple light sources).

DX10 added a similar format (R11G11B10_FLOAT) that is now supported by all modern GPU's, and I wouldn't be surprised if quite a few recent games make use of it.

That makes sense since Xenos had no support for alpha blending when doing FP16 which would be a no go when dealing with MRTs.
Regarding R11G11B10_FLOAT, I read sebbi's explanation here and it seemed to have been an alternative for Xbox 360 donno what games used it though.
http://beyond3d.com/showthread.php?t=46241
 
Last edited by a moderator:
in fact the only games I can recall are the Halo games with their dual buffer setup
Halo only used the dual buffers for 3 and ODST.

It was an interesting solution, allowing handling of luminance at 128x the LDR white level and maintaining blending support, but you wind up with 16 bits per channel and slower blending (since both buffers need to get blended).

Reach ditched the dual buffers for 7e3. This dropped the range to only 8x over white level, but it enjoys much faster blending and half the footprint. Reach mostly seems to manage to get away with this, as the imagery isn't so high contrast as Halo 3.

I have no idea what Halo 4 does under the hood, but I've noticed bloom clamping quite a lot, so I can't imagine the HDR depth is particularly high.
 
Halo only used the dual buffers for 3 and ODST.

It was an interesting solution, allowing handling of luminance at 128x the LDR white level and maintaining blending support, but you wind up with 16 bits per channel and slower blending (since both buffers need to get blended).

Reach ditched the dual buffers for 7e3. This dropped the range to only 8x over white level, but it enjoys much faster blending and half the footprint. Reach mostly seems to manage to get away with this, as the imagery isn't so high contrast as Halo 3.

I have no idea what Halo 4 does under the hood, but I've noticed bloom clamping quite a lot, so I can't imagine the HDR depth is particularly high.

Interesting, I can see it now but I usually never would've guessed since there were no noticeable black/white crushing and the light intensity from the bright source like the sun seemed quite strong to give the appearance if HDR. I say this cause 7e3/FP10 isn't really HDR, it's more like MDR (if such a term existed). Isn't this also what the Gears games and most if not all UE3 games on 360 use ?
 
and the light intensity from the bright source like the sun seemed quite strong to give the appearance if HDR.
I don't have a very good eye for this stuff, but what you're bringing up sounds like something that has more to do with the assets and the lighting pipeline's front end than with the backbuffer format.

I say this cause 7e3/FP10 isn't really HDR, it's more like MDR (if such a term existed)
8/17/2014 NeoGAF semantics bleedover?

;)
 
The fp11/11/10 format in dx10 is not really similar to fp10 format in the 360 because of the vastly different dynamic range:


-fp11/11/10 uses 5 bit exponent giving 37 stops (bits) of range for red and green and 36 stops for blue. (or maybe 36/35 stops if one of the exponent values is reserved for special values like inf and NaN)

-fp10 on xbox360 (7e3) uses 3 bit exponent giving only 14 stops of dynamic range


So they are nothing alike when it comes to hdr rendering, fp10 on xbox360 is more comparable to srgb when it comes to dynamic range:


-8bit srgb has about 11.7 stops of dynamic range

-10bit srgb has about 13.7 stops of dynamic range, almost the same as fp10 on xbox360


but that is assuming proper srgb, I don't know if any HW supports 10bit srgb and 8bit srgb on the xbox360 was a rough approximation giving much less than 11.7 stops of range.

I would say that calling fp10 with 3 bits of exponent even MDR is very generous, it's really a LDR format.
 
I don't have a very good eye for this stuff, but what you're bringing up sounds like something that has more to do with the assets and the lighting pipeline's front end than with the backbuffer format.
The way I look at it is that generally speaking games without HDR will avoid using bring light sources or low lit areas. Since as you know without HDR the colorspace would vary from a float value of 0.0 to 1.0 where the brightest light would be 1 and anything closer to 1 would get crushed to white while anything closer to 0 would get crushed to black. But a light source such as the sun is suppose to be many many times brighter than any local light source, then in order to avoid white crush the game will have to tone down the intensity of this light source considerably. So if the intensity of illumination from the sunlight is about the same as the one from a bright lightbulb then it's probably due to the limited range but if the difference is quite large while still retaining the details then it's probably HDR. At least that's how I try to look for this.

That is me actually, which is why I became more curious about this. I've ignored the discussion any further so this isn't for that but rather purely for my knowledge :)
 
Last edited by a moderator:
7e3 (10f) is HDR. It has enough dynamic range to map the light intensity range we commonly see. However blending to that format on Xbox 360 happens in 16 bit fixed point precision and that limits the brightness (cut highlights) and limits the precision in dark areas. Modern hardware is better when handling this format (no quality loss). On modern hardware this format has just (barely) enough precision to output good quality HDR image. This means that if you do lots of processing passes (like adding lots of darker light sources to get the final lighting accumulation), you will get banding. 11f-11f-10f is slightly better (twice as much quality in green and red channels). Green and red are the most important colors for human eye, as blue has only 7% effect on human eye luminance seeing. So the 11f-11f-10f is an acceptable format, if you have a compute shader based lighting pipeline (single pass tiling or clustered).

16f is obviously easier to use, since you don't need to always think about the data precision. It's more robust and flexible and modern GPUs have full fill rate when outputting to 4x16f render targets. The performance difference (if any) is fully tied to available bandwidth of the rendering passes that need to output and input HDR data.
Regarding R11G11B10_FLOAT, I read sebbi's explanation here and it seemed to have been an alternative for Xbox 360
This is not right. x360 doesn't support 11 bit float formats. This format was introduced in DirectX 10. For more information see the (publicly available) XNA documentation at Microsoft's website. Microsoft has basically revealed all the console hardware details to public (to help developers and also hobbyists in making XNA games).
 
Alright, I understand now...thanks for the explanation.
So it just seems to be a matter of the hardware not being able to efficiently utilise the available range fully.


This is not right. x360 doesn't support 11 bit float formats. This format was introduced in DirectX 10. For more information see the (publicly available) XNA documentation at Microsoft's website. Microsoft has basically revealed all the console hardware details to public (to help developers and also hobbyists in making XNA games).

Ah, I only assumed because this was written over your quote.
"Other formats do exist, such as R11G11B10 on Xbox 360 as Sebbbi explains"
 
We use RGBM on last gen yes. x360 doesn't support 10f texture sampling (rendering is supported). Bilinear from 4x16f is quarter rate, meaning that RGBM post processing (such as blur kernels) is up to 4x faster. Bilinear from RGBM is not 100% correct however. You will see a minor color band whenever the multiplier changes. It's very hand to notice, so the trade off is definitely worth it. RGBM is also good on low end (integrated) GPUs that have similarly slow floating point texture bilinear sampling performance.
 
Back
Top