HDR - The Halo Way * Spin-Off

Status
Not open for further replies.

AlNom

Moderator
Moderator
Legend
Dunno why but overall you have 27% less pixels to shade than you'd have rendering a 720p image, which can be a good thing if you're limited by pixel shaders.
At 12 bytes per pixel (8 + 4) it also nicely fits in edram..

Hm... well according to the gamefest 2006 slides, they use 2 render targets for their HDR accordingly:

LDR: 10-bit float 7e3
Resolved to 16-bit linear

HDR: 8-bit gamma 2.0 / PWL gamma

How does that factor into the eDRAM :?:
 
no idea, I have to read those slides.
btw I would call the 10 bits per component format (low accuracy) MDR..not LDR :)
 
no idea, I have to read those slides.
btw I would call the 10 bits per component format (low accuracy) MDR..not LDR :)

Actually, that's what they said in the following slides too. :LOL:

Here's a link to the (public-safe) presentation:

http://www.microsoft.com/downloads/...1d-6bbd-4731-ac82-d9524237d486&displaylang=en

They go through some definitions and gamma adjustments and options they were exploring. They don't outright say what they chose besides the two render targets, but maybe the Gamefest August 2007 slides (still unreleased!) will shed more light on what they did for the final release.

edit: I know these ones say 2007, but the slides were first seen at Gamefest 2006.
 
Read the slides, 'HDR the bungie way'... :)
I read those slides this morning and I can't really say I'm impressed, but don't wanna go OT. Just one thing relevant to ths thread: you don't need to keep 2 RT in edram at the same time to do what they maybe do, so the argument of extra memory needed for their HDR technique doesn't sound good to me.
 
It's more of a theory, not an argument. In other words, trying to guess why they went with a smaller buffer. The game doesn't seem to be fillrate intensive, and they could do 720p/noAA without tiling too.
 
It's more of a theory, not an argument. In other words, trying to guess why they went with a smaller buffer. The game doesn't seem to be fillrate intensive, and they could do 720p/noAA without tiling too.

Perhaps it is a pixel shader limitation as nAo mentioned previously. How is the backbuffer calculated with their FP10 usage and MRT :?:
 
Moved the Halo HDR discussion over here. So you have some space to spread out :p Hopefully, I haven't missed anything. If there's a post you wanted moved / copied just pm me.
 
I read those slides this morning and I can't really say I'm impressed, but don't wanna go OT. Just one thing relevant to ths thread: you don't need to keep 2 RT in edram at the same time to do what they maybe do, so the argument of extra memory needed for their HDR technique doesn't sound good to me.

I would guess that filling both RT's in one pass was a late decision, made to keep the performance > 30 fps.

I read through the slides and I did find it interesting, I'd say they had a good level of success dealing with the issues they were trying to tackle based on their actual results in-game. However as I was going through it, I couldn't help but think that it just wasn't worth the sacrifices they made for the 360's hardware.
 
If it was a late decision, then they've probably not planned to do it in a single pass - in which case the size of the EDRAM would've allowed for at least 720p, or 640p with 2xAA. So it wouldn't have been such a big sacrifice originally.

The upside is that if Bungie's gonna use this engine again, they'll probably be able to optimize content and code to work at a higher resolution as well.
 
So why didn't they enable AA on the render targets? Aren't they each under the 10MB :?: The FP10 format is 10-10-10-2, taking the same space per pixel as 8-8-8-8. What am I missing :?:
 
What NAO32 doesn't provide is alpha blending IIRC. Given Halo 3's uber amounts of non-geometry foliage, it wouldn't be a good fit. In the Bungie slides, they also mention specificially that alpha blending is something they really wanted, hence limiting some HDR paths they might have taken.
 
Out of interest, how good is this method compared to say.. nao32? :smile:

I think always - but in these days especially - words like "good" should be avoided when setting up technical comparisons. Better to simply ask, how are they different? - or - what are the benefits and drawbacks of each?

But when questions are phrased as in the quote, it sort of puts posters in a mentality to knock ideas down rather than to discuss them.
 
From what I understand Bungie was adamant regarding the "usable range" of precision that people would be sensitive towards. They looked at several HDR implementations including gamma adjustments, and they felt that rendering the scene with different exposures gave a good amount of range whilst keeping the ability to alpha-blend.

Check out the power point slide (and the notes) for more details.
 
I think always - but in these days especially - words like "good" should be avoided when setting up technical comparisons. Better to simply ask, how are they different? - or - what are the benefits and drawbacks of each?

But when questions are phrased as in the quote, it sort of puts posters in a mentality to knock ideas down rather than to discuss them.

I apologise about my phrasing. I was using "good" as a shorthand for "efficient, flexible, physically accurate, etc". All of the things that determine "goodness" :p
 
IIRC nao32 (aka funky colorspace) doesn't work well on 360?
It works well on any pixel-shader-powerful GPU. Regards foliage, how is HS in that respect? The screenshots I've seen tend to be pretty sparse on alpha-blending, but nAo's talked about using alpha-to-coverage instead of blending which saves the fill-rate.
 
IIRC , " free AA " is a benefit of eDRAM , right ?.. Bungie dropped the resolution to 640p for fitting the frames to eDRAM ...So , 640p fits into EDRAM but still there's no AA , why not ?..

And what's the point of eDRAM if there's no AA ?.. Does 10MB eDRAM create a bottleneck in X360 ?.. Could it be better if there's no eDRAM , i mean , for AA with 720p+ resolutions ?..
 
The EDRAM also removes the need for an additional memory bus, as the GPU's data traffic would easily eat up all the ~22GB/sec that's available to the main RAM. If there would be no EDRAM at all, the system would be a lot slower in 3D rendering.
Think about it as an alternative to the two discrete pools of memory on the PS3; and also know that it makes the system's motherboard more simple to build and of course a bit cheaper, which all matters in the long run.
 
Status
Not open for further replies.
Back
Top