HDR overused?

speedsix

Newcomer
Hi,

Does anyone think developers are getting a bit carried away with HDR lighting? PGR3 is the worst offender imo, particularly the lit roadsigns, way OTT. Oblivion is the same in places, completely washing out the details in places i.e characters faces when you are using a light producing spell.

GRAW seems to use it considerably but it still remains natural looking.


Dom.
 
HDR != bloom/brightness. It's just a matter of experience. Developers are currently overdoing the brightness and glare aspects of HDR, they'll come to their senses soon enough.
 
I think it`s also due to user perception. If HDR was done "right" the differences would be quite subtle, as it`s something that we mostly take for granted. Sure, IRL, there are some instances where bloom is apparent, but not to the extent seen in games. Lighting should be a bit soft, sure, but not as soft as to wash everything out.

Now put out a game without huge, non-realistic bloom, on the stage, and everyone and their aunt is gonna cry:where`s the HDR in that?Oblivion does it so much better etc., even if that game implements sensible usage of a High Dynamic Range, producing better and more accurate whites and blacks etc.
 
I agree that it tends to be done in totally nonsensical ways and dosages. It annoys me so much that I am grateful whenever I have the option to turn it all off, and I always do.
(that includes not using "Bloom" either in Oblivion)

I have said it lots of times already, but the only things you need HDR colors for at all are reflections and refractions. If there is something else where you see any difference between HDR rendering and non-HDR rendering, you can take for granted that it's not realistic.
This is slightly my opinion. I expect it to be debatable whether or not you need HDR rendering for the simulation of iris contraction. I think you don't really need it, but I understand that it makes things easier if you just do it.
 
i think it was near perfect in pgr..... it's just like driving out of a tunnel in real life.

Graw does have an excellent implemeantation though i agree (in single player)
 
zeckensack said:
...you can take for granted that it's not realistic...This is slightly my opinion. I expect it to be debatable whether or not you need HDR rendering for the simulation of iris contraction. I think you don't really need it, but I understand that it makes things easier if you just do it.
I think this is actually where it's needed most. Needing multiple textures for different intensities is a PITA. The point with HDR is keeping absolute colour detail no matter the light situation. Overbirght reflections and refractions are a nice consequence but these are fairly comfortably fakeable with additive reflection maps.

What HDR actually does is add an element of real-world skill requirements. Real-world type light intensities needs real-world type camera and lighting knowledge and skills. The use and results of HDR by game developers is likely going to produce results akin to amateur photography in quality. Throwing bloom around is a crass way to use HDR. The Getaway demo is a good example of realistic implementation, and that should become more prominent as HDR becomes widespread and skillsets develop, eventually producing games that use HDR as photographers and filmographers, artistically working with exposure to set mood (eg. Firefly/Serenity using overexposure to portray intense sunlight).
 
Ah, getting my jargon confused. I'm mainly referring to the bloom around things. Looks extremely unrealistic and very distracting.
 
Shifty Geezer said:
The Getaway demo is a good example of realistic implementation, and that should become more prominent as HDR becomes widespread and skillsets develop, eventually producing games that use HDR as photographers and filmographers, artistically working with exposure to set mood (eg. Firefly/Serenity using overexposure to portray intense sunlight).

Just in case anybody was wondering what Shifty was talking about here's the Getaway video.

Click Here

Edited: Why do most next-gen games today not have that natural looking lighting that this Getaway video is displaying? Is it technical or is it artistic skill that not being shown?
 
Last edited by a moderator:
I think with oblivion it's simply a limitation of still having to support LDR. The change in exposure from moving from shadow to looking directly at the sun is surprisingly insignificant. Everything is still LDR, just the sky is a tad brighter.
On top of that, the actual range of exposure seems artifically limited in oblivion. You never seem to get all that dark, and quite easily you can get a solid-white screen at midday.

In an ideal game you won't notice HDR at all. The eye adjusts very fast to changing light conditions. (Although there are exceptions like extreme darkness). There seems to be a thing going on with slowing down the exposure change, for 'dramatic effect'.

The other problem is current HDR implementations deal with a full screen tonemapping value. However the brain and eye are smart enough to be somewhat dynamic. If you look out a window, and you have dark curtains, the edge between the very bright outside and the dark curtain will have very little detail, chances are the curtain will be black along the edge, however the rest of the curtain will still show detail and have colour. We are so used to this we simply don't notice it, unless you get 'natrual' HDR lighting in a game, where this doesn't happen. The effect suddenly becomes extremly exaggerated, and can look rather ugly, with a bloomy outside, and a pitch black curtain :).
There are papers out there on solving this problem, but unfortunatly the solution is spectacuarly complex (however the results are very, very good - errily natural looking).

The other thing is, many people associate HDR with bloom. Sure bloom can actually look good, but only when used with taste, style and subtlty. Few games do this (well, graw may be an exception). One thing that seems to be ignored, however, is darkness. In darkeness there are far fewer photons shooting about to make up a clear image, yet games don't exploit this. HDR is perfect for creating 'dark' looking games. All you need to do is add varying levels of blur, colour correction and noise, and you can *completly* change the feel and look of a game. example.
 
Last edited by a moderator:
rusty said:
Graw does have an excellent implemeantation though i agree (in single player)

Do they cut back the details a lot in online mode?

Is it only GRAW or all X360 games?

Do they have the disclaimers about online mode having graphics cut down?
 
wco81 said:
Do they cut back the details a lot in online mode?

Is it only GRAW or all X360 games?

Do they have the disclaimers about online mode having graphics cut down?
online mode looks great in GRAW but all of the effects are not present as they are in SP. Also, GRAW's online mode is a sperarte part of the game and was developed by a different team than the one that produced the SP experience.

It is not most games that are reduced (it depends on the game and how many players online and how much is involved on the screen and how good the network code is etc)

The disclaimer about "experience changing online" does not relate to the game (or graphics) at all but is regarding the age rating on the game (ESRB) and the potential for vulgarities and such over the headset by other users.
 
wco81 said:
Do they cut back the details a lot in online mode?

Is it only GRAW or all X360 games?

Do they have the disclaimers about online mode having graphics cut down?

Online has AA issues and the HDR isn't as prominent. Online and Singleplayer use two seperate graphic engines.
 
Brothers in Arms 3 shows the best realtime lighting I've seen, hopefully it's just a matter of time until all games look like this:
bia4a.jpg

bia5a.jpg

bia8a.jpg
 
zeckensack said:
I expect it to be debatable whether or not you need HDR rendering for the simulation of iris contraction. I think you don't really need it, but I understand that it makes things easier if you just do it.
Iris contraction is not the problem, as that's handled by a simple scale factor. Use the previous frames average luminance to determine how much you want to scale the data being written in the current frame, and keep adjusting this way. You can jump an order of magnitude each frame this way, so it doesn't really limit realism since the eye is much slower.

(BTW, I'm not necessarily explaining this to you, as you probably feel the same way as me.)

The reason HDR rendering is needed is that the eye can see a simultaneous dynamic range of 10,000:1. Print film, whats used in movie theatres, has a similar contrast ratio (DemoCoder informed me of this). 35mm film can capture data from a range of 3-4 orders of magnitude in intensity (data). IMO this is range of data needed for realistic rendering. The FP10 format is just about enough for this (32 / (1/256) = 8192), so it should be adequate. I think some effects can make use of higher dynamic range, but for photorealism it's enough in terms of range in a linear color space. Figuring out what to write in terms of lighting is a much bigger problem in photorealism than storing it accurately.

Basically, what I'm saying is that absolute luminance of what you render is mostly meaningless, since only relative luminance should affect your final image.
 
the problem with how bloom is typically imnplemented is, developers just grab the current screen and use that as the basis of the bloom.
a better way is to only have bloom with selected objects.
the reason being it stands out more thus has more of a visual impact
(and u also dont get that horrible effect of things like ppls faces glowing :D )

Q/ so why dont ppl use a selected bloom?
A/ slightly more difficult to program + also comes at more of a performance hit.
 
Mintmaster said:
Iris contraction is not the problem, as that's handled by a simple scale factor. Use the previous frames average luminance to determine how much you want to scale the data being written in the current frame, and keep adjusting this way. You can jump an order of magnitude each frame this way, so it doesn't really limit realism since the eye is much slower.

(BTW, I'm not necessarily explaining this to you, as you probably feel the same way as me.)
Yup.
The common theme seems to be that that average luminance is generated from a downscaled framebuffer readback -- or rather the more efficient equivalent involving mipmap filters as available --, and if you don't allow the values in your framebuffer to exceed 1.0 (i.e. if you don't use a "HDR" data format, whether it be INT16 per component or floating point), many of the values in the fb will be clamped and hence your average luminance reading will be skewed towards the darker range.

But I'm actually a proponent of figuring out average scene luminance by other means. There are usually only very few significant light sources in a scene and I consider it to be a worthwhile optimization to take the analytical approach there. If the sun's in view, and you have an occlusion query pending that will tell you with a healthy accuracy how much of its radius will end up in view, you'll have a very good first approximation of light intensity.

If there's no sun, just pick, say, the top 3 of the artificial light sources and run from there.
If there are very large highly reflecting surfaces, you have to do some boiler-plate work to take these into account, but really, the math for doing so is still simple.

The problem with the approach is rendering a scene involving a low sun over an ocean, because the sun's reflection will be "smeared out" over a very large area and it makes it pretty difficult to compute the average scene luminance accurately enough.
Mintmaster said:
The reason HDR rendering is needed is that the eye can see a simultaneous dynamic range of 10,000:1. Print film, whats used in movie theatres, has a similar contrast ratio (DemoCoder informed me of this). 35mm film can capture data from a range of 3-4 orders of magnitude in intensity (data). IMO this is range of data needed for realistic rendering.
I'm not so sure about that reasoning.
I know my eyes don't appreciate scene contrast ratios of 10000:1. I know I don't have a display that could ever hope to resolve that accurately, and I even think it's fine as it is. A pure grey gradient from black to white looks pretty smooth already to my eyes at just 8 bits in sRGB. In real life only masochists or well-protected people ever look at the sun for more than a fraction of a second. In games you frequently do. And it's great to clamp the sun's intensity to some "large but not crazy" value IMO.

Sane people will adjust their displays' white levels to levels they are comfortable with. A game IMO should not assume that realism is more important than that level of comfort. E.g. my iiyama CRT has an "OPQ" mode, supposedly for watching movies, from greater view distances, where my eyes actually hurt for the split second I tried it out (being a curious cat). I will never go there again.

We just have to realize that we must stop way before achieving realism, simply for health and safety reasons. It's pretty much a given that I don't want to risk my eyesight in exchange for having a realistic game.

Mintmaster said:
The FP10 format is just about enough for this (32 / (1/256) = 8192), so it should be adequate. I think some effects can make use of higher dynamic range, but for photorealism it's enough in terms of range in a linear color space. Figuring out what to write in terms of lighting is a much bigger problem in photorealism than storing it accurately.
Agreed.
Mintmaster said:
Basically, what I'm saying is that absolute luminance of what you render is mostly meaningless, since only relative luminance should affect your final image.
Not sure.
There are limits to how relaxed or contracted the iris will get, and the one effect where this shows, which is also pretty low-hanging fruit for game engine class renderers, is near darkness. Loss of color below certain thresholds and noisiness are phenomena I certainly experience myself in low-light conditions, and I assume that's normal for humans. Right? :D
 
zed said:
the problem with how bloom is typically imnplemented is, developers just grab the current screen and use that as the basis of the bloom.
a better way is to only have bloom with selected objects.
the reason being it stands out more thus has more of a visual impact
(and u also dont get that horrible effect of things like ppls faces glowing :D )

Q/ so why dont ppl use a selected bloom?
A/ slightly more difficult to program + also comes at more of a performance hit.
Many developers are simply trying to show off. They blunt the effect so that every idiot can see it's there, even in still screens.
"HDR" is a checkbox item you can put on the box, you can tell your boss you have it, you can tell the gaming press you have it, you can even add to your resumé that you did it once already.

There are a lot of technocrats in the PC gaming public, and they currently demand "HDR" in their games because, currently, "it's modern". These are the same people who bought cards with the unspeakable family of NVIDIA chips for the "forward-looking feature-set".
*insert the usual 800-word off-the-wall rant about technocrats here*

Plus it's also easier to not be selective. The easy approach to "bloom" is to copy the whole backbuffer to a texture, filter it down, scale it back up and add it to the backbuffer with a blend. It will become fuzzy because of the downscale/upscale process. And it's inherently not selective ;)
 
Maybe devs will start adding a "HDR intensity" level in the options, from none to heavy or 0 to 9, a to b....
That would certainly to be the like of most, after that it would just be a matter of artistic implementation.

And that Getaway demo from E3 is the best i'v seen so far. Some game types should probably look better with more agressive hdr, like some adventure/fantasy(like kameo), but not much.
 
Back
Top