What's the deal with HDR?

skilzygw

Newcomer
Can both consoles use HDR in their games?

I remember a while back reading something about it, but i've since forgotton and since you cant search for 3 letter words in this forum. I can't find anything relevant when i search.

Thanks.
 
skilzygw said:
Can both consoles use HDR in their games?

I remember a while back reading something about it, but i've since forgotton and since you cant search for 3 letter words in this forum. I can't find anything relevant when i search.

Thanks.

Both consoles meaning? ps2/xbox? or are you talking next gen?
 
Sorry about that. Next Gen.
Thanks.

Do you think this will be the single greates leap for next gen consoles? Will it have the most impact on the way we perceive the improved graphics?

Thanks.
 
skilzygw said:
Sorry about that. Next Gen.
Thanks.

Do you think this will be the single greates leap for next gen consoles? Will it have the most impact on the way we perceive the improved graphics?

Thanks.

Now now let's not get over-excited. HDR will help. Definately. Whether it's the biggest leap in realtime graphics is all to be seen.
 
I think that second question was a sepparate question seeing as how as far as I know the Xbox and PS2 can do HDR, well the xbox at least in Far Cry.
 
Atsim said:
I think that second question was a sepparate question seeing as how as far as I know the Xbox and PS2 can do HDR, well the xbox at least in Far Cry.

SOTC gives the player a HDR like feel. Just look at some of the videos. Yes I think HDR will be one of those things that seperates today's games from tomorrow's games. To be honest I'm noticing HDR effects in movies and TV shows now thanks to next-gen media.:p
 
Atsim said:
I think that second question was a sepparate question seeing as how as far as I know the Xbox and PS2 can do HDR, well the xbox at least in Far Cry.

The lighting in FarCry really is amazing for an XBOX game, I don't know if it's true HDR or fake HDR though...
 
I think HDR will have its "places" to give dramatic/realistic effect, but if it gets overused because it is now the new, "hot" feature, we can count on seeing gratuitously plenty scenes that look like the console world has been attacked by a trend of over-exposed/unnaturally-exposed film shots.
 
It can't be overused anymore than 32-bit color vs 16-bit, or fragment shaders, or HW T&L, or dot-product. It's just a higher precision format for rendering.

It's how it is used which is the problem, one word: bloom. All new features get used for piece-meal gimmicks at first. Dot-product -> shiny bump maps. HW T&L -> specular per-vertex everywhere. And each new generation of games each had the "gawdy" look at first. Remember how every thing used to be shiny? Then bumpy. And with fragment shading, the first effects you started to see was water. Water, water, everywhere. Cause that's the easiest feature to add that doesn't effect your existing art pipeline.


HDR != bloom and over/underexposure. Properly architectured games with HDR fromt he ground up will help remove issues with shadow detail in heavy multipass situations where precision errors creep in a 8-bits per pixel. HDR will permit usage of higher quality HDR lightprobes in scenes.

And with good tone mapping, it isn't so much getting an overexposure/underexposure, as getting the *right exposure* that preserves local contrast and elimates as much over/underexposure as possible.

See the example HDR renders in Interactive Time-Dependent Tone Mapping Using Programmable Graphics Hardware
 
Sounds reasonable to me, but in the end, nearly no average consumer is going to appreciate the difference unless they are handed the script that "this is HDR and it looks good to you", and/or they are beat over the head with a pronounced "bloom" effect. W/o the obvious (possibly obligatory) bloom effect, most people will be scratching their heads wondering if they are "seeing" HDR or not, or why is the "HDR" missing...
 
Well, they won't know the technology behind it, but they will appreciate better graphics. HDR, when used in a game engine written from the ground up, isn't a subtle effect, just like anisotropic filtering isn't. Consumers won't know what AF is, or perspective correct texture mapping, but they sure and hell can recognize the difference between a PS1 and PS2.

Or, when consumers start hooking up their XB360's to their big screen TVs, they are going to notice an enormous improvement over the XBox1 due to always-on 4xMSAA, AF and higher resolution.
 
scooby_dooby said:
The lighting in FarCry really is amazing for an XBOX game, I don't know if it's true HDR or fake HDR though...


It really does have a different feel than the PC version as the Xbox version was designed with bloom in mind...

Now, what you mentioned of fake HDR raised a flag to me because in an IGN video interview, Jay Stelly (Valve, HL2) mentioned that they were able to get "a version of [HDR] working." Could this mean anything or...
 
DemoCoder said:
Or, when consumers start hooking up their XB360's to their big screen TVs, they are going to notice an enormous improvement over the XBox1 due to always-on 4xMSAA, AF and higher resolution.

Most assuredly they will for those other features you mention. How HDR will poke out from that to actually be identified and noticed is not exactly clear (unless it is an intentional bloom).
 
DemoCoder said:
due to always-on 4xMSAA, AF and higher resolution.

NO...just... NO

" ALWAYS" on???

360 would be lucky to have 2xMSAA on all the time, but i can see Dev's trading AA for effects.
 
Did you just ignore the entire thread on the cost of AA?

Once game engines are built with predicated tiling in mind, 4xAA should be as small as a 1-3% hit. Engines that weren't built with tiling in mind can suffer from a 1-10% hit.

In comparison, 4xAA on the G70 can easily be >30% performance hit.

The reason dev's are struggling with EDRAM right now is because they didn't build in predicated tiling.
 
scooby_dooby said:
Did you just ignore the entire thread on the cost of AA?

Once game engines are built with predicated tiling in mind, 4xAA should be as small as a 1-3% hit. Engines that weren't built with tiling in mind can suffer from a 1-10% hit.

In comparison, 4xAA on the G70 can easily be >30% performance hit.

The reason dev's are struggling with EDRAM right now is because they didn't build in predicated tiling.

The reason Dev's are'nt using it is because its not entirely " FREE " as pointed out in the PDZ video. As i said..i expect Dev's to use simply Blur effects in place of AA and use the saved power for more special effects.
 
X360 - FP10

PS3 - FP16

...

?

Is this how things are generally going to be? I've been wondering for a while about potential differences between the systems in this regard.

(Yes, I know X360's ROPs support FP16, but I feel there is a reason for the inclusion of FP10..)
 
Titanio said:
(Yes, I know X360's ROPs support FP16, but I feel there is a reason for the inclusion of FP10..)
FP10 is a lower memory conuming HDR mode, to fit more into the eDRAM. Using higher quality modes like FP16 will increase the number of tiles needed.
 
DemoCoder said:
And with good tone mapping, it isn't so much getting an overexposure/underexposure, as getting the *right exposure* that preserves local contrast and elimates as much over/underexposure as possible.

See the example HDR renders in Interactive Time-Dependent Tone Mapping Using Programmable Graphics Hardware
Nice link that shows what HDR should be doing. However I'm concerned how practical this is given the very last line of the document...
Plate 1: A series of 512x512 HDR images that have been tone mapped on the GPU using equation 5. Underneath each image is thecompression ratio achieved by our algorithm using two adaption zone. All images were generated at nearly 30 hz.

So nearly 30hz for a 512x512 buffer. And that's without rednering the scene first! Using this algorithm realtime on a 1280x720 buffer isn't going to happen.

Definitely a great optical future awaits us when cameras etc. record HDR images and we can compress, stretch and select from intensities, but this level of usefulness of HDR in realtime isn't going to appear for many a year I imagine. And perhaps HDR will add nothing more than excessives of bloom and overexposure (that's oversaturated too) :(

 
Shifty Geezer said:
FP10 is a lower memory conuming HDR mode, to fit more into the eDRAM. Using higher quality modes like FP16 will increase the number of tiles needed.

Even using FP10, though, with any AA you'll be tiling anyway, so it's not the breaking point in that regard. Sure, you'll likely be tiling more with FP16, but I wonder if using it will be as easy regardless of that.
 
Back
Top