OpenEXR (HDR) Industrial Light+Magic next big thing?

Hello All,

I read a review a few days about on HardOCP on Farcry and I bumped into some sweet pics of Farcry using OpenEXR a high dynamic-range image file format. The stills looked amazing, and can only guess at how cool full motion games look but it cuts FPS about in half in Farcry on NVidia's 6800 GT from 80ish to 40ish.

I have a few questions:

1] Does it really look as cool and rich in motion as it does in still screen shots?

2] As OpenEXR is free, will it be a new standard in games to deliver HDR?

3] Does it most of the time cut FPS in half?

4] Does two 6800 GTs in SLI run Farcry at acceptable levels (1024x768, 4xAA, 8xAF)?

5] Dumb question, but the game has to be programed for OpenEXR to work from the ground up or is it a quick add on?

6] Any other games supporting this?

7] Anyone else excited about this?
 
OpenEXR is just a lossless file format for image files, like PNG or GIF or whatever, except it uses 64 bits floating-point per component rather than 8 bits fixed point. It has nothing to do with realtime graphics rendering in games...
 
Guden Oden said:
OpenEXR is just a lossless file format for image files, like PNG or GIF or whatever, except it uses 64 bits floating-point per component rather than 8 bits fixed point. It has nothing to do with realtime graphics rendering in games...

I thought it used FP16 per channel????
 
Good good I was pretty sure they weren't using doubles espically since OpenExr is already here on the NV4X and in use in "a Game". ( I was righ the first time the displaying of the image ain't defined in the standard)

Unfortunatly the R4XX didn't support float blends like everyone predicted so they only way to get full minium FP16 quality throughout the pipeline on R4xx cards is to use software emulation ( could be assisted with shaders but it still crawl in any scene with heavy use of transperency ).
 
4] Does two 6800 GTs in SLI run Farcry at acceptable levels (1024x768, 4xAA, 8xAF)?

I'm sure that two 6800GT's in SLI config will be able to run the game extremely well, perhaps using 8xAA and high levels of AF. However, keep in mind that the FP16 blending capability on the NV40 only works without AA, so the OpenEXR HDR in the game will only work when playing without AA. Still, SLI users will most likely be able to run HDR at 1600x1200 using high levels of AF, and the game should look quite good at these settings.
 
bloodbob said:
Unfortunatly the R4XX didn't support float blends like everyone predicted so they only way to get full minium FP16 quality throughout the pipeline on R4xx cards is to use software emulation ( could be assisted with shaders but it still crawl in any scene with heavy use of transperency ).
You don't even need transparency for this to break down, just multipass rendering.
 
Dr. Ffreeze said:
1] Does it really look as cool and rich in motion as it does in still screen shots?
It's kind of gimicky. It doesn't look bad, but you can tell that the game wasn't designed for HDR rendering.

3] Does it most of the time cut FPS in half?
No, due to:

4] Does two 6800 GTs in SLI run Farcry at acceptable levels (1024x768, 4xAA, 8xAF)?
FSAA is not supported with floating-point render targets on the NV4x, unfortunately. This does mean that you can run with a floating-point rendertarget at rather high resolutions (on my GeForce 6800 I can run just fine at 1024x768 with 16-degree AF...I'm sure you can do a fair bit better with a GT or Ultra). The lack of FSAA is distracting, but it's fun to play with HDR rendering for the time being.

6] Any other games supporting this?
I'm expecting most new games (that try to push technology) coming out over the next two years to support some form of HDR rendering. I bet it will be ubiquitous within three years.

7] Anyone else excited about this?
Not OpenEXR, but rather the floating point blending support enabled with the NV4x.
 
jimmyjames123 said:
4] Does two 6800 GTs in SLI run Farcry at acceptable levels (1024x768, 4xAA, 8xAF)?

SLI users will most likely be able to run HDR at 1600x1200 using high levels of AF, and the game should look quite good at these settings.

:rolleyes:

training_hdr.gif


More HDR benches from Xbit; http://www.xbitlabs.com/articles/video/display/farcry13_7.html
 
I think that's pretty stupid. The reviewer claims massive memory bandwidth requirements of HDR, and doesn't bother to notice that the 6600 sometimes outperformed the 6800 Ultra with HDR enabled.

To tell you the truth, I never did benchmark HDR on my system. I've just been playing with it, and haven't had any problems. Anyway, it'd be interesting to see if there's a change in the benchmark results if anisotropic filtering is disabled (there should only be a change if FarCry also uses some floating-point textures, or if anisotropic filtering affects the final rendering pass, which it shouldn't).

Edit:
I guess what these benchmark results *do* say is that if you want to play with HDR, you're much better off waiting for the refresh of the NV4x line before buying. Looks like HDR rendering may be rather flawed in the NV40.
 
Chalnoth said:
I
Edit:
I guess what these benchmark results *do* say is that if you want to play with HDR, you're much better off waiting for the refresh of the NV4x line before buying. Looks like HDR rendering may be rather flawed in the NV40.
My thoughts exactly 8)
 
Actually, you don't need hardware floating-point blending to do HDR, you can emulate it with fragment programs(as it is done in HalfLife 2 and in several HDR demos). HDR works with ATI just as good as with NV40, the only difference is that NV40 makes the life of the developer much easier(and performs the blending much faster) .
 
Zengar said:
Actually, you don't need hardware floating-point blending to do HDR, you can emulate it with fragment programs(as it is done in HalfLife 2 and in several HDR demos). HDR works with ATI just as good as with NV40, the only difference is that NV40 makes the life of the developer much easier(and performs the blending much faster) .
No, it's not that simple. There's a strong feasibility limit here. Specifically, once you go to multipass rendering, there's just no realistic way to do HDR rendering. Multipass rendering will become ubiquitous with any program that makes extensive use of realtime shadows.

It is conceivable that Valve has a simple enough lighting equation that they were able to do all lighting in one pass on the R300, but I think it's more likely that their HDR effects are implemented in a separate pass from normal rendering, which would essentially limit the HDR to bloom-type effects.

I'm pretty certain that if you compare the HDR seen in FarCry to that seen in Half-Life 2, when it is released, you will notice stark discrepancies.
 
Chalnoth said:
I'm pretty certain that if you compare the HDR seen in FarCry to that seen in Half-Life 2, when it is released, you will notice stark discrepancies.


Sorry for not being "in the know," but what differences do you mean :?:
 
Alstrong said:
Sorry for not being "in the know," but what differences do you mean :?:
Well, I haven't seen Half-Life 2 yet, thus I don't know what exact compromises Valve has made to achieve HDR on older processors. So, I don't really know for certain.

One possible symptom you could have in a partial HDR solution, for example, is that lights produce bloom effects, but shiny objects do not. The screenshots I've seen so far seem to lead in this direction, but certainly haven't proven it.
 
Well, to be fair, I think you can get FP16 emulated quality on the R300+. Does it have a 16-bit per component integer framebuffer format?

Thus, you convert a float into a coded 16-bit INT on write, and then unconvert on read when multipassing. However, you'll have to do filtering inside the shader as well.

It'll be alot slower than native FP16 filtering and blending, but still work.
 
DemoCoder said:
Well, to be fair, I think you can get FP16 emulated quality on the R300+. Does it have a 16-bit per component integer framebuffer format?

Thus, you convert a float into a coded 16-bit INT on write, and then unconvert on read when multipassing. However, you'll have to do filtering inside the shader as well.

It'll be alot slower than native FP16 filtering and blending, but still work.

You end up having to render 1 triangle at a time( actually u can prolly do better then 1 triangle with alot of sorting and calculate the screen space the triangles live in ). Can u imagine how long that would take? If their are 1000 transperent particles on the screen its gonna take 1000 passes.

Cause you have to do float blends. As trying to use integer maths on a float packed in an integer simply doesn't work.
 
DemoCoder said:
Well, to be fair, I think you can get FP16 emulated quality on the R300+. Does it have a 16-bit per component integer framebuffer format?

Thus, you convert a float into a coded 16-bit INT on write, and then unconvert on read when multipassing. However, you'll have to do filtering inside the shader as well.
I don't think that would work, as I don't think normal blending would work in any encoded format. In other words, I don't see how this would be different from just using a floating-point framebuffer on ATI hardware.
 
Back
Top