HDR in the Crytekdemo, not only for Nv. (56k warning)

Apple740

Newcomer
Now Ati is "in bed" with Crytek we (X800 users) finally can see HDR in the Cryengine demo, why not in Farcry itself?

Edit your system.cfg file:

r_HDRBrightOffset = "6.000000"
r_HDRBrightThreshold = "3.000000"
r_HDRLevel = "1.000000"
r_HDRRendering = "1"

(Default =
r_HDRBrightOffset = "6.000000"
r_HDRBrightThreshold = "3.000000"
r_HDRLevel = "3.000000"
r_HDRRendering = "0")


aa.jpg


bb.jpg


cc.jpg


ii.jpg


dd.jpg


ee.jpg


ff.jpg


gg.jpg
 
None of your images work, even if I copy and paste the link manually. Also, the standard FarCry engine does not enable HDR on ATI cards no matter what you do with those CVARS. I'm sure the Crytek demo uses it fine, but they have changed something else between FarCry and the ATI demo.

I kinda halfway expect Crytek to offer HDR on ATI cards in the near future now that they're showing it in the ATI demo, but it's not working currently.
 
Apple740 said:
Now Ati is "in bed" with Crytek we (X800 users) finally can see HDR in the Cryengine demo, why not in Farcry itself?

How is the performance hit?

By the way you have forgot a : in all your links.
 
Thanks for the ":" tip. :) :oops:


Some benches:

Asus X800Pro@16pipeXTPE
AMD Athlon 64 3000+
1024 mb ddr 400 mem
Asus K8V SE de luxe

Default (1024, 4xAA/8xAF), no HDR.
Min: 23 Avg: 43.39 Max: 66

with 3Dc.
Min: 23 Avg: 45.03 Max: 68

HDR:
Min: 23 Avg: 42.02 Max: 59
 
Apple740 said:
Default (1024, 4xAA/8xAF), no HDR.
Min: 23 Avg: 43.39 Max: 66

with 3Dc.
Min: 23 Avg: 45.03 Max: 68

HDR:
Min: 23 Avg: 42.02 Max: 59

Pretty amazing scores - the 6800s has 50-60% performance hit from HDR using FP-buffers in Farcry and it can't be used with AA.

Edit:
Are you actually sure that anything other than the bloom effect is added? A 7% hit is just unbelieveble for full HDR.
 
Tim said:
Pretty amazing scores - the 6800s has 50-60% performance hit from HDR using FP-buffers in Farcry and it can't be used with AA.

Not surprising really. Nvidia does lots of stuff first just to have ATI come after and do it better. I'm pretty much expecting them to have a better FP32/SM3.0 and SLI implementation. Just a hunch.
 
trinibwoy said:
Not surprising really. Nvidia does lots of stuff first just to have ATI come after and do it better. I'm pretty much expecting them to have a better FP32/SM3.0 and SLI implementation. Just a hunch.

But Ati has no FP-blending functionality, they have to "emulate" the HDR with shaders - that should be slower.
 
Tim said:
Apple740 said:
Default (1024, 4xAA/8xAF), no HDR.
Min: 23 Avg: 43.39 Max: 66

with 3Dc.
Min: 23 Avg: 45.03 Max: 68

HDR:
Min: 23 Avg: 42.02 Max: 59

Pretty amazing scores - the 6800s has 50-60% performance hit from HDR using FP-buffers in Farcry and it can't be used with AA.

Edit:
Are you actually sure that anything other than the bloom effect is added? A 7% hit is just unbelieveble for full HDR.

It looks like AA is disabled automatically when I set HDR rendering on in the config file. No matter what I do, i can't get the AA to work.
 
Tim said:
trinibwoy said:
Not surprising really. Nvidia does lots of stuff first just to have ATI come after and do it better. I'm pretty much expecting them to have a better FP32/SM3.0 and SLI implementation. Just a hunch.

But Ati has no FP-blending functionality, they have to "emulate" the HDR with shaders - that should be slower.

Unless ATI using shaders to render this effect are a lot faster/efficient than Nvidia's FP blending implementation to do the same?
 
However AA doesn't work properly with HDR, I'll get a better HDR performance if i turn off AA completely:

1024, noAA/8xAF = Min: 25 Max: 98 Avg: 57.614

1024, noAA/8xAF & HDR = Min: 24 Max: 68 Avg: 47.743

So the performance without AA/with HDR is slightly better as no HDR/4xAA.
 
Ok, I can see the pictures now :)

But these screenshots are just from the little ATI demo and not from actually FarCry... FarCry is still using a different engine (maybe not significantly different, but nevertheless different enough) and does not yet support HDR on ATI cards.

What would be fun / interesting to see if someone figures out how to "hack" the demo graphics portion of the engine over to a FarCry install and get it working. I wonder if it's new shaders, a new DLL file, a whole new engine, or some combo of the above?
 
Albuquerque said:
the standard FarCry engine does not enable HDR on ATI cards no matter what you do with those CVARS. I'm sure the Crytek demo uses it fine, but they have changed something else between FarCry and the ATI demo.

I kinda halfway expect Crytek to offer HDR on ATI cards in the near future now that they're showing it in the ATI demo, but it's not working currently.

I dumped all the HDR shaders from the demo into the Farcry folder, editted the config file, but no progress. :D :cry:
 
Please please please, in the name of all good things, stop calling a bloom effect "HDRI"...
 
DaveBaumann said:
The engine for the demo is 1.3 whereas the game was 1.2, or something along those lines.

you mean even with the 1.3 patch is it more like 1.2?
 
You can get the HDR effect for FREE on ANY card simply by turning the contrast up on your monitor to max ;)
 
Laa-Yosh said:
Please please please, in the name of all good things, stop calling a bloom effect "HDRI"...

I agree completely. Perhaps it is not the case here, but everywhere I look I see people confusing HDR with blooming. I'm not sure why people have it in their heads that you should *always* be able to tell when HDR is in use.
 
Back
Top