Techniques of Special optical effects (lens flare, DOF, etc.) *spawn

Ok thanks, I have a better feel now. If these effects are implemented in a real game instead of a tech demo, it would be easier to "break" the performance. His comments was directed at games, not the tech demo.
 
They would aim for lower quality, skip them or optimise for newer architecture/shader model. After all that techdemo runs on SM2.0 and was built around SM2.0 limitations. Geforce 6xxx series introduce SM3.0, nvidia 8xxx/ATI 2xxx introduced SM4.0 then ATI 3xxx/4xxx series SM4.1 and DX11 with SM5.0. Then you have custom going outside of fixed API.
 
Chromatic aberration looks like this:

Chromatic_aberration_(comparison).jpg


KZ2 does not present that effect. Simple as that.

Not sure if Crysis 2 demo got it on consoles. If like Crysis games then HUD image, cloak, ocean water and for screen "EMI" effect or such. IIRC the chromatic aberration effect has a ~2ms impact on a 8800 if going by shader note.
 
That's just what some random guy thinks KZ2 is doing ;)

Seriously, people are quoting me on that?! :oops:

Same thread...
http://forum.beyond3d.com/showpost.php?p=1159400&postcount=50
I had a feeling it was sarcasm, but I didn't bother to check the rest of the thread to see it in context :LOL:

Now I'm scratching my head. Do you want a game to look like that?!
Not really, but since somebody claimed the game had it, discussion ensured xD.

Furthermore, where exactly are/were Guerilla talking about chromatic aberration? I've Googled to find the feature list that claims it as a feature, but haven't found it. All I can find is a repeated quote from a Wikipedia article on raytracing which was a quoted feature.
I think the origin of this rumor was indeed you xD

Actually it does, in the loading screens. ;)
Interesting. Ever used during gameplay?
 
KZ2 does lens reflection + bloom post processing, which disperses light/colors subtly in many situations.

And yes, the loading screens do have very clear chromatic aberration (You can move SIXAXIS during loading in a limited fashion to toy with the scene just a little). Some dispersion will lead to chromatic aberration in a/the lens. Will have to play through the game again to see where else it occurs.
 
When I used the term "chromatic aberration" in that blog post, I was referring to the light being scattered and refracted inside the lens (which is what causes lens flare in the first place). In other words, the phenomenon that causes lens flares to have different colorization than the original light source. I wasn't implying that KZ2 approximates this effect for all incoming light (AKA all pixels on the screen), which is what would give you something like what you have in those wikipedia images.

Anyway the Guerilla guys obviously didn't do it exactly the same way I did, since they were doing it on SPU's and they did the blurring a bit differently. The way I did it makes more sense for a traditional GPU post-processing pipeline. But it's definitely pretty close, based both on their description and the end result. Either way it's all a big fake, but I think it looks kinda nice. It's especially cool when it happens due to a muzzle flash or a big explosion, which you don't get from games that do it the "old-school" way.
 
Chromatic aberration is a very subtle effect, and I don't see developers wasting programming time and system resources to do it, since most people wouldn't notice anyway. I've seen people toy with it in CGI, where they add it to give that extra push into photorealism, but on the flip side I've also seen editing programs with the ability to remove it to help clean up a photo.

It's interesting sometimes to see the counter-balance of effects like this. Lens flares, for example. I was watching a Photoshop video that demonstrated the removal of lens flares from a photograph, shortly after I'd spent $130 on a VFX plugin that's designed to create lens flares.

The problem with any of it is that it's nearly always overused in games. So-called HDR (bloom) just for the sake of saying that the game has "HDR lighting" (which is a completely different thing, that most games don't have at all), or in-your-face lens flares just for showing off the lens flares.
 
The problem with any of it is that it's nearly always overused in games. So-called HDR (bloom) just for the sake of saying that the game has "HDR lighting" (which is a completely different thing, that most games don't have at all), or in-your-face lens flares just for showing off the lens flares.

HDR is much more about the art pipeline then it is about bloom. It allows artists to use more realistic parameters for scene lighting and other assets, and at runtime it allows you to perform a more realistic simulation of light transport. It's not even strictly related to bloom...depending on how you implement HDR it usually lets you have a "better" bloom that's more straightforward in approach (for instance you can lower your exposure to create a source for the bloom effect, a la Halo 3). I'm not sure how you got the impression that "most" games don't use an HDR pipeline, but it's almost certainly false. Or at least, in terms on PC/Xbox360/PS3 games.
 
Chromatic aberration is a very subtle effect, and I don't see developers wasting programming time and system resources to do it, since most people wouldn't notice anyway. I've seen people toy with it in CGI, where they add it to give that extra push into photorealism, but on the flip side I've also seen editing programs with the ability to remove it to help clean up a photo.
.

It's nice in Crysis games for nanosuit HUD distortion by EMI or damage, ocean water surface from above and under and also for cloaked materials. It fairly subtle but nice and that it reacts according to lightsource dispersion, lights strength and even HUD visir FOV. Also like it in Prey PC where it is used for force fields.
 
I was referring to chromatic aberration over the entire image, like the pic that Scofield posted, rather than used for a particular effect. The prismatic effect in Crysis is much more in-your-face than what I was referring to (not to say it's not cool, I agree with you there, it's a wonderful touch, especially on water). I've seen CGI folks use chromatic aberration, but tiny.. 0.5-1.0 pixel shift, not even noticeable, but does make a subtle change in how the image looks. As it were a photo taken by a camera with a slightly flawed CCD. In that arena, it's actually the imperfections in the image that make it look more real.. "perfect" images look fake.

MJP, I know very well what HDR is. While they may be using it somewhere in the rendering pipeline, I've seen very few games that actually use high-dynamic lighting. In my world (CGI), HDR is (at its most basic level) the ability to go outside the normal bounds of 0-100% illumination (0-255, 8-bits). To have sunlight that is literally a thousand times brighter than a light bulb. In direct application, it tends to yield results with very high contrast (very bright highlights and very dark shadows), without losing any detail. Nothing in the illuminated area is blown out, and nothing in the shadows is crushed. And yes, it can certainly result in some amazingly lifelike results, especially in a linear colorspace (which is a whole other matter, I don't know if games are even touching on that yet).

Most games have very flat lighting.. the illuminated areas aren't really that much brighter than the shadows. Now, I've seen games use lighting solutions that yield good results, like Assassin's Creed or Uncharted, but Crysis is about the only game I've seen that really does it well. And even then, you have to push the TOD outside its normal settings to keep the lighting from looking flat.

What I was referring to by the statement of "so-called HDR" are developers who use a limited-range lighting solution, but then toss a bunch of lens flares and bloom onto the image to make it look like it's using a HDR solution. They crank up the settings on things like specular highlights and say "ooo, pretty", but it's not real HDR lighting. For an example, take a look at the "HDR mod" for World of Warcraft. It's literally just a crazy bloom filter, but they're advertising it as a HDR lighting solution for WoW, when it actually does nothing at all to change the game's own illumination settings.

They're getting better at it, and we're seeing more of it, but it's something that some developers appear to be struggling with. Whether that's on the programming side or the art side, I don't know.
 
I was referring to chromatic aberration over the entire image, like the pic that Scofield posted, rather than used for a particular effect. The prismatic effect in Crysis is much more in-your-face than what I was referring to (not to say it's not cool, I agree with you there, it's a wonderful touch, especially on water). I've seen CGI folks use chromatic aberration, but tiny.. 0.5-1.0 pixel shift, not even noticeable, but does make a subtle change in how the image looks. As it were a photo taken by a camera with a slightly flawed CCD. In that arena, it's actually the imperfections in the image that make it look more real.. "perfect" images look fake.

Ah I see. You know I was actually toying with original flowgraph to apply fullscreen chroma shift which looks similar or is similar with exaggeration some days ago.

Obviously the strength I set it to was to high and it might give headache if viewed for to long time. :LOL:
http://img51.imageshack.us/img51/210/editor2010061112342135.jpg
 
MJP, I know very well what HDR is. While they may be using it somewhere in the rendering pipeline, I've seen very few games that actually use high-dynamic lighting. In my world (CGI), HDR is (at its most basic level) the ability to go outside the normal bounds of 0-100% illumination (0-255, 8-bits). To have sunlight that is literally a thousand times brighter than a light bulb. In direct application, it tends to yield results with very high contrast (very bright highlights and very dark shadows), without losing any detail. Nothing in the illuminated area is blown out, and nothing in the shadows is crushed. And yes, it can certainly result in some amazingly lifelike results, especially in a linear colorspace (which is a whole other matter, I don't know if games are even touching on that yet).

Contrast and preservation of detail not the direct results of HDR rendering. By itself, HDR is really just about what you mentioned in the first part of your post: using a wide range of values for radiance and irradiance during lighting calculations. But all this does is allow you to have a more realistic light transport simulation (AKA simulating the physics of light/material interactions)...it doesn't necessarily enhance the quality of the final image that gets displayed on your screen. In face on its own it will give you worse results: if you just map your irradiance values to pixel colors directly, pretty much everything will be blown out whites. To make it actually look good, you have to use tone mapping. Tone mapping is essentially where you simulate the process of light entering the eye or a camera, and then striking the retina/sensor/film. It is here where you take into account things like exposure, or contraction of the iris, and apply some sort of non-linear curve to bring the HDR values into the visible range. The results of this process are incredibly subjective...I mean just think about the amount of time a photographer takes to set up exposure and shutter speed in order to produce an image that looks "good". A lot of people lump HDR and tonemapping together, but I think it's important to realize that they're both separate things. It's especially important in understanding why games look the way they do: most games use the Reinhard operator for tone mapping, which preserves detail but results in a low-contrast "washed-out" look. They also have to automate parameters selection for exposure and other values, since in a game you can't tweak shot-by-shot.

Oh and related to the "linear colorspace" bit...this gen plenty of games are using a gamma-correct pipeline. This is partly because important to getting nice results (no banding!), and also partly because current-gen consoles provide the hardware support necessary for doing it.

Most games have very flat lighting.. the illuminated areas aren't really that much brighter than the shadows. Now, I've seen games use lighting solutions that yield good results, like Assassin's Creed or Uncharted, but Crysis is about the only game I've seen that really does it well. And even then, you have to push the TOD outside its normal settings to keep the lighting from looking flat.

This is related to what I said above, but lighting looking "flat" doesn't mean a game doesn't use HDR. Ultimately it's probably more related to the kind of tone mapping and color correction used. If you want, have a look at my latest blog where I have a bunch of screenshots using different tone mapping methods. For instance you might want to compare this picture (which uses a typical Reinhard operator), with this one (which emulates the tonal response curve of Kodak film). The second has much greater contrast and a more satisfying color range, but both were rendered with the exact same HDR pipeline.

What I was referring to by the statement of "so-called HDR" are developers who use a limited-range lighting solution, but then toss a bunch of lens flares and bloom onto the image to make it look like it's using a HDR solution. They crank up the settings on things like specular highlights and say "ooo, pretty", but it's not real HDR lighting. For an example, take a look at the "HDR mod" for World of Warcraft. It's literally just a crazy bloom filter, but they're advertising it as a HDR lighting solution for WoW, when it actually does nothing at all to change the game's own illumination settings.

What you're describing sounds more like the norm for the previous gen, rather than the current gen. Also using an HDR pipeline doesn't mean that artists can't crank up the bloom...a lot of them like the sort of surrealistic look that it produces.
 
Obviously the strength I set it to was to high and it might give headache if viewed for to long time. :LOL:
http://img51.imageshack.us/img51/210/editor2010061112342135.jpg
That's pretty convincing around the window frame , but it's still a pointless feature to go with IMO except when you want a special effect to highlight a situation, like engaging a Beserk mode or during one of those horrible 'you're dying, so we're going make everything that much harder to see so you can't actually save yourself' effects that are so depressingly popular these days. Otherwise why try to recreate a bad lens? Why not add screen-blur and high contrast to get as close as possible to the Holga? (example) Qunicunx FTW!
 
No game developer with any brain is going to construct an engine that properly fakes chromatic aberration to make their game look like poop, nor are they going to fake it to a realistic level such that it's imperceptible, just like real photographs.

Actually it's not like that. CA has become a cheap post effect to add 'realism' in offline renders, so much so that even we added some into one of our latest work (barely noticeable though).

I've also seen it in one recent game's trailer, using it as a 'damage' effect, ie. when the player is shot; can't recall which one though. If it can be faked using some cheap post effect, I'd expect it to appear in more and more games as well.
 
It makes sense to have an effect like (or atleast one that imitates) this during an emp blast or such, from what I remember I saw similar effect in Crysis & Splinter Cell conviction...pretty cool to look at. But outside of these areas its totally useless.
 
Actually it's not like that. CA has become a cheap post effect to add 'realism' in offline renders, so much so that even we added some into one of our latest work (barely noticeable though).
Yes, it's the same bitter irony that had software engineers trying to create lens flare while optical engineers trying to remove it. If you're going for a cheap camera look it makes sense, but typically photography uses the best possible equipment and the results are outstanding fidelity with very little CA; even moreso now companies like Canon provide software compensation for CA in their own lenses.

If we're not going to soften our renders to lose the pixel-perfect edges (something that's cropped up with upscaled content and people saying they prefer it) then the ever-more subtle effect of CA seems even more of a waste. Except...
I've also seen it in one recent game's trailer, using it as a 'damage' effect, ie. when the player is shot;
...as I said, as a point effect. In which case CA could be replaced with any other fancy effect and there's no point to target CA in particular per se. Just a coloured, image ghosting, fresnel colouring effect of various sorts.
 
Actually it's not like that. CA has become a cheap post effect to add 'realism' in offline renders, so much so that even we added some into one of our latest work (barely noticeable though).

Crazy, we (as in photographers or at least big photography lovers, and i would assume videophiles too) try to get rid of CA as much as we can, and b!tch about it a whole lot when it appears on pictures, and you guys try to replicate it on CGI to make it look more realistic!! :D

Next you'll tell me you even try to fake barrel distorsion.
(For anyone who doesn't know what that is, it's when you try to take a picture of a wall or a square head-on with a rubbish lens and it doesn't look perfectly square, it either looks rounded outwards - barrel distorsion - or inwards - pincushion distortion).

We should try to make CGI look like it was shot with good lenses, not rubbish ones! :D

Just kidding, really, not only about rubbish lenses (the most amazing wide angle/zoom lenses will still have some barrel distorsion, and it's perfectly fine), but about my post in general.
 
Back
Top