First Killzone screenshot/details? So says USAToday..

nAo I am not a person who knows much about graphics, what I was thinking (probably wrongly) is that lets say you could produce an infinite amount of polygons to create your character, would this still mean you would have to have AA, because I would have thought that because you had such dense polygons there would not be any way to see the jaggies via the naked eye?

Sorry for using such an extreem example, just wondering if my thinking is correct?

You'd still require AA due to the nature of raster displays.. The increased polycount will work in making your model appear less angular but it's the flat, grid nature of pixel plotting which prompt the need for reducing aliasing and not any inherent scene-based information..

Maybe one day we'll see vector displays but until then.. :cry:
 
nAo I am not a person who knows much about graphics, what I was thinking (probably wrongly) is that lets say you could produce an infinite amount of polygons to create your character, would this still mean you would have to have AA, because I would have thought that because you had such dense polygons there would not be any way to see the jaggies via the naked eye?

Sorry for using such an extreem example, just wondering if my thinking is correct?
 
Finally thanks to nAo for clarifying a lot of the points. What are "internal lens reflection" and "depth-based color grading" used for ?
'Internal lens reflection' is 'lens flare'. It gives the game optical realism as though filmed through a camera. 'Depth based colour grading' is, I presume from the name, adjusting a pixel's colour based on the depth in the scene. When you have 3D data, you can do interesting post-processing on things like normal and distance to change hue, saturation, intensity, or whatever. In this case, you can add atmospherics based on depth, shifting the hue and darkening things. Ye Olde Fog was such an application of adjusting a pixel value by depth. I don't know the specifics of this particular KZ effect though. nAo may have something specific in mind.

Terarrim said:
Sorry for using such an extreem example, just wondering if my thinking is correct?
Nope. As archangelmorph rightly says, the problem is more in the display. The display is made up of discrete pixels which each has a colour. Graphics are rendered as discrete pixels of separate colours to show on these displays. Jaggies appear when pixels contrast heavily. If you have a white slope against a black background, each white pixel will stand clear against the black and you'll see the stepping, whether that slop is made of 1 polygon or 1 million.

The solution is either to have such small pixels that you can't see the stepping - which probably won't occur for decades if ever, or to make sure the contrast between pixels isn't too strong. When you view a photo on a display, jaggies are hard to come by. This is because each pixel is shaded by an amount of different surfaces. On a computer rendering, each pixel is made out of 100% one surface. Thus in the white slope example, a pixel either shows the white slope or the black background. If you were photographing the slope, that pixel would have a degree of white slope in it, and a degree of background, and it's intensity will be an average. Antialiasing does this. It takes multiple samples and averages the results. A photo can have what amounts to basically 'infinite' samples in creating it's average, as it's made of photons, which gives smooth gradients. Computers have to calculate samples, and that means not so smooth gradients. 2x AA mean, in the above example, either white pixels, black pixels, or 50% grey pixels. A gradient shows two colour steps. 4x AA adds 25% and 75% intensity, with four colour steps on gradients. This is how we can determine the amount of AA in a game - find an area of jaggies, a slightly off-horizontal or off-vertical line, and see how many colour steps it has. These colour steps are of less contrast that the two original colours, but can still be visible.

Infinite AA won't fix the problem either. Photographs can still show jaggies on our displays. Car programmes are a good example, with the inside view often showing a bright outside through the window, and a jaggy window frame. Gameshows with horizontal lighting on steps are another culprit. HD will reduce these issues, but won't make them go away.
 
COD 4 looks like, just another video game.

*snip*

codyo0.jpg


killzoneyk5.jpg


I really hope I dont get in trouble for doing such a blatant compare (it's after E3, right?) but it's the simplest way to do it. This is a similar scene from both games. One of these two games looks like it's on a different generation of hardware, it's that simple, I'll let you guess which one.

Thanks for articulating yourself with a long post but I think you just hit the nail on the head with the bolded line.

COD4 has the look of a game.
KZ2 has the look of CG.

It's as simple as that.
 
'Internal lens reflection' is 'lens flare'. It gives the game optical realism as though filmed through a camera. 'Depth based colour grading' is, I presume from the name, adjusting a pixel's colour based on the depth in the scene. When you have 3D data, you can do interesting post-processing on things like normal and distance to change hue, saturation, intensity, or whatever. In this case, you can add atmospherics based on depth, shifting the hue and darkening things. Ye Olde Fog was such an application of adjusting a pixel value by depth. I don't know the specifics of this particular KZ effect though. nAo may have something specific in mind.

Nope. As archangelmorph rightly says, the problem is more in the display. The display is made up of discrete pixels which each has a colour. Graphics are rendered as discrete pixels of separate colours to show on these displays. Jaggies appear when pixels contrast heavily. If you have a white slope against a black background, each white pixel will stand clear against the black and you'll see the stepping, whether that slop is made of 1 polygon or 1 million.

The solution is either to have such small pixels that you can't see the stepping - which probably won't occur for decades if ever, or to make sure the contrast between pixels isn't too strong. When you view a photo on a display, jaggies are hard to come by. This is because each pixel is shaded by an amount of different surfaces. On a computer rendering, each pixel is made out of 100% one surface. Thus in the white slope example, a pixel either shows the white slope or the black background. If you were photographing the slope, that pixel would have a degree of white slope in it, and a degree of background, and it's intensity will be an average. Antialiasing does this. It takes multiple samples and averages the results. A photo can have what amounts to basically 'infinite' samples in creating it's average, as it's made of photons, which gives smooth gradients. Computers have to calculate samples, and that means not so smooth gradients. 2x AA mean, in the above example, either white pixels, black pixels, or 50% grey pixels. A gradient shows two colour steps. 4x AA adds 25% and 75% intensity, with four colour steps on gradients. This is how we can determine the amount of AA in a game - find an area of jaggies, a slightly off-horizontal or off-vertical line, and see how many colour steps it has. These colour steps are of less contrast that the two original colours, but can still be visible.

Infinite AA won't fix the problem either. Photographs can still show jaggies on our displays. Car programmes are a good example, with the inside view often showing a bright outside through the window, and a jaggy window frame. Gameshows with horizontal lighting on steps are another culprit. HD will reduce these issues, but won't make them go away.

Good and interesting post!
 
In my computer something like 2x still makes the game look like a flickering mess (like most ps2 games).

so, if that is what they are doing, i like it lol

Perhaps teh post-process effect rhelps to reduce the jaggies. ANd nAo noted that there where some aliasing seen on the IGN video (only for IGN members :cry: ).

2x MSAA can do that..if you employ a 'good' AA resolve, proper gamma correction and a dark palette!
I downloaded the 720p trailer from IGN and I can assure you that aliasing is there..
 
Thanks for the explanation Shifty A1 explanation another couple of years asking questions about graphics and I might consider myself a graphics non-newb :).
 
Thanks for articulating yourself with a long post but I think you just hit the nail on the head with the bolded line.

COD4 has the look of a game.
KZ2 has the look of CG.

It's as simple as that.

Wow.
No, not even close. Playing the gameplay videos on an HDTV looks nothing like CG.

Originally Posted by nAo
and they also have tons and tons of fillrate intensive stuff on screen with no apparent slowdowns (edram is overrated anyway this generation )
Maybe they're lowering AA at some parts to free up some fillrate? That's a problem EDRAM on 360 would have prevented.
 
Wow.
No, not even close. Playing the gameplay videos on an HDTV looks nothing like CG.

Wow.
It was soo "not even close" that it managed to fool IGN :rolleyes:

IGN said:
The opening was once again fantastic, but we couldn't help but sit there and think, "When are we finally going to see some in-game footage?" The only thing was that we had been looking at in-game footage. As soon as our soldier hits the ground and his gun comes into view, very much like what we saw with the opening to Resistance, we couldn't help but think, "Holy hell, all of that was in-game?"

Indeed, it's quickly apparent that Guerilla has come much closer to the original trailer than most anyone thought possible. It's not 100% identical to be sure, but there are times (quite often) when it's really, really damn close.
 
How hard is it to fool IGN though, cannot imagine it being that difficult :LOL:

They are probably technoignorant and judge what they see and don't particully care how it's done.

People here do.

I am more like IGN myself, if the end result looks good, I really couldn't careless how they do it.
 
Wow.
It was soo "not even close" that it managed to fool IGN :rolleyes:
It's easy to be fooled when you're not looking. First time through, the viewer is swept along on the emotion of the footage. There's a dramatic story here, and they're more interested in that than anything else. If you ignore that, stop looking at the characters, and analyse the rendering, it's very apparent. But a first look is going to be misleading, as the first eyes see things very different to later eyes.
 
It's easy to be fooled when you're not looking. First time through, the viewer is swept along on the emotion of the footage. There's a dramatic story here, and they're more interested in that than anything else. If you ignore that, stop looking at the characters, and analyse the rendering, it's very apparent. But a first look is going to be misleading, as the first eyes see things very different to later eyes.

You think I don't know all of that?

I didn't say it IS CG - but it has that "CG look" about it.
I can watch the HD trailer on a HDTV right now knowing that it hasn't reached the level of E305 but still see that it has a "CG look" and comes closer than any other game to achieving that.

On the technical side maybe you can shed some light on how they are achieving that?
 
'Internal lens reflection' is 'lens flare'. It gives the game optical realism as though filmed through a camera. 'Depth based colour grading' is, I presume from the name, adjusting a pixel's colour based on the depth in the scene. When you have 3D data, you can do interesting post-processing on things like normal and distance to change hue, saturation, intensity, or whatever. In this case, you can add atmospherics based on depth, shifting the hue and darkening things. Ye Olde Fog was such an application of adjusting a pixel value by depth. I don't know the specifics of this particular KZ effect though. nAo may have something specific in mind.

Thanks ! Can always count on you for concise answers. I am praying for a dynamic environment to showcase some of those atmospheric effects and gameplay.
 
'Internal lens reflection' is 'lens flare'. It gives the game optical realism as though filmed through a camera. '
Yes and no, lens flares have been rendered as 2D sprites for ages, so even if this case we are talking about the same physical effect implementation (and final result..) can vastly different as KZ clearly uses a full screen/post processing method.
Thye got a nice chromatic aberration effect. :)
 
nAo, where is a good place to read about it ? I tried googling for "internal lens reflection" but it's not very helpful.
 
Nope. As archangelmorph rightly says, the problem is more in the display. The display is made up of discrete pixels which each has a colour. Graphics are rendered as discrete pixels of separate colours to show on these displays. Jaggies appear when pixels contrast heavily. If you have a white slope against a black background, each white pixel will stand clear against the black and you'll see the stepping, whether that slop is made of 1 polygon or 1 million.

The solution is either to have such small pixels that you can't see the stepping - which probably won't occur for decades if ever, or to make sure the contrast between pixels isn't too strong.


Awesome post Shifty, another "Post of the Month" for you.
:yes:
 
Perhaps teh post-process effect rhelps to reduce the jaggies. ANd nAo noted that there where some aliasing seen on the IGN video (only for IGN members :cry: ).

It's possible, I suppose. We've seen plenty of games use DOF and motion blur to help cover up jaggies. There's also the old "edge filter + selective blur", but IMO that gives poor results based on what I've seen in my own experiments and also in the GRAW2 demo for PC.

I think its probably more a result of the low-contrast colors used, which did wonders for Gears of War.
 
Back
Top