Digital Foundry Article Technical Discussion Archive [2011]

Status
Not open for further replies.
What about this effect starting at 3:20 for GOW3:

http://www.youtube.com/watch?v=QQR6reTiJGA#t=3m20s

Yeah that is basically the same as the effect you see used on the sun in say RDR, etc, and described in that Gpu Gems article. You just put a "sun", or a really bright light where the dudes head is and let the post process do it's thing. You can see as well that it's done in lower res, which is typical for that particular effect.

One tweak to what I said previously, a game doesn't have to be hdr to have the necessary buffers already around. You need a luminance map so if you are doing tone mapping without hdr then you probably have a luminance buffer of sorts lying around as well.
 
Yeah that is basically the same as the effect you see used on the sun in say RDR, etc, and described in that Gpu Gems article. You just put a "sun", or a really bright light where the dudes head is and let the post process do it's thing. You can see as well that it's done in lower res, which is typical for that particular effect.

One tweak to what I said previously, a game doesn't have to be hdr to have the necessary buffers already around. You need a luminance map so if you are doing tone mapping without hdr then you probably have a luminance buffer of sorts lying around as well.

Is it what Far Cry 2 also does and btw Far Cry one also had godrays when facing sun in 'paradise' mode set in menu. Though they called it procedural beams/shafts.
 
Is it what Far Cry 2 also does and btw Far Cry one also had godrays when facing sun in 'paradise' mode set in menu. Though they called it procedural beams/shafts.

Like in this vid:

http://www.youtube.com/watch?v=_k0in9pH6e4

It does look to be the same basic idea, although their version looks tweaked a touch to make it seem more like streaks. Prolly the same basic idea though. Haven't played the first Far Cry in ages, can't remember that one.



Yeah, it's tried and true. Especially if they are far off and you can't get to them, then they look really cool. If you can walk up to them then you can always put two planes perpendicular to each other and fade them based on angle to the camera. So as you start getting around to the side of one of the the fake polygons it will start to fade away, and the other one will start to fade in.

I think Castlevania uses all three methods. They have good old fake polygons like in this pic for ambience on the sides:

http://images.gamersyde.com/image_castlevania_lords_of_shadow-13529-1869_0011.jpg

...and the sun rays in this pic:

http://images.gamersyde.com/image_castlevania_lords_of_shadow-13529-1869_0005.jpg

...and I believe they do true volumes in one instance, but I can't find a darned screen shot for that one :(

EDIT: Hmm, maybe they aren't doing volumes. In this video at 5:18 and 5:40, looks like they are just using a combination of the other two tricks.

http://www.youtube.com/watch?v=xMzSJ0rdSfI&feature=relmfu
 
Last edited by a moderator:
That repi tweeted it or that the artists preferred A2C?

The use of A2C has nothing to do with resolution. It's just the alpha coverage mask ANDed with the MSAA coverage mask.

No, they're just alpha tested.

Taking repi's word at value, you have to look at the use of A2C (effectively dither result with single sample per pixel) from the standpoint that they're going to be using a lot of said textures. Again... think of your typical insect screen door and think about what happens when you put a ton of them together, overlapping, and then change your own view distance. Doesn't mean that everyone's going to like it or that it ends up being ideal.

That the artists preferred A2C.


I thought A2C is where rather than rendering the full transparency, they only render every other pixel or something (similar to interlacing).

I can see what you're saying about how the screen door effect would blend together when layered, but from the DF face off it clearly looks worse than the PS3 version:
http://www.eurogamer.net/articles/digitalfoundry-faceoff-battlefield-bad-company-2?page=1

http://images.eurogamer.net/assets/articles//a/9/9/1/9/0/7/360_screendoor2.jpg.jpg
http://images.eurogamer.net/assets/articles//a/9/9/1/9/0/7/PS3_screendoor2.jpg.jpg

So is there a performance issue or was it really an artistic decision
 
That the artists preferred A2C.


I thought A2C is where rather than rendering the full transparency, they only render every other pixel or something (similar to interlacing).

I can see what you're saying about how the screen door effect would blend together when layered, but from the DF face off it clearly looks worse than the PS3 version:
http://www.eurogamer.net/articles/digitalfoundry-faceoff-battlefield-bad-company-2?page=1

Humus does a good job of explaining A2C.

http://www.humus.name/index.php?page=3D&ID=61

It's far more involved than just rendering every other pixel. :)

The X360 <-> PS3 comparison shots at least the ones you link are complicated by the fact that it looks like the PS3 version has a full screen blur applied to everything which softens the edges but also makes everything a bit blurrier.

I haven't played the X360 version so I can't comment on how it looks when playing the game, but on PC I don't really notice any of the artifacts from A2C.

Perhaps the X360 version is doing something odd or missing a step or something which exists in the PC version.

Regards,
SB
 
Motorstorm Pacific rift and Apocalypse also have them, so does FFXIII Versus. So there goes another three ps3 exclusives that sport the effect Nebula;).
 
Motorstorm Pacific rift and Apocalypse also have them, so does FFXIII Versus. So there goes another three ps3 exclusives that sport the effect Nebula;).

FFXIII regular also had them, in the beach scene for instance (maybe even only, not sure). Not a PS3 exclusive of course, but close enough.
 
FFXIII regular also had them, in the beach scene for instance (maybe even only, not sure). Not a PS3 exclusive of course, but close enough.

You had them everywhere in FFXIII, its just that the beach level was the only level (aside from chapter 10) where it was 'in your face, in rest of the cases it was pretty subtle and nicely done as well.
 
You had them everywhere in FFXIII, its just that the beach level was the only level (aside from chapter 10) where it was 'in your face, in rest of the cases it was pretty subtle and nicely done as well.

True, I forgot (been a while and I still only played till chapter 6 or so). But on the Beach it was easy to confirm that the godrays were sort of realtime, in that a lot of random people walk around the beach that can block the rays from your perspective.
 
Humus does a good job of explaining A2C.

http://www.humus.name/index.php?page=3D&ID=61

It's far more involved than just rendering every other pixel. :)

The X360 <-> PS3 comparison shots at least the ones you link are complicated by the fact that it looks like the PS3 version has a full screen blur applied to everything which softens the edges but also makes everything a bit blurrier.

I haven't played the X360 version so I can't comment on how it looks when playing the game, but on PC I don't really notice any of the artifacts from A2C.

Perhaps the X360 version is doing something odd or missing a step or something which exists in the PC version.

Regards,
SB

What blur? Are you talking about the snow being blown through? There isn't any blur in the game.

the 360 doesn't have any MSAA, which will get rid of the screen door effect on PC.
 
eurogamer article on killzone 3 said:
...so the process of moving anti-aliasing across from GPU to CPU is a good way to free up precious RSX resources, not to mention saving around 18MB of precious graphics RAM.

for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?
 
for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?
MLAA saves memory because it is applied as a post process effect, you don't need to worry about fitting it into the framebuffer.
 
for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?

nxMSAA (or QAA) requires n many samples per pixel in vram. That's why MLAA saves memory compared to hardware AA.

While your question is still valid, you don't need to keep your framebuffer (or a full copy of it) in main memory to do full screen cpu post processing (ala MLAA).
 
for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?

The article is referring to the amount saved compared to KZ2's G-Buffer. Going from quincunx to single sample per pixel means the 36MB (5x2x1280x720x32bpp buffers) is halved.

MLAA done on SPUs means that they'll need a copy of the frame in XDR.
 
Status
Not open for further replies.
Back
Top