What about this effect starting at 3:20 for GOW3:
http://www.youtube.com/watch?v=QQR6reTiJGA&start=200
http://www.youtube.com/watch?v=QQR6reTiJGA&start=200
What about this effect starting at 3:20 for GOW3:
http://www.youtube.com/watch?v=QQR6reTiJGA#t=3m20s
Yeah that is basically the same as the effect you see used on the sun in say RDR, etc, and described in that Gpu Gems article. You just put a "sun", or a really bright light where the dudes head is and let the post process do it's thing. You can see as well that it's done in lower res, which is typical for that particular effect.
One tweak to what I said previously, a game doesn't have to be hdr to have the necessary buffers already around. You need a luminance map so if you are doing tone mapping without hdr then you probably have a luminance buffer of sorts lying around as well.
Is it what Far Cry 2 also does and btw Far Cry one also had godrays when facing sun in 'paradise' mode set in menu. Though they called it procedural beams/shafts.
You reminded me of some screenshots I saw back in 2007. Like this?
http://www.bungie.net/projects/halo3/asset_popup_viewer.aspx?at=59&cc=21&item=74
http://www.bungie.net/projects/halo3/asset_popup_viewer.aspx?at=59&cc=21&item=79
That repi tweeted it or that the artists preferred A2C?
The use of A2C has nothing to do with resolution. It's just the alpha coverage mask ANDed with the MSAA coverage mask.
No, they're just alpha tested.
Taking repi's word at value, you have to look at the use of A2C (effectively dither result with single sample per pixel) from the standpoint that they're going to be using a lot of said textures. Again... think of your typical insect screen door and think about what happens when you put a ton of them together, overlapping, and then change your own view distance. Doesn't mean that everyone's going to like it or that it ends up being ideal.
That the artists preferred A2C.
I thought A2C is where rather than rendering the full transparency, they only render every other pixel or something (similar to interlacing).
I can see what you're saying about how the screen door effect would blend together when layered, but from the DF face off it clearly looks worse than the PS3 version:
http://www.eurogamer.net/articles/digitalfoundry-faceoff-battlefield-bad-company-2?page=1
Motorstorm Pacific rift and Apocalypse also have them, so does FFXIII Versus. So there goes another three ps3 exclusives that sport the effect Nebula.
Motorstorm Pacific rift and Apocalypse also have them, so does FFXIII Versus. So there goes another three ps3 exclusives that sport the effect Nebula.
FFXIII regular also had them, in the beach scene for instance (maybe even only, not sure). Not a PS3 exclusive of course, but close enough.
You had them everywhere in FFXIII, its just that the beach level was the only level (aside from chapter 10) where it was 'in your face, in rest of the cases it was pretty subtle and nicely done as well.
Humus does a good job of explaining A2C.
http://www.humus.name/index.php?page=3D&ID=61
It's far more involved than just rendering every other pixel.
The X360 <-> PS3 comparison shots at least the ones you link are complicated by the fact that it looks like the PS3 version has a full screen blur applied to everything which softens the edges but also makes everything a bit blurrier.
I haven't played the X360 version so I can't comment on how it looks when playing the game, but on PC I don't really notice any of the artifacts from A2C.
Perhaps the X360 version is doing something odd or missing a step or something which exists in the PC version.
Regards,
SB
eurogamer article on killzone 3 said:...so the process of moving anti-aliasing across from GPU to CPU is a good way to free up precious RSX resources, not to mention saving around 18MB of precious graphics RAM.
MLAA saves memory because it is applied as a post process effect, you don't need to worry about fitting it into the framebuffer.for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?
for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?
for someone who doesn't really understand mlaa, I'm assuming mlaa saves graphics memory because the framebuffer is moved over to the main xdr ram, but does mlaa use more ram overall(xdr+gddr)?