The Game Technology discussion thread *Read first post before posting*

It took some of the most experienced spu coders weeks of optimization to make the MLAA run as well as it does on GOW3. I don't think any developer that's not trying to showcase ps3 tech (read: not owned by Sony) will spend that much on trying to get mlaa to work. Which multiplat engine is designed to pass buffers between the cell and rsx anyways?
 
It took some of the most experienced spu coders weeks of optimization to make the MLAA run as well as it does on GOW3. I don't think any developer that's not trying to showcase ps3 tech (read: not owned by Sony) will spend that much on trying to get mlaa to work. Which multiplat engine is designed to pass buffers between the cell and rsx anyways?

Saboteur?

GoW3 had extra constraints for spu post processing, being a 60fps(targeting) hack and slash.
Their primary challenge (I assume) was to reduce the latency (of adding SPUs to the pipeline).

That's not necessarily the case for most games, and many can get away with an extra frame of latency, making MLAA or any SPU postprocessing implementation significantly easier.
 
I'm just thinking that Alan Wake has 5xxp because it probably started development at the same time as a few well known other 5xxp titles on the 360 that have been released already. This game has been very long in the making, and they probably never switched to tiling, as it's much easier to add other effects rather than significantly changing the rendering pipeline and art asset requirements, the latter probably being about 93% of the work on this title, comparable to Heavy Rain.
 
It took some of the most experienced spu coders weeks of optimization to make the MLAA run as well as it does on GOW3. I don't think any developer that's not trying to showcase ps3 tech (read: not owned by Sony) will spend that much on trying to get mlaa to work. Which multiplat engine is designed to pass buffers between the cell and rsx anyways?
You must have missed TB's post on this. You can drop it any game with a few function calls. As Betan says, Santa Monica didn't want to add a frame of latency. The question is a matter of eeking out performance. I imagine UE3 games probably have enough spare cycles to support the AA, though whatever scheduling methods are available in UE3 may complicate its inclusion. Ultimately though, as I understand it, if you can spare 20ms of SPU time, you should be able to slot in the AA method into any engine as it's a post effect, and the libraries are available to use.
 
Then every game coming out in the near future and beyond should have this sort of AA instead of no or QAA, if they can spare the cycles.
 
This game has been very long in the making, and they probably never switched to tiling, as it's much easier to add other effects rather than significantly changing the rendering pipeline and art asset requirements, the latter probably being about 93% of the work on this title, comparable to Heavy Rain.

That's not true - 540p and 4xAA would not fit into the EDRAM without tiling. The game has also been running at 720p before, the reduced resolution probably has performance or memory related reasons.

In fact the entire game tech looks mighty impressive, as we're able to see more of it. And I get the impression that a lot of the really good stuff is held back...
 
You must have missed TB's post on this. You can drop it any game with a few function calls. As Betan says, Santa Monica didn't want to add a frame of latency. The question is a matter of eeking out performance. I imagine UE3 games probably have enough spare cycles to support the AA, though whatever scheduling methods are available in UE3 may complicate its inclusion. Ultimately though, as I understand it, if you can spare 20ms of SPU time, you should be able to slot in the AA method into any engine as it's a post effect, and the libraries are available to use.
Saboteur used the much less impressive method of edge detect/blur compared to GOW3.

Then again, I thought MLAA wasn't available to use for other devs, they'd have to build their own. Did sony decide to include GOW3's MLAA code in the dev tools now?
 
Last edited by a moderator:
That's not true - 540p and 4xAA would not fit into the EDRAM without tiling. The game has also been running at 720p before, the reduced resolution probably has performance or memory related reasons.

Indeed. Just a single 32-bpp RT at 540p 4xAA is 8.29MB. Double that for depth. The 720p vids weren't of good enough quality to ascertain MSAA though. If they were 2xMSAA, the RT would be 7.37MB.

One would argue the benefits of 4xMSAA for A2C, particles, and shadows (perhaps any reflections on water if they have them) given how the game is essentially all of the above. :p It isn't easy with other performance considerations.

We know now they have some sort of deferred lighting solution as well. So at bare minimum we would have normals + depth per pass. It just scales up from there.
 
Yeah, on top of tiling they also have to render multiple buffers - they're doing a LOT of work.
 
I thought MLAA wasn't available to anyone outside Sony and they had to develop their own. Saboteur also used the much less impressive method of edge detect/blur compared to GOW3. Then again, I thought MLAA wasn't available to use for other devs, they'd have to build their own. Did sony decide to include GOW3's MLAA code in the dev tools now?

You have missed his point imho. MLAA smart use depends of the works of the developers. We can discuss just with its 'proper' use & maybe SMS had discovered a better technic to have more efficiently edge with alternate spe job but if a 'modest' software house can reach similar results with a simple edge detected at the end depends only of the developers works.
 
Last edited by a moderator:
You must have missed TB's post on this. You can drop it any game with a few function calls.

In my own personal experience, nothing related to SPU's has ever been a simple matter of dropping in a few function calls. :p

Besides like they said, it took them a lot of work to optimize their code enough to get it down to that nice 20ms number. So unless they make their code available, it would mean going through a similar effort. And even if they did make it available, it may need to be adapted to a studio's own in-house SPU/task-scheduling/post-processing framework.
 
In my own personal experience, nothing related to SPU's has ever been a simple matter of dropping in a few function calls. :p

Besides like they said, it took them a lot of work to optimize their code enough to get it down to that nice 20ms number. So unless they make their code available, it would mean going through a similar effort. And even if they did make it available, it may need to be adapted to a studio's own in-house SPU/task-scheduling/post-processing framework.
That's what I think too. They should make at least the API avaliable, if not the source code.
 
Laa-Yosh said:
That's not true - 540p and 4xAA would not fit into the EDRAM without tiling.

I'm thinking they're using a 16bpp framebuffer in the EDRAM at any one time. Anyone remember how Halo 3 worked?
 
So unless they make their code available...
I'm assuming they have as Sony's Core Technology Group were involved, and it'd be a waste to develop an effective set of libraries that T.B. has dropped into programs, but not share them. ;)
And even if they did make it available, it may need to be adapted to a studio's own in-house SPU/task-scheduling/post-processing framework.
Post processing should be a nicely isolated point in most games. Basically you've finished the frame, and perform a second pass. You'd want to drop the GOWAA into the initial post-procesing step so the FB is antialiased, and then apply your colour filters and effects. I'm having trouble imagining a post-processing phase where that isn't possible.

Remember, Santa Monica's scheduling difficulties were because they were being highly optimal with the hardware to hit their 60fps target. If they didn't care for that, the post-processing integration wouldn't be a problem once they had achieved their optimisations.
 
Are there any articles or presentations out there that describe what is involved in using AA samples to reconstruct an image used in games like Ratchet and Clank, Lair, WWE etc? Google fails me, as does Insomniac's R&D page.
 
Are there any articles or presentations out there that describe what is involved in using AA samples to reconstruct an image used in games like Ratchet and Clank, Lair, WWE etc? Google fails me, as does Insomniac's R&D page.
Yea I wanna know about it too, and even I've tried searching for articles with no luck.
 
Are there any articles or presentations out there that describe what is involved in using AA samples to reconstruct an image used in games like Ratchet and Clank, Lair, WWE etc? Google fails me, as does Insomniac's R&D page.
Reconstruction is just a fancy signal-processing term for filter kernel. Nvidia started throwing it around in the graphics realm to describe their Quincunx resolve, back when the Geforce 3 was news.

Image reconstruction means an (unspecified) filter that makes pixels from samples. If someone says "AA resolve", they mean the same thing. If someone says "downsample my AA'd backbuffer", they mean the same thing. Language, 'sall.
 
Back
Top