Deferred Rendering on PS3 (Was KZ2 Tech Discussion)

Status
Not open for further replies.
Thats pretty much a expanded version of the slides I was talking about, very interesting stuff, thanks :smile: .

I thought there was a presentation on SPURS and how the developers used it to manage culling on the SPUs ? (Memory is a little vague now). It's supposed to have more detailed number breakdown.

Still can't find it. May be I remember it wrongly.
 
The trouble is that the PS3 has to keep the framebuffer which it renders into in main memory. Using 2xAA doubles it, using double buffering doubles it, using multiple render targets multiplies it.

Why would double buffering double it? Seems to me that the front-buffer could be kept around post-sample-resolve, and you don't need the z-buffer for the front buffer either. And MRT's wouldn't be double-buffered either. A 1280x720 2xMSAA backbuffer should take up about 15M with Z/Stencil. If you resolve the backbuffer and keep it around, you need only about 3M more. Maybe nAo can shed light on this, but I recall from years ago talking with an NVidia employee about how the G7x architecture is capable of custom MSAA resolves, it's just not exposed by any public API (to prove it, he implemented a custom Gamma resolve which I had proposed) This seems like it would be ideal in the deferred renderer case, since you can do it as a post-process.

So, the difference between the X360 and the RSX in this case seems to be 8M. All of the buffers take a total of 18M, but the X360 has 10M of EDRAM, leaving a difference of 8M, or 1.5% of total memory.


The amazing thing to me is, the X360 does have an advantage. It has unified shaders, a more functional GPU, vastly more framebuffer bandwidth, and an easier CPU programming model, plus excellent MS dev support, yet despite this, no XB360 developer seems to be pushing the X360 HW like is being done in the PS3. Could be the laziness of a large market with built in zillions of sales for top tier devs, but it seems devs are definately working on a time to market basis, pushing out wares quickly, rather than years of expensive tech development. The ubiquity of UE3 titles shows that.
 
...Its really as simple as where developers are and what they're doing in my opinion.

On 360 the only comparable production (that we know of), with a focus on ground-breaking technology & 360 is Alan Wake, and that could be quite something if we get to see it properly.. Rare are unpredictable, even thought they showed early signs of technical brilliance on the platform.

PS3 on the other hand has a formidable in-house with an aim to push the PS3, in addition a couple of 3rd party developers choosing to make their big, technically advanced productions on PS3.

(excuse the off topic)
 
...Its really as simple as where developers are and what they're doing in my opinion.

On 360 the only comparable production (that we know of), with a focus on ground-breaking technology & 360 is Alan Wake, and that could be quite something if we get to see it properly.. Rare are unpredictable, even thought they showed early signs of technical brilliance on the platform.

PS3 on the other hand has a formidable in-house with an aim to push the PS3, in addition a couple of 3rd party developers choosing to make their big, technically advanced productions on PS3.

(excuse the off topic)

Doesn't this belong in this thread? http://forum.beyond3d.com/showthread.php?t=51767&page=3
 
I don't understood the sense to talk about 360 and ps3 differences in this thread or what is better or worsen in the respective platform...I think at the end it's only question of personal preference and no more.
 
The amazing thing to me is, the X360 does have an advantage. It has unified shaders, a more functional GPU, vastly more framebuffer bandwidth, and an easier CPU programming model, plus excellent MS dev support, yet despite this, no XB360 developer seems to be pushing the X360 HW like is being done in the PS3. Could be the laziness of a large market with built in zillions of sales for top tier devs, but it seems devs are definately working on a time to market basis, pushing out wares quickly, rather than years of expensive tech development. The ubiquity of UE3 titles shows that.

I think the devs will become more ambitious over time. Just be patient.

Due to the esoteric architecture, the PS3 devs may need more creativity and supporting mechanisms to get the ball rolling. Sony's marketing people also pushed for differentiation in their first party games right off the bat.
 
I think the devs will become more ambitious over time. Just be patient.

If they have a captive market that is more than willing to lap up UE3 derivative games then why bother with the additional time and expense of developing a new engine?

Maybe with the arrival of IDs new engine we'll see a welcome change in 360 games. PS3 owners also seem pickier and more critical about about the quality of games. There also seems to be a very healthy level of friendly competition amongst PS3 devs, even if they trade tech on a regular basis.
 
If they have a captive market that is more than willing to lap up UE3 derivative games then why bother with the additional time and expense of developing a new engine?

Maybe with the arrival of IDs new engine we'll see a welcome change in 360 games. PS3 owners also seem pickier and more critical about about the quality of games. There also seems to be a very healthy level of friendly competition amongst PS3 devs, even if they trade tech on a regular basis.

Going off-topic now. So I'll be brief. Just wait and see. No effort required on our part. :)

In the mean time, I hope Sony devs continue to surprise us. :D
 
Why would double buffering double it? Seems to me that the front-buffer could be kept around post-sample-resolve, and you don't need the z-buffer for the front buffer either. And MRT's wouldn't be double-buffered either. A 1280x720 2xMSAA backbuffer should take up about 15M with Z/Stencil. If you resolve the backbuffer and keep it around, you need only about 3M more. Maybe nAo can shed light on this, but I recall from years ago talking with an NVidia employee about how the G7x architecture is capable of custom MSAA resolves, it's just not exposed by any public API (to prove it, he implemented a custom Gamma resolve which I had proposed) This seems like it would be ideal in the deferred renderer case, since you can do it as a post-process.

So, the difference between the X360 and the RSX in this case seems to be 8M. All of the buffers take a total of 18M, but the X360 has 10M of EDRAM, leaving a difference of 8M, or 1.5% of total memory.


The amazing thing to me is, the X360 does have an advantage. It has unified shaders, a more functional GPU, vastly more framebuffer bandwidth, and an easier CPU programming model, plus excellent MS dev support, yet despite this, no XB360 developer seems to be pushing the X360 HW like is being done in the PS3. Could be the laziness of a large market with built in zillions of sales for top tier devs, but it seems devs are definately working on a time to market basis, pushing out wares quickly, rather than years of expensive tech development. The ubiquity of UE3 titles shows that.

On the flip side of that argument though, the PS3 does have (total) more bandwidth to throw around. A technically superior CPU, as we've been seeing with things like physics, animations, and even graphic-aid tasks from the SPURS docs for RSX.

RSX isn't a far weaker card either from all indications. Criterion (B:p) stated the difference was, at best, marginal. So really you're looking at a more flexible shader architecture and edram... But the reason the Edram is there in the first place shouldn't be forgotten. Xenon\Xenos have to split and fight over their bandwidth (which is less than what the PS3 has dedicated per component).

I haven't heard much in the way of praises for the kind of architecture the 360 has really. Only that it's more dev friendly.. But not if you're really trying to push a top tier project and pump money into it. In that case, if you're going to blow tons of cash, you'll probably get more interesting results from PS3. (Which might be what we're seeing?)

Yin and Yang. As a poster above me said. Really it's developer personal preference. And for a Deferred Renderer, the PS3 is quite apparently the better machine to try and implement it on.

Although I think Crackdown was DR.
 
Only that it's more dev friendly..

As we have seen from the results of most cross-platform game comparisons, this seems to be the most important thing -- not something dismissed easily. ;)

Again, the problem lies within the motivations and skillset of the developers. Right now the focus on the 360 seems to be cranking out as many games as possible -- and why not, they sell like gangbusters. The PS3 games need to stand out to get attention, so they get the Killzone 2 type treatment.

I'd caution strongly against making assumptions on the hardware based on what we've seen on the 360 so far, just because the developer intentions are different. And usually when we have the same developer on both platforms (multi-platform games), the 360 version usually gets the better result. The moral of the story here is, too many variables...
 
As long as the developers want to make the games look "equal", they will eventually be equal once they are familiar enough with both systems. Both hardware are capable but the 360 has one more year in the market, plus a better SDK from the get-go.

GG's GDC 2009 presentations, plus code sharing between studios should shed more light on the KZ2 tech topic.

The thing is these knowledge will help to propel all game devs forward. Techniques critical for PS3 is useful for 360 too. At the same time, since PS3 can already achieve good result when squeezed with less memory, it will be able to do more from here on out since there are more memory available now compared to a year or two ago -- if memory is a problem in these situations.

Now back to tech discussions. :)

From a tech perspective, I would like to see what else can be done on the SPUs :devilish:
Hopefully, the art side can work in tandem to produce distinguishable results.
 
Techniques critical for PS3 is useful for 360 too.

And this is something i don't believe anymore. It might have been true for high level stuff like treatment of textures or shaders etc. But when it comes to the specifics of CELL/PS3 there's almost nothing left that can be used on 360 (imo) because it's a totally different architecture.
 
Are you a game programmer with actual coding experience? If not, how can you tell?
 
So what exactly enables a Deferred Rendering engine to allow so many light sources? I mean I know it has to do with the method of rendering pass right?

Take a peek at this link:

http://www.ziggyware.com/readarticle.php?article_id=155

It's a step by step guide with code samples of a deferred rendering example. There's some tech jargon but it's not too bad, it'll give you an idea what's going on better than slides or power point presentations will. The setup of the gbuffers will of course vary by implementation, but you get the idea.


Jesus2006 said:
It might have been true for high level stuff like treatment of textures or shaders etc. But when it comes to the specifics of CELL/PS3 there's almost nothing left that can be used on 360 (imo) because it's a totally different architecture.

I think people forget sometimes that the 360 has 3 cpu cores and hence 3 vector units. The PPC cores might be poop, but the vector units aren't bad. Stuff like "cpu side skinning" may look and sound good on slides, but it's cake to implement on the 360s vmx unit if you so choose.


DemoCoder said:
Why would double buffering double it? Seems to me that the front-buffer could be kept around post-sample-resolve, and you don't need the z-buffer for the front buffer either. And MRT's wouldn't be double-buffered either. A 1280x720 2xMSAA backbuffer should take up about 15M with Z/Stencil. If you resolve the backbuffer and keep it around, you need only about 3M more.

With 2xmsaa, somewhere in PS3 memory must reside a 2560x720 buffer to render to as well as the regular 1280x720 buffer to resolve out to. The memory for both of these must be available somewhere in the 512mb of ram. On 360 you do not need the 2560x720 buffer in the 512mb of ram, you render msaa to edram in tiles, and it gets resolved out to the 1280x720 buffer in the 512mb of ram. That's a big memory penalty on the PS3. Likewise, rendering PS3 particles to a 1/4 size memory buffer means that somewhere in memory must exist a 640x360 buffer to render to. If you are lucky and can reuse a buffer from elsewhere then cool, if not then that's more memory eaten. No need for that on 360, you render particles straight to edram at full res. Etc, etc. Put it this way, if it ever does become only a 6mb difference then I'll have a party at my LA pad and you're all invited. For now though, it's not :) Memory is what you make of it in any case, God of War looked amazing on a 32mb PS2.
 
joker454 you're a gem as usual.
I assume that you're still primarily developing on the PS3.

I don't recall you mentioning cpu skinning in the VMX thread, are you devs using them more effectively now.

I will read when I finish my homework.
 
joker454 you're a gem as usual.
I assume that you're still primarily developing on the PS3.

I don't recall you mentioning cpu skinning in the VMX thread, are you devs using them more effectively now.

I will read when I finish my homework.


I thought Joker was only really working on the 360.
 
Status
Not open for further replies.
Back
Top