Question for developers... PS3 and framerate

Joker454, I've been holding off on asking how development using Edge tools is going for awhile... have you been able to meet your target on PS3? Does that include any level MSAA?

We aren't using Edge directly, just implementing some of the things it does ourselves (cpu skinning, culling, etc...) because we just can't drop Edge into our codebase. We haven't hit 60fps on PS3 yet but we still have time. It will hit 60 fps sometimes, but we're still unable to sustain it. We are doing 4x msaa at the moment.
 
Joker and other Devs,

Could one or more of you give us a summary of what you CAN say about the RSX (or what you are willing to say)? I'm fascinated with the conversation so far in this thread, but would love to read a basic overview of the RSX.

For example, I have found many references to a document called "RSX Best Practices" but I don't think it's available to the public. It would be interesting (in my opinion) to learn what are the best practices for the RSX, which are the worst, and of course information about what falls in between.

Edge is a fascinating subject. However, the public has a lot of information about the CELL aspect of Edge. But what we don't know is the second half (the RSX).
 
I'm curious... are most PS3 games (multiplatform) that run at a low FPS mostly CPU bound rather than GPU bound? The reason I ask is that it would seem logical that in these scenarios nothing gets ported to the SPUs and PPE chokes. Also, since games that look as good as Motorstorm do not accelerate graphics through the use of the SPUs but make good use of SPUs for just about everything else... it would seem to me that most bad PS3 ports are likely mostly CPU bound rather than GPU? Or am I being too kind to RSX?
 
Last edited by a moderator:
I'm curious... are most PS3 games (multiplatform) that run at a low FPS mostly CPU bound rather than GPU bound? The reason I ask is that it would seem logical that in these scenarios nothing gets ported to the SPUs and PPE chokes.
The GPU's just as capable of being choked if assets aren't designed with the PS3 architecture in mind. The problem could lie anywhere. Well, it's almost certainly the fault of lazy devs who spend all their time in hammocks drinking cocktails, like they do. :p
 
The GPU's just as capable of being choked if assets aren't designed with the PS3 architecture in mind. The problem could lie anywhere. Well, it's almost certainly the fault of lazy devs who spend all their time in hammocks drinking cocktails, like they do. :p

How rude, im sure they squeeze in some quality coding time inbetween the cocktails! :p
 
but as you know, 4 Gpixels/sec is enough for HD resolutions and 60fps only if the graphical complexity is limited. the limit is for complexity that is not far above last generation. improved yes, but not much of an improvement, especially with 1080p.
If you had 50 Gpixels/sec, you could still only hit 60fps if the graphical complexity is limited. That limit is simply higher.

For 1080p I sort of agree, but not everything is about fillrate. More RAM and math help a lot.

well, developers decisions are often based on hardware. you can say framerates have nothing to do with hardware but then, one can make the arguement that framerates have alot to do with hardware.
It doesn't matter. They make the final call about the tradeoff between framerate and graphical fidelity.

If the hardware was more powerful, what makes you think the developer would have the same target and make the same decisions? You can only make that argument for cross platform games whose primary development platform was a different console. In that case, fine, faster hardware on the non-primary console would likely lead to higher framerates.

In all other cases, though, framerates would be more or less the same regardless of hardware power (except for the extreme case where a console has so much power that devs can't find anything to do with it that significantly improves image quality when halving.)

well, because framerates were, to a high degree on some arcade platforms, fixed with games of the 1990s, I don't believe it's not possible to keep framerates at a steady and high rate for many games. Namco's System 22/23 families, Sega's Model 2 & Model 3 families, ran more than 95% of their games at 60fps.
They did not have fixed framerates regardless of scene complexity. I assure you that those games could run over 100fps most of the time (when scene load was low or medium) if they did not have v-sync enabled.

Again, it's developer choice back then to maintain 60fps. Part of the reason is that they didn't have a whole lot to do. 2D games rarely require you to touch each pixel more than once. Clipping is very easy. There's no 3D world to maintain. There wasn't much you could do with twice the render time per frame, so 30fps didn't make much sense. Graphical quality was almost entirely dependent on art.

With 3D games back then, there wasn't much to do beyond, say, layering a couple textures on top of one another. Having more render time in a 30fps game didn't buy you much, because you couldn't do normal mapping or dependent texturing or arbitrary math. RAM was a pretty big constraint on graphical fidelity too.
while you could say this generation's console GPUs are the most powerful console GPUs ever, and that would be true, the upgrade from last generation is smaller than from 2 generations ago to last gen which was a huge upgrade.
Depends on how you look at it. I think some of the hardware from 2 generations ago had some glaring deficiencies. PS1, for example, had no filtering, and N64 has very little storage. Last gen didn't have to deal with a big resolution increase, either. Finally, if you're using XBox as your comparison point, remember that the span of time between XBox and XB360 was pretty small. (It's a shame Sony couldn't push graphics harder with the extra year they had.)

fillrate has not kept up with advancements in resolution. or at best, fillrate has kept even, but that doesnt allow for much of an improvement in graphics.
You're ignoring a lot of details here. First of all, fillrate has improved 4.3x from XBox to 360 (8.6x w/4xAA), and the resolution increase is a factor of 3. Secondly, the Z-culling is so fast now that hidden pixels are basically free. Finally, we have no performance hit for alpha blending either. All these factors explain why fillrate heavy things like grass and smoke look much, much better this gen.

Most important is the fact that we have a huge increase in math ability in the pixel shaders (15-30x) on top of it all being floating point. That makes for a substantial difference in graphics quality.

You and a lot of people have forgotten what last gen really looks like.
 
(It's a shame Sony couldn't push graphics harder with the extra year they had.)

I recently got some of the backstory on why they went with RSX. Turns out it's *hugely* political alas, and there was other hardware that got pushed aside because of it. It's unlikely you'll get anyone willing to talk publicly about it though, but if you have some friends close to Sony and the like, then probe around and they may be willing to fill you in privately. It is a shame though.

Finally, we have no performance hit for alpha blending either

I agree with most of what you've said, but a quick point on the above, it's not free on PS3. You definitely want to avoid alpha blending if you can on RSX and go with alpha coverage.

You and a lot of people have forgotten what last gen really looks like.

Amen to that. I think many would be shocked what their old gen games really looked like if they fired them up to day on say an original Xbox/PS2/Cube.
 
I recently got some of the backstory on why they went with RSX. Turns out it's *hugely* political alas, and there was other hardware that got pushed aside because of it. It's unlikely you'll get anyone willing to talk publicly about it though, but if you have some friends close to Sony and the like, then probe around and they may be willing to fill you in privately. It is a shame though.

Joker is your reference here in terms of the reason Sony went RSX vs their original plan (which we won't get into in this thread), or why they went RSX vs a G80 derivative? Because I think Mintmaster was referring to the later...
 
I'm curious... are most PS3 games (multiplatform) that run at a low FPS mostly CPU bound rather than GPU bound? The reason I ask is that it would seem logical that in these scenarios nothing gets ported to the SPUs and PPE chokes. Also, since games that look as good as Motorstorm do not accelerate graphics through the use of the SPUs but make good use of SPUs for just about everything else... it would seem to me that most bad PS3 ports are likely mostly CPU bound rather than GPU? Or am I being too kind to RSX?

Initially it was probably a bit of both, today its just GPU. Remember though that multiplatform devs have it much harder than PS3 only devs. The main reason is because PS3 specific devs have only one version of the game, and hence nothing to compare it so. So, no one will be comparing Uncharted 360 to Uncharted PS3. The reason this makes life much simpler for them is because then you can take all kinds of tricks and shortcuts to get the game running faster and most will not know any better because there is nothing else to directly compare it to.

Multiplatform devs don't have that luxury, both versions need to look as close as possible. The little tricks that the PS3 only devs can do on their versions would get us killed on our multiplatform versions because sites like IGN, Gamespot, etc, will almost certainly do an A to B comparison between our 360 and PS3 versions and cry foul when they detect that version A has a triangle someplace that version B doesn't. Worse yet are sites like this one, where its almost certain that someone will plaster up screenshots of both and people will tear apart the one version because of x, y and z differences. So we have to get the same assets working on both platforms.
 
Amen to that. I think many would be shocked what their old gen games really looked like if they fired them up to day on say an original Xbox/PS2/Cube.
i recently fired up GT3 and GT4 and they definitely don't look as good as i remember, but still decent. but yeah, i agree.
 
Could one or more of you give us a summary of what you CAN say about the RSX (or what you are willing to say)? I'm fascinated with the conversation so far in this thread, but would love to read a basic overview of the RSX.
Good god... what is it with your obsession to cling to some false hope that there's some deep dark secret about RSX that will create miracles? There isn't one! There's no point trying to wheedle out that which doesn't exist. Sure we can work a little closer to the metal and cheat our way through a few things because it's a console, but those details are neither significant enough to warrant your pleading nor is it something you'll get to know about anytime soon.

How rude, im sure they squeeze in some quality coding time inbetween the cocktails!
Our internal office Onion has an article about a study that proves that delays and unclosed bugs result from rampant unchecked teetotaling.

First of all, fillrate has improved 4.3x from XBox to 360 (8.6x w/4xAA), and the resolution increase is a factor of 3. Secondly, the Z-culling is so fast now that hidden pixels are basically free. Finally, we have no performance hit for alpha blending either. All these factors explain why fillrate heavy things like grass and smoke look much, much better this gen.
Considering the target mentioned as being "fillrate-limited" was PS3, I get the feeling he was talking about PS3 vs. PS2, in which case, fillrate is only up by a factor of 1.7x, but then PS2 was a fillrate monster for its time. I'd also add alpha blending on PS3 is not exactly free when you start laying it on thick. Z-culling is certainly a weapon, and it takes a lot of per-pixel hits out of the state where geometric complexity makes a difference.

Of course, given how much resolution you have on your display, that in turn affects asset resolution (texel fillrate), and additionally things like shadow map resolution and so on, and that's where I think fillrate starts to come into question. One could also argue that if we didn't have the ability to use things like Early-Z and so on, then we'd really be fillrate limited quite easily.

Either way, I think I can name a lot of people who would be happier if HD was not a requirement this gen. Unless both 360 and PS3 had GPUs that stood toe-to-toe with G80, I wouldn't have considered it a reasonable demand.

You and a lot of people have forgotten what last gen really looks like.
Ah yes, the universal insult. If there's something to pick apart and mark as a flaw, it gets the designation of looking "last gen."

There's no accounting for people's lack of brain activity when the image in their head is that more power == nothing to ever complain about.
 
ShootMyMonkey,

I do not think the RSX is a miracle chip. However, I realize there is more about it than most of us have been told. I'm realizing it's severely limited, because I am seeing games getting downgraded before launch (although others are looking better).

To be honest, at this point I'm just curious about the differences for the sake of being curious. If there were anything super special in there we would have seen the results in games by now. We are seeing some great games, but nothing astonishing.

Over the past year or so I've spent a hundred... no probably closer to a thousand hours searching google, yahoo, lycos, and other search engines for documentation about the RSX. However, there is little that is publicly available.

What is interesting to me is how they have made some alterations not to improve performance but just to stop performance from being awful. For example, the post transform cache can be used to increase polygon performance. Also, the larger texture caches are used to make sure texturing from the XDR RAM does not stall everything. This stuff is fascinating to me, because even though I'm realizing the RSX is nothing special maybe these can be used to make the RSX2 for the PS4 a better GPU.

Any information you could provide would be appreciated.
 
However, I realize there is more about it than most of us have been told.
At what level? If you're talking things like specific swizzle formats or something, forget it... you won't be told that by anyone. If you're talking about broader architectural details, then what you "realize" is untrue. That's all out in the open... everything else is just G7x.

To be honest, at this point I'm just curious about the differences for the sake of being curious.
Coming from you, that's difficult to believe. Given your habit of flailing your arms at dissatisfaction and screaming the same thing just so you can hear something that puts your mind at rest, and also getting defensive otherwise, I can't help but think you're looking for something that will justify your fixation with the PS3. When you "accepted" that it can't possibly be the RSX itself, you started hoping for some Cell+RSX combo miracle.

I can feel the synergy... ;)

What is interesting to me is how they have made some alterations not to improve performance but just to stop performance from being awful.
That's pretty much the most common kind of architectural alteration there is. I'd classify Xenos' eDRAM as an example since it is meant to lift a major limiting factor (that is, framebuffer bandwidth). Hell, OOOE or branch prediction in a CPU could fall in the same group.
 
Joker is your reference here in terms of the reason Sony went RSX vs their original plan (which we won't get into in this thread), or why they went RSX vs a G80 derivative?

Is there any thread where we can get into that, fruitfully?
 
Is there any thread where we can get into that, fruitfully?

First off, I'll use your post as a re-querying of Joker to answer my original question as to whether it was the pre-RSX design or an RSX+ design he was referring to. (Joker!)

Now that said... you may want to try searching some older threads on the topic - it's come up a number of times, to be sure. Those threads won't tell you what it was per se, but they'll tell you what it wasn't. Anyway as to Joker's references, if they were related to the pre-RSX design, then politics aside, in truth there were a lot of very pragmatic reasons to go NVidia. But if he's talking some would-be RSX replacement, well then yeah that would be upsetting, because I was wondering myself back in the day why they wouldn't take advantage of the extra year to update the design.
 
Last edited by a moderator:
Dunno what Joker has heard..but I heard at least 4 different versions of the same story :)
 
Hello ShootMyMonkey,

Thank you for the response to my question.

I'm really interested in learning more about how the RSX is different than the standard PC part. Anyone can go online and look up info about the 7800GTX or a similar nVIDIA GPU. However, there are certain aspects of the RSX that were modified to make it work better in the PS3. Basically, to prevent it from having awful performance.

I would like to know more about these alterations and how they relate to the best and worst practices one can have when working with the RSX. Basically, I would LOVE to one day be able to read "RSX Best Practices." Because obviously these practices are at least slightly different from the PC part.

By the way, please realize that I'm the kind of person who likes to look up obscure information. For example, when I was younger I had the Official Technical Guide to the Starship Enterprise Memorized.
 
Back
Top