Question for developers... PS3 and framerate

PS3 devs, would it be true to say that to a large extent. PS3s disadvantage in framebuffer bandwidth can be alleviated by using some of the SPU time for detailed culling? Or is that pushing it to far?
 
If I may add something to the list of questions here: sounds like the general flow is "RSX renders stuff really well but you should cull, remove bits that are smaller than a pixel and generally preprocess the scene outside of RSX to feed it only what it really needs to work on and draw" - making it sound like a total graphics powerhouse as long as you don't make it work on stuff that wont even be rendered in the end.

As I don't really have a clue about one graphics system compared to another, or how to program at that level.... is this actually a weird method of drawing a scene compared to other systems? Until the PS3, was the general idea of console and PC graphics based on just throwing everything at the graphics chip and letting it sort out what gets drawn and what doesn't? Some people have mentioned that there was sometimes no point in optimising or backface culling on PS2 because the fill rate was so quick you could draw over unneeded stuff rather than cut it out in preprocessing.

It seems as if some people find the RSX to be weak and needs to be assisted by an SPU if you want to see anything good. Even if it's true, is that really a bad thing? Sounds like a good idea to optimise as much as possible before rendering anyway.. sounds like a cleaner stream of useful data for the graphics chips.

..or, are functions like these things that developers would normally expect to be built into the graphics chip and the PS3 is unique and a bit backwards by not having a strong preprocessing system on the RSX itself?
 
Until the PS3, was the general idea of console and PC graphics based on just throwing everything at the graphics chip and letting it sort out what gets drawn and what doesn't?

..or, are functions like these things that developers would normally expect to be built into the graphics chip and the PS3 is unique and a bit backwards by not having a strong preprocessing system on the RSX itself?
That's more like it, but AFAIK RSX isn't so much weak as 'not as strong', though I may be wrong on that. It should in principle be just as good at efficiency measures as G70, unless they deliberately gimped those bits to make a chip just as big and expensive but less functional... I think the advantage PS3 has is Cell is very efficient at these graphics optimization functions. And compare that to XB360 where Xenos is very good at them. Thus the two machines need different approaches to keep everything running at top speed.
 
I'm scared framerate performance is the real bottleneck of ps3 architecture instead of the shared ram

as already mentioned, it's not "framerate performance".


the *overall* graphics performance of PS3 and any system has to be divided up to do many things. one thing is push enough frames per second to have fairly smooth movement, a playable framerate. the more graphics performance that goes into geometry, shaders, lighting, HDR, AA, special effects, etc, the less goes into pushing frames per second. HD resolution also plays a HUGE factor in harming framerates. PS3 (and Xbox360) games, the resolution is locked in HD. usually at 720p. sometimes 1080p or 1080i.


It should be no mystery about the framerate issues with many PS3 games. a large part of the problem is pixel fillrate. it's a fairly simple thing to understand. Fillrate is part of the overall graphics performance equasion. it's a large part. bandwidth is another. both factors combined, limited fillrate and bandwidth, accounts for lower than ideal framerates in most current-gen games, be it on PS3 or Xbox 360.

it seems we've gone backwards from last-generation. fewer games have rock solid framerates. fewer 60fps games compared to the same point in time last-gen.

if Microsoft and Sony had demanded GPUs with say, at least twice fillrate (8 Gpixels/sec instead of 4 Gpixels) and double the graphics memory bandwidth, the framerate situation could be more reasonable. there would always be developers that would just waste that on even more advanced graphics and therefore those games might still have framerate issues-- but *responsible* developers would use that to push 60fps. or at the very least, 30fps with absolutely no dips below 30, a common problem in games today.
 
Shader performance

It should be no mystery about the framerate issues with many PS3 games. a large part of the problem is pixel fillrate. it's a fairly simple thing to understand. Fillrate is part of the overall graphics performance equasion. it's a large part. bandwidth is another. both factors combined, limited fillrate and bandwidth, accounts for lower than ideal framerates in most current-gen games, be it on PS3 or Xbox 360.

I think now they move to more shader program than super-fill-rate effects no?

it seems we've gone backwards from last-generation. fewer games have rock solid framerates. fewer 60fps games compared to the same point in time last-gen.

I disagree my friend. I will not say so much backwards but instead I prefer to say clarification. Xbox/Xbox360 showed that 30fps will be acceptable frame-rate for many games for many customers. I think for future games very few will be 60fps. Look at PGR4, Halo3, Gears of War success. All action games with low frame-rate but still for many consumers they are happy.

I think only sports sims (VT3, VF5, GT5, Forza2) will be 60fps.
 
The question is why do you feel the need to question everything without checking facts out for yourself before.... Mark Rein said it himself in an interview! ;)
Sry for the previous post, sometimes I have a variable mood. I'm going to edit it because, except the first lines, it's a bunch of crap.
I disagree my friend. I will not say so much backwards but instead I prefer to say clarification. Xbox/Xbox360 showed that 30fps will be acceptable frame-rate for many games for many customers. I think for future games very few will be 60fps. Look at PGR4, Halo3, Gears of War success. All action games with low frame-rate but still for many consumers they are happy
ihamoitc2005, a good motion blur can help hiding the "low" framerate. PGR3 is a great example of this, everything just seemed smoother than it actually was, especially when it came out.

I got bored of Forza 2, it's a soulless game -in my opinion- but I remember playing it and then launching PGR3 just to have some arcade fun and I noticed framerate jumping around -or gaps, a jerking effect that's hard to describe- that I didn't notice before.

Currently PGR3's framerate is ok for me, mainly because I don't play Forza 2 anymore.
 
Last edited by a moderator:
[*]Skin matrices, even after animation there some work required to get them into the format used by the GPU vertex shader.
An additional note on this SPU module: it really maps bad to the SPUs as it basically navigates through hierarchies to collect bone datas that will be used (and set) as vertex shader constants by another SPU module.
Despite the fact that it's all about random accesses to memory it runs blazingly fast, probably I can't post any numbers or comparisons with the PPU..but let me say I was very surprised when I found out how fast an SPU was at (basically) chasing pointers,.
 
[*]Blend shapes, a custom module that handles facial animations

Based on the "making of" videos, it seemed that the animation is based on FACS... were you using any corrective shapes, or just the base set? In other words, how many shapes per character? And it looks like there are some additional normal maps for eyebrow/nose wrinkles...

PS: AFAIK Half-Life 2 was the first game to use FACS, which is basically a psychological breakdown of facial expressions, about 35-40 individual shapes that can be combined to form practically all possible expressions. But it had no normal maps combined into it, in general I think the character models' faces are a bit overrated (simple photo based textures). So far, HS seems to have the best implementation of human faces and animation.
 
PS3 devs, would it be true to say that to a large extent. PS3s disadvantage in framebuffer bandwidth can be alleviated by using some of the SPU time for detailed culling? Or is that pushing it to far?
SPU culling doesn't save any bandwidth (aside from some corner cases). All vertices still have to be loaded from XDR, except now instead of sending all through FlexIO and having RSX's setup hardware eliminate hidden triangles, the SPUs are doing it (at a faster rate). It could actually take up more XDR bandwidth if you decide to temporarily store the results of SPU culling rather than feeding it directly to RSX (this would prevent the SPU from waiting on RSX in portions of high pixel load and low vertex load).

It should be no mystery about the framerate issues with many PS3 games. a large part of the problem is pixel fillrate. it's a fairly simple thing to understand. Fillrate is part of the overall graphics performance equasion. it's a large part. bandwidth is another. both factors combined, limited fillrate and bandwidth, accounts for lower than ideal framerates in most current-gen games, be it on PS3 or Xbox 360.

it seems we've gone backwards from last-generation. fewer games have rock solid framerates. fewer 60fps games compared to the same point in time last-gen.

if Microsoft and Sony had demanded GPUs with say, at least twice fillrate (8 Gpixels/sec instead of 4 Gpixels) and double the graphics memory bandwidth, the framerate situation could be more reasonable. there would always be developers that would just waste that on even more advanced graphics and therefore those games might still have framerate issues-- but *responsible* developers would use that to push 60fps. or at the very least, 30fps with absolutely no dips below 30, a common problem in games today.
4GPix/s is plenty for 720p or 1080p. For 720p and 60fps, you can fill the entire screen and/or 1024x1024 shadow maps a total of ~70 times per frame.

The problem is that the ROPs don't always reach that rate in reality. Sometimes bandwidth isn't enough. Sometimes they're waiting for the pixel shader. Sometimes both are waiting for the setup engine. You have to divide up your frame time among the various bottlenecks that each object brings about.

In any case, the reason framerates are low has nothing to do with the hardware. It entirely due to developers aiming for too many objects and/or too long shaders. You could halve Xenos/RSX and still get 60fps at 1080p with 4xAA if you simplify the scene enough. There's always a three-way tradeoff between scene complexity/quality, resolution, and framerate. Once the renderer and art is as good as it gets, you can't increase all three at the same time. The exact relationship between these three variables isn't fixed across all games, but the devs still subjectively decide what the optimal point is.

With framerate, suppose you have a distribution like this:
89% of the time the framerate is >60fps
10% of the time the framerate is 30fps-60fps
1% of the time the framerate is 15fps-30fps
0.1% of the time the framerate is <15fps.

What should the dev do? Halve graphical load to avoid the 1% of the time that it goes below 30fps? Quarter graphical load to ensure >99.9% of the time it's over 30fps? I say leave it as is, and maybe even increase the graphical load a bit because I wouldn't mind 2% of the time going below 30fps.

Framerate is all about decisions. It has little to do with hardware unless you're talking about a fixed rendering load. Unfortunately, rendering load is incredibly hard to judge without having full access to the assets, game engine, profiler, and hardware.
 
With framerate, suppose you have a distribution like this:
89% of the time the framerate is >60fps
10% of the time the framerate is 30fps-60fps
1% of the time the framerate is 15fps-30fps
0.1% of the time the framerate is <15fps.

What should the dev do? Halve graphical load to avoid the 1% of the time that it goes below 30fps? Quarter graphical load to ensure >99.9% of the time it's over 30fps? I say leave it as is, and maybe even increase the graphical load a bit because I wouldn't mind 2% of the time going below 30fps.

Maybe you could redesign the scenes that tax the engine too much?
 
Maybe you could redesign the scenes that tax the engine too much?

The problem in games is that its never quite that simple.

Content is flowing in until the day you ship, there can be gameplay ramifications for content changes, there is time pressure, and probably more relevantly it's extremely difficult to isolate transient conditions that affect performance. With freeform camera's locking to a framerate is extremely difficult without being massively conservative.

Now yes you can address this, but in practice for example it may not be worth changing a particle effect because in rare circumstances it impacts framerate. If framerate is not a top priority (and frankly these days there is so much other stuff to worry about) it'll be inconsistent. Gone are the days where I could personally police every piece of artwork and content in a game.
 
4GPix/s is plenty for 720p or 1080p. For 720p and 60fps, you can fill the entire screen and/or 1024x1024 shadow maps a total of ~70 times per frame.


but as you know, 4 Gpixels/sec is enough for HD resolutions and 60fps only if the graphical complexity is limited. the limit is for complexity that is not far above last generation. improved yes, but not much of an improvement, especially with 1080p.

The problem is that the ROPs don't always reach that rate in reality. Sometimes bandwidth isn't enough. Sometimes they're waiting for the pixel shader. Sometimes both are waiting for the setup engine. You have to divide up your frame time among the various bottlenecks that each object brings about.

now that I agree with you on.

In any case, the reason framerates are low has nothing to do with the hardware.

well, developers decisions are often based on hardware. you can say framerates have nothing to do with hardware but then, one can make the arguement that framerates have alot to do with hardware.

It entirely due to developers aiming for too many objects and/or too long shaders. You could halve Xenos/RSX and still get 60fps at 1080p with 4xAA if you simplify the scene enough.

true.


There's always a three-way tradeoff between scene complexity/quality, resolution, and framerate. Once the renderer and art is as good as it gets, you can't increase all three at the same time. The exact relationship between these three variables isn't fixed across all games, but the devs still subjectively decide what the optimal point is.

yeah, absolutely.

With framerate, suppose you have a distribution like this:
89% of the time the framerate is >60fps
10% of the time the framerate is 30fps-60fps
1% of the time the framerate is 15fps-30fps
0.1% of the time the framerate is <15fps.

What should the dev do? Halve graphical load to avoid the 1% of the time that it goes below 30fps? Quarter graphical load to ensure >99.9% of the time it's over 30fps? I say leave it as is, and maybe even increase the graphical load a bit because I wouldn't mind 2% of the time going below 30fps.

Framerate is all about decisions. It has little to do with hardware unless you're talking about a fixed rendering load. Unfortunately, rendering load is incredibly hard to judge without having full access to the assets, game engine, profiler, and hardware.

well, because framerates were, to a high degree on some arcade platforms, fixed with games of the 1990s, I don't believe it's not possible to keep framerates at a steady and high rate for many games. Namco's System 22/23 families, Sega's Model 2 & Model 3 families, ran more than 95% of their games at 60fps. one could argue that todays games are far more complex, and they are. but then, back in the mid to late 1990s, the arcade games of the time were graphically the most complex games at the time, more complex than games of before that time.

I do agree it's more about developers choice than hardware. yet it's also about console providers choices regarding hardware. they really set the bar low for this generation as far as GPUs. while you could say this generation's console GPUs are the most powerful console GPUs ever, and that would be true, the upgrade from last generation is smaller than from 2 generations ago to last gen which was a huge upgrade.

fillrate has not kept up with advancements in resolution. or at best, fillrate has kept even, but that doesnt allow for much of an improvement in graphics. this generation is more like the PS1/N64 gen in terms of framerates. it's very close actually. so i hope that now that resolution has stabalized, isn't going to go up next-gen (i hope i'm right) that whatever improvement in GPUs we get is enough for a large leap in graphical complexity, image quality and framerate.

even with the ever-changing sea of gamecode, graphics-engines, graphical load from frame to frame, i hope that game development and tools advance enough to allow for steadier & higher framerates. some developers with some games achieved that last generation, and even some (though fewer) this generation. i just hope it gets better again.
 
well, because framerates were, to a high degree on some arcade platforms, fixed with games of the 1990s, I don't believe it's not possible to keep framerates at a steady and high rate for many games. Namco's System 22/23 families, Sega's Model 2 & Model 3 families, ran more than 95% of their games at 60fps. one could argue that todays games are far more complex, and they are. but then, back in the mid to late 1990s, the arcade games of the time were graphically the most complex games at the time, more complex than games of before that time.

I think it´s just the sheer size of the code base, third party libraries etc. and the number of people involded in the development of modern games that make up the complexity compared to previous generations, making it hard or impossible to predict and mitigate all worst case scenarios that make the frame rate drop.

The efforts to do that may just be too costly compared to the "improvements" (maybe you need to lower the fidelity significantly and then the improvement might be questionable) of the game, not to mention the loss of flexibility during the development.

I think ERPs comment says plenty.
ERP said:
If framerate is not a top priority (and frankly these days there is so much other stuff to worry about) it'll be inconsistent. Gone are the days where I could personally police every piece of artwork and content in a game.

Besides, today we have motion blur and stuff that cover up for some of the poor frame rate, which also have made it less of an issue, but you know all that stuff, I know. :)
 
I haven't read the entire topic, so I'll just post this:

As a PS2 owner, I am very disappointed with the amount of 30fps titles on next gen consoles as of yet. Some of them are still enjoyable, but as I've said numerous times, there should be no substitute for framerate! :/
 
As a PS2 owner, I am very disappointed with the amount of 30fps titles on next gen consoles as of yet. Some of them are still enjoyable, but as I've said numerous times, there should be no substitute for framerate! :/

Visuals over needless frames any day imo.
 
Joker454, I've been holding off on asking how development using Edge tools is going for awhile... have you been able to meet your target on PS3? Does that include any level MSAA?
 
Back
Top