How do console games achieve a steady framerate?

Sorry if this is obvious, but I was wondering, how is it possible for console games to achieve the same frame rates, (seemingly) irrespective of the amount of on screen complexity?

If you look at the framerate in PC games you'll see it shoot all over the place as you move to different parts of the level or even just from view to view. Console games, even ports, don't seem to have this problem and the framerate is constant whether there is one enemy on the screen or five. How is this variance smoothed out, and why is it not easy for the player to 'break' the performance?
 
I think you'll find that with a lot of PC games, you can fine-tune the effects to your hardware so that the framerate rarely drops. In most of the most prominent console games, that tuning has already been done. If the console can't handle more than 5 guys onscreen, there won't be more than 5 guys in the room. If you don't have enough bandwidth or texel fillrate for a lot of multitexturing, there won't be a lot.

But there are lots of console games that have unstable framerates, even big AAA titles. Grand Theft Auto: Vice City comes immediately to mind. Quite a few of the Xbox titles noted for their fancy shader work (Chronicles of Riddick, Thief III, Deus Ex: IW, Doom 3) had unstable framerates.
 
some console games use a bit of motion blur to smooth out the action and give the appearance of a smooth framerate.
 
Subtlesnake said:
Sorry if this is obvious, but I was wondering, how is it possible for console games to achieve the same frame rates, (seemingly) irrespective of the amount of on screen complexity?

If you look at the framerate in PC games you'll see it shoot all over the place as you move to different parts of the level or even just from view to view. Console games, even ports, don't seem to have this problem and the framerate is constant whether there is one enemy on the screen or five. How is this variance smoothed out, and why is it not easy for the player to 'break' the performance?


You pick a framerate and lock to it.
Console have fixed hardware so either it's fast enough or not, you don't have to worry about uber highend rigs or minimum spec.

Generally you pick a framerate say 30fp, and make it work inside the time budget (33 ms), that means tweaking code and content until it works. Obviously in 3D environments it's hard to cover all the bases without being overly conservative, which is why games tend to drop frames in places.
 
see colon said:
some console games use a bit of motion blur to smooth out the action and give the appearance of a smooth framerate.
Uh, which "some" games are you thinking of, exactly? I can't name a single one.

Motion blur as defined by realtime 3D hardware isn't an effective means of hiding a poor framerate; just for starters, it isn't really motion blur, and furthermore, it requires considerable fillrate to do a full-screen alpha blend too. It just looks really assy, and it can't hide anything. Least of all poor framerate. Any large changes between frames such as fast pans and such will look like hell I might add.
 
i liked how in the early-to-mid 90s forward, most 3D arcade games from Namco and Sega were 60fps, with very few exceptions (i.e. 18 Wheeler: American Pro Trucker).

framerate was king. I had hoped that would've been a priority this generation, but it seems not. developers and gamers want the best looking graphics first and foremost, not always the smoothest framerates.
 
Devs are still striving for rock solid 30 or 60 FPS, but it is not as easy as it once was when there was very little geometry, textures, and other assets that needed to be worked on as well as everything else. It really all comes down to extremely careful management of every facet that goes into the game. If there is too much geometry going on in a scene at once then that can definitely slow down the framerate. There's all sorts of examples. What it comes down to is how much time a developer has to really make the game polished. Some are attacking the problem from the very beginning so it doesn't become a huge problem at all during the development phase. Other devs will wait towards the end to really narrow down a good framerate, but by that time they may have limite time to solve the problem and the publisher wil put the pressure on to ship it.

We can delve deeper if you wish...

The good thing is that a console is a closed system and as devs become more intimate with the hardware it should be easier to control and sustain an acceptable framerate.
 
Guden Oden said:
Uh, which "some" games are you thinking of, exactly? I can't name a single one.

Motion blur as defined by realtime 3D hardware isn't an effective means of hiding a poor framerate; just for starters, it isn't really motion blur, and furthermore, it requires considerable fillrate to do a full-screen alpha blend too. It just looks really assy, and it can't hide anything. Least of all poor framerate. Any large changes between frames such as fast pans and such will look like hell I might add.
shadow of the colossus is one...
http://www.dyingduck.com/sotc/making_of_sotc.html
Well, SOTC is a game whose frame rate can increase and decrease wildly. Actually, we have incorporate the variable frame rate in the design - it increases and decreases with load balancing. Although there are cases when it reaches 60fps, there are also times when it falls to the 15fps range, but the motion blur helps to smooth over this, and the player's sensation of frame rate changes is held down to a minimum.

there are quite a few others i suspect are doing the same, even some FPS like unreal championship.
 
Subtlesnake said:
How is this variance smoothed out, and why is it not easy for the player to 'break' the performance?
Well the way it usually works:

- Given a fixed target/closed platform, you optimize the application for worst case/load (which also defines your minimum framerate).
- Given a variable target(like PC), you optimize for average case (rather then worrying about minimum framerate, you focus on keeping fps smooth in average situation).
(The interesting sideeffect(which isn't immediately obvious) of the above is that sometimes optimization strategies applied are incompatible between two approaches)

So to answer your last question - performance isn't easy to 'break' because fps limit is set to what should (in theory) be the peak utilization / minimum fps for the target game.
 
_xxx_ said:
No games adjusting the detail level dynamically? I thought there were some out there :???:


According to the framerate or to the view distance? I don't recall any off-hand that do the former.
 
You can't really dynamically affect detail per frame without a hideous mess. One frame you have high poly models, and the next, having turned the corner, suddenly the models drop to low quality. If you observe detail pop-in on terrain based on draw distance, and imagine that happening to lots of different models per frame, it'd be pretty crazy. You could potentially drop quality to boost performance in known situations, but dynamically I think it'd be a disaster. I also recall a game actually did this. One of the Myst series perhaps? Someone on this forum said it exhibited visible weirdness - maybe they'll catch this thread and comment?
 
ihamoitc2005 said:
Rally Sport Challenge developer says they have this feature to keep 60fps.

I believe a few games have it, however they're not very aggressive with it. Oh, and the video card drivers used to drop antialiasing when the framerate got too low.
 
see colon said:
shadow of the colossus is one...
Well, if it is, then it's pretty much the only one, as it's not a common technique. It's also not effective, as it's technically NOT blur at all, not by any standards whatsoever. Blending the previous frame buffer into the current does not blur anything. It also does not help to hide framerate hitches and may in fact help to create more of them.

there are quite a few others i suspect are doing the same, even some FPS like unreal championship.
This I seriously doubt.
 
Guden Oden said:
This I seriously doubt.

Correct, UC did not have motion blur at all. It did have AA which destroyed the framerate at all times though. And even with the subsequent patch that turned it off, the framerate was very inconsistent. UC2 used blurring for special effects such as depth of field (gameplay in general or selective blurring for sniping) or motion blur (aerial dashes) rather than for "hiding" performance issues. See the UE2X page for examples.
 
Last edited by a moderator:
It sometimes looks like dynasty warriors 5 blurs the screen, and the framerate in this game does drop quite often especially in the larger battles.
 
Back
Top