*spin* another 60 vs 30 framerate argument

So what we have here is the top selling games run 60 fps while the other games are trying to compete with prettier graphics? how is that working out for them?

It seems to be a long running trend with Sony/Playstation. Shadow of the colossi and Ico are two of my favorite playstation games but they have horrible frame rate, seemly because of the over emphasis on the pushing extra graphics. Are the devs pushing the playstation too hard or is it sony pushing them to do this?

Good question. There is one thing talking about solid 30 or solid 60 fps. But typically this gen's 30 fps games are more often mid 20 to 28 fps games. I just read the DF Crysis 3 comparison and in my opinion the devs did push too far, they did let the fps down to single digits and happily sacrificed gameplay. GG with Killzone MP as well to some extend...and imo this is a much more cumbersome trend these days on consoles than the difference of 30 Hz or 60 Hz. Hope next gen improves on solid fps...new KZ analysis of DF showing rock solid 30fps is a real good sign imo.
 
So what we have here is the top selling games run 60 fps while the other games are trying to compete with prettier graphics? how is that working out for them?

It seems to be a long running trend with Sony/Playstation. Shadow of the colossi and Ico are two of my favorite playstation games but they have horrible frame rate, seemly because of the over emphasis on the pushing extra graphics. Are the devs pushing the playstation too hard or is it sony pushing them to do this?

No, some top selling games are 60 fps, some are not.
 
^ which ones are not? I don't think they are "targeting" 30fps. I think they run at 30fps because making a pretty game is the only talking point of the game. So you have a self fulfilling prophecy; put at much graphics as you can while keeping the minimum framerate.

Perfect example; Uncharted; Its a pretty game; what do you do in it? watch videos and shoot bots. On the ps4 its going to be the same game running at 30fps. Nobody targets 30fps. 30fps is a side effect of poor optimization.
 
Nobody targets 30fps. 30fps is a side effect of poor optimization.
:???: You have a balancing act between many variables. Targeting 30 fps provides double the pixel power at a given framerate. It is very much a target, as is the resolution. It's not a matter of poor optimisation as that'd mean every game that currently runs 30 fps can be made to run faster without changing any other parameters just by coding it more effectively.
 
^ which ones are not? I don't think they are "targeting" 30fps. I think they run at 30fps because making a pretty game is the only talking point of the game. So you have a self fulfilling prophecy; put at much graphics as you can while keeping the minimum framerate.

Perfect example; Uncharted; Its a pretty game; what do you do in it? watch videos and shoot bots. On the ps4 its going to be the same game running at 30fps. Nobody targets 30fps. 30fps is a side effect of poor optimization.

Gears of war, Halo (all of them afaik), GTA 4, Mass Effect 1-3, Battlefield 3. I'd say most games aside from cod and racers.
 
:???: You have a balancing act between many variables. Targeting 30 fps provides double the pixel power at a given framerate. It is very much a target, as is the resolution. It's not a matter of poor optimisation as that'd mean every game that currently runs 30 fps can be made to run faster without changing any other parameters just by coding it more effectively.

They seem to be balance them to 30fps first and foremost. So they half the framerate and double the graphics, half the objects on screen, no destructable environments, small stages. And you call this balancing?
 
They seem to be balance them to 30fps first and foremost. So they half the framerate and double the graphics, half the objects on screen, no destructable environments, small stages. And you call this balancing?

COD does the same: no destruction, even smaller stages, less players...cut down graphics, but 60fps...so, what exactly is your point?
 
Last edited by a moderator:
This comment is from Drivel;

Let's get this cleared up, it's not about the FPS. It's about that:

60fps -> 16.6ms per frame
30fps -> 33.3ms per frame

In other words, you get twice the amount of time to do everything that needs to be done in order to display the next frame. That helps, but, it's also a lie on many levels.

Expanding the time by a factor of 2x is about similar to making your code run 2x faster (it's not but bear with me). Programmers usually don't consider a speedup of 2x to be worth investigating. Why not? Because twice as fast does not mean twice as much stuff that can be done. For instance:

- Physics: twice as fast in an n-body simulation means that you can do 40% more bodies (not 100% more), n-body sims are O(n^2)
- 3D volumes of stuff: Games are not made in 1D, they're made in 3D. To sufficiently fill a world you need to extend "stuff" into 3 axes. So twice as fast lets you draw 25% more volume of stuff (extending stuff in 3D is O(n^3)).

There are many examples of this where twice as fast just does not give you enough "omph" to make it worth your while.

But 30FPS -> twice as much time is also a lie because there are things that you cannot run at these framerates, at all. For instance physics. Good physics does not run at 30FPS. You usually evaluate 90 to 180 physics steps per second (at a fixed timestep). Changing the display framerate won't change anything about that. So physics is out, no benefit there.

So what about speedups in raw rendering? Surely that is worth it? Well there are fixed cost things (like SSAO, shadowing etc.) and sure enough doing them at half the frame-rate frees up time for the rest. However, 30FPS also throws up a serious issue that adds cost back in. At 60-FPS you can get a decent display (albeit one that is too sharp) without artificial motion blurring. You can add motion blurring, but that's strictly speaking a "nice to have". But at 30FPS you *cannot* go without motion blurring. Why? Well movies, you see, they have motion blur "built-in". It's free. You know, shutter speed and all. But games don't have that. So if you present a 30FPS picture without motion blurring, things will just look "stuttery". So you've got to have pretty good motion blurring which is hellishly expensive to do.
 
So basically the real answer to the thread is devs are wasting the extra frames post processing which is expensive - even for the ps4. And even though the PC has the 60fps thing all sorted, devs on the consoles are chasing pc quality while sacrificing frames. We might have to wait until 2020 before we get all games on console running at 60fps or above.
 
Until consoles have more power than developers can effectively make use of because of manpower constraints you'll see many games targeting 30fps. 60fps doesn't show up in commercials or screenshots (unless you think lower quality assets demonstrate 60fps adequately).
 
Again, Temporal-AntiAliasing helps on that front. Not perfect,but pretty good.

PS. I mean actual temporal-antialiasing as temporaly correct motion blur, and not the badly named TXAA post process SPATIAL antialiasing algo from nvidia.

How can you do "correct" motion blur without adding shitloads of latency?
 
We're not talking about just COD though, but all games being 30/60 Hz. I start that sentence with, "That's part of the experience though for a lot of games." I agree that most people (pretty much all) who bought COD bought it to play online and do play online. A fair number did so not because COD is 60 fps but because their mates were, so it's very hard to qualify how much of CODs success depends on 60 fps. For every other game, there's a mix of solo and online play. Uncharted was created for its solo performance. The target there for 30 fps adds to the visuals and doesn't take away from the control issues because the player's just shooting bots. Creating a 60 fps multiplayer experience would add a lot of work and damage the style, so it makes sense to just carry over the 30fps engine to multiplayer. Chances are most of those Uncharted players online aren't overly sensitive to higher latency controls.

COD isn't a good reference point. It's an outlier. We need to look at all games (Gears, Halo, Uncharted, Fifa, Final Fantasy, Borderlands, Sacred, Assassin's Creed, LBP, Battlefield, Tomb Raider, Dynasty Warriors, Bioshock, Sniper 2, Metal Gear, Asura's Wrath, etc.). A great many are solo only games or sold on their solo as much as their online, where the control latency of 30 fps doesn't really impact the player. Fluidity will help, but Joe (and Sally, and Frederick, and José) Gamer can get buy without that, while selling to them without the added eyecandy that twice the rendering time permits will be pretty hard for a lot of developers.

I do not get the point here. Isn't the point of playing games to have as much fun as possible (or be immersed in some other way)? If you have more fun at 60 fps than that is the way to go! If it is "hard for a lot of developers", well that is not my problem. If they want to compete for my money they will have to work hard (or smart).
 
I do not get the point here. Isn't the point of playing games to have as much fun as possible (or be immersed in some other way)? If you have more fun at 60 fps than that is the way to go! If it is "hard for a lot of developers", well that is not my problem. If they want to compete for my money they will have to work hard (or smart).

Perhaps developers should make games so you can turn a bunch of the effects off so they run at 60fps in consoles and then you could have the most fun and the rest of us could still enjoy our pretty pixels.
 
They seem to be balance them to 30fps first and foremost. So they half the framerate and double the graphics, half the objects on screen, no destructable environments, small stages. And you call this balancing?
It's balancing according to the developers feature weighting system. It's certainly not lack of optimisation as you ludicrously suggested.

I do not get the point here. Isn't the point of playing games to have as much fun as possible (or be immersed in some other way)? If you have more fun at 60 fps than that is the way to go! If it is "hard for a lot of developers", well that is not my problem. If they want to compete for my money they will have to work hard (or smart).
It's not so much 'fun' often enough as an 'experience'. A moody game can provide a moodier experience with more complex graphics than simpler graphics at higher framerates. There's no one-size-fits-all. And clearly 30 fps isn't a bottleneck to people having fun.

I understand trying to explain the value 30 fps to those who love 60 fps is like trying to convince those who hate Marmite that other people can actually like it, but there it is. For plenty of people, and certainly enough that developers and publishers set it as their target, 30 fps is plenty enough. There's not enough advantage in control latency or responsiveness in most cases for any developer to bother with it - otherwise they would. I personally would like to see 60 fps games. When I played the Kingdoms of Amalur demo it was mostly 60 fps with dips, and I hoped it'd get improved, but the final game released at 30 fps to my disappointment (the visuals were pretty simple and certainly weren't competitive with other games). But I also understand how other people can value more eye-candy at less framerate, and I'm not going to draw a correlation with one best selling title that 60 fps is better or preferred. The best selling franchise of Xbox was Halo at 30 fps, was it not? And the best-selling franchise, GTA, had pretty bad framerates didn't it?
 
How can you do "correct" motion blur without adding shitloads of latency?

Well I mean't correctish. Like, object based, with somewhat good quality (as much as real time allows) with the amout of blur calibrated by framerate rather then some random value an artist thought looked (cinematic)
People don't like Motion Blur because many times its overdone, or its just camera based, or its just shitty. But the good implementations out there (LBP1 had some of the best I've seen) really do help. I don't know how fast an object has to move for its movement to seem more fluid in Motion Blured 30fps vs. non Motion blured 60fps, but it certainly does happen at some threshold, just curious how comonly those types of speeds are found on typical games.
What we need is some blind tests, with diferent types of gamers (casuals, core and profetional) on diferent types of games, on diferent types of setups of resolution, motion blur, controller lag, and spatial resolution. Then we would have actual data on what is more worthwhile and where cycles are best spent in the end.
So far we are playing with guess work here. Sure 60fps is better, the question is How much better it really is. And our personal feelings aren't the most scientific of mesurements to say the least.
 
I think I have said my stance on this matter on another similar thread, but I will say it again.
If it's about 720p60 vs 1080p30, it would've depend on the game. But generally, racing, fighting, and fps would benefit from the increase in framerate. Even then, I'm not really fond of 1080p and 60fps, so I would prefer 720p30 and dial the graphics to eleven :)
Heck, I can even tolerate a jrpg @15fps if they give me amazing graphics.
The key for me is for a game that require tight control, the framerate must be stable. Even if the control is sampled at a consistent 16ms, if the feedback jittery and all over the place, then it's useless.
 
Well I mean't correctish. Like, object based, with somewhat good quality (as much as real time allows) with the amout of blur calibrated by framerate rather then some random value an artist thought looked (cinematic)

Correctish will only hinder the gameplay. Image something stopping but you think it is moving, then you will have several frames before you see the correct image and can react to it.
 
Back
Top