*spin* another 60 vs 30 framerate argument

Isn't Wii sports also 60 fps?

Could be, when I played it the controls were so laggy I doubt it mattered.

Point being how many COD games do we need? Should every game just try to be COD because it's the most popular franchise? 60 fps has maybe helped COD be popular, that doesn't mean other titles would duplicate that success by targeting 60fps. Would Gears of War have sold more with lower visuals at a higher framerate? Halo? GTA? Doubtful.
 
Wii sports sold more than the best selling cod game on any platform, perhaps every game should be trying to be wii sports.

Wouldn't that be awesome.

I think wii sports is 60fps and a couple other of the first party titles like mario galaxy. The argument has boiled down to "devs have to go 30fps to make the game look pretty".
 
I think wii sports is 60fps and a couple other of the first party titles like mario galaxy. The argument has boiled down to "devs have to go 30fps to make the game look pretty".
That is not the argument. Devs have various choices in terms of art style and user experience, and they pick whatever parameters fit their goal. If they want 'realism' in terms of detail and variety and post effects, they'll probably go with 30 fps like Gears and Uncharted. You cannot have games that look how they do at 60 fps because the hardware cannot pull it off. If devs want high framerate, they'll have to be willing to sacrifice elements elsewhere like resolution and post effects. Pretty can be achieved at higher framerates with a suitable art-style at no artistic sacrifice. Some will argue that if 30 fps had never been made, no-one would miss the improved visuals it affords, but that's never a realistic view as developers will push visuals. They always have on every platform, pushing down framerates as the hardware gets older.

On the flip-side, the argument persists that, "gamers need 60 fps for controller accuracy," despite plenty of examples to the contrary that gamers are having fun on 100+ ms controller lag where an extra 17ms shaved off will net them zero perceptible difference. Is anyone, anyone at all, who was saying that 60 fps is an essential gameplay feature, willing to accept that Joe Gamer with 150 ms lag from pressing a button to seeing the result on screen in his solo gaming isn't going to even notice the different 17ms nets him?

At the end of the day, despite a lot of denial on both sides, it's a subjective choice. Neither 30 fps nor 60 fps can be objectively proven as the best option that should always be implemented in all games. I can quote one personal example of forcing a game to 720p on PS3 to get 60 fps (Age of Booty) expecting my friend to appreciate the difference and he didn't care one jot. So he for one wouldn't benefit from Naughty Dog being forced to make Uncharted at 60 fps and lose all their amazing art and style that makes Uncharted as rewarding for him as it is.
 
Correctish will only hinder the gameplay. Image something stopping but you think it is moving, then you will have several frames before you see the correct image and can react to it.

Several frames? Do you know how modern object based motion blur works? They use velocity buffers fed from the animation system. It is quite acurate. There are no several frames of error anywhere. Or am I wrong? I didn't quite understand what you meant....
 
30 fps does double the cycles available to render a single frame. If the engine is simply brute force rendering everything again for each frame, this basically "doubles the GPU performance".

However there are many techniques available that reuse old data and amortize the rendering cost to several frames. The smaller variation between frames, the smaller the cost to render a single frame. In a 60 fps game, the camera and animation moves at half the speed (difference from a frame to another). For example CSM scrolling (https://d3cw3dd2w32x2b.cloudfront.net/wp-content/uploads/2012/08/CSM-Scrolling.pdf) provides a pretty much constant performance hit for sunlight shadow map rending, independent of the frame rate. In a 60 fps game, the camera moves at half the speed (per frame), so you have to update roughly half the shadow map data per frame compared to a 30 fps game. A similar efficiency gain for complex material combining (including decal rendering) can be achieved with virtual texturing. The visible scene rendering samples a single texel from the virtual texture cache (this is very fast). All the complex material blending is rendered to the virtual texture cache, and reused for as many frames as the surface is visible. In a 60 fps game, each generated surface pixel (in virtual texture cache) is visible for twice as many frames compared to a 30 fps game. Again the cost is independent of the frame rate.

Compute shaders allow us to run more complex algorithms on the GPU. As soon as current generation consoles are no longer the main development platform, we will start seeing some clever rendering pipelines that do not have to use that much brute force to generate a similar end result. 60 fps will eventually be a better choice. The performance impact will be minimized, while the positive impact on game play quality will remain.
Nobody targets 30fps. 30fps is a side effect of poor optimization.
This is not true. Many console games target 30 fps from the beginning of the project. The target frame rate is decided at early stage of a console game project (locked 30 fps, locked 60 fps, or variable). The selected frame rate affects many technical decisions, project schedule and tasks (it takes time to create new effects, higher frame rate allows less special effects). The frame rate is monitored throughout the project. No matter what frame rate was chosen, a huge deal of programmer time is spent in optimizing the bottlenecks. If you are running at 60 fps, you have to micro-optimize every single thing you are running on the GPU. Every GPU instruction counts. If you are running at 30 fps, you have more effects to optimize, more draw calls, more geometry. Optimization is as important as it is for the 60 fps game, but the focus is more on the fluctuating parts of the rendering process (for example object and shadow map rendering), as the relative cost of the constant cost passes (for example post effects) is smaller.
 
On the flip-side, the argument persists that, "gamers need 60 fps for controller accuracy," despite plenty of examples to the contrary that gamers are having fun on 100+ ms controller lag where an extra 17ms shaved off will net them zero perceptible difference. Is anyone, anyone at all, who was saying that 60 fps is an essential gameplay feature, willing to accept that Joe Gamer with 150 ms lag from pressing a button to seeing the result on screen in his solo gaming isn't going to even notice the different 17ms nets him?

It is not just less latency, it is also higher temporal resolution which make you see things in motion more clearly. And that matters a lot in fast games.
 
Several frames? Do you know how modern object based motion blur works? They use velocity buffers fed from the animation system. It is quite acurate. There are no several frames of error anywhere. Or am I wrong? I didn't quite understand what you meant....

Quit accurate yes. But not accurate when you really need it (motion changes). For that you need a future frame.
 
Quit accurate yes. But not accurate when you really need it (motion changes). For that you need a future frame.

Yes, in that case it is half inacurate ( the back blur is ok, just the foward one is wrong ) for one frame, not several. So that boils down again to how big of a problem 1 frame of delay is a that much of a big issue or not. Sure for pro players it is. For regular ones, the only way to acurately know is by testing it out as I said. These "i don't like it" arguments are too personal and anedoctal. I wanted to see these people go through blind tests, where framerate changed progressively DURING gameplay without warning, to see how long it actually takes them to notice.
 
Nobody targets 30fps. 30fps is a side effect of poor optimization.

You forget that a game is not just graphics. There are lots of other things to do that can benefit from two times the CPU power.
 
Yes, in that case it is half inacurate ( the back blur is ok, just the foward one is wrong ) for one frame, not several. So that boils down again to how big of a problem 1 frame of delay is a that much of a big issue or not. Sure for pro players it is. For regular ones, the only way to acurately know is by testing it out as I said. These "i don't like it" arguments are too personal and anedoctal. I wanted to see these people go through blind tests, where framerate changed progressively DURING gameplay without warning, to see how long it actually takes them to notice.

It might feel like more than one frame since the motion blurring fools your brain that it is actually moving.

I do not buy into this categorization of pro vs regular players. How do you define that?


I also want more tests done and published.
 
Nobody targets 30fps. 30fps is a side effect of poor optimization.

I'm sorry but this is one of the most ignorant and offensive statements I've read here in a while.
Game development is one gigantic process of compromise and making tradeoffs. No one gets exactly what they want, but everyone does their damndest to do the best they possibly can with a very limited set of resources and time.

Designers want immense worlds, complex interaction and vast scope. Artists want incredible detail and the highest fidelity, programmers carry a lot of the weight so they want high stability and to keep things simple and maintainable.
The problem is, those goals all fight one another, and to assume a massive choice such as doubling the frame rate simply comes down to 'poor optimisation' is incredibly naïve.

All games are a set of very tightly controlled limitations, where everyone involved goes completely mad trying to push each of those limitations to the absolute limit. Reality is, no one gets everything that they want; everyone has to sacrifice for the greater good of the game and to let all disciplines shine.

So what it comes down to is what priority you put on things. The COD developers put a big priority on frame rate. This isn't something they just decided to do; this will have had utterly massive implications for the game, and will have effected every single asset, budget, schedule, system, team member, design and feature within the project.
 
You guys ignore the fact that eventually people will experience the game and it will have an impression. And that impression will define how much they desire the next game or form expectations about other games of the genre.

In the case of COD, the average joes may have no freakin idea what framerate is, or even understand explicitly the framerate smootness. But they sure experience the feel provided by a smoother framerate. They cant define what it is but they know the game does something right.
That average joe who loves COD will feel something is missing when he plays another FPS that is 30fps. But it will have an impact on his perception.
Dont tell me all those millions who play COD are hardcore. Because the majority arent and they are faithful to a franchise that seriously lacks personality compared to Killzone, Halos and so many others. There is a reason for this, and it is probably due to the experience the smooth multiplayer mode provides
 
And those who play 30 fps shooters will feel a disappointment in the visuals of COD?

I fundamentally disagree that the loyalty to COD is strongly due to its 60 fps nature - those COD gamers do buy and play other 30 fps games. COD's the current fashion; a self-propelling experience where popularity breeds popularity. It got to ascendency from the right game at the right time. When COD:MW first released, they could have gone 30 fps and I expect the current popularity wouldn't be reduced one jot. They chose a visual target that preferred smoothness, but Joe Gamer didn't buy MW because it was 60 fps. There are many reason Joe Gamer bought it. Now if they were to change the recipe and drop 60 fps, I expect players would complain because it's no longer the same experience, but I don't think responsiveness is the reason alone for many.

but most importantly, COD is not the only game in the world! And as said before, last gen the top selling titles were 30 fps despite plenty of 60 fps games.
 
COD is already not that impressive visually but you dont see those who play 30fps shooters being large enough to affect its sales because the rest that enjoy and play COD are much more. So the former are irrelevant. I am irrelevant too because I am actually one of those that find COD overrated and mediocre looking.

Note that I mentioned nothing about the concious perceived responsiveness of the average joe or anything that implies that the average joe is consciously aware of 60fps.
If the next game is 30fps it will sell by the backet load due to the previous games success and positive expectations.

But from there even if it still sells a lot, expect sales to see gradual reduction as the average joe will start feeling that something is not the same.

Sure its COD is "fashionable" and people buy it because of it but "fashion" is not immune.
 
And those who play 30 fps shooters will feel a disappointment in the visuals of COD?

I fundamentally disagree that the loyalty to COD is strongly due to its 60 fps nature - those COD gamers do buy and play other 30 fps games. COD's the current fashion; a self-propelling experience where popularity breeds popularity. It got to ascendency from the right game at the right time. When COD:MW first released, they could have gone 30 fps and I expect the current popularity wouldn't be reduced one jot. They chose a visual target that preferred smoothness, but Joe Gamer didn't buy MW because it was 60 fps. There are many reason Joe Gamer bought it. Now if they were to change the recipe and drop 60 fps, I expect players would complain because it's no longer the same experience, but I don't think responsiveness is the reason alone for many.

but most importantly, COD is not the only game in the world! And as said before, last gen the top selling titles were 30 fps despite plenty of 60 fps games.


You are making a lot of assumptions here. I strongly believe that CoD would not be as popular at 60 fps because it would not play the same. The game mechanics would differ quite a lot (you would basically have a much slower game).

And 60 fps is not just a visual target (like AA vs noAA) but a game mechanic enabler.
 
Back
Top