Does 30fps feel more "cinematic" than 60fps?

What did Insomniac do in that respect when they decided 60fps didn't win any interest?

They did an incredibly biased study of review scores and a very biased poll. I also think they excluded the COD games. They did not test anything. Their study also did not seem to take into account that various genres benefit more from higher fps.

If you want to test 30 vs 60 fps you do the following:

1: Make a 60 fps and a 30 fps version of your game. Strip down graphics, resolution etc on the 60 fps version if needed or just use the PC version of your game.
2: Test both versions on players. Do not tell them the difference, just tell them there are some differences in the two versions.
3: Ask the players to grade their enjoyment of the game. You can also ask them what they thought of graphics, how easy the game was to control and other stuff. For extra points, hook them up to biometrics like Trumbull did.

Has there been any published studies using this method?
 
What gaming lacks is a dedicated research arm, just finding things out. It's left to random university investigations. Someone should be funding dedicated, targeted research into gaming fields.
 
so even if I wanted to dial down resolution or anything to have a stable framerate, I can't. Would be cool, though (Toshinden on PS1 had a 30/60Hz toggle, which disabled environment geometry etc. to reach 60Hz)
There are some current generation console games where lowering the output resolution also lowers the rendering resolution and stabilizes the framerate, SotC HD being one fairly recent example.
 
There are some current generation console games where lowering the output resolution also lowers the rendering resolution and stabilizes the framerate, SotC HD being one fairly recent example.

I always wished current gen games would offer 60fps mode if you have your console hooked up at sd resolution (set output at 480p). I understand why they don't (q/a, testing, etc) but it would make for an interesting stat to see how many people would purposely play at downgraded resolution just to get 60fps.
 
There are some current generation console games where lowering the output resolution also lowers the rendering resolution and stabilizes the framerate, SotC HD being one fairly recent example.

Need For Speed Most Wanted (Xbox 360): 45/50fps mode (480p)
 
I always wished current gen games would offer 60fps mode if you have your console hooked up at sd resolution (set output at 480p). I understand why they don't (q/a, testing, etc) but it would make for an interesting stat to see how many people would purposely play at downgraded resolution just to get 60fps.
Definitely. Add faster framerate and higher IQ at lower resolution. Some clever dev should add it as an option to evaluate. Would be a good test case.
 
My definition of cinematic is 60fps, v-sync on, with god of war 3 quality motion blur. Hopefully standard with next gen consoles.
 
I always wished current gen games would offer 60fps mode if you have your console hooked up at sd resolution (set output at 480p). I understand why they don't (q/a, testing, etc) but it would make for an interesting stat to see how many people would purposely play at downgraded resolution just to get 60fps.

It's not just QA.
You're assuming that the GPU is the only reason the game can't run at 60fps.
In my experience the CPU is way bigger of a problem. Between game update, visibility preparation and feeding the GPU, it is really hard to run 60fps unless you plan your entire engine around it.
And even then it's still very hard because the 60 fps target would be questioned and second guessed every step of the way.
 
It's not just QA.
You're assuming that the GPU is the only reason the game can't run at 60fps.
In my experience the CPU is way bigger of a problem. Between game update, visibility preparation and feeding the GPU, it is really hard to run 60fps unless you plan your entire engine around it.
And even then it's still very hard because the 60 fps target would be questioned and second guessed every step of the way.

Why would the 60 fps target be questioned and second guessed?
 
It's not just QA.
You're assuming that the GPU is the only reason the game can't run at 60fps.
In my experience the CPU is way bigger of a problem. Between game update, visibility preparation and feeding the GPU, it is really hard to run 60fps unless you plan your entire engine around it.
And even then it's still very hard because the 60 fps target would be questioned and second guessed every step of the way.

Yeah that's true, especially this late in the game the vmx units must be getting heavily stressed.


Why would the 60 fps target be questioned and second guessed?

There is still a train of thought that 60fps isn't worth it both because it's really hard to do, but also because people often don't notice it. Witness the pc/console game comparisions where people can't see any improvement with 60fps on the pc version. So, it's theorized that you're better off spending the cycles on visuals rather than framerate and it can be difficult to convince people otherwise. Of course Infinity Ward has gone totally counter to this to huge success as have other games like Trials HD, which makes one wonder if more attention should be directed to 60fps. Likewise I remember the furor way back in the day when NHL Hockey came out as 60fps on the Sega Genesis and 30fps on the Nintendo SNES, somehow back then gamers really noticed 60fps and the Sega version was unanymously declared the better version for this one fact alone.
 
It's not just QA.
You're assuming that the GPU is the only reason the game can't run at 60fps.
In my experience the CPU is way bigger of a problem. Between game update, visibility preparation and feeding the GPU, it is really hard to run 60fps unless you plan your entire engine around it.
And even then it's still very hard because the 60 fps target would be questioned and second guessed every step of the way.
I concur; I don't know if it's the right example, but remind me to Rage with better performance on the ps3 where on 360 find it's best suit but with more occasional stutter; I highly doubt on 360 even with the same 'tech' or trick used on ps3 version we have the same results. I suspect hdmi driver (or 3d stereoscopic support, forgive my ignorance) on ps3 could give some advantage to have more steady 60 fps in some scenario but, of course, not without heavy comprimises on IQ.
 
Last edited by a moderator:
You can't ask this question on a tech forum. Yes, it has a more cinematic feel because 24fps has defined that look for decades. The only question is whether you prefer that, strobing and all, or it ticks your OCD box too much that you desire the smoother framerate.

Neither is the more valid stance, but I do harbor some resentment for 60fps-maniacs since I do enjoy strobing (and its vaguely hallucinatory effects in some instances), which seems hard to grasp for some of the hardest-edged videophiles. But nonetheless, it comes to preference. And thankfully in the graphics world you have fine control over animation data, so you can have a game running at 60fps that still feels roughly 30fps if you're so inclined.
 
I've recently watched the first episode of Game of Thrones in HD. Meaning 1080P at 23.967. There's one scene in the the beginning, where the group finds a dead wolf. And it obviously lacks any real "color grading" and other post processing effects. It looks "real", as in not cinematic at all. And it isn't the framerate at all, that makes it looks non-cinematic, it's the unprocessed colors and unobtrusive lighting that makes it look like it does.

I liken this to post processing in more recent games, too (i.e. newer DX9 titles). Before that, most, if not all what a game looked like were the meshes and primitive lighting. Now it's massive post processes that "change" how a game looks. Disable those things and most games look as bland as their predecessors (albeit with more lights and better models). And it's not a question of framerate at all that makes the newer games look more like movies. No matter the framerate I let Quake 3 Arena run at, it will never look cinematic. (yes, I know, this is a weird comparison).

I'd really like to see some real world examples of 60Hz movies. Not upscaled or interpolated stuff. And not WIP stuff, like the "The Hobbit" stuff, either (well, technically it's 48Hz, but... meh)
 
There is still a train of thought that 60fps isn't worth it both because it's really hard to do, but also because people often don't notice it. Witness the pc/console game comparisions where people can't see any improvement with 60fps on the pc version.

Have you or anyone else done any actual tests or experiments supporting these claims? All the tests/experiments I have read about find that people notice higher frame rates. Do you have any other info that you can share?
 
Definitely. Add faster framerate and higher IQ at lower resolution. Some clever dev should add it as an option to evaluate. Would be a good test case.

I do not know if I like this. For PCs, sure. For a fixed platform, I think it is the game developer's responsibility to make the game with the right tradeoff of graphical fidelity versus fluidity. I as a consumer/gamer/audience should not make that choice.
 
Have you or anyone else done any actual tests or experiments supporting these claims? All the tests/experiments I have read about find that people notice higher frame rates. Do you have any other info that you can share?

We did some tests when I was working on sports games back in the 2005-2007 timeframe. People noticed 60fps, and even noticed subtle changes like double buffering vs triple buffering during very timing sensitive parts of the game. But that's sports games, those tend to be far less resistant to going 60fps compared to other genres. Personally I think 60fps is a huge upgrade on *any* game, but reading the constant forum posts of "cant see the difference" when comparing 30fps console games to their 60fps pc counterparts (and all the other improvements pc provides) has left me confused on the matter truth be told. I guess it was noticed en masse in the Snes/Genesis days, but not so much anymore.
 
I do not know if I like this. For PCs, sure. For a fixed platform, I think it is the game developer's responsibility to make the game with the right tradeoff of graphical fidelity versus fluidity. I as a consumer/gamer/audience should not make that choice.
I disagree. Options are important to provide a better experience to diverse people with different tastes. That's why we have difficulty settings. I know the early Lego games this gen had v-sync as an option, giving users a choice between ghastly screen-tear or horrific judder (gee, thanks...). As an investigation, I'd be very interested to see some games come designed around 30 fps with a toggle that drops quality (resolution etc.) to enabled 60 fps, and then see which people choose. Only it has to be an obvious switch. Age of Booty on PS3 runs 60 fps at 720p, but only 30 at 1080p. I force my PS3 to 720p to play it, but my friend never noticed. Something like that in game as a switch, properly labelled, that people will try and see and decide on, would gave great data for identifying what people actually would prefer instead of us all guessing!
 
We did some tests when I was working on sports games back in the 2005-2007 timeframe. People noticed 60fps, and even noticed subtle changes like double buffering vs triple buffering during very timing sensitive parts of the game. But that's sports games, those tend to be far less resistant to going 60fps compared to other genres. Personally I think 60fps is a huge upgrade on *any* game, but reading the constant forum posts of "cant see the difference" when comparing 30fps console games to their 60fps pc counterparts (and all the other improvements pc provides) has left me confused on the matter truth be told. I guess it was noticed en masse in the Snes/Genesis days, but not so much anymore.

So your own test results dispute the nonsense people write on message boards? I have seen no "cant see the difference" posts where people have actually compared different frame rates, but you might have some examples.

Anyway, it would be really interesting if you could post the results of the tests that you did, there is a severe lack of data in this area.
 
Have you or anyone else done any actual tests or experiments supporting these claims? All the tests/experiments I have read about find that people notice higher frame rates. Do you have any other info that you can share?

It's hard to prove because it's very subjective. Even if a person subconsciously prefers 60 fps they often don't associate it directly as consequence of frame rate, but rather label it as "poor / good hit detection" or "responsive / laggy" etc.

The other problem is that 30fps games tend to look a lot better in screenshots (higher resolution, more post-effects etc). And screenshots are a big part of the advertisement and "hype" effort.

Given all of the above, it takes a real commitment from a studio to stick to 60 fps, as natural forces tend to pull it towards 30 fps.
For example, even at a tech driven company such as Id Software, apparently John Carmack had to put a serious effort to convince the studio to stick to 60 fps for Rage. Luckily he is in a position where he can probably just veto any other option, but that's rarely the norm at other studios where the tech team is most likely not running the show.
 
I've recently watched the first episode of Game of Thrones in HD. Meaning 1080P at 23.967. There's one scene in the the beginning, where the group finds a dead wolf. And it obviously lacks any real "color grading" and other post processing effects. It looks "real", as in not cinematic at all. And it isn't the framerate at all, that makes it looks non-cinematic, it's the unprocessed colors and unobtrusive lighting that makes it look like it does.

I liken this to post processing in more recent games, too (i.e. newer DX9 titles). Before that, most, if not all what a game looked like were the meshes and primitive lighting. Now it's massive post processes that "change" how a game looks. Disable those things and most games look as bland as their predecessors (albeit with more lights and better models). And it's not a question of framerate at all that makes the newer games look more like movies. No matter the framerate I let Quake 3 Arena run at, it will never look cinematic. (yes, I know, this is a weird comparison). I'd really like to see some real world examples of 60Hz movies. Not upscaled or interpolated stuff. And not WIP stuff, like the "The Hobbit" stuff, either (well, technically it's 48Hz, but... meh)

You're on the right track here. Shooting live action at a natively higher framerate requires a different process - a reassessment of norms - and is fundamentally different from working with 60hz in computer graphics. It's a fact that the same film shot the same way at 24 fps and 48hz or 60hz will look and feel different. Shooting at a higher framerate means addressing the way you move the camera to the way you ask actors to walk through a scene to even the way you light (as counterintuitive as the latter might seem). Ignoring this is why daytime television has taken on the unpleasantly garish, hyper-real look many of us can't stand.

It's the same reason Quake 3 looks as you describe, even when that's to its benefit as a fast paced game. Quake 3's feel is arguably the lack of feeling altogether, for the sake of giving players the most evenhanded visual experience possible. More than any other it's a game purpose built for sport.
 
Back
Top