Does 30fps feel more "cinematic" than 60fps?

. A very slight amount of CA seems to help realism, such that I expect it has an effect in cinematography. It's certainly present in every camera lens to some degree even if not to the extent of clearly separate colour fields.

Yeah, the guys here came up with a few other tricks, like grime on the camera lens that gets more visible when there's a light in front of it and so on. There may be a new trailer coming out soon where I might be able to point it out... :)
 
I never really got the separation of graphics from framerate.
60fps IS eyecandy, regardless of the impact (or lack of thereof) on gameplay.
60 isn't sacrificing visuals, it's trading off for a different visual effect - that can't be shown in screens... now which other visual buzzword does that remind me of...

This would make a nice signature.
 
It makes films all look the same, it doesnt really make them pop. Someone in Hollywood thinks it's great.

It makes the characters pop. In other words, it makes them stand out more by enhancing the contrast between their faces and the background.
 
Yeah, the guys here came up with a few other tricks, like grime on the camera lens that gets more visible when there's a light in front of it and so on. There may be a new trailer coming out soon where I might be able to point it out... :)

Yeah BF3 did this, although their simulation was bit crude. I'm sure we'll see it in other games soon enough. I think Optical Flares will also do it, although if I remember correctly their simulation was really simple as well.
 
I went to the shop recently.

At no point while driving, walking, or operating a shopping trolley did everything suddenly look like I have cameras strapped to my eyeballs. It was raining when I walked back to the car. Water did not start running down a virtual camera lens infront of me.
 
I went to the shop recently.

At no point while driving, walking, or operating a shopping trolley did everything suddenly look like I have cameras strapped to my eyeballs. It was raining when I walked back to the car. Water did not start running down a virtual camera lens infront of me.
Man, get with the times and get glasses!
Your life will suddenly be more cinematic, you even get unique effects like steamy vision.
 
Truth be told, my pals at Weta say they aren't really worried about it.
Haven't personally seen any footage at 48fps yet, so I don't know how it'll compare, but it might not be such a big issue on the content creation side. It really is more about every theater having to upgrade their equipment.
That's good to know. If content creation doesn't cost more, we can start pushing better framerates for movies.

Then again there might be two versions of the Hobbit - 3D at 48fps and a regular 2D at 24fps for the rest of the cinemas.
I presume they create the 48 fps version and just compile every other frame for 24 fps.
 
Man, get with the times and get glasses!
Your life will suddenly be more cinematic, you even get unique effects like steamy vision.

I was wearing glasses, but also a hat that stopped the rain getting on them. Sorry, I messed up.

The effect would have been cheap too (you could almost say "free") so I had no excuse not to use it.
 
Now what i'm about to say may sound like blasphemy to most if not all involved in this discussion, but I honestly can't tell the difference between 60fps in anything other than driving games and fighters.

I've recently been playing rage, and the whole thing about it being a 60fps game is completely lost on me. had i not known beforehand i would never have guessed. I've played a few PC games too at 60fps and didn't necessarily see what all the fuss was about.

Can someone kindly explain to me what this fabled "60fps" is supposed to look like over a solid "30fps"? I only notice any framerate issues in a game if the framerate is inconsistent, or in the case of games where it takes a massive dive for some reason or another.
 
It costs more in that they need a bigger renderfarm and online storage. But hardware is still cheaper than artist time... although stuff like rotoscoping will require more work (it's basically tracing the outlines of people and set elements so that CG can be inserted; and also they'll have to paint Serkis out from two times as many frames to make room for Gollum)

Not sure if there's going to be a 24fps version, it's just speculation on my end for now ;) so I have no idea if simply blending frames together will work.
 
Can someone kindly explain to me what this fabled "60fps" is supposed to look like over a solid "30fps"? I only notice any framerate issues in a game if the framerate is inconsistent, or in the case of games where it takes a massive dive for some reason or another.

Found this link on Google. Flash isn't working properly for me atm but I seem to remember this page being used before:

http://www.boallen.com/fps-compare.html
 
It costs more in that they need a bigger renderfarm and online storage. But hardware is still cheaper than artist time... although stuff like rotoscoping will require more work (it's basically tracing the outlines of people and set elements so that CG can be inserted; and also they'll have to paint Serkis out from two times as many frames to make room for Gollum)
That's what i thought. There's still a lot of hand-painted mattes and stuff, being the biggest limiting factor to faster cinema speeds.

Not sure if there's going to be a 24fps version, it's just speculation on my end for now ;) so I have no idea if simply blending frames together will work.
Well the frame interpolation techniques for upscaling footage framerate points to the reverse being very servicable IMO. That's an idea I've had for cinema - shoot it in high framerates and then combine across 24hz frames to provide better motion continuity between frames, with the option to release decent framerates on disc!
 
Of course...they add it in post. ;)

That just helps for the CA appearing in the lens. In reality any refractive surface produces CA, and while lenses are specifically made to have very few artifacts, the other surfaces (windows, water, ... all that) are of much lower quality and have weirder forms and are hit on steeper angles. So while you can "repair" the lens-situation in a game by faking a lens-curvature you are totally unable to do that for the things in the world, except you have a raytracer (using wavelengths preferrably, Maxwell or fryrender fe.).
 
Last edited by a moderator:
Thank you kindly ;-)

Can't view it here at work but will check it when i get home ;-)

It's just a little demo of 15, 30 and 60 fps and putting the 30 and 60 fps on screen at the same time should show the differences up. Try picking a corner on one of the boxes in the demo that was linked to and keeping "locked" onto only that corner. You can do it at 30 fps but you see a series of judders. It's so much nicer at 60 fps.

Tracking - where you fix onto a point and follow it - is one of only two modes of operation of the eye iirc (the other being when it darts between points) and 60 fps makes fixing onto something that's moving, while examining it for details, more effective and enjoyable IMO. It's something that 24 fps movies absolutely suck for, and in games (where you are frequently looking at things move) I see 30 fps as being a necessary evil and not something worthy of being a goal in itself.

People rag on CoD for its low resolution and link it to "bad graphics" but people who aren't pixel counters love the way it looks. A friend of mine (that's been a gamer for decades) described Black Ops as having "probably the best graphics I've ever seen". Part of that is probably because while things are actually moving you can make out lots of detail and things appear to move more naturally. And the more aggressively you use motion blur, the smoother things will look but the more detail you will lose.

A slow, lumbering in-game camera is a poor substitute for having eyes that can dart around or lock onto a rapidly moving point. A higher frame rate makes everything better.
 
Now what i'm about to say may sound like blasphemy to most if not all involved in this discussion, but I honestly can't tell the difference between 60fps in anything other than driving games and fighters.
I have a friend who doesn't distinguish 64 kbps MP3 audio from uncompressed. And another who doesn't see any flickering in a 60 Hz CRT monitor. People are different.

I personally got a really bad headache from anything less than 100 Hz CRT (and could clearly see the flickering at standard 85 Hz). I have a few PC gaming friends who have bought 120 Hz LCD displays just for gaming (competetive first person shooter players). The difference between 60 Hz and 120 Hz isn't exceptionally big, but it's even noticeable on Windows desktop (mouse cursor and windows move/scroll more smoothly).

30 fps is bad if you have lots of sideways movement in the game (we have in our Trials series), and you try to follow targets moving sideways in the screen. Because your eyes are following the target, there should be no motion blur at all. You should be able to see the target very clearly. However at 30 fps this is impossible, as without heavy motion blur the animation becomes really choppy. Even 60 fps is not enough for scenes that have fast objects moving across the screen. Movies are slightly different, because the director can often guides the watcher to focus on center of the screen (with camera movement, focal blur, storyline factors, etc). In games, the player can and will watch where he/she likes. Motion blur should reach to eye movement, and thus it doesn't work so well in games.
 
Not being able to tell the difference conciously is importantly not the same as not seeing the difference. Most people will notice the difference, but don't exactly know what is different - but they'll invariably prefer the 60fps all the same.
 
i agree that some people not able to see the difference.

each person is different.

in my case, i cant see tge difference between 85 and 100hz in crt. my friend cant see the difference beetween 60-85-100.

he even comfortably use his 60hz crt for working in excel and ms word for hours.

as for fps, there only a few of my friends that can see and tell the difference.
most of them cant tell the diffference but im sure they can see the difference.

they see low fps as making head hurt and high fps is comfy.

sorry cant give the exact fps but thats what they said when i aetring their graphic option. they prefer the lower quality graphic but higher fps.
 
Last edited by a moderator:
I used to say I preferred HQ30Hz rather than LQ60Hz, but quality standard are so high today that I want 60Hz or better ;p

(On a side note, since I tried my brother's 3DS, I'm looking for a way to get my 120Hz to do relief/3D with my ATi HD5, anyone know how ? [Sonic Generations & Trine 2 must be absolutely fantastic in relief!])
 
30 fps is bad if you have lots of sideways movement in the game (we have in our Trials series), and you try to follow targets moving sideways in the screen. Because your eyes are following the target, there should be no motion blur at all. You should be able to see the target very clearly. However at 30 fps this is impossible, as without heavy motion blur the animation becomes really choppy. Even 60 fps is not enough for scenes that have fast objects moving across the screen. Movies are slightly different, because the director can often guides the watcher to focus on center of the screen (with camera movement, focal blur, storyline factors, etc). In games, the player can and will watch where he/she likes. Motion blur should reach to eye movement, and thus it doesn't work so well in games.

Yeah I was playing Sonic Generations at 60fps, I can't imagine that game at 30fps, it would totally ruin it. On the other hand movies just look bizarre to me at higher frame rates, 24/30 fps for movies just makes more sense to me otherwise they look like amatuer home videos.
 
Back
Top