Does 30fps feel more "cinematic" than 60fps?

Of course. The closer you get to 24fps with realistic motion blur the more cinematic it will look. This has nothing to do with responsiveness, or whether you like blurry/jerky/whatever, it has to do with whether what you see looks like something you'd see in a movie. Other things you could throw into this category:

1. Defocus (depth of field)
2. Chromatic aberration
3. Lens flares and glares(bloom)
4. Character lighting rigs (fill light, rim light, key light, etc.)
5. Unusually glossy/specular environments
6. Film grain
7. Color correction
8. Filmic tone mapping
9. Lens distortion (fisheye effect)
You forgot

10. Cars that explode on the slightest impact
11. Weather that always matches the story's mood, so it's always raining when something sad is happening
12. Shaky-cam footage during action so you can't see a damned thing
 
Oh, hey, wow, anyone catch that latest Michael Bay movie? It was great!

All games should try and look like cinemeratic like that! It was great! Then they could be art too, like at the cinema! But make it look like Broakeback Mountain too, because that was art, and it had a emotions. Games can do emotion too, because they can do look like cinemas.

Hhhhhhhhnnnnnnnnnnnnnnnnn film grayn.
 
Mass Effect 1 had film grain and a shit framerate, so we're already 1/2 of the way there. In fact, that's just a couple more ways that ME2 is a step backwards.
 
In an apple to apple comparison, no. I think it's because DS2 has higher native resolution, prettier pixels and effects which makes it look more cinematic.
 
Only on small motions. Even with motion blur you can't disguise the discontinuity between one frame and the next on a pan. I'll never forget the juddering mountains in LOTR as the epitome of the illusion breaking framerate. We're supposed to be looking out over this mountain range, so why's it doddering along like a series of stills? Now if you extended blur across frames, so frame 2 included the motion from frames 1 and 3, then it'd look smoother, but that's not technically possible with cameras. It would be possible to create in a game however, and on a faster camera with fancy frame blending. Don't think anyone's tried it though, and it'd probably look pretty sloppy anyhow, kinda drunk or dreamy. Like too much reverb in audio.

It is possible to improve it with cameras and post processing, a worse offender would be trying to play that 24fps content @ 30fps with pull down. Part of the reason I waited a while before I bought an HDTV.
 
Of course. The closer you get to 24fps with realistic motion blur the more cinematic it will look. This has nothing to do with responsiveness, or whether you like blurry/jerky/whatever, it has to do with whether what you see looks like something you'd see in a movie. Other things you could throw into this category:

1. Defocus (depth of field)
2. Chromatic aberration
3. Lens flares and glares(bloom)
4. Character lighting rigs (fill light, rim light, key light, etc.)
5. Unusually glossy/specular environments
6. Film grain
7. Color correction
8. Filmic tone mapping
9. Lens distortion (fisheye effect)

Most films have little real Chromatic aberration, it is used sometimes but they mostly want to elimnate it from live action.Sometimes its used on underwater CG because people associate it with underwater footage using incorrect (air) lenses (which is not common these days in properly filmed underwater footage). Still it often makes stuff look a bit more real, but its often OTT

Colour correction, or colour grading in films is in my opinion is awfully used and uncreative at the moment. I really would prefer if games didn't go the same way.

I was watching "The Rock" the other day, which is not the best film but I was actually suprised how good the lighting is in many scenes. This was Michael Bay pre-digital correction.

Now you look at many films these days and they have orange-teal grading - which was fine once or twice. But now nearly every action film contains it. This was something that would have been very difficult or impossible with standard stock chemical film.

So sometimes I dont think we should look towards Hollywood
 
Most films have little real Chromatic aberration,
I used to think that, until Laa-Yosh showed his Assassin's Creed trailer. A very slight amount of CA seems to help realism, such that I expect it has an effect in cinematography. It's certainly present in every camera lens to some degree even if not to the extent of clearly separate colour fields.

I just Googled an Avatar screenshot as we know Cameron was going for all-out photorealism, and you can see they've added CA to the rendered parts. Zoom in on the left hand supporting the gun.
 
xedi said:
And I am also one of those weird people who are fine with 30fps and prefer the better graphics over a faster framerate....
I never really got the separation of graphics from framerate.
60fps IS eyecandy, regardless of the impact (or lack of thereof) on gameplay.
60 isn't sacrificing visuals, it's trading off for a different visual effect - that can't be shown in screens... now which other visual buzzword does that remind me of...
 
I never really got the separation of graphics from framerate.
60fps IS eyecandy, regardless of the impact (or lack of thereof) on gameplay.
60 isn't sacrificing visuals, it's trading off for a different visual effect - that can't be shown in screens... now which other visual buzzword does that remind me of...

It is a big trade though with current hardware, working inside 16ms is much much harder than 33ms.
 
I used to think that, until Laa-Yosh showed his Assassin's Creed trailer. A very slight amount of CA seems to help realism, such that I expect it has an effect in cinematography. It's certainly present in every camera lens to some degree even if not to the extent of clearly separate colour fields.

I just Googled an Avatar screenshot as we know Cameron was going for all-out photorealism, and you can see they've added CA to the rendered parts. Zoom in on the left hand supporting the gun.

I wasnt saying it doesnt exist or is not used (it is added to CG a lot), my point was its something they wanted to eliminate from real cameras. But I guess the same could be said about grain, flares and distortion.
These things need to kept under control, if you go too far they can give bad results. (in cinema as well as games). There is a tendancy to, if you paid for it - faunt it in games.
 
Most films have little real Chromatic aberration

Of course...they add it in post. ;)

Now you look at many films these days and they have orange-teal grading - which was fine once or twice. But now nearly every action film contains it. This was something that would have been very difficult or impossible with standard stock chemical film.

It serves its purpose well, which is to make characters pop. Lots of things people do in film (and games) are for that same purpose.
 
I wasnt saying it doesnt exist or is not used (it is added to CG a lot), my point was its something they wanted to eliminate from real cameras. But I guess the same could be said about grain, flares and distortion.
The trick with CG is that you often have to add those little imperfections to the shot to make it look "real". The more perfect it looks, the less real it looks.

The problem with CA in particular is that you shouldn't really notice it's there, even if it's been added intentionally. Most of the time, it's a very subtle effect that requires zooming down to the pixel level to see if it's even there, but can give CG that little extra push that it needs. Gaming graphics are nowhere near the level where a subtle push of CA will make a difference.

For me, the biggest thing about the framerate argument is how smooth the motion is. Whether it's done through a higher framerate, or by simply adding motion blur to help blend the frames together. Since nearly all games lack motion blur, when you get the framerate down into the mid 20s, it looks like a stuttering mess. But for a game like Crysis at über-max settings, which for me runs at around 30fps, it still comes across as being very smooth because of the motion blur.

But that's a whole other thing, since a lot of people simply don't like motion blur of any kind. Even if they started implementing it in console games, I think they'll need an option to turn it off for the people that don't want it there. You won't be able to please everyone, I'm afraid.
 
Mass Effect 1 had film grain and a shit framerate, so we're already 1/2 of the way there. In fact, that's just a couple more ways that ME2 is a step backwards.

I liked the film grain option that Capcom added in RE5, it made the campaign mode more cinematic. When turned off the game looked too clean though it helps when playing competitively in VS mode because you could see more clearly at distant targets.
 
Of course...they add it in post. ;)



It serves its purpose well, which is to make characters pop. Lots of things people do in film (and games) are for that same purpose.

It makes films all look the same, it doesnt really make them pop. Someone in Hollywood thinks it's great.
 
It makes films all look the same, it doesnt really make them pop. Someone in Hollywood thinks it's great.
I'm guessing this is due to a new generation of cinematographers coming up. In the early digital age, you got people taught on film trying to recreate what they knew digitally. Now you start getting people who have done mostly digital shoots and/or post all their careers, and are employing the (currently somewhat limited) bag of tricks that are the current "state of the art" of what they were taught.

As for games, I'm in the "I like the cinematic feel" camp. Framerate be damned...! :) Though, I expect that by the time consoles have the power to recreate the look and feel of film, clever people may have found a better use for those resources as our perception of what looks good/right will have shifted.
 
I'm guessing this is due to a new generation of cinematographers coming up. In the early digital age, you got people taught on film trying to recreate what they knew digitally. Now you start getting people who have done mostly digital shoots and/or post all their careers, and are employing the (currently somewhat limited) bag of tricks that are the current "state of the art" of what they were taught.

As for games, I'm in the "I like the cinematic feel" camp. Framerate be damned...! :) Though, I expect that by the time consoles have the power to recreate the look and feel of film, clever people may have found a better use for those resources as our perception of what looks good/right will have shifted.

If you look at film colour grading, it was a very fine art and somewhat limited in results - lighting would play a bigger role in bringing colour into the film. The problem with digital colour grading is that people higher up than the cinematorgrapher know how easy it is now to change the colour. I don't know for sure, but I suspect the decisions are being made higher up the chain (producers), which is bad - they shouldnt be making artistic decisions. Its a bit like in games where a publisher says GTA sold loads, we gotta make a GTA.

Personally I think 60fps makes a game more intense to play, as I mentioned in another thread - experiments where made by Douglas Trumbull which convinced him 60fps was better to stimulate the viewer.

I think games can be cinematic and run at 60fps, just it's harder with current consoles
 
its also why most TVs nowadays allow de-crappyfying movies by scaling them up to 100Hz or more. Which aint working perfect and introduces artifacts, but heck, I like that still better than jerky 24frames/sec.

Blasphemy!!!!
 
There are creative issues, as Laa-Yosh will tell you. It already takes an age to create all the effects for 24 fps footage. 48 fps footage requires of the order of twice the work for twice the frames; only so much can be automated. There have also been storage issues which may be solved now. Don't know.

Truth be told, my pals at Weta say they aren't really worried about it.
Haven't personally seen any footage at 48fps yet, so I don't know how it'll compare, but it might not be such a big issue on the content creation side. It really is more about every theater having to upgrade their equipment.

Then again there might be two versions of the Hobbit - 3D at 48fps and a regular 2D at 24fps for the rest of the cinemas.
 
Colour correction, or colour grading in films is in my opinion is awfully used and uncreative at the moment. I really would prefer if games didn't go the same way.

I'd say that it's about digital grading making it far too easy to do bad stuff. Back when they did all this chemically it was a different question, but I don't know enough about cinematography to get into a discussion about it ;)

So sometimes I dont think we should look towards Hollywood

While I don't necessarily agree with everything you wrote, I also wonder why we'd want games to look more cinematic... they're not meant to be movies after all.
 
Back
Top