Does 30fps feel more "cinematic" than 60fps?

The point is that you said:



And that guy said after watching footage shot natively at a high fps:


Not saying it's the same but it is similar.

It is similar in that sense that you get a new frame 48 times per second. The detail, blur and motion will differ. Also, this is one person's opinion after watching ONE scene shot at a high fps. If he had compared the same scene shot in different frame rates, than he would be onto something. He also says that certain scenes look really amazing (like slow motion).

Some people claimed upscaled DVDs have similar quality as Blu-rays. That is totally bogus and you can see that with your own eyes. Unfortunately there is basically no movie footage available in 48/60 fps which makes it impossible for us to compare. Hopefully we can do that when "The Hobbit" is released.
 
The difference in motion blur alone is worth it... especially in hectic scenes, which tend to be (bad directors, at least), so much so that you can't see what's happening anymore. And with the drop in motion blur, the interpolation gets better also... you cannot "remove" motion blur once it's applied. At least not with additional info and A LOT of calculations. TVs don't have that, hence "motion flow" isn't perfect. With games however (disabling blur in pc games for example), that's a valid way to go, I think... not sure if it makes a lot of sense to go anything higher than 60Hz, though... but it's valid for 30Hz to 60Hz interpolation also.

There is enough 60Hz video available... problem is, it's never feature film quality. The PSEye can go up to 120Hz (at 320x240), which is really nice (and I think, save for the drop in resolution, really worth it), others can go even higher.

It also makes a lot of sense to use it, because TVs always had a lead in framerate... more or less since they went mainstream, TVs were 50/60Hz... Why not take advantage of that, instead of having to deal with bs like ivtc etc. The move from CRTs to digital tvs allows for even better material (but worse colors, though, mostly) to be displayed.

I'd really want to see the real 48Hz trailer. I mean, they already released the "low rate" trailer... so why not the better one? PCs can easily cope with the framerate, sadly h264 high profile 4.1 doesn't allow for framerates higher than 30... Only 4.2 and higher support it (i.e. no bluray spec).
 
It is similar in that sense that you get a new frame 48 times per second. The detail, blur and motion will differ. Also, this is one person's opinion after watching ONE scene shot at a high fps. If he had compared the same scene shot in different frame rates, than he would be onto something. He also says that certain scenes look really amazing (like slow motion).

Some people claimed upscaled DVDs have similar quality as Blu-rays. That is totally bogus and you can see that with your own eyes. Unfortunately there is basically no movie footage available in 48/60 fps which makes it impossible for us to compare. Hopefully we can do that when "The Hobbit" is released.

THERE is a big difference between interpolating to 60fps and upscaling. With very good interpolation you don't lose detail. The real difference is that native 60fps shows no flaws. FRAME by FRAME. There is good interpolation and bad interpolation so I can understand your restraint.

I can not download these files. Can you repost them?

Wow. My files were all individually deleted. Very immature. I'll repost it.
 
Last edited by a moderator:
Repost: Credits to the OP on NeoGAF for the original files.

Here's a shot of a dude kicking a basketball shot in native 24fps

http://www.peejeshare.com/files/362104193/basement24.mov.html
Now here's a shot of the same scene reshot in native 48fps


http://maximum-attack.com/basement_red_fps.zip
http://www.peejeshare.com/files/362104194/basement48.mov.html

Now here's the 24fps shot interpolated to 60fps

http://www.peejeshare.com/files/362104198/basement60fps.mp4.html

As you can see the motion blur from the 24fps version is carried over. Movement is also very similar to the 48fps version. Surprisingly the 60fps interpolated version displays better panning on the last shot.

I hope that clears up the interpolation vs. native debate.

EDIT: So someone keeps deleting the files huh? I'll post the original then.
 
Last edited by a moderator:
Reiko said:
THERE is a big difference between interpolating to 60fps and upscaling. With very good interpolation you don't lose detail.
Perceived detail increases with higher fps - and I'd argue it happens even with interpolation, due to how our brains do "temporal upsampling" of moving images.
 
THERE is a big difference between interpolating to 60fps and upscaling. With very good interpolation you don't lose detail. The real difference is that native 60fps shows no flaws. FRAME by FRAME. There is good interpolation and bad interpolation so I can understand your restraint.

You do not lose detail when upscaling either. I do not understand what you mean with you other comment.
 
THERE is a big difference between interpolating to 60fps and upscaling. With very good interpolation you don't lose detail. The real difference is that native 60fps shows no flaws. FRAME by FRAME. There is good interpolation and bad interpolation so I can understand your restraint.

Detail is already lost. For the original film stock, each frame is typically exposed for half the frame time, ie. shutter speed is twice the framerate, ~48Hz. This adds a ton of blur. Blur which is then interpolated between frames.

The beach landing scene in Saving Private Ryan and some scenes in the Gladiator was filmed with a shutter speed 4 times faster than normal, which gives a distinct jittery look and very sharp frames.

Cheers
 
The difference in motion blur alone is worth it... especially in hectic scenes, which tend to be (bad directors, at least), so much so that you can't see what's happening anymore. .

And i promise you that most of what you cant see is fully intended.. I think it was Blade that had a directors commentary track where they originally planned to shoot action/fight scenes with a certain distance so that you, the viewer could see what was happening, but it took away intensity from the fight scenes. So they were glad they had kept a camera closed in as well.
 
You do not lose detail when upscaling either. I do not understand what you mean with you other comment.

Yes you can. Depends on the upscaling algorithm.

Interpolation can show flaws when you slow it down frame by frame. (ghosting). But it's subtle since everything is moving so fast. Slowing down native 60fps content frame by frame shows no flaws. And no motion blur either.
 
And i promise you that most of what you cant see is fully intended.. I think it was Blade that had a directors commentary track where they originally planned to shoot action/fight scenes with a certain distance so that you, the viewer could see what was happening, but it took away intensity from the fight scenes. So they were glad they had kept a camera closed in as well.

I am not saying that there should be a directors and editors vision... Far from it. I just dislike most hectic scenes in virtually any current movie (not all of them, though). But that's not really the point here. Your argument is totally valid, yet has little to do with raising the framerate of movies...
 
I am not saying that there should be a directors and editors vision... Far from it. I just dislike most hectic scenes in virtually any current movie (not all of them, though). But that's not really the point here. Your argument is totally valid, yet has little to do with raising the framerate of movies...

My point would be that with 120hz the director may very well just choose to blur the shit out of the same actions scenes anyway. And with 24hz they can still choose to show what is going on.

If they can get 48hz to look "good" they have the best of both worlds and can turn up whatever effect they want. They just need to get the magic in there :)
 
Yeah that was interesting, makes me wonder how that correlates to games as well. Like maybe the Call Of Duty series being 60fps on console helps to make it the top dog there, just because it makes the game "feel" right. Someone should do those same physiological tests with games and framerate.
 
What did Insomniac do in that respect when they decided 60fps didn't win any interest?

I googled it and found this:

To back up his arguments, Acton has produced interesting data based on a large number of game reviews, which indicates that while there is a clear link between graphics and final score, there is little to no evidence that frame-rate has as much influence. He also polled readers of the Insomniac website, and found that while 16 per cent of respondents were firmly in favour of 60FPS, most are not, with the majority favouring a solid frame-rate that doesn't interfere with the gameplay.

My gut tells me it's more because trying to maintain 60fps on ps3 is incredibly difficult and just out of their scope. Even Modern Warefare 3 drops to 40fps often on that platform and they have a ton of money and resources to make 60fps work yet still couldn't. Same with other prominent/talented studios like Sony Santa Monica whose Gow3 also runs frequently at sub 60fps. There are very few true 60fps games on that platform so presumably it was just too expensive/difficult for a shop like Insomniac to be able to do a true 60fps game and they decided to drop it as a business decision. But that's all just imho :)
 
I googled it and found this:



My gut tells me it's more because trying to maintain 60fps on ps3 is incredibly difficult and just out of their scope. Even Modern Warefare 3 drops to 40fps often on that platform and they have a ton of money and resources to make 60fps work yet still couldn't. Same with other prominent/talented studios like Sony Santa Monica whose Gow3 also runs frequently at sub 60fps. There are very few true 60fps games on that platform so presumably it was just too expensive/difficult for a shop like Insomniac to be able to do a true 60fps game and they decided to drop it as a business decision. But that's all just imho :)

For God of War 3 it was mostly a compromise of fidelity vs. framerate.
 
Well... Insomniac "polled some guys", whereas Trumbull actually measured the physiological reception of the viewers... very different approach.

For me, it's like this. While most people don't mind 30Hz gaming, a lot (me included) do prefer 60Hz. Some even go as far and avoid anything below 60Hz. But it's not the other way around... or saying it differently, I've never heard anyone saying "ah, this game has too many frames per second, I'd wish it had less". For all it's worth, you could just drop the additional frames anyway^

With consoles, it's clear why they mostly go with 30Hz... it's the same reason... or similar, why the movie industry went with 24Hz... it's the cheapest variant that doesn't put people off. You can make ANY game run at 60Hz, but there's a lot that goes with it... and with the limited hardware at your disposal, you can't just ramp anything up to your desired framerate, unless you want to compromise.

My bigger pet peeve is an average framerate WELL below 30Hz. Especially one that fluctuates heavily. If only all games were hard locked to 30Hz^ (it's more a problem on PS3, though). PC gaming has an advantage here... if you have a midly capable PC today, anything runs at massive settings, and still reaches 60Hz. But even with older hardware, you can simply adjust the settings and reach your goal. On consoles, those toggles aren't there... so even if I wanted to dial down resolution or anything to have a stable framerate, I can't. Would be cool, though (Toshinden on PS1 had a 30/60Hz toggle, which disabled environment geometry etc. to reach 60Hz)
 
Well... Insomniac "polled some guys", whereas Trumbull actually measured the physiological reception of the viewers... very different approach.

For me, it's like this. While most people don't mind 30Hz gaming, a lot (me included) do prefer 60Hz. Some even go as far and avoid anything below 60Hz. But it's not the other way around... or saying it differently, I've never heard anyone saying "ah, this game has too many frames per second, I'd wish it had less". For all it's worth, you could just drop the additional frames anyway^

With consoles, it's clear why they mostly go with 30Hz... it's the same reason... or similar, why the movie industry went with 24Hz... it's the cheapest variant that doesn't put people off. You can make ANY game run at 60Hz, but there's a lot that goes with it... and with the limited hardware at your disposal, you can't just ramp anything up to your desired framerate, unless you want to compromise.

My bigger pet peeve is an average framerate WELL below 30Hz. Especially one that fluctuates heavily. If only all games were hard locked to 30Hz^ (it's more a problem on PS3, though). PC gaming has an advantage here... if you have a midly capable PC today, anything runs at massive settings, and still reaches 60Hz. But even with older hardware, you can simply adjust the settings and reach your goal. On consoles, those toggles aren't there... so even if I wanted to dial down resolution or anything to have a stable framerate, I can't. Would be cool, though (Toshinden on PS1 had a 30/60Hz toggle, which disabled environment geometry etc. to reach 60Hz)

This console gen was a very weird one. We had tons of games last gen that ran in 60fps. So I really don't understand how some gamers can flip out when they play a 60fps console game this gen. It felt like it was almost like a standard on PS2.
 
We had tons of games last gen that ran in 60fps.

We have a lot of 60 fps games today as well. We also had a ton of games with borderline unplayable framerates last generation. 60 fps was far from a standard last generation, not to mention lots of games were simply declared 60 fps and no-one ever bothered to double check.
The weird thing about this generation isn't framerate or image quality. The weird thing is how people started obsessing over stuff which never ever bothered them before.
 
We have a lot of 60 fps games today as well. We also had a ton of games with borderline unplayable framerates last generation. 60 fps was far from a standard last generation, not to mention lots of games were simply declared 60 fps and no-one ever bothered to double check.
The weird thing about this generation isn't framerate or image quality. The weird thing is how people started obsessing over stuff which never ever bothered them before.

PC gaming standards rising could be the main culprit.

"Why isn't this game performing like it is on my blah blah blah"

"Tearing makes my eyes bleed"

"No way it could run this"

"It's not really HD"
 
Back
Top