The Great Framerate Non-Debate

brunogm

Newcomer
Well the framerate debate is nonexistant even when recent research has show that temporal resolution is more important than spatial boost (4k).
Add to that the great FUD of The Order1886 and the BS "filmic" term. There is a real need to educate people on terms, metrics and methodologies in perception research, so that more informed trade-offs occur in the gaming industry. Tons of games last gen failed because of inflated production costs and bad management that didn't use perception limits or physics to guide decisions. For sure artists that dont know how one asset will look at 3m distance can waste weeks putting details. Even crazier is if those details and gameplay feel are lost because of low fps, what a conundrum!

http://forums.blurbusters.com/viewtopic.php?f=7&t=135
I'm using a CRT, so I tried several different resolutions and refresh rates: 800x600 at 160 Hz, 1260x945 at 120Hz, 1600x1200 at 96 Hz, and finally 2560x1920 at 57Hz.

Playing at 160Hz with VSYNC off, hitting 1000 FPS (with a corresponding mouse polling rate), the immediacy is exhilarating. I'm not sure I could distinguish 1000 FPS from 500 or even 300, so I'd need to do some ABX testing, but it's scary good. Tearing is not an issue here. I suspect this would sound crazy anywhere other than BlurBusters, but I still want a higher framerate; I can distinguish individual frames during fast mouse movements.
Last provocation for how long we will live with lag?

More at http://www.blurbusters.com/motion-tests/tools/ including a distance/aliasing test.
What refresh do i need?[Analysis]
Cooked Response Times
 
I am not sure I'd define it as "fud", when a developer is saying that their game is only 30Hz, because they need "as much graphics" to make the game look like it does and to feel like it does.
 
I am not sure I'd define it as "fud", when a developer is saying that their game is only 30Hz, because they need "as much graphics" to make the game look like it does and to feel like it does.
RAD made the claim that 30fps was in and of itself more appropriate for the game's filmic aesthetic. They made it sound not only like it was the best choice for the game, but that all other things being equal, 30fps might still be better than 60fps for their game. Hence the uproar. You get dissent when you claim that 30fps was the correct compromise in general, but RAD's comments included something different and landed on a much larger powder keg.
 
I think the problem with the RAD comment is that it's trying to sugar coat their reason for 30FPS.
Just say the game is 30FPS because you valued aesthetics more than the benefits of 60FPS.
 
Not sure this warrants another thread, as it's effectively the 30 vs 60 fps argument only with another more outlandish option (few displays that consoles are connected to can provide 120 Hz and that's not going to change any time soon).

As for the aesthetic, 30 fps is going to have a different feel, just as per the HFR film examples. Whether one prefers it or not, and whether one's conditioned to it or not, it is true that choppy framerate will give a different feel to the experience. The Order at 120 Hz would feel more gamey and less filmy, even if it plays better as a result.
 
RAD made the claim that 30fps was in and of itself more appropriate for the game's filmic aesthetic. They made it sound not only like it was the best choice for the game, but that all other things being equal, 30fps might still be better than 60fps for their game. Hence the uproar. You get dissent when you claim that 30fps was the correct compromise in general, but RAD's comments included something different and landed on a much larger powder keg.
We have heard this developer explanation countless of times. This is pure PR damage control. Every developer would choose 60 fps if they could achieve it.

Whenever a console developer releases a PC version of their 30 fps console game, the PC version always runs at unlocked frame rate (60 fps+ supported), and there's usually not even an option to turn the 30 fps lock on. And this happens even though the developer was vocally claiming that 30 fps is better for their game. Why not allow 30 fps lock on PC if it looks better? I don't understand this.
The Order at 120 Hz would feel more gamey and less filmy, even if it plays better as a result.
120 Hz looks much more like real life than 30 fps (or 60 fps). The debate is "real life" vs "cinema". Nobody is trying to make a "game" looking game (of course there are exceptions, some people are also trying to make "game" looking movies).

I am kind of tired about this discussion. If you want a movie looking game, just increase the stutter time (more exposure with longer motion blur). You can make dream like content at HFR too if you want. But you can't make good looking (judderless) panning at less than 60 fps (and even that is debatable when stereo 3d is used).
 
I am kind of tired about this discussion.
However, it's never going to end! As Laa-Yosh has decribed, HFR in movies makes everything more fake because the brain gets enough detail to interpret the content correctly. It's not an issue for games that are trying to match the real-life feel, but if you want that cinematic 'feel', a lower framerate is required to have the amount of perceptual misinformation that leaves the viewer somewhat confused and perturbed just like watching a movie.

I'm an advocate of higher framerate in games. However, I accept that cinema has a distinct feel that, if one wants to duplicate for whatever reason, one would need to use a lower framerate. I'd be interested to see The Order or other games with 24 fps as an option. It'd be horrific for pure gaming, but it'd be interesting to see how the brain interprets the on screen action and if it decides the content on TV is more realistic.
 
However, it's never going to end! As Laa-Yosh has decribed, HFR in movies makes everything more fake because the brain gets enough detail to interpret the content correctly. It's not an issue for games that are trying to match the real-life feel, but if you want that cinematic 'feel', a lower framerate is required to have the amount of perceptual misinformation that leaves the viewer somewhat confused and perturbed just like watching a movie.

I'm an advocate of higher framerate in games. However, I accept that cinema has a distinct feel that, if one wants to duplicate for whatever reason, one would need to use a lower framerate. I'd be interested to see The Order or other games with 24 fps as an option. It'd be horrific for pure gaming, but it'd be interesting to see how the brain interprets the on screen action and if it decides the content on TV is more realistic.

I don't think that would work. It would require the monitor to display at 120hz (or any multiple of 24 but 120 is probably the lowest option) or have G-sync for that to work properly and I don't think devs can actually count on that. But as an option for people who have that and properly educating whoever's trying to enable it might work though.
 
I'm an advocate of higher framerate in games. However, I accept that cinema has a distinct feel that, if one wants to duplicate for whatever reason, one would need to use a lower framerate. I'd be interested to see The Order or other games with 24 fps as an option. It'd be horrific for pure gaming, but it'd be interesting to see how the brain interprets the on screen action and if it decides the content on TV is more realistic.

The Order director kind of hinted that they tested 24 fps and it felt like crap. Of course, we do not know if they got a 24 fps output or if they got a lot of judder.
 
I don't think that would work. It would require the monitor to display at 120hz (or any multiple of 24 but 120 is probably the lowest option) or have G-sync for that to work properly and I don't think devs can actually count on that. But as an option for people who have that and properly educating whoever's trying to enable it might work though.
Many modern TVs support 24 Hz input for movies. I know mine does and it's pretty entry level and a few years old. 24 Hz movies are a lot smoother than 30 Hz pulldowns.
 
I don't think that would work. It would require the monitor to display at 120hz (or any multiple of 24 but 120 is probably the lowest option) or have G-sync for that to work properly and I don't think devs can actually count on that. But as an option for people who have that and properly educating whoever's trying to enable it might work though.

24Hz doesn't always needs 120Hz TV/monitor. 24Hz is a standard, thus most TV support it. Just like most TV support 50Hz. (my TV at least support 24, 25, 30, 50, 60).
 
Supported doesn't mean it works properly in the way we're talking about. You need to make sure it's actually "displaying" at 24 hz, and not doing a 3:2 pulldown, which defeats the purpose.

The ability to take the signal doesn't automatically mean it works the way we want it to.



That's exactly why it's going to be hard for the devs to properly implement this. Even the TV specs don't really make it clear if they're doing a 3:2 pulldown. Getting a 120 Hz TV instead of a 60 Hz one is probably the surest way to get it done properly.
 
Supported doesn't mean it works properly in the way we're talking about. You need to make sure it's actually "displaying" at 24 hz, and not doing a 3:2 pulldown, which defeats the purpose.

The ability to take the signal doesn't automatically mean it works the way we want it to.

That's exactly why it's going to be hard for the devs to properly implement this. Even the TV specs don't really make it clear if they're doing a 3:2 pulldown. Getting a 120 Hz TV instead of a 60 Hz one is probably the surest way to get it done properly.
I think any TV listed as supporting 24 Hz actually supports it natively (shows 24 Hz refresh in signal info). Of course devs aren't going to support it just as they won't 50 Hz because you can't be 100% sure users will get the right experience.
 
I think any TV listed as supporting 24 Hz actually supports it natively (shows 24 Hz refresh in signal info). Of course devs aren't going to support it just as they won't 50 Hz because you can't be 100% sure users will get the right experience.

The natively reality is that the modern TVs run at 24 multiples like 72hz or more, to solve for judder.
Now the FUD was the 24hz not the choice of 30hz to make better lighting.
Then 24hz was chosen for the cost of film tape and the minimum speed the electric motors run and sound sync was not garbage. It only worked based on controlled ambient light to induce mesoptic vision and the relaxed state of ~70ms integration time at the brain.

My problem is so much "blind" and gut defined specs or dumb traditions causing nausea and the like. BBC has show nausea is a problem of static and motion resolution mismatch (TL.DR. slides at the end. intro history on silent films and infamous 24fps).

There are tons of information and experiments out there, so much can be accomplished if one is humble to look for. That will make for greater imersion, sane budgets in assets and production even in HD or UHD era.
 
Last edited by a moderator:
There are tons of information and experiments out there, so much can be accomplished if one is humble to look for.
Can you please not argue that people are too stupid/lazy if they don't agree with you. I think pretty much everyone on this board knows higher framerates are 'better', and plenty of us know the history behind the choice for 24 fps movies. I'm certainly one of them. That doesn't change the fact that perception is heavily influenced by multiple factors. Truth is, if you want to make a computer game that causes people to ask what film you're watching when they walk in on you playing it, you are likely going to want 24 fps. Smoother framerates don't look like film (because people are accustomed to cinema). That's a legitimate argument for targeting a lower framerate even if it's a rather daft argument for compromising gameplay. That's ultimately the developers' prerogative to target their quality/framerate balance to their preferred aesthetic, and they have to hope that the market agrees with their choices.
 
The natively reality is that the modern TVs run at 24 multiples like 72hz or more, to solve for judder.
Now the FUD was the 24hz not the choice of 30hz to make better lighting.
Then 24hz was chosen for the cost of film tape and the minimum speed the electric motors run and sound sync was not garbage. It only worked based on controlled ambient light to induce mesoptic vision and the relaxed state of ~70ms integration time at the brain.

Most modern TVs do not run at multiples of 24hz, they run at multiples of 30hz because that's what tv has standardised on. Recent tv sets feature the ability to run at 120hz allowing 24hz content to be displayed without the need for 3:2 pulldown. Most older tvs do a terrible job of 3:2 pulldown which is why personally the only thing that interested me about 3D tvs was this native 24hz support.

You seem oddly dismissive of 24hz for film despite then listing the reasons why it works just fine and became the standard for cinema, very confusing to me. What works in one medium does not necessarily work for another, thus 24hz is fine for film content, janky for tv and just almost unusable for gaming.
 
Sorry, not trying to argue on personal attacks. Let me objectively argue.

LaLaland when 24hz cinema mode is engage the panel changes from 60hz to at least 72hz. Yes there are 120hz panels but real-world pixel response is in 40ms to 60ms and retina persistence is a major issue. Examples in the link "Cooked response times".

Shifty Geezer my argument is not trying to shame people. Its based on research from many sources with important results. I posted many links with the full documentation to prove my argument. My problem is when someone refuses to read and think about an argument, then continues to discuss refusing to let others contribute to the question in place.

In the 80's from a perceptual perspective 80hz were almost standardized but the bandwidth and storage cost made 50/60hz, interlacing and other artifacts present until few years ago.

Charles Poynton tried to establish a worldwide compatible 72hz standard but was only meet in a worldwide color one. Gaming would be very different in a 72hz world, a high effects game could use 24hz in difficult cut-scenes and not suffer from stutter, but there are real hard parameters for this to work seamlessly.
# EDIT Thinking about it most HDTVs autodetect changes in content and so today a game can switch to 24fps@72hz cinema mode at any time. Provided game OS allows refresh rate changes. EDIT#

There is open access research on what exactly is "filmic aesthetic" with hard objective parameters. 24fps is not needed for the look , a more controlled shutter angle, blur, strobing and so can take its place.

With VESA adaptive-sync, variable framerate in the form of low action close-up cut-scenes can render with motionblur at 24hz but sample at 72hz. I mean lets follow physics(Nyquist). If a certain object is moving at 960pixels per second the framerate has to be X or a compensation in motion blur and so on.
Even at fixed rates there are objective measures to be taken in each case following the velocity maps. The real world camera motion blur is free but on games motion blur needs computation. Such a conundrum at 60fps a little free motionblur happens, so one is trading render at 30 with computation resources to motionblur or double framerate.
What cannot continue is irrational or apparently unsubstantiated arguments.

One is free to produce a game that cause nausea intentionally but the developer has to be transparent and responsible of the consequences.
 
Last edited by a moderator:
Back
Top