The Great Framerate Non-Debate

What are the benefits of that ?
It cuts down on sample-and-hold effects. Human eye-tracking is very smooth, but eye-tracking while a screen is static (i.e. the duration of a frame) causes a bit of unwanted motion blur. This is because your eyes are moving and the on-screen image isn't; for 16ms or 33ms periods, your eyes are moving relative to the on-screen imagery. CRTs do not experience this particular effect because they output frames in brief flashes of light with darkness in between, so there's not a long-displayed static incoming light distribution sitting around to be smeared by your eye movement.

it would make the game 50% darker
It'll limit the maximum brightness of your screen, but LCDs are capable of being pretty bright anyway.
 
I already have. Read the thread.
The only thing you've said is that higher frame rates makes the experience less cinematic.
There is no disagreement with me there. Unless you also propose that this is how it will continue to be forever and ever, and that what is considered "cinematic" won't change over time.
Audio and photography suggests that the inertia in peoples opinions is indeed strong, but not infinite.
One could make the claim that "cinematic" inertia will be higher because cinema projection systems are replaced at very slow rate, but the counterargument to that is that film watching in theaters is a minuscule percentage of total film consumption today.

Yes, this is what I'm talking about. At 60fps it's much easier to spot the non-realistic parts.
This is indeed an argument.
Just like we couldn't (quite) see the fishing line Flash Gordons space ship was pulled along on old flickery NTSC B&W TV-sets. That does not imply that there is anything wrong with using higher resolution standards. In fact, it implies that higher quality standards can bring tangible benefits. It just means the the standards of content production needs to follow along with the higher demands. Just as it has in both photography and audio for instance.
 
It cuts down on sample-and-hold effects. Human eye-tracking is very smooth, but eye-tracking while a screen is static (i.e. the duration of a frame) causes a bit of unwanted motion blur. This is because your eyes are moving and the on-screen image isn't; for 16ms or 33ms periods, your eyes are moving relative to the on-screen imagery. CRTs do not experience this particular effect because they output frames in brief flashes of light with darkness in between, so there's not a long-displayed static incoming light distribution sitting around to be smeared by your eye movement.

It'll limit the maximum brightness of your screen, but LCDs are capable of being pretty bright anyway.
Plus 60Hz flicker, which some people find annoying. Light output is sufficient for night viewing/dim room but not really usable in a bright room on my set (Sony 2014 W8 with 'Motionflow'). You need input fps to match the strobing rate also i.e 60fps for it to be truly effective, 30fps will show some double image effect. Essentially just like a CRT. Part of the reason I want to build another gaming PC to ensure a locked 60fps. It is fantastic though.
 
The only thing you've said is that higher frame rates makes the experience less cinematic.
There is no disagreement with me there. Unless you also propose that this is how it will continue to be forever and ever, and that what is considered "cinematic" won't change over time.
Audio and photography suggests that the inertia in peoples opinions is indeed strong, but not infinite.
One could make the claim that "cinematic" inertia will be higher because cinema projection systems are replaced at very slow rate, but the counterargument to that is that film watching in theaters is a minuscule percentage of total film consumption today.
I didn't say it was less cinematic, I explained that by having a more realistic medium with which you can look at visual content, any flaws or inconsistencies in said content are far more evident than at a lower framerate, taking you out of the experience. This is not something you get used to, the brain won't suddenly become stupid and start ignoring all the obvious flaws.

This is indeed an argument.
Just like we couldn't (quite) see the fishing line Flash Gordons space ship was pulled along on old flickery NTSC B&W TV-sets. That does not imply that there is anything wrong with using higher resolution standards. In fact, it implies that higher quality standards can bring tangible benefits. It just means the the standards of content production needs to follow along with the higher demands. Just as it has in both photography and audio for instance.

Nobody claimed that higher IQ standards are bad per se, only its effect on media going for a photorealistic style that doesn't quite match up with reality, which is most of it. Then again, not everybody prefers photorealistic media so just upping the level of said content doesn't mean everybody will like it.
 
No I am saying that you can't read or lack the ability to process data.

That article supports my thesis. The only game listed with sub 100 ms lag was guitar hero 360.
??????????

Guitar Hero 3 was listed at 3/60ths of a second, which is 50ms. Every other 60fps game on that page is listed at 4/60ths, which is 67ms.

And even if modern games only went down to ~100ms, your original claim that controller and TV together were always at least 100ms would be dubious, given the sequential time spent running game logic and rendering the frame.
 
??????????

Guitar Hero 3 was listed at 3/60ths of a second, which is 50ms. Every other 60fps game on that page is listed at 4/60ths, which is 67ms.

And even if modern games only went down to ~100ms, your original claim that controller and TV together were always at least 100ms would be dubious, given the sequential time spent running game logic and rendering the frame.

Did you miss the part where they deducted 33 ms from every reading?
 
Crts haven't been in common use for over a decade
This is just blatantly wrong. A decade ago, CRTs were still the market leader in active sales.

//===================

Regardless, there are plenty of flat panel displays with sub-33ms input lag, some modern TVs go below 20ms, and a few computer monitors have even measured sub-10ms. So even if we're restricting our search to modern flat panel displays, it's not really that difficult to experience sub-100ms input lag at 60fps.
 
Yeah, when we've moved into our current office in 2006, we've all been using 21" CRT Eizo monitors. I'm not sure how long it took to slowly replace them, but could've been something like 1-2 years at most.

Good LCD panels for graphics work took quite a while to become affordable. My main 24" Dell 2407WFP HC is actually quite bad ;) but I'm in modeling so proper colors and contrast aren't as important. It's from2007 model so that's probably when we've moved to LCDs.
 
Crts haven't been in common use for over a decade

A decade?

I had one development machine for work driving three 24inch Sony Trinitron CRTs up until about 2009.

There were over 150 million CRTs sold in 2004.

Worldwide LCD shipments didn't exceed CRT shipments until 2008.


100629_worldwide_tv_market_by_technology.png
 
BFI on a 60fps game like MGS:GV looks amazing
Well, you need a pretty high framerate for the effect to make much sense. For a 30fps game, you'd have to choose between strobing at a higher frequency than the framerate (which creates ghost images in your vision for the same reason that sample-and-hold creates motion blur) or strobing at 30Hz (the resulting flicker would quite possibly be physically painful to look at).
 
Im having a hard time accepting this ideal of 30 fps in games being "more cinematic". With the rash of XB1 titles that run at 30 fps and the same PS4 ports that run unlocked between 45-55 fps, I've haven't seen any comments that the XB1 titles look more cinematic. Or witness the calls for an option of vsync at 30 fps on those PS4 titles being motivated by a more cinematic presentation.

Movies with higher framerates and a digital recording method may ruin immersion because you can see the fakeness of the scene. But how much of that is relevant in a game? Its all 3D rendered polygons with no ability to make you believe you are looking at reality. Its not like game artists are making rocks out of virtual styrofoam because virtual rocks are more expensive and harder to work with and 60 fps allows me to see that the game is using styrofoam props.

I have not really perused through the posts. But if higher frame rates do a better job at highlighting visual flaws, wouldn't the discussion be centered around how 60 fps makes aliasing artifacts, low poly objects, dithering, poor lighting/shading or low resolution textures stand out more? You know, aspects of visual flaws more applicable to gaming.
 
To be honest: Tomb Raider 2013 (or should that be.. 2014? lol), did look rather strange. The high framerate actually did make the game look worse;
the fire effects look strange at ~ 60fps, same for the motion and animation. Lara running just feels off.
They should have patched the game to run at 30fps with a good object based motion blur implementation, as well as patch the color levels, and we would have had an excellent running next-level game. Instead... we have a sharper looking, sped up version of the Xbox game..

The same thing applies to the Dead Space PC version: firing certain rifles it's obvious that the gun animations/muzzle flashes/ firing rates were optimised for 30 fps. Isaac walking (especially when viewed on a 5:4 monitor) is weird as well.

In conclusion: if the game is not built to be 60 fps, then it could most certainly look worse.
 
To be honest: Tomb Raider 2013 (or should that be.. 2014? lol), did look rather strange. The high framerate actually did make the game look worse;
the fire effects look strange at ~ 60fps, same for the motion and animation. Lara running just feels off.
They should have patched the game to run at 30fps with a good object based motion blur implementation, as well as patch the color levels, and we would have had an excellent running next-level game. Instead... we have a sharper looking, sped up version of the Xbox game...

This is crazy! The best thing about TR was the nice feeling when navigating the environments. Also, what do color levels have to do with the frame rate?
 
Last edited by a moderator:
Back
Top