Defining framerates and methods of describing framerate fluctuations *spawn

I've wondered about this myself. Not sure what to think. On PC, you can actually try it. Crysis 2 (at least on Steam) has a bug, that "forces" you into 1080P/24 (there's a fix, but... it shouldn't need one). But a shooter at such low framerate is abysmal (and don't get me started on the console versions of this game...)
 
Also why not use the other timmings. If the game runs at 54fps or so sync it at 1080p@50hz with perfect reproduction. 60hz was the only possibility in analog TV from decades ago (50hz for europe) to syncronize flickering of the ambient lamps (incandescent) and TVs.

Games already change hdmi sync mode for stereoscopic 3D, why not include 50hz, 25hz...
Smartest is use standard timings from HDMI CEA/EIA-861F. Too large so removed all SD, stereoscopic and interlaced because many people are confused with fields and frames. Steereoscopic modes can be used to show 2D content but is a hack :
Code HxV @ F

4 720p 16:9 1:1 1280x720p @ 59.94/60 Hz
16 1080p 16:9 1:1 1920x1080p @ 59.94/60 Hz
19 720p50 16:9 1:1 1280x720p @ 50 Hz
31 1080p50 16:9 1:1 1920x1080p @ 50 Hz
32 1080p24 16:9 1:1 1920x1080p @ 23.98/24 Hz
33 1080p25 16:9 1:1 1920x1080p @ 25 Hz
34 1080p30 16:9 1:1 1920x1080p @ 29.97/30 Hz
41 720p100 16:9 1:1 1280x720p @ 100 Hz
47 720p120 16:9 1:1 1280x720p @ 119.88/120 Hz
60 720p24 16:9 1:1 1280x720p @ 23.98/24 Hz
61 720p25 16:9 1:1 1280x720p @ 25Hz
62 720p30 16:9 1:1 1280x720p @ 29.97/30 Hz
63 1080p120 16:9 1:1 1920x1080p @ 119.88/120 Hz
64 1080p100 16:9 1:1 1920x1080p @ 100 Hz
65 720p24 64:27 4:3 1280x720p @ 23.98/24 Hz
66 720p25 64:27 4:3 1280x720p @ 25Hz
67 720p30 64:27 4:3 1280x720p @ 29.97/30 Hz
68 720p50 64:27 4:3 1280x720p @ 50 Hz
69 720p 64:27 4:3 1280x720p @ 59.94/60 Hz
70 720p100 64:27 4:3 1280x720p @ 100 Hz
71 720p120 64:27 4:3 1280x720p @ 119.88/120 Hz
72 1080p24 64:27 4:3 1920x1080p @ 23.98/24 Hz
73 1080p25 64:27 4:3 1920x1080p @ 25Hz
74 1080p30 64:27 4:3 1920x1080p @ 29.97/30 Hz
75 1080p50 64:27 4:3 1920x1080p @ 50 Hz
76 1080p 64:27 4:3 1920x1080p @ 59.94/60 Hz
77 1080p100 64:27 4:3 1920x1080p @ 100 Hz
78 1080p120 64:27 4:3 1920x1080p @ 119.88/120 Hz
 
I've wondered about this myself. Not sure what to think. On PC, you can actually try it. Crysis 2 (at least on Steam) has a bug, that "forces" you into 1080P/24 (there's a fix, but... it shouldn't need one). But a shooter at such low framerate is abysmal (and don't get me started on the console versions of this game...)

Only to justify this a little, the input signal is 24hz so maybe not all types of games will run ok but in this mode the display is at 72hz so there is random noise that changes and in your retina the image is not really the same maybe it can even use the superesolution mechanism of the eye that needs movement above 41.5Hz. For more on this see http://accidentalscientist.com/2014...e-better-at-60fps-and-the-uncanny-valley.html
 
More on the topic at hand: people think when they go for the Hobbit HFR 48fps the projector or display is at 48Hz that is not true. It runs at 96hz in this case and the normal 24fps input runs at 48Hz.
 
More on the topic at hand: people think when they go for the Hobbit HFR 48fps the projector or display is at 48Hz that is not true. It runs at 96hz in this case and the normal 24fps input runs at 48Hz.

Could you please expand on this?
 
More on the topic at hand: people think when they go for the Hobbit HFR 48fps the projector or display is at 48Hz that is not true. It runs at 96hz in this case and the normal 24fps input runs at 48Hz.
That would be required if you were using a active/shuttering approach to 3D. In a polarized approach (using two projectors, each one with outputs that pass through a different lens of the glasses) or a lenticular approach (direction the light to align with the viewer's two eyes) or simpler crap like a cross-eyes approach (just show the two images next to each other), you only need your video refresh to equal the display framerate.
 
If true wouldnt you be getting a duplicate frame for each frame on the projector?
Flickering dictates a retention drop on your retina. true 24hz flicker like crazy even in dark cinemas.

HTupolev double framerate or you can go cheap and use 48Hz for stereoscopic. Then it is explained why much people fell bad...
 
My internet failed earlier so here is some material that puts fps and double(96Hz) and triple(144Hz) flash to explain how many times a frame is shown.
We can use some of its terms in the present discussion.

https://hopa.memberclicks.net/assets/documents/Kennel_3D__for_HPA_022306.ppt

"Shot at higher frame rates, new 3D movies will be double-flashed by projectors to remove any hint of flickering. Fans watching a film produced at 48 FPS will see the same frame flashed twice per second, resulting in 96 FPS seen by each eye and 192 FPS overall. Films produced at 60 FPS, and then double-flashed, will result in moviegoers seeing a 3D film at an ultrasmooth 240 FPS."
http://info.christiedigital.com/lp/3d-hfr
 
Oh, you're just referring to the multi-shuttering that gets used to defeat flicker?

Does that actually still get done with digital projectors? (Strictly speaking it wouldn't have to for sample-and-hold types, as they wouldn't flicker in the first place.)
 
Even greater rates quad-flash projectors are planned. The thing is fps on the input is different than what your eye perceive. The fluctuations and artifacts that people fuss about are in your opinion dependent on input or what is arriving your eyes? My tale is at least we have to differ the cases: input average FPS, instanteous frame time and so on.
Right on the technique of HDMI capture doesnt "capture" what is the flash factor of your display. So again were we want to measure the artifacts and fluctuations?

Yes sample-and-hold doesnt flicker but brings smearing, blur and so on that is dependent on your retina, cone eye positions. My question is then lets discuss the objectives to them define what is measured and in what times?
 
The fluctuations and artifacts that people fuss about are in your opinion dependent on input or what is arriving your eyes?
Input rate, obviously. Flickering an image over time is clearly something different to changing that image over time, and a 12 fps animation flickering at 120 Hz isn't in any way smoothed in its motion. Cinema animates at 24 fps and looks awful because of it. It looks slightly less awful thanks to flicker reduction, but it still isn't smooth and clear like real life, which faster framerates (input, obviously) are needed for.
 
Ok, the reference is input. Then maybe reporting jitter(ms) on frametime gives a simple value but is less confusing to drop to some "fps" rate.

Nyquist gives the double sample frequency than signal in order to perfectly count the number of cycles in the data, maybe that can be transported? so if we want to "see" 60fps data we have to measure 120fps???
 
Nyquist gives the double sample frequency than signal in order to perfectly count the number of cycles in the data, maybe that can be transported? so if we want to "see" 60fps data we have to measure 120fps???
Be careful; a single frame is not a cycle. For instance, if you want to display a 1Hz square wave on an LCD panel, your video signal needs to be at least 2fps (alternative dark and bright frames or some such business).

To capture a 60fps video signal, you can capture at 60fps and get everything.
 
Back
Top