Are higher framerates likely to proliferate?

everyone is impacted by lowered frame rates, whether they're aware of it or not. Lowered frame rates means increased input lag. Last gen most people could tell that COD felt good to play, even if they were not aware of the difference of it being 60Hz vs most other console titles being 30Hz. They also were not aware it was sub-HD. I'd guess most people would not know if games were 1800p or 1440p vs 2160p if Digital Foundry did not tell them. Frame rate stability will always be the best option.
Well it can be split between those sensitive to input lag/stutter and those in Silent_Buddha situation with frame variation affecting perceived positional alignment, but yeah both come back to not sustaining a persistent framerate although input lag/stutter benefits from VRR more easily.
But which modern games do not drop say 1-5fps that are meant to be locked at either 30fps (meaning a close to persistent 60fps is not possible even for those that can hold 30fps such as Destiny 2) or more so as relevant to thread with 60fps at 4K without notable compromises?
That is ignoring games with larger spikes and quite a few of those.
Eurogamer with Digital Foundry has has shown only a rare few AAA games can hold their frame-pacing correctly through most of the game, which then brings back in input lag and many are not capable of 60fps with 4K related scaling.
I appreciate there are games that do well, but thinking about the mainstream popular games played on consoles that also exacerbated that both PS4 Pro and XBox One X are not equal.
 
Last edited:
Then there's black frame insertion, or backlight strobing, but you're probably going to want a game running at least 60Hz on a 120Hz screen for those to look good.

Black frame insertion Strobing works good only when source fps match the rate of backlight strobing. From my own experience 60 fps source on a screen which strobes at 120 Hz produces ghosting. Matching fps is the key but strobing at 60 Hz produces visible flicker, I would say 100 fps is absolute minimum if you want to use BFI strobing. In my opinion 120 fps without any strobing is the best option.
 
Last edited:
Black frame insertion works good only when source fps match the rate of backlight strobing. From my own experience 60 fps source on a screen which strobes at 120 Hz produces ghosting. Matching fps is the key but strobing at 60 Hz produces visible flicker, I would say 100 fps is absolute minimum if you want to use BFI. In my opinion 120 fps without any strobing is the best option.

BFI doesn't require strobing though. My tv can do BFI, but doesn't strobe. Won't let me turn it on in game mode though, so I've only seen it with movies. Seems to work ok, but I think the VA panel in my tv has response time ghosting anyway.
 
How do you insert black frames without strobing? That's the very definition of strobing!

Backlight strobing vs leaving the backlight on and just inserting a black frame. Strobing with tvs and monitors tends to refer to turning the backlight on for a fraction of the period of each frame. Black frame insertion leaves the backlight on constantly, but just inserts a black frame between each source frame. They take advantage of the 120Hz modes on tvs to do this.
 
Surely both flicker though, unless a high enough refresh to mitigate, at which point strobing shouldn't be perceivable. Strobing at 60Hz should be 120 Hz alternating, which shouldn't be flicker.
 
Surely both flicker though, unless a high enough refresh to mitigate, at which point strobing shouldn't be perceivable. Strobing at 60Hz should be 120 Hz alternating, which shouldn't be flicker.

Yess, they'd both flicker, but most LCD tvs have a noticeable backlight even when the screen is entirely black. It wouldn't be an on/off flicker, but a dimming flicker. Perhaps not as hard on the eyes? OLEDs don't have a backlight, and the pixels are self-emitting, so black frame insertion would effectively be like strobing the backlight on an LCD.

My only point in responding to novcze was that black frame insertion doesn't note require strobing the backlight. It's two different solutions for solving the same problem. Technically backlight strobing is a lot better because you can make the visible period of each frame even shorter. Some backlight strobing monitors can turn on the backlight for a fraction of a millisecond. The tradeoff is massively decreased brightness.
 
My only point in responding to novcze was that black frame insertion doesn't note require strobing the backlight. It's two different solutions for solving the same problem.

Yep, you meant 60 fps source running at screen that can do 120 fps and source is alternating with the black frames. I don't have personal experience with this particular technique but imo it would flicker as well. Edited my original post to make it clear that I'm talking about my experience with strobing and not BFI.
 
Last edited:
I'm not sure how driving circuits of an OLED pixels relate to eye tracking motion blur.

They don't use BFI on OLED, because depending on viewing conditions even 60hz with "scan-and-erase" can be flicker-free-

from the same book a few pages down:

"Sony introduced an erase scan line to obtain a secure write operation by turning T4 off slightly earlier than T3. The erase scan can also control the time-averaged pixel brightness and the pixel emission duty by turning T4 on during deselect time. The pixel emission duty control produces a high-quality, fast-moving image."

(also producing least flicker if the scanning continues until the end of frame time, starting again on top without delay)

CRT projectors are/were used at 72, 60, and even 48p (not uncommon) rate regularly. There is only ~1ms "black" period if its properly set up. Most people with monitors never bothered to get ~60hz (fast retrace) right.
 
Last edited:
Is this talking about emulating the scan and decay of a crt?

no, it's talking about current-programmed OLED pixel circuits, where "erase" mean that charge on the capacitor is cleared before setting new value that is hold until next addressing.

They don't use BFI on OLED, because depending on viewing conditions even 60hz with "scan-and-erase" can be flicker-free-

Panasonic, Sony and now even LG have BFI on their OLEDs, no?


btw. cool video from slow-mo guys
 
Years ago there was a Sony PVM (professional ) OLED series slow-motion video showing this in action. About 20-25% of lines were updated at once in a continuous manner so it was about 1/5th duty cycle. (And ofc. no leaking or obvious segmentation of a back/edge lit LCD).

So if you want 60hz it's either this stuff or the top CRT-s like Marquee 9500LC (that's still better ).
 
In my opinion, UHD on consoles (even on something like a PlayStation 5) is a stupid decision and a waste of resources. 60fps are many times more important. There are many limiting factors but something like 1620p and 60fps looks significantly better than UHD and 30fps. The quality of the TAA is more important than the resolution.

At 30fps, video games are also not as fun to play.

VRR, IMO, should die and developers should put more development effort into variable resolution with a fixed framerate (preferably minimum 60 FPS).

Not all scenes are limited by the GPU. Especially when there is a lot of AI around.
 
Last edited:
Another way to look at it, if next gen consoles are 18-20tf monsters than by all means proliferate as you wish although imagine the possibility of an unrestrained 30fps title with that much power! The truth is a 9-10 tf weaksauce box is simply inadequate to provide next gen graphics with decent resolution at 60fps. It would probably just look twice as good as Doom or CoD with 1800p res? That's a pretty depressing prospect personally.
 
Another way to look at it, if next gen consoles are 18-20tf monsters than by all means proliferate as you wish although imagine the possibility of an unrestrained 30fps title with that much power! The truth is a 9-10 tf weaksauce box is simply inadequate to provide next gen graphics with decent resolution at 60fps. It would probably just look twice as good as Doom or CoD with 1800p res? That's a pretty depressing prospect personally.

I get that, but I'll take the frame rate because the games will play better.
 
This guy comes across poorly, like audio/videophile people. BFI makes the image more engaging, lucid, glossier, glassier ... ok. Sample and hold displays have a "dead look" ... ok.

It removes motion blur caused by rapid eye movement, improving motion clarity. That's it.

BFI itself is a flawed concept in many ways:

1. going overboard with flicker for a hypothesized "global shutter" effect , when rolling shutter can be compensated and probably less objectionable than what they do with VR optics
2. it's a digital "discrete" concept , scan-and-hold displays use continuous ramp adressing , not a really good match
3. digital displays like plasma/DLP/ mobile OLED using binary grayscale can get away without "top down ramp" , but usually need many subframes on their own (for grayscale), and won't really go below 6ms duty cycle .
 
Back
Top