Why did it took so long for variable refresh rate tech to appear?

green.pixel

Veteran
G-Sync and Freesync. Why they weren't available much before, like a decade+ ago?

Were there some genuine obstacles related to r&d or is it just a classic milking the market dry?
 
The changes necessary were relatively trivial and within reach, tech-wise. But they involved many small tweaks in many parts of that data chain from GPU to LCD. The image goes through many small coders and decoders and buffers and signal processors along the way. All were theoretically capable of doing it, or could easily be made to be. Just nobody took the time to do so because there was no perceived value on it. Untill somebody changed their mind.
John Carmack talked about he had tried to push that idea around 2006, with not much success for example. He talked about it in some interview or quakecon.
 
As far display technology goes variable refresh rate is still a fudge instead of "tech", you get time variant luminance levels on sample hold displays for example. Where is the extended bit depth to compensate for that?
 
As far display technology goes variable refresh rate is still a fudge instead of "tech", you get time variant luminance levels on sample hold displays for example. Where is the extended bit depth to compensate for that?
If you have a time variant variable, no amount of additional bits is going to compensate for that.
 
Obvious "solution" of taking the brightness voltage going into the LEDs and insert a precision multiplying DAC inbetween -can' t reliably work because pixels aren't refreshed at once and there is overlap between frames.

"Digital displays" with binary grascale can probably do it though (with MDAC), it's another question you need 240hz for acceptable hold times, and there is no space for meaningful binary grayscale in such little time, also an enormous data rate.
 
Last edited:
Back
Top