AMD demonstrates Freesync, G-sync equivalent?

The ghosting problem is different than the below the refresh rate problem...right? I think the ghosting problem isn't a freesync problem, but more the monitor makers implementation of the adaptive sync that isn't good enough.
Apologies. You are correct that they are two different currently known problems with FreeSync that needs to be resolved. I was hoping they were one issue that could be corrected with the same solution. If I understand the Vesa Standard correctly, Adaptive Sync at the monitor level only involves adding a "DisplayPort 1.2a" port to the monitor (and a few modifications to scalars) and it's supposed to work, so the monitor makers have done that. At the GPU level FreeSync is supposed to perform it's magic when running games, applications, etc... and is where the problem lies.

I guess this Adaptive Sync VESA Standard also extends to any output device using a DisplayPort 1.2a to carry a signal from a GPU. So whatever solution is arrived at will also have to work when I have my LCD TV's, 4K TV's, or any other output device connected to my graphics card. Can a company with AMD's reputation for driver support provide FreeSync driver level code solutions for all output devices currently on the market or "future coming-to-market" products that might be connected to a DisplayPort 1.2a? Or does AMD intend to provide support for only monitors they certify and no other output devices connected to the GPU DisplayPort 1.2a, and is it still an open VESA Standard at this point?
 
Last edited:
How does a display scale, say, a 720p signal to 1080p? How do TV sets perform frame-rate interpolation? How do TVs do de-interlacing?
 
In order to do that, theoritically you need to read the "value" back from an LCD cell, and calculate/deduct the exact original voltage to be re-applied. There are some problems here that I can think of; Can you actually read the content of a panel cell back?

Okay, I read the DRAM wiki, and apparently contents are read/rewritten to refresh. Wow, I though one maybe just need to recharge some capacitator.
 
Okay, I read the DRAM wiki, and apparently contents are read/rewritten to refresh. Wow, I though one maybe just need to recharge some capacitator.
And that's to refresh a binary value! In this case, you'd have to maintain an 8 or even 10 bit value that's stored in a very tiny capacitor.
 
According to the pcper article, when a game drops below the monitors minimum refresh rate, at least the BenQ free sync monitor, stays att 40 hz instead of going to it's max refresh rate. I can't understand why? If doing around 30-39 fps in a game, a refresh at 40 hz would look much worse than refresh of 144 hz. And if one choose to put v-sync on, it would have a lot more "even" equal dividers to sync at 144 hz than 40 hz. Or maybe I got this wrong?
 
According to the pcper article, when a game drops below the monitors minimum refresh rate, at least the BenQ free sync monitor, stays att 40 hz instead of going to it's max refresh rate. I can't understand why? If doing around 30-39 fps in a game, a refresh at 40 hz would look much worse than refresh of 144 hz. And if one choose to put v-sync on, it would have a lot more "even" equal dividers to sync at 144 hz than 40 hz. Or maybe I got this wrong?
I heard the theory that faster refresh rates naturally lead to a brighter image. The sharp transition between 40 Hz and 144 Hz would manifest as a marked dimming and brightening, and since games often trip above and below the minimum refresh rate, the result would be very distracting.

This theory seems easy to test with a FreeSync monitor: just write a program that switches quickly between 40 and 144 Hz, and make a video.
 
There would be no need to wait until you reach 40. At 72 you could start duplicating frames, making the effective range 72-144. And since I don't believe changing between 60 and 144 today in a non adaptive sync monitor causes any change in brightness, that would make it a non issue.

In any case, if the panel suffers from brightness changes at different refresh rates, then the monitor itself should be the one to compensate for that.
 
There would be no need to wait until you reach 40. At 72 you could start duplicating frames, making the effective range 72-144. And since I don't believe changing between 60 and 144 today in a non adaptive sync monitor causes any change in brightness, that would make it a non issue.

In any case, if the panel suffers from brightness changes at different refresh rates, then the monitor itself should be the one to compensate for that.

If FreeSync could do duplication, why would it be easier at 72 Hz than 40?
 
Because the lower you go, more problems start to appear: ghosting, lower perceptible brightness, etc. (typical LCD issues at low Hz). By having the monitor work as close to its maximum refresh rate as possible you can avoid those problems, unless you have them already when you are at 72.

When you change your non adaptive sync monitor between 60 and 144 do you notice a brightness change? I've never seen it. Changing between 72 and 144 should be a non issue. And if it is, the monitor should compensate for it.
 
I think the less real refresh rate difference (smaller difference between min and max), the lesser the need for a hardware to compensate for the refresh rate difference thus simpler panel. Of course it needs the duplication capability (again, can't this be added in the GPU itself?).
 
Exactly, that's why duplication should start as soon as 72, which is the first value that allows it on a 144 monitor.

Regarding where it should happen, well, technically it can on both the video source and monitor. My opinion is that it should be on the GPU side.
 
Exactly, that's why duplication should start as soon as 72, which is the first value that allows it on a 144 monitor.

Regarding where it should happen, well, technically it can on both the video source and monitor. My opinion is that it should be on the GPU side.
One of the important benefits of variable refresh rate is decreased lag without tearing. If you start frame doubling at higher refresh rates, you've added lag, because you can't interrupt the refresh cycle without causing a tear. For a 144 Hz panel, frame doubling at 72 Hz causes 1/2 a frame of extra lag.

As to why not do frame doubling in the GPU: I'm really curious as to why AMD isn't doing so already, if it were that easy.
 
As to why not do frame doubling in the GPU: I'm really curious as to why AMD isn't doing so already, if it were that easy.
I guess they could, though it would require them keep 2 framebuffers when using freesync which would increase VRAM usage (just like vsync). But of course they are certainly capable of doing that since you can already run vsync and freesync at the same time.
 
One of the important benefits of variable refresh rate is decreased lag without tearing. If you start frame doubling at higher refresh rates, you've added lag, because you can't interrupt the refresh cycle without causing a tear. For a 144 Hz panel, frame doubling at 72 Hz causes 1/2 a frame of extra lag.

That's really the worst case scenario: you send a frame 13.88 ms after the previous one (72 Hz), and since you have room for a duplicate frame, you send a copy 6.94 ms later to use the full 144 Hz. Now a new different frame comes a fraction of a millisecond after you send the duplicate, and you have to wait those 6.94 ms to paint it, or skip the new frame. This is highly unlikely, as it would mean you were refreshing at 72 fps and all of a sudden you jump to 144 fps. For much more progressive ups and downs in the real world the latency should be much lower (I'm thinking 1-2 ms as absolute max, which is not noticeable at all).

Also if you are in a competitive scenario where that minimal lag would matter then I guess 72 fps is way lower than where you'd want to be, and anything above that wouldn't have any lag.

Unless there are other practical issues with the frame duplication, I don't see any good reason not to use it.
 
According to the pcper article, when a game drops below the monitors minimum refresh rate, at least the BenQ free sync monitor, stays att 40 hz instead of going to it's max refresh rate. I can't understand why? If doing around 30-39 fps in a game, a refresh at 40 hz would look much worse than refresh of 144 hz. And if one choose to put v-sync on, it would have a lot more "even" equal dividers to sync at 144 hz than 40 hz. Or maybe I got this wrong?
Unfortunately that flies in the face of what AMD has stated... Anandtech review stated the opposite of PCPer.
 
Back
Top