AMD demonstrates Freesync, G-sync equivalent?

How can it work "just fine" when it's riddled with screen flashes?

The official press release from AMD listed monitors that only supported 30Hz as the lower limit. The reason being that this is limit of current display tech.

If the display itself isn't capable of displaying images below 30Hz without flickering, then it will communicate 30Hz as the lower bound to the GPU, which won't go lower.
If the display is capable of displaying lower frequencies without flickering, then it will communicate a lower value than 30Hz as the lower bound, and the GPU will use this instead.

In other words, FreeSync itself works all the way down to 9Hz, but it's up to display manufacturers to make sure that their monitors specify to the GPU what frequencies they can handle.
 
If the display itself isn't capable of displaying images below 30Hz without flickering, then it will communicate 30Hz as the lower bound to the GPU, which won't go lower.
If the display is capable of displaying lower frequencies without flickering, then it will communicate a lower value than 30Hz as the lower bound, and the GPU will use this instead.

In other words, FreeSync itself works all the way down to 9Hz, but it's up to display manufacturers to make sure that their monitors specify to the GPU what frequencies they can handle.
You know you are arguing semantics now, right? This is just a mere software switch, the monitor has to do all the work, and since there are no monitors that can do sub-30Hz right now (see Jawed link, also AMD press release), then Freesync can't do it either, it's worthless to market that a tech can do something theoretical when the actual hardware can't do it yet, You don't see companies announcing GPUs that can operate at 2GHz (because hey, we can do it, we just need better process, better cooling solutions, and better architectures)!
 
How can it work "just fine" when it's riddled with screen flashes?

The official press release from AMD listed monitors that only supported 30Hz as the lower limit. The reason being that this is limit of current display tech.

The current limit of freesync is 9Hz.. this say scaler manufactuer have choose 24hz minimum for now .... Nvidia G-sync is set to go " offline" at 30Hz ( 33.3ms ) , so please, im nost sure what do you imply with that ..

I think you misunderstand somewhere, softwares, scalers, and hardware limits. It was clearly state by AMD that the first iteration was use not a 9hz as minimum, but scalers producers will start at 24hz 6 months ago...

Understand it clearly, this is not AMD hardware or freesync who cant do it, but scalers manufacters who have better like to use the 24hz as under limit. ( technical question ?.. )

Technically if 33.3ms / 30hz is a limit on the panem, freesyncl will use this lmit, if panel can go under .... Freesync is able to follow... AMD is not Samsung or LG, nor MVPA ( ok i think have list 90% of LCD panels on the market ).. they will not create your monitor panel ... They create only the software / part on gpu side ...

I dont know where you go graham, you base your question on panel / monitor manufacturer who have do their own choice, based on the technology they want to use . .. AMD dont control their choice, their technology choice...The panel, they want use .. ( i want a superAmoled 4K on a 34" )..


9Hz can only be used for low image display as reduce power consumption on fixed image, or for low operation on deskop .. you want to play a game with 9hz ? display a movie at 9hz ? with what 4000hz interpolation, like it is possible with the omega driver ?

( for be more clear, it is not freesync who can go as low as 9hz, but the vesa standard adaptive sync, freesyncis just her fort use it its part of the spec of adaptive sync. Nvidia can use it, Intel can use it, qualcomm can use it, ARM can use it, my mom, can use it if she wanted )
 
Last edited:
You know you are arguing semantics now, right? This is just a mere software switch, the monitor has to do all the work, and since there are no monitors that can do sub-30Hz right now (see Jawed link, also AMD press release), then Freesync can't do it either, it's worthless to market that a tech can do something theoretical when the actual hardware can't do it yet, You don't see companies announcing GPUs that can operate at 2GHz (because hey, we can do it, we just need better process, better cooling solutions, and better architectures)!

Err, no, that's not the same thing. And as far as I know the 30Hz issue is related to the way liquid crystals work, but I don't see why it should be a problem for, say, OLED displays. FreeSync GPUs can output a 9Hz signal, and they will do so correctly if you just ask them to. No GPU can run at 2GHz.
 
Last edited:
And as far as I know the 30Hz issue is related to the way liquid crystals work, but I don't see why it should be a problem for, say, OLED displays.
Can you explain why this is the case?

I thought the liquid crystals were constantly exposed to a voltage maintained by the active matrix elements, thus any fading/relaxing of the liquid crystals would be a result of the circuitry not maintaining the desired voltage (a result of capacitors losing charge?). Hence the need for refreshing, like with DRAM, even when the image doesn't change. As OLED displays use similar active matrix circuitry, wouldn't they have the same problem?
 
Can you explain why this is the case?

I thought the liquid crystals were constantly exposed to a voltage maintained by the active matrix elements, thus any fading/relaxing of the liquid crystals would be a result of the circuitry not maintaining the desired voltage (a result of capacitors losing charge?). Hence the need for refreshing, like with DRAM, even when the image doesn't change. As OLED displays use similar active matrix circuitry, wouldn't they have the same problem?

My (possibly completely erroneous) understanding was that liquid crystals are controlled by pulses, whereas I'd expect continuous current for OLEDs. But if it's really just a matter of inadequate circuitry, then display manufacturers should be able to come up with designs that avoid this flaw without too much trouble, provided they feel enough incentive to do so.
 
Can you explain why this is the case?

I thought the liquid crystals were constantly exposed to a voltage maintained by the active matrix elements, thus any fading/relaxing of the liquid crystals would be a result of the circuitry not maintaining the desired voltage (a result of capacitors losing charge?). Hence the need for refreshing, like with DRAM, even when the image doesn't change. As OLED displays use similar active matrix circuitry, wouldn't they have the same problem?

The LCD pixels need to be constantly refreshed or they will loose charge and relax, causing them to fade to white. It seems to become noticeable when you start refreshing at around 30Hz or lower. The lower the response time of the LCD, the higher your minimum refresh has to be to prevent this from happening. Another reason why we need IPS gsync panels...

Think I posted this on the last page but it should answer your questions.
http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering

NVIDIA said:
"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.

This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.

When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."
 
The LCD pixels need to be constantly refreshed or they will loose charge and relax, causing them to fade to white. It seems to become noticeable when you start refreshing at around 30Hz or lower. The lower the response time of the LCD, the higher your minimum refresh has to be to prevent this from happening. Another reason why we need IPS gsync panels...

Think I posted this on the last page but it should answer your questions.
http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering
That doesn't really answer my question, which was in the context of Alexko saying that OLEDs shouldn't have the same problem. But if the issue is that capacitors in the active matrix backplane lose charge and need to be refreshed, then OLEDs fade, too (to black in this case, which might make it easier on the eyes but it would still be noticeable).

Just as a side note, years ago I came across a mobile SoC devkit with a transflective LCD panel which, when you cut the power, held the image for several seconds before completely fading to white.
 
Anyway it seems the Asus MG279Q 120hz should be able to work with freesync..

It use DP1.2a+ with adaptivesync, so basically it should been compatible with it.

Quote:
it is the first display that publicly supports Adaptive Sync and DP 1.2a+ but does not have an affiliation with either branded variable refresh rate technology. As it turns out though, that isn't bad news.
Quote:
The monitor supports DP 1.2a+ and Adaptive Sync which leads us too...


...the fact that this monitor will work with AMD Radeon graphics cards and operate at a variable refresh rate. After talking with AMD's Robert Hallock at the show, he confirmed that AMD will not have a whitelist/blacklist policy for FreeSync displays and that as long as a monitor adheres to the standards of DP 1.2a+ then they will operate in the variable refresh rate window as defined by the display's EDID.

So, as described by the ASUS reps on hand, this panel will have a minimum refresh of around 40 Hz and a maximum of 120 Hz, leaving a sizeable window for variable refresh to work it's magic.
http://www.pcper.com/news/Displays/C...efresh-Monitor
 
Last edited:
Anyway it seems the Asus MG279Q 120hz should be able to work with freesync..

It use DP1.2a+ with adaptivesync, so basically it should been compatible with it.


http://www.pcper.com/news/Displays/C...efresh-Monitor

Good to hear that it'll work exactly like how I expected. Now hopefully Nvidia gets off their arses and supports VESA Adaptive sync in the future or I may have to go back to AMD GPUs sooner than I planned (prefer to not buy a new GPU sooner than every 2-3 years). It'll be a cold day in Hell before I buy a G-sync monitor.

Regards,
SB
 
That doesn't really answer my question, which was in the context of Alexko saying that OLEDs shouldn't have the same problem. But if the issue is that capacitors in the active matrix backplane lose charge and need to be refreshed, then OLEDs fade, too (to black in this case, which might make it easier on the eyes but it would still be noticeable).

Maybe OLED has a slower decay time? I don't really know, but in the case of gsync/freesync super fast response times are not such a great thing because of this issue. Another reason we need some god damn IPS variable refresh displays.
 
http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering

Well, Nvidia seems to have try to correct something related to the 33.33mhz limit, but it seems have a bit gone in the wild i ... brightness flashing ...

This said, it will not forcibly occur in all games / all situations ..

I hope Freesync will not suffer of the same problem, because ofc you can read how good is G-sync in the forums, but if you goes in the Rog Swift forum or other display forums, you will be quickly discouraged to buy one monitor who use this technology right now .

Ofc we all know the problem have got Asus to get out this monitor, and everythings, but i hope we will dont discover more damage from other brand with free sync then,
 
Last edited:
IPS generally has a slower decay time, so it wouldn't be a huge issue like it is on shitty 1ms TN panels.
I'm not convinced that this is the case: the transition speed and the decay time are not necessarily related.

Transition time is related to how fast the LC crystal can respond to a change in applied voltage. It's limited by mechanical characteristics.

Fade is related to the charge leakage of the capacitors that drive the gate of the TFT. So it's an electrical characteristic.

A TN and an IPS panel can have very similar charge leakage characteristics despite having a totally different LC fluid. So their fade would be similar as well.
 
I'm curious if twitch streaming and h264 recording support variable frame rate as well. Because if not, I'm not sure how well the world will solve this discrepancy. You play, and all is nice, then you watch your own playback, and it goes to hell.? Maybe it revives animated GIFs with per-frame latencies, LOL.
 
Back
Top