AMD demonstrates Freesync, G-sync equivalent?

Interesting prior to the release of Freesync Monitors, Scott Wasson at Tech Report was fairly skeptical about Freesync but was willing to give it a fair shake when he could finally test one.


With regards to the under 40 hz performance of the Benq monitor? It's maybe barely worse than how Nvidia handles things when it goes below the variable refresh rate cut-off.

The short vlog version of it...

http://techreport.com/news/28116/here-our-discussion-of-freesync-g-sync-and-the-benq-xl2730z

And the in depth testing with the monitor as well as asking both Nvidia and AMD how they handle things.

http://techreport.com/review/28073/benq-xl2730z-freesync-monitor-reviewed/3

Still, I made an attempt in several games intensive enough to push our R9 290X below 40 FPS. Far Cry 4 was just a stutter-fest, with obvious system-based bottlenecks, when I cranked up the image quality. Crysis 3, on the other hand, was reasonably playable at around 35 FPS.

In fact, playing it was generally a good experience on the XL2730Z. I've seen low-refresh quantization effects before (by playing games on one of those 30Hz-only 4K monitors), and there was simply no sign of it here. I also had no sense of a transition happening when the frame rate momentarily ranged above 40Hz and then dipped back below it. The experience was seamless and reasonably fluid, even with vsync enabled for "out of bounds" frame intervals, which is how I prefer to play. My sense is that, both in theory and in practice, FreeSync handles real-world gaming situations at lower refresh rates in perfectly acceptable fashion. In fact, my satisfaction with this experience is what led me to push harder to understand everything I've explained above.

So, in other words, you'll only see the "better" way that Gsync handles this if you use an extremely contrived demo where the framerate of the demo is locked to a single static number. In games where the framerate is going to be variable from one frame to the next when below the variable framerate lower bounds? It basically offers the same experience as a Gsync monitor.

And he also talks about the "ghosting" issue. Basically it's a non-issue with the tech, and is going to depends purely on monitor makers. Nvidia with their module do the tuning for the monitor manufacturers. Only fair as they are paying a premium to Nvidia for the Gsync module.

And the antiblur strobing doesn't work with variable refresh for Benq just like it doesn't work for the Asus Gsync monitor.

As with the ULMB mode on G-Sync displays, BenQ's blur-reduction mode is not compatible with variable refresh rates, so it's probably more of a curiosity than anything.

Which explains why it is disabled when variable refresh is enabled. As with all things internet, so much to do about nothing.

Regards,
SB
 
When did PiP start to become a "normal" feature of PC monitors?

Acer is to release new 34" gaming monitors with a 3440 x 1440 resolution and IPS panel, and as it now seems two actually. The XR341CK has AMD FreeSync support and the XR341CKA has NVIDIA G-sync support. They come in a silver and black design with a "zero-frame" panel (i.e. ultra thin bezel).

The XR341CK (FreeSync version) will offer HDMI 2.0 (MHL), DisplayPort, Mini DP and DP out connections, along with a 4 port USB 3.0 hub. It will support daisy chaining via the DP out port and also PiP and PbP functions.

The XR341CKA (G-sync model) will apparently offer DisplayPort, but also an additional HDMI 1.4 video connection. That would mark the first G-sync screen we've seen with more than one connection, so we will be interested to see how this works. Acer mentioned the screen was G-sync v2, so perhaps they have made adjustments to allow more than one video input. With the G-sync module being used, the XR341CKA will also support ULMB (Ultra Low Motion Blur). The XR341CKA also has a 4 port USB 3.0 hub as with the other model.
 
Well, at least someone is asking the right questions ... :LOL:

They can surely shed some light on this. If Overdrive is not working properly when Free-Sync is on, I'm sure they can tell what the monitor is doing at those times. I'm sure that AMD also knows, but since it's apparently useless to try to get the answer from them, get it from the guys that make the monitors.

....

Yet another edit, as I've gotten a reply from a rep on the BenQ support center site:

"Hi Dante,

Great question! This would indeed be a function of free-sync. As of this moment there is no overdrive functionality built in to the freesync architecture. Much like nVidias G-Sync, we unfortunately have no control over the performance of the scalars while in either G-sync or Free-sync modes. Where G-Sync currently has a solution that is part of G-Sync, there is no such offering for Freesync at this moment, although I imagine there will be in the future.

We, as well as other free sync monitor brands, are working with AMD to see if there is a fix for this issue, but currently, there isn't one, unfortuantely.

however, as soon as we have something to show for it, we will definetely let you know.

Thanks!"

2423815








So they indeed know what is going on, they just built the spec in a way that they can't do anything about it on their end, apparently.

How did they agree to make monitors like these in the first place? Were they on the assumption that AMD would have Overdrive support ready on their end when the launch came? But wouldn't you suppose that monitor manufacturers would like to have a working driver before launch to ensure that that was indeed the case? Otherwise I'd assume they'd like to point out publicly that Overdrive functionality when in FreeSync mode was AMD's responsibility.

So, going by this, and let's assume AMD does indeed have control of Overdrive on their end, they have to know the panel characteristics and apply a suitable setting. Why haven't they done this yet?

This is essentially releasing a product before it's finished and not telling people about it. They could at least have said that Overdrive will be added later, just like they said that FreeSync support for Crossfire and Eyefinity would be added later.
 
What a load of BS. Proper overdrive is 100% on the monitor's side. It has nothing to do with anything else.
AMD stated that FreeSync is more than just Adaptive Sync, and that monitors don't automatically deserve the FreeSync badge. In other words: it reaches beyond the cable and goes inside the monitor. You may think it's BS that they are now called out for it, but it's something of their own making, and their own, lower, standards.
 
Is the ghosting issue (lack of overdrive?) on freesync really a problem outside of the windmill demo? Do people using it for gaming will notice it?
 
AMD stated that FreeSync is more than just Adaptive Sync, and that monitors don't automatically deserve the FreeSync badge. In other words: it reaches beyond the cable and goes inside the monitor. You may think it's BS that they are now called out for it, but it's something of their own making, and their own, lower, standards.
If the FreeSync decal doesn't adhere to the bezel, the monitor is given only probationary approval...
 
AMD stated that FreeSync is more than just Adaptive Sync, and that monitors don't automatically deserve the FreeSync badge. In other words: it reaches beyond the cable and goes inside the monitor. You may think it's BS that they are now called out for it, but it's something of their own making, and their own, lower, standards.

It doesn't matter what X or Y company representative said. Overdrive is part of the monitor electronics, period.
 
Is the ghosting issue (lack of overdrive?) on freesync really a problem outside of the windmill demo? Do people using it for gaming will notice it?
I don't know. But if it's something that's only exposed by that demo, then what does it say about the understanding of the issue by those who wrote it? ;)

It doesn't matter what X or Y company representative said. Overdrive is part of the monitor electronics, period.
That's nice, dear. I've never claimed otherwise. Overdrive is also something that impacts the visual experience of a demo that's supposed to highlight the very technology it's supposed to promote.
 
That's nice, dear. I've never claimed otherwise. Overdrive is also something that impacts the visual experience of a demo that's supposed to highlight the very technology it's supposed to promote.

You said:

AMD stated that FreeSync is more than just Adaptive Sync, and that monitors don't automatically deserve the FreeSync badge. In other words: it reaches beyond the cable and goes inside the monitor. You may think it's BS that they are now called out for it, but it's something of their own making, and their own, lower, standards.

Both bolded parts are just wrong, "dear". The first because adaptive sync is part of the DP standard (i.e. the cable and the signal that goes thru it), and AMD is doing nothing outside the standard in the cable or in the signal that goes thru it. The only thing they can do is frame duplication and similar tricks before the signal enters the cable (i.e. the video content itself), and that has nothing to do with the DP standard, nor with the adaptive sync part of it.

The second one is just a wild assumption of yours. It doesn't make any sense, because they'd need to have some kind of propietary protocol to communicate some magic sauce to the monitor.
 
Both bolded parts are just wrong, "dear". The first because adaptive sync is part of the DP standard (i.e. the cable and the signal that goes thru it), and AMD is doing nothing outside the standard in the cable or in the signal that goes thru it.
Yes, yes. That's what the Adaptive Sync standard is.

The second one is just a wild assumption of yours. It doesn't make any sense, because they'd need to have some kind of propietary protocol to communicate some magic sauce to the monitor.
They bless a monitor with a FreeSync label if it satisfies their quality requirements. That doesn't mean the GPU has influence over the internals of the monitor. It doesn't mean that the GPU needs to signal something magic to the monitor. It's a label of quality. And dealing with ghosting is apparently not part of that quality assessment.
 
Because the LG FreeSync one for example, doesn't seem to have full scene frame buffer on board.
Why do you say that?

I would hardly call an 'enhanced premium' chip a "normal" part of monitor design.
If you have HDMI in your monitor, it's likely it has memory (because scaling and de-interlacing are required on progressive scan monitors) Low quality scaling and de-interlacing are possible without memory - emphasis on "low quality". I merely linked a document that showed memory is a real thing inside monitors.

Scalers are normally part of monitors. Those that are memory based seem to add something like 10-20ms of latency. The majority of monitors with lag results tested show this approximate value of lag, implying they're memory-based. The reference for latency is usually a CRT.

Screens like the Dell P2714H

http://www.tftcentral.co.uk/reviews/dell_p2714h.htm

don't seem to have a framebuffer based scaler (or they have a super-fast framebuffer scaling algorithm).

The G-Sync based Acer XB270HU obviously does have a framebuffer, but we don't know whether that's used for scaling. With G-Synch off it has very low lag:

http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

But the lag increases, "a few milliseconds at most", when G-Sync is turned on (which could imply that it doesn't use the framebuffer for scaling or it has a super-fast scaling algorithm or that the scaler is only active in non-native resolutions - most monitors have the scaler permanently active).

Tech Report showed that lag is the same on the G-Sync and the Adaptive Sync monitors it compared. That might indicate that the Adaptive Sync monitor has no scaler (very doubtful), or that it has a framebuffer-less scaler or it falls back to framebuffer-less scaling when Adaptive Sync is turned on (to reduce lag) or that the scaler is disengaged (again, to reduce lag). Can't tell. If the latter, then it would seem that a whole load of monitor functionality is turned off while in Adaptive Sync mode, presumably due to bypassing the components that do things like scaling and response-time compensation. The chip I linked earlier does RTC, so it's likely that there are other cases where scaling and RTC are within a single chip.

Complete guess: BenQ (and the other monitor makers?) chose to do "gaming mode with Adaptive Sync" by bypassing the chip that does framebuffer scaling and RTC (or the bit of the chip that does those things?). If they're all using the same chip with Adaptive Sync support and that's the way that chip works, then whoops.

Apart from that, their thought process seems to be "we can't do variable overdrive RTC, which is the ideal approach for framerates between 40 and 144fps, so we'll just turn it off. At 144Hz, RTC isn't that important." Of course at lower frame rates RTC becomes more important as ghosting becomes more obvious.

It's pretty stupid to turn off RTC just because Adaptive Sync is on. Maybe the chip works this way, so the monitor makers are forced to use this compromise?

And giving yet to come products with projected premium price tag as example doesn't really help it either.
I listed those to simply demonstrate that non-G-Sync monitors have memory.

When did PiP start to become a "normal" feature of PC monitors?
Some time after 2007?

http://www.hdtvsolutions.com/BenQ_FP241WZ_LCD_Monitor_Review.htm

And I still don't see some kind of self refresh functionality in the features list.
Neither do I. I merely suggested that since memory is frequently a part of monitors, it is possible ;)
 
You're giving the Freesync label a bit too much weight. It is probably simply a label given if the monitor had already been tested with AMD "freesync" and they adaptive sync works properly, thus the label. It's up to the monitor manufacturer to implement whatever technique to reduce ghosting. Maybe in the next BenQ freesync monitor, they will add a feature called SyncOverdrive and charge a premium for it.
 
You're giving the Freesync label a bit too much weight. It is probably simply a label given if the monitor had already been tested with AMD "freesync" and they adaptive sync works properly, thus the label. It's up to the monitor manufacturer to implement whatever technique to reduce ghosting.

The responsibility lies with whatever is causing the issue, and in this case it's FreeSync. Without or disabling FreeSync there is no ghosting problem.

"In addition, ghosting on the screen disappears immediately after the FreeSync function is blocked and the monitor will be able to operate with a fixed refresh rate."
https://translate.google.com/transl...sl=pl&tl=en&u=http://pclab.pl/art62755-4.html

Maybe in the next BenQ freesync monitor, they will add a feature called SyncOverdrive and charge a premium for it.

In this case it is no different than G-Sync and no longer a free open Vesa standard but a proprietary one targeted at supporting one graphics manufacturer. The monitor manufacturers have no obligation to support broken standards that are not "plug and play".
 
Last edited:
It doesn't matter what X or Y company representative said. Overdrive is part of the monitor electronics, period.

It's depressing that on a graphics forum I have to link this, since people can't do the basic research themselves, but there it is:

http://www.tftcentral.co.uk/advancedcontent.htm

Response time compensation is an analogue conditioning of the signal fed to the LCD matrix.
AMD seems to disagree;

AMD said:
What is Project FreeSync?
Posted by rhallock in AMD Gaming on May 29, 2014 5:03:00 PM

...

Project FreeSync is AMD's name for the complete solution: a compatible AMD Radeon™ graphics card, an enabled AMD Catalyst™ graphics driver, and an Adaptive-Sync-aware display. Together, these three pieces will abolish tearing, eliminate stuttering, and greatly reduce input latency.

http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/05/29/what-is-project-freesync

Why do you say that?
Said series at least, only do bob de-interlacing. So basically just scaling.

Mere existence does not automatically make something normal to me.

Neither do I. I merely suggested that since memory is frequently a part of monitors, it is possible ;)

Having memory enough to store a few lines and having memory enough to store at least a couple frames -- plus all the necessary complexity required for G-Sync like capabilities -- makes a lot of difference for a business where economics of scale is an important aspect. I don't think companies like Novatek, Realtek or MStar will bother design and producing some scaler where the potential user is limited to small portion of 290/290X owners.
 
I have no idea what you're talking about :???:
In the midst of opposing reactions over manufacturer response which stated that (VRR) overdrive functionality is part of FreeSync architecture, I took your post posing as an argument that it is not the case.
 
I don't think companies like Novatek, Realtek or MStar will bother design and producing some scaler where the potential user is limited to small portion of 290/290X owners.

There are listed as doing scaler for DP 1.2a, DP1.3 etc .. and so adaptive sync.. whatever is the number of user of AMD gpu's is not in question, basically in 2016 every monitor should support adaptive sync in a way or the other. If they support Displayport standard.
 
Last edited:
Back
Top