AMD demonstrates Freesync, G-sync equivalent?

The "g-sync laptops" don't use the module, PCPer tested this on some Asus laptop with leaked drivers and G-sync worked (mostly), no module for sure.
It's different to support the same features on eDP and DP1.2a, but after this it's pretty much certain that if/when NV will support Adaptive-sync, they'll still call it G-sync

I can't remember any G-sync laptops being shown at CES though? There were rumors about "laptop G-sync" around CES times, but that's about it for what I remember?
 
The first monitors that are FreeSync compatible have been available at a handful of etailers already (at least here in the EU), however to enable you are going to need a driver. That driver with its dynamic refresh rate capabilities will be released on march 19th.

All AMD Radeon graphics cards in the AMD Radeon HD 7000, HD 8000, R7 or R9 Series will inevitably support Project FreeSync for video playback and power-saving purposes. The AMD Radeon R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.

AMD APUs codenamed "Kaveri," "Kabini," "Temash," "Beema" and "Mullins" also feature the necessary hardware capabilities to enable dynamic refresh rates for video playback, gaming and power-saving purposes. All products must be connected to a display that supports DisplayPort Adaptive-Sync.

It is our current understanding that the software architecture of select games may not be compatible with dynamic refresh rate technology like Project FreeSync. In these instances, users will be able to toggle the activation of FreeSync in the AMD Catalyst driver.

http://www.guru3d.com/news-story/single-gpu-amd-freesync-driver-march-19th.html
 
AMD FreeSync Review With the Acer XG270HU Monitor
Two standards equal differentiation and that equals into higher prices on these monitors. See FreeSync is another sticker on the box, and that will drive its price level up. Also switching from AMD to Nvidia in the future will become a hard thing to do if you invested in a monitor that supports just either one of these standards. Therefore it would be my wish to see two things happen, I'd like to see monitors support both and not just GSYNC or FreeSync as technically that obviously should not be an issue.
OR if everybody can all forget about proprietary standards and just would follow the broad adoption of Adaptive Sync, then we would really not need brand specific purchases like FreeSync and GSync offer. Given that FreeSync really is DP Adaptive Sync I give AMD the best cards here. But in the end it is your brand preference that will be the decisive factor. The second factor is the Dynamic range the monitor can take, most FreeSync monitors start at 40Hz dynamic. The GSYNC monitors we have seen are at 30Hz. So with FreeSync that kind of blows in the 30 to 40 FPS area where you could really use this tech. Hopefully in the near future we will see better, being lower ranges starting at 30 Hz.

http://www.guru3d.com/articles_pages/amd_freesync_review_with_the_acer_xb270hu_monitor,1.html
 
The 30-40 fps range is a big deal IMO. Im inclined to support freesync over Gsync but not if I need to be hitting over 40fps for it to kick in.
 
It looks like a compromise made in favor of time-to-market and/or broader adoption to allow monitors with timing ranges that start that far above the minimum the specification was published as supporting. I'm curious if the next wave of implementations will have any that stretch outside of that range.
 
The 30-40 fps range is a big deal IMO. Im inclined to support freesync over Gsync but not if I need to be hitting over 40fps for it to kick in.

From the linked review above...

FreeSync will work starting as low as 9 Hz thus 9 frames per second, a great and low value. But most monitors cannot support such a low value, a number of monitors will have a dynamic ranges like these, 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz. Meaning if your rendering framerate is outside of that range, FreeSync will not work and thus tearing and stuttering is back. More specific, if you purchase a FreeSync model with say a dynamic 40-240 Hz range everything below 40 Hz would result into tearing and stuttering (depending on VSYNC preference) as FreeSync gets disabled. And that surely is not why you are purchasing such a monitor right ? In the near future a lot of monitors will settle at 21-144Hz which I certainly am comfortable with. We did noticed on the ACER screen that at low sub 35 FPS screen tearing was back, which was disappointing. From what we learned, the ACER dynamic range starts at 40Hz, and thus so will FreeSync.

It's a limitation of the monitor and not Adaptive Sync (FreeSync). It can go as low as 9 Hz but is them apparently limited to 60 Hz at the top. Author speculates that most monitor manufacturer's will eventually opt for the 21-144 Hz option.

Regards,
SB
 
Looking at Anandtech's review (http://www.anandtech.com/show/9097/the-amd-freesync-review )...

The launch price of $649 is pretty impressive; we’ve looked at a few other 21:9 displays in the past, and while the resolution doesn’t match LG’s 34UM95, the price is actually $50 less than the LG 34UM65’s original $699 MSRP (though it’s now being sold at $599). So at most, it looks like putting in the new technology to make a FreeSync display costs $50, and probably less than that. Anyway, we’ll have a full review of the LG 34UM67 in the coming weeks, but for now let’s return to the FreeSync discussion.

So a relatively small price premium compared to a similar monitor by the same manufacturer which doesn't have adaptive sync.

And a large price advantage versus similar but less feature rich G-sync models. Or an even larger feature advantage versus similarly priced G-sync models. But at least this is forcing some price cuts on some of the available G-sync monitors.

It certainly appears that AMD and their partners are serious about pricing FreeSync aggressively, though there aren’t direct comparisons available for some of the models. The least expensive FreeSync displays start at just $449, which matches the least expensive G-SYNC display (AOC G2460PG) on price but with generally better specs (29” 2560x1080 and IPS at 75Hz vs. 24” 1920x1080 TN at 144Hz). Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the$759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage. As AMD puts it, that’s almost enough for another GPU (depending on which Radeon you’re using, of course).

And I hadn't realized that G-sync still only lists 6 monitors as supporting it. I'd expected more. 4 adaptive sync monitors available now, 11 adaptive sync monitors soon (today's 4 + 5 more from Samsung) with 20 or more by the end of the year.

Here's to hoping that Nvidia implements support for adaptive sync soon-ish. As well as Intel.

Regards,
SB
 
HardWare.fr has a short review as well: http://www.hardware.fr/focus/108/freesync-disponible-premiers-ecrans-decoivent.html

They found that FreeSync works as advertised, but the Acer display they've tested has a pretty crappy panel that kind of ruins it. They're concerned that AMD might be too lax when bestowing the FreeSync label.
The panel only turn crappy when Freesync is active. Their attempt to explain the problem is in this part:
Mais pour le XG270HU, soit Acer désactive purement et simplement l'overdrive lorsque la FRV est utilisée (3 options sont proposées et ne font pas de réelle différence), soit cet overdrive a été calibré uniquement pour le fonctionnement à 144 Hz et exploite des paramètres non adaptés aux fréquences de rafraîchissement inférieures. Mais au final, ce problème fait que sur cet écran et à niveau de performances réaliste pour la plupart des joueurs, FreeSync ajoute un compromis de plus pour les joueurs : il faut choisir entre la fluidité et l'absence de ghosting.

I cant speak French but from rough google translation, it seems that HW.fr conclusion is not that different to that of PCPer, which also experiencing similar level of ghosting on two different Freesync models they tried. The way I understand it, different refresh rate requires different level of voltage overdrive. And according to PCPer, G-Sync adjust the voltage dynamically in sync to the refresh rate. While these Freesync monitors can only either always turned it on or turned it off altogether. Nothing in between. Hence when the voltage overdrive level mismatched the dinamicaly changing refresh rate, ghosting appears.

PCPer said:
The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync? NVIDIA has stated on a few occasions that there is more that goes into a VRR monitor than simply integrated vBlank extensions and have pointed to instances like this as an example as to why. Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

One of PCPer editor said:
As we understand it, G-Sync uses its on-board memory to buffer the previous frame, using this data combined on-the-fly with the next incoming frame to calculate appropriate drive voltages and compensate for ghosting (similar to overdrive).

FreeSync relies on the regular adaptive sync TCONs that at present don't seem capable of correcting for this when operating in VRR modes. The BENQ panel does have a 'Blur Reduction' mode, but that drops it out of FreeSync mode and shifts to constant refresh when enabled. One note - switching this while running the FreeSync demo hung the display and I could only get the screen back up after rebooting both the system and hard restarting the panel by unplugging it.

Question is, can this fixed by scaler alone?
 
I cant speak French but from rough google translation, it seems that HW.fr conclusion is not that different to that of PCPer, which also experiencing similar level of ghosting on two different Freesync models they tried. The way I understand it, different refresh rate requires different level of voltage overdrive. And according to PCPer, G-Sync adjust the voltage dynamically in sync to the refresh rate. While these Freesync monitors can only either always turned it on or turned it off altogether. Nothing in between. Hence when the voltage overdrive level mismatched the dinamicaly changing refresh rate, ghosting appears.

Yes, they're saying that overdrive is either disabled when variable refresh rate is enabled, or running with 144Hz settings, which produces a lot of ghosting. So it's more the electronics than the panel, you're right.
 
Hence when the voltage overdrive level mismatched the dinamicaly changing refresh rate, ghosting appears.

This sucks ... could be a disaster for this "standard" if additional hardware is required in Freesync monitors to resolve this issue. Seems that may be the case since Nvidia states in the PCPer article:

NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

Edit: Ghosting Concerns: http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion

FreeSync is doing the right things and is headed in the right direction, but it can’t claim to offer the same experience as G-Sync. Yet.
 
Last edited:
From the linked review above...



It's a limitation of the monitor and not Adaptive Sync (FreeSync). It can go as low as 9 Hz but is them apparently limited to 60 Hz at the top. Author speculates that most monitor manufacturer's will eventually opt for the 21-144 Hz option.

Regards,
SB

Ah fair enough, that's fine then. 21-144 Hz is more than enough, I really don't think having vsync turned off if your running at less than 21 fps should be top of your list of concerns!

Well done to AMD, this seems like a much more preferable solution to Gsync. It's just a shame I'm currently locked into NVvia my 3Dvision monitor. That tech seems to be dying a rapid death though tbh, perhaps once VR comes along I can trade it in for an adaptive refresh model. I quite like the look of those 21:9 monitors but I'm a bit concerned about game support for that non-standard resolution.

EDIT: hmm, after reading the most recent posts about ghosting it seems I may have been a little hasty in my conclusion. If that's true across the board for Free-Sync then it would seem that Gsysc is indeed the better solution. Oh well, at last Gsync will still come done in price.
 
Last edited:
The 40hz diplay rate for low framerates is quite interesting..
Basically we get monitor which changes image every 25ms, a 9ms difference to 60hz monitor.

V-sync judder and especially tearing might be quite visible.
Also possible v-synced framerates are.
40, 20, 13.3 , 10, 8, 6.6, 5.7 ...
 
Can't they make the base rate variable? Like if you need 39Hz, it would change the base to 78Hz? Or maybe it needs additional hardware to understand the low than 40fps? How is this thing works exactly?
 
The 40hz diplay rate for low framerates is quite interesting..
Basically we get monitor which changes image every 25ms, a 9ms difference to 60hz monitor.

V-sync judder and especially tearing might be quite visible.
Also possible v-synced framerates are.
40, 20, 13.3 , 10, 8, 6.6, 5.7 ...
Yeah it is amusing. After all the talk about two way GPU-Monitor handshaking to determine minimum refresh rate, at subminimum refresh rate they ended up with... nothing intelligent. It is worst than a regular monitor.

At 40Hz it is basically:
V-sync on -> No tear but judder, 50% worst than 60hz monitor (25ms vs 16ms)
V-sync off -> Tear AND STILL same amount of judder, due to the large ratio between refresh interval and refresh speed.
 
Yeah it is amusing. After all the talk about two way GPU-Monitor handshaking to determine minimum refresh rate, at subminimum refresh rate they ended up with... nothing intelligent. It is worst than a regular monitor.

At 40Hz it is basically:
V-sync on -> No tear but judder, 50% worst than 60hz monitor (25ms vs 16ms)
V-sync off -> Tear AND STILL same amount of judder, due to the large ratio between refresh interval and refresh speed.

For that one monitor when it is outside of the adaptive sync band, I believe it operates as it would if it was a 144 Hz monitor without adaptive sync. Hence, below 40 Hz you get Vsync at 144 Hz panel refresh or no Vsync at 144 Hz panel refresh. At least they allow you to choose to turn Vsync off if you would rather have tearing than laggy input and/or judder, unlike Gsync.

Regards,
SB
 
Back
Top