AMD demonstrates Freesync, G-sync equivalent?

Looks like even the small time Korean monitor makers are getting into Freesync. A user at overclock.net found a firmware on Wasabi Mango's website that appears to enable adaptive sync (FreeSync) on the UHD550 and UHD420 (55" and 42" 4k monitors using LG AH-IPS panels).

http://www.overclock.net/t/1554580/...s-monitor-what-tests-to-run/520#post_24193708

If it does, that must mean that the scaler/video processor used to enable Adaptive Sync must be cheap as hell.

I've been looking at the 55" version for use with a wall mount. With that screen real estate I'd just replace all 3 of my current monitors with the single one. And if it has FreeSync then I might be ditching Nvidia sooner than I was thinking of. Was going to see if they supported adaptive sync with Pascal before deciding whether to go back to AMD or not.

The 55" goes for 1.2-1.5k USD (under 1K USD in Korea), while the 42" goes for under 1k USD.

Hopefully the Crossover 494k (49" 4k monitor with AH-IPS LG panel) gets a similar update as I'd rather get that one. Dots per arc angle would be more similar to my 30" monitor then, considering the wall mount will move it further away.

Regards,
SB
 
Well, some confirmation that Crossover are working on a firmware patch to enable adaptive sync (FreeSync) on their monitors that feature the correct hardware.

http://www.overclock.net/t/1565292/...bit-ips-4k-uhd-dp-60hz-43-49/50#post_24205030

For their 434k (43") and 494k (49") AH-IPS 4K LG panel displays.

A user on the forum also confirmed that FreeSync is working fine on his Wasabi Mango UHD 420 after flashing the firmware.

So, that's at least 2 of the budget Korean manufacturer's (who don't export, you can only get them from Korean sellers on E-bay, Amazon, Newegg, or other outlets) are using the scaler's that support it. That's the nice thing with open standards. It wouldn't surprise me if in a year or two, every monitor sold supported adaptive sync.

So, I'm now trying to decide on taking a chance on either the Wasabi Mango UHD 420, Crossover 434k or 494K. I believe both the WM UHD 420 and Crossover 434K use the same LG panel (it's a 42.5" panel). The UHD 420 is going for 650-1000 USD while the 434K should be cheaper once it's available and more sellers appear.

Regards,
SB
 
Acer XR341CK is a curved 3440x1440 IPS screen with 30-75Hz Freesync support.

Acer XR341CK name changed to Predator X34 ...

Acer is released slash offering two 34" ultra-wide gaming screens, the XR341CK and the XR341CKA. These new screens offered 34" curved format screens with a 21:9 aspect ratio and 3440 x 1440 resolution based on a LG AH-IPS panel, boosted refresh rates and a selection of advanced gaming features including AMD FreeSync and NVIDIA G-sync respectively.

These screens were released as 75 Hz screens, however as it now seems Acer decided to change the model number to a simpler "Predator X34" screen and there's a new spec, it will support refresh rates up to 100Hz. Reports suggest that is only available when using G-sync specifically, AMD users and non-G-sync users could only use the screen up to an expected 75Hz.
http://www.guru3d.com/news-story/acer-predator-x34-will-get-100hz-refresh-rate-support.html
 
Crazy Koreans. 65" AH-IPS 4k monitor with 3D and adaptive sync (same controller as the UHD 420 and 550).

http://www.wasabimango.co.kr/bbs/board.php?bo_table=product_1&wr_id=33&sca=65형&sfl=wr_10&stx=

Too big for me, but the passive 3D is intriguing. Tempted to replace my living room TV with this. Doesn't come with a TV tuner (tuner is optional accessory and for Korean stations anyway). And I don't need a tuner anyway. Love my LG TV but the input lag is horrible.

Regards,
SB
 
Excellent news, that should completely redress the balance of power between these two specs. With intel support I'd expect to see most monitors in the future supporting adaptive sync. Although were still several years away from that position yet though.
 
It is very good news ... either Intel or Nvidia needed to support adaptive sync to make it viable. I imagine Nvidia will probably provide support within the next 6 months and look forward to buying a monitor that supports both G-Sync and Adaptive Sync.
 
Last edited:
It is very good news ... either Intel or Nvidia needed to support adaptive sync to make it viable. I imagine Nvidia will probably provide support within the next 6 months and look forward to buying a monitor that supports both G-Sync and Adaptive Sync.

You may have hit the nail on the head there in terms of what will now happen. i.e. people may now start to hold back from purchasing a Gsync monitor on the expectation that NV will ultimately support adaptive sync (either instead of or alongside gsync). That should further speed up Nvidias take up on adaptive sync if sales of Gsync start to decline. Gsync is great tech and kudos to NV for introducing it to the market but an open standard is clearly the way to go here. It's like the Mantle/DX12 situation in reverse.
 
Excellent news, that should completely redress the balance of power between these two specs. With intel support I'd expect to see most monitors in the future supporting adaptive sync. Although were still several years away from that position yet though.

Not as far away as you think. It shouldn't be long before all of Samsung and LG's monitors support adaptive sync. I believe all of the Samsung and LG 4k PC monitors will be supporting it. Asus and Benq are on board. Multiple 2nd tier Korean manufacturers are using either the Realtek or Mstar chips to provide adaptive sync as a free bonus (their monitors have had the chips in them for months now, but only recently started advertising Freesync on their monitors along with providing firmware updates for early buyers of those monitors). With the lower tier manufacturers getting in on adaptive sync, it wouldn't surprise me if there were 30-ish or more Adaptive Sync supporting monitors available around the world.

With Intel support, it should be only a matter of time before the PC OEMs (Dell, HP, etc.) start to support it on their monitors as well.

And then it's only a short jump to adaptive sync over HDMI and potential next generation support in consoles (or maybe even a mid-life console revision to include it).

At this point AMD could go belly up and adaptive sync will survive and thrive.

The chip manufacturer's just need to work on lowering the minimum hz for adaptive sync. Doesn't affect me as I'd kill myself before I ran a game that would hit 30 hz refresh. For me, it'd just be a nice safeguard for the inevitable dips below 60 fps during hectic play.


PCPer said it perfectly in their review of one of the 2nd tier Korean manufacturer's monitors (a 42" 4K AH-IPS display with Freesync with a 42 hz cutoff). You set your game settings for 60 FPS play and adaptive sync provides a nice buffer in case your FPS ever drops below it.

I've purchased their 49" version of that monitor. Just waiting for it to arrive. Now I just need Nvidia to add support for adaptive-sync. Or to get rid of my Nvidia card and get an AMD card again.

Regards,
SB
 
Last edited:
Not as far away as you think. It shouldn't be long before all of Samsung and LG's monitors support adaptive sync. I believe all of the Samsung and LG 4k PC monitors will be supporting it. Asus and Benq are on board. Multiple 2nd tier Korean manufacturers are using either the Realtek or Mstar chips to provide adaptive sync as a free bonus (their monitors have had the chips in them for months now, but only recently started advertising Freesync on their monitors along with providing firmware updates for early buyers of those monitors). With the lower tier manufacturers getting in on adaptive sync, it wouldn't surprise me if there were 30-ish or more Adaptive Sync supporting monitors available around the world.

With Intel support, it should be only a matter of time before the PC OEMs (Dell, HP, etc.) start to support it on their monitors as well.

And then it's only a short jump to adaptive sync over HDMI and potential next generation support in consoles (or maybe even a mid-life console revision to include it).

At this point AMD could go belly up and adaptive sync will survive and thrive.

The chip manufacturer's just need to work on lowering the minimum hz for adaptive sync. Doesn't affect me as I'd kill myself before I ran a game that would hit 30 hz refresh. For me, it'd just be a nice safeguard for the inevitable dips below 60 fps during hectic play.


PCPer said it perfectly in their review of one of the 2nd tier Korean manufacturer's monitors (a 42" 4K AH-IPS display with Freesync with a 42 hz cutoff). You set your game settings for 60 FPS play and adaptive sync provides a nice buffer in case your FPS ever drops below it.

I've purchased their 49" version of that monitor. Just waiting for it to arrive. Now I just need Nvidia to add support for adaptive-sync. Or to get rid of my Nvidia card and get an AMD card again.

Regards,
SB


Im sorry, but can i ask you something ? why have you buy this monitor who support adaptive sync / freesync and seems to be the reason why you have got it, if you dont have an AMD GPU ? .. I say that, because im nearly 99% sure that at a moment Nvidia will support adaptive sync ( they allready do it on the mobile, laptop with eDP ).. but you can maybe wait 1 year more before they do it, maybe even more if they decide to consolidate G-sync first by nearly offer it to panel manufacturers.

Even Intel if the rumors are true, is aiming at support it ( and maybe faster than we could think ).

This said, im really curious about this panel, if you can post your feeling and finds about it..
 
Last edited:
Im sorry, but can i ask you something ? why have you buy this monitor who support adaptive sync / freesync and seems to be the reason why you have got it, if you dont have an AMD GPU ? ..
I believe for Nvidia to add support for Adaptive sync it only takes a driver update. But what I'm not certain is will Intel and Nvidia have their own versions of freesync-type software which will be proprietary?
 
Last edited:
I believe for Nvidia to add support for Adaptive sync it only takes a driver update.

I think same, as it is proven by their laptop variant ( who use adaptive sync from eDP ), butt the politic behind it is another story.. Personally i think this will be a real good move from Nvidia..

As for proprietary question.. no problem, whatever the name, whatever how it work, at the moment it use the adaptative sync standard, this mean every monitor compatible can work with any GPU's brand, and vice versa .

And this is exactly the reason of this standard, whatever is the brand model of your gpu, or any compatible monitor or if you change your gpu brand or for example, use your laptop as source instead of your desktop pc with a gpu from green, red or blue, you could enable adaptive sync on this monitor.
 
Last edited:
Im sorry, but can i ask you something ? why have you buy this monitor who support adaptive sync / freesync and seems to be the reason why you have got it, if you dont have an AMD GPU ? .. I say that, because im nearly 99% sure that at a moment Nvidia will support adaptive sync ( they allready do it on the mobile, laptop with eDP ).. but you can maybe wait 1 year more before they do it, maybe even more if they decide to consolidate G-sync first by nearly offer it to panel manufacturers.

Even Intel if the rumors are true, is aiming at support it ( and maybe faster than we could think ).

This said, im really curious about this panel, if you can post your feeling and finds about it..

I'm not buying it because of adaptive sync. I'm buying it because it's a 4K 49" AH-IPS screen without PWM flicker. I was also considering the 43" versions. Adaptive sync is basically a nice freebie. There is no cost savings for not having it, and there is no equivalent 4K 43" or 49" monitors that don't have PWM flicker that don't also have adaptive sync.

In other words, I can't buy a monitor without adaptive sync that is IPS, 4k, large screen, HDCP 2.2, and PWM flicker free.

In order to take advantage of that freebie, I'll either need Nvidia to add support or get an AMD card again. If I'd been planning to use variable refresh monitors, I never would have gotten an Nvidia card in the first place as I'll never buy a Gsync monitor. I only got it because I wasn't planning on limiting my monitor choices to those with variable refresh (adaptive sync) and it had the best price/performance at the time for my budget range..

Regards,
SB
 
42hz minimum is too high IMO. As a "freebie" then it's nice to have but I wouldn't pay extra for it. In fact you might end up with a worse gaming experience than regular vsync since if your fps ever strays below 42hz (not exactly crazy low) then you're instantly locked to 21fps. Not good at all. Perosonally I think the sweet spot for any type of adaptive sync (including gsync) should be about 25hz.
 
Intel to support Adaptive-Sync (Freesync) in future generations of graphics hardware:
http://techreport.com/news/28865/intel-plans-to-support-vesa-adaptive-sync-displays

Intel will most likely be focused more on mobile displays. Still good move by them.

Crazy Koreans. 65" AH-IPS 4k monitor with 3D and adaptive sync (same controller as the UHD 420 and 550).
Mah gawd, what a beast.

I would like 40" 4K 4:4:4 Freesync [~35fps for min freesync range] with passive 3D for 400-500€. One day...
 
42hz minimum is too high IMO. As a "freebie" then it's nice to have but I wouldn't pay extra for it. In fact you might end up with a worse gaming experience than regular vsync since if your fps ever strays below 42hz (not exactly crazy low) then you're instantly locked to 21fps. Not good at all.
You could always turn off v-sync.
 
In fact you might end up with a worse gaming experience than regular vsync since if your fps ever strays below 42hz (not exactly crazy low) then you're instantly locked to 21fps. Not good at all.
It's strange that the default behavior is to lock vsynch, rather than as (Shifty?) suggested previously, just start duplicating the most recent frame at full clip, to keep latency as low as possible.

You'd think the designers of adaptive synch would have run into this situation, but to change this behavior should only need a GPU driver update I'd think, so it's likely fixable in software.
 
Back
Top