AMD demonstrates Freesync, G-sync equivalent?

How possible is it to develop an A/V receiver that is compatible with Freesync?
Or even a dedicated external box for Freesync?
 
Interesting, anyway, outside the color change on the cable drive on the stand, the design of the monitor is exactly the same of the PB278Q . https://www.asus.com/Commercial_Monitors_Projectors/PB278Q/

And a different panel. That one uses a PLS panel while the AdaptiveSync one uses an IPS panel. Other than that, they do look remarkably similar.

How possible is it to develop an A/V receiver that is compatible with Freesync?
Or even a dedicated external box for Freesync?

Until it is included in the HDMI spec, it'd likely need to have a DP output. I haven't been in the market for an A/V receiver in a couple years now, so I'm not sure if any come with a DP output.

Regards,
SB
 
Interesting stuff.
Look at Tech reports review of that BenQ-freesync monitor; http://techreport.com/review/28073/benq-xl2730z-freesync-monitor-reviewed

"I asked AMD's David Glen, one of the engineers behind FreeSync, about how AMD's variable-refresh scheme handles this same sort of low-FPS scenario. The basic behavior is similar to G-Sync's. If the wait for a new frame exceeds the display's tolerance, Glen said, "we show the frame again, and show it at the max rate the monitor supports." Once the screen has been painted, which presumably takes less than 6.94 ms on a 144Hz display, the monitor should be ready to accept a new frame at any time."

This is not what PC-per found in their in depth test: http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

They used a oscilloscope and found that the monitor refreshed at it's lowest refresh rate when frames was down in the 30 fps region.

What gives?
 
If you play the 720p 60 FPS pendulum video at 25% speed, the judder pattern of the update/refresh of the FreeSync monitor can be easily observed:
1, 2, 3, stop, 1, 2, 3, stop...

It is consistent with how a fixed 40 Hz refresh rate monitor would display a 30 FPS source. If it refreshes at 144 Hz, then it would very closely match G-Sync fluidity. But that is not the case here. For comparison, the image below illustrates the frame update pattern on several refresh rate.

fpDlQsN.jpg


So certainly that monitor did not show the frames at the max rate it supports (144 Hz.) Nor did it accept a new frame at any time after refresh (which would imply a variable rate at work.) It only updates at fixed 40 Hz refresh rate. So somebody is lying/misleading, and for sure it isn't the high speed camera, nor the oscilloscope. And some reviewers just convey it to the readers at face value.
 
If you play the 720p 60 FPS pendulum video at 25% speed, the judder pattern of the update/refresh of the FreeSync monitor can be easily observed:
1, 2, 3, stop, 1, 2, 3, stop...

It is consistent with how a fixed 40 Hz refresh rate monitor would display a 30 FPS source. If it refreshes at 144 Hz, then it would very closely match G-Sync fluidity. But that is not the case here. For comparison, the image below illustrates the frame update pattern on several refresh rate.

fpDlQsN.jpg


So certainly that monitor did not show the frames at the max rate it supports (144 Hz.) Nor did it accept a new frame at any time after refresh (which would imply a variable rate at work.) It only updates at fixed 40 Hz refresh rate. So somebody is lying/misleading, and for sure it isn't the high speed camera, nor the oscilloscope. And some reviewers just convey it to the readers at face value.

You are using the demo or the video ?
 
How does a display scale, say, a 720p signal to 1080p? How do TV sets perform frame-rate interpolation? How do TVs do de-interlacing?

You can do scaling without buffering the whole scene. Frame interpolation and de-interlacing introduces input lag, and pointless for PC monitors anyway when it can be done cheaply via software.

I heard the theory that faster refresh rates naturally lead to a brighter image. The sharp transition between 40 Hz and 144 Hz would manifest as a marked dimming and brightening, and since games often trip above and below the minimum refresh rate, the result would be very distracting.

This theory seems easy to test with a FreeSync monitor: just write a program that switches quickly between 40 and 144 Hz, and make a video.

On PCPer video, the scope shows that the amplitude get smaller as refresh rate up. Indicates that the monitor does compensate the brightness change.

When you change your non adaptive sync monitor between 60 and 144 do you notice a brightness change? I've never seen it. Changing between 72 and 144 should be a non issue. And if it is, the monitor should compensate for it.

While the monitor can and does compensate the brightness change, it can't compensate the lost of contrast range. At higher refresh rate, colors become washed out.

One of the important benefits of variable refresh rate is decreased lag without tearing. If you start frame doubling at higher refresh rates, you've added lag, because you can't interrupt the refresh cycle without causing a tear.

In the case of LCD monitor, I think collision will cause flicker in the long run, not tear. Assuming that the self resfresh is performed by the monitor instead of the GPU.
 
Videos simply can't capture these phenomena, can they?
just having slightly different timing on the video changes everything (the video camera isn't synced to display, it doesn't capture it's frames the same time display changes it's frames > results don't match real life)
 
Maybe, the difference from what David Glen said and Pcper:s test is due to whether V-sync is on or off?. If V-synced, it defaults to the monitors maximum refresh rate, if off, it doesn't?
 
The theory behind video scaling is trivial: you're applying an FIR filter over the input pixels both horizontally and vertically.
The latency of the scaler is determined by the number of taps of the FIR filter. If you have 11 taps (which is be plenty for good quality scaling), you have 10 lines of latency. So you're talking order of magnitude of 60us or so (depending on refresh rate and resolution.) When you're not scaling (because the input is native resolution), that 60us doesn't even have to be there: just set the first (instead of the center) tap to 1.0 and all the others to 0.0 and you're good to go.
 
Last edited:
The DDR memory port :oops::rolleyes:
So what?

look at the spec that you linked:
- SOC: where do they store the code that runs it?
- OSD with bitmaps etc: where are those stored?
- advanced overdrive for 3D: where do they store that?
- PIP support: where do you store the secondary video images?

There: 4 reasons to use DDR. None of them related to scaling. None of them inherently requiring adding latency on the primary input.

You point to a spec and somehow we need to make an educated guess about what you mean? I still don't know what you're trying to get across, TBH.
 
Last edited:
Videos simply can't capture these phenomena, can they?
just having slightly different timing on the video changes everything (the video camera isn't synced to display, it doesn't capture it's frames the same time display changes it's frames > results don't match real life)
The 60 FPS video runs at half the real world speed. So we are looking at 120 FPS effective capture rate here. Should be more than enough to discern frame update pattern between 40 hz and anything more than than 120 hz.

If they release the 240 FPS video, it'll be even better -- more ways to analyze.

Maybe, the difference from what David Glen said and Pcper:s test is due to whether V-sync is on or off?. If V-synced, it defaults to the monitors maximum refresh rate, if off, it doesn't?
If in that circumstances the GPU driver can drive the monitor at will -- as AMD claims, then VRR is active by definition; in which case VSync is basically irrelevant.

So tos maybe $50+ more to the BOM, on top of the $50 FreeSync premium, to make it just $50 short of greedy NV G-Sync "license" tax? Yet still, does it have enough bandwith to read/write the framebuffer at the same time in case of refresh + update collision? Can it handle VRR overdrive? And who knows

If the answer is yes, AMD's partners may be indeed a bunch of incompetent fools. And there is this thread from last year that is more focused to discuss the matter.
 
So tos maybe $50+ more to the BOM, on top of the $50 FreeSync premium, to make it just $50 short of greedy NV G-Sync "license" tax?
I have no idea why you think the BOM would increase, because chips like these with memory are normally part of the monitor design.

https://www.overclockers.co.uk/showproduct.php?prodid=MO-217-SA&groupid=17&catid=948

https://www.overclockers.co.uk/showproduct.php?prodid=MO-213-SA&groupid=17&catid=948

Features that are going to need memory:

- Upgraded HDMI Support: With an upgraded HDMI that supports UHD resolutions at a 60Hz refresh rate, 4K content plays smoothly without delay, even when connected to AV devices.
- PIP 2.0: Multitask while you watch a variety of content, thanks to support for even more inputs with Picture-in-Picture 2.0. Resize the second picture to cover up to 25% of the screen and position it anywhere you please, and even control the sound settings with ease

Not to mention that most G-Sync displays do not support HDMI or scaling. So despite the lowered BOM because of the less-than-premium feature set and dearth of ports, they still incur a huge price increase.
 
So tos maybe $50+ more to the BOM, on top of the $50 FreeSync premium, to make it just $50 short of greedy NV G-Sync "license" tax? Yet still, does it have enough bandwith to read/write the framebuffer at the same time in case of refresh + update collision? Can it handle VRR overdrive? And who knows.

That chip is likely just 1-3 USD. I'd be surprised if it was as much as 5 USD.

Regards,
SB
 
I have no idea why you think the BOM would increase,
Because the LG FreeSync one for example, doesn't seem to have full scene frame buffer on board.

because chips like these with memory are normally part of the monitor design.
I would hardly call an 'enhanced premium' chip a "normal" part of monitor design.

And giving yet to come products with projected premium price tag as example doesn't really help it either.

- Upgraded HDMI Support: With an upgraded HDMI that supports UHD resolutions at a 60Hz refresh rate, 4K content plays smoothly without delay, even when connected to AV devices.
- PIP 2.0: Multitask while you watch a variety of content, thanks to support for even more inputs with Picture-in-Picture 2.0. Resize the second picture to cover up to 25% of the screen and position it anywhere you please, and even control the sound settings with ease
When did PiP start to become a "normal" feature of PC monitors?
And I still don't see some kind of self refresh functionality in the features list.

That chip is likely just 1-3 USD. I'd be surprised if it was as much as 5 USD.

Regards,
SB
I do not wish you to get a heart attack, so open at your own risk.
SDkl0hD.png
 
Back
Top