D
Deleted member 13524
Guest
How possible is it to develop an A/V receiver that is compatible with Freesync?
Or even a dedicated external box for Freesync?
Or even a dedicated external box for Freesync?
Interesting, anyway, outside the color change on the cable drive on the stand, the design of the monitor is exactly the same of the PB278Q . https://www.asus.com/Commercial_Monitors_Projectors/PB278Q/
How possible is it to develop an A/V receiver that is compatible with Freesync?
Or even a dedicated external box for Freesync?
If you play the 720p 60 FPS pendulum video at 25% speed, the judder pattern of the update/refresh of the FreeSync monitor can be easily observed:
1, 2, 3, stop, 1, 2, 3, stop...
It is consistent with how a fixed 40 Hz refresh rate monitor would display a 30 FPS source. If it refreshes at 144 Hz, then it would very closely match G-Sync fluidity. But that is not the case here. For comparison, the image below illustrates the frame update pattern on several refresh rate.
So certainly that monitor did not show the frames at the max rate it supports (144 Hz.) Nor did it accept a new frame at any time after refresh (which would imply a variable rate at work.) It only updates at fixed 40 Hz refresh rate. So somebody is lying/misleading, and for sure it isn't the high speed camera, nor the oscilloscope. And some reviewers just convey it to the readers at face value.
How does a display scale, say, a 720p signal to 1080p? How do TV sets perform frame-rate interpolation? How do TVs do de-interlacing?
I heard the theory that faster refresh rates naturally lead to a brighter image. The sharp transition between 40 Hz and 144 Hz would manifest as a marked dimming and brightening, and since games often trip above and below the minimum refresh rate, the result would be very distracting.
This theory seems easy to test with a FreeSync monitor: just write a program that switches quickly between 40 and 144 Hz, and make a video.
When you change your non adaptive sync monitor between 60 and 144 do you notice a brightness change? I've never seen it. Changing between 72 and 144 should be a non issue. And if it is, the monitor should compensate for it.
One of the important benefits of variable refresh rate is decreased lag without tearing. If you start frame doubling at higher refresh rates, you've added lag, because you can't interrupt the refresh cycle without causing a tear.
The "judder pattern" refers to the video. The graph is just an illustration, its code calculated/generatedYou are using the demo or the video ?
http://www.st.com/web/catalog/mmc/F...129?sc=internet/imag_video/product/252129.jspYou can do scaling without buffering the whole scene. Frame interpolation and de-interlacing introduces input lag, and pointless for PC monitors anyway when it can be done cheaply via software.
What kind of conclusion are we supposed to make from this link?
So what?The DDR memory port
The 60 FPS video runs at half the real world speed. So we are looking at 120 FPS effective capture rate here. Should be more than enough to discern frame update pattern between 40 hz and anything more than than 120 hz.Videos simply can't capture these phenomena, can they?
just having slightly different timing on the video changes everything (the video camera isn't synced to display, it doesn't capture it's frames the same time display changes it's frames > results don't match real life)
If in that circumstances the GPU driver can drive the monitor at will -- as AMD claims, then VRR is active by definition; in which case VSync is basically irrelevant.Maybe, the difference from what David Glen said and Pcper:s test is due to whether V-sync is on or off?. If V-synced, it defaults to the monitors maximum refresh rate, if off, it doesn't?
So tos maybe $50+ more to the BOM, on top of the $50 FreeSync premium, to make it just $50 short of greedy NV G-Sync "license" tax? Yet still, does it have enough bandwith to read/write the framebuffer at the same time in case of refresh + update collision? Can it handle VRR overdrive? And who knows
I have no idea why you think the BOM would increase, because chips like these with memory are normally part of the monitor design.So tos maybe $50+ more to the BOM, on top of the $50 FreeSync premium, to make it just $50 short of greedy NV G-Sync "license" tax?
So tos maybe $50+ more to the BOM, on top of the $50 FreeSync premium, to make it just $50 short of greedy NV G-Sync "license" tax? Yet still, does it have enough bandwith to read/write the framebuffer at the same time in case of refresh + update collision? Can it handle VRR overdrive? And who knows.
Because the LG FreeSync one for example, doesn't seem to have full scene frame buffer on board.I have no idea why you think the BOM would increase,
I would hardly call an 'enhanced premium' chip a "normal" part of monitor design.because chips like these with memory are normally part of the monitor design.
And giving yet to come products with projected premium price tag as example doesn't really help it either.
When did PiP start to become a "normal" feature of PC monitors?- Upgraded HDMI Support: With an upgraded HDMI that supports UHD resolutions at a 60Hz refresh rate, 4K content plays smoothly without delay, even when connected to AV devices.
- PIP 2.0: Multitask while you watch a variety of content, thanks to support for even more inputs with Picture-in-Picture 2.0. Resize the second picture to cover up to 25% of the screen and position it anywhere you please, and even control the sound settings with ease
I do not wish you to get a heart attack, so open at your own risk.That chip is likely just 1-3 USD. I'd be surprised if it was as much as 5 USD.
Regards,
SB
I would hardly call an 'enhanced premium' chip a "normal" part of monitor design.