AMD demonstrates Freesync, G-sync equivalent?

Discussion in 'Rendering Technology and APIs' started by Kaotik, Jan 6, 2014.

  1. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,889
    Likes Received:
    4,536
  2. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,423
    Likes Received:
    10,316
    Interesting prior to the release of Freesync Monitors, Scott Wasson at Tech Report was fairly skeptical about Freesync but was willing to give it a fair shake when he could finally test one.



    With regards to the under 40 hz performance of the Benq monitor? It's maybe barely worse than how Nvidia handles things when it goes below the variable refresh rate cut-off.

    The short vlog version of it...

    http://techreport.com/news/28116/here-our-discussion-of-freesync-g-sync-and-the-benq-xl2730z

    And the in depth testing with the monitor as well as asking both Nvidia and AMD how they handle things.

    http://techreport.com/review/28073/benq-xl2730z-freesync-monitor-reviewed/3

    So, in other words, you'll only see the "better" way that Gsync handles this if you use an extremely contrived demo where the framerate of the demo is locked to a single static number. In games where the framerate is going to be variable from one frame to the next when below the variable framerate lower bounds? It basically offers the same experience as a Gsync monitor.

    And he also talks about the "ghosting" issue. Basically it's a non-issue with the tech, and is going to depends purely on monitor makers. Nvidia with their module do the tuning for the monitor manufacturers. Only fair as they are paying a premium to Nvidia for the Gsync module.

    And the antiblur strobing doesn't work with variable refresh for Benq just like it doesn't work for the Asus Gsync monitor.

    Which explains why it is disabled when variable refresh is enabled. As with all things internet, so much to do about nothing.

    Regards,
    SB
     
    BRiT likes this.
  3. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,889
    Likes Received:
    4,536
     
  4. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,889
    Likes Received:
    4,536
    Well, at least someone is asking the right questions ... :lol:

    [​IMG]







     
  5. STaR GaZeR

    Newcomer

    Joined:
    Dec 10, 2011
    Messages:
    16
    Likes Received:
    0
    What a load of BS. Proper overdrive is 100% on the monitor's side. It has nothing to do with anything else.
     
  6. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    AMD stated that FreeSync is more than just Adaptive Sync, and that monitors don't automatically deserve the FreeSync badge. In other words: it reaches beyond the cable and goes inside the monitor. You may think it's BS that they are now called out for it, but it's something of their own making, and their own, lower, standards.
     
  7. Rurouni

    Veteran

    Joined:
    Sep 30, 2008
    Messages:
    1,101
    Likes Received:
    432
    Is the ghosting issue (lack of overdrive?) on freesync really a problem outside of the windmill demo? Do people using it for gaming will notice it?
     
  8. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    If the FreeSync decal doesn't adhere to the bezel, the monitor is given only probationary approval...
     
  9. STaR GaZeR

    Newcomer

    Joined:
    Dec 10, 2011
    Messages:
    16
    Likes Received:
    0
    It doesn't matter what X or Y company representative said. Overdrive is part of the monitor electronics, period.
     
  10. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    It's depressing that on a graphics forum I have to link this, since people can't do the basic research themselves, but there it is:

    http://www.tftcentral.co.uk/advancedcontent.htm

    Response time compensation is an analogue conditioning of the signal fed to the LCD matrix.
     
    Lightman likes this.
  11. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    I don't know. But if it's something that's only exposed by that demo, then what does it say about the understanding of the issue by those who wrote it? :wink:

    That's nice, dear. I've never claimed otherwise. Overdrive is also something that impacts the visual experience of a demo that's supposed to highlight the very technology it's supposed to promote.
     
  12. STaR GaZeR

    Newcomer

    Joined:
    Dec 10, 2011
    Messages:
    16
    Likes Received:
    0
    You said:

    Both bolded parts are just wrong, "dear". The first because adaptive sync is part of the DP standard (i.e. the cable and the signal that goes thru it), and AMD is doing nothing outside the standard in the cable or in the signal that goes thru it. The only thing they can do is frame duplication and similar tricks before the signal enters the cable (i.e. the video content itself), and that has nothing to do with the DP standard, nor with the adaptive sync part of it.

    The second one is just a wild assumption of yours. It doesn't make any sense, because they'd need to have some kind of propietary protocol to communicate some magic sauce to the monitor.
     
  13. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Yes, yes. That's what the Adaptive Sync standard is.

    They bless a monitor with a FreeSync label if it satisfies their quality requirements. That doesn't mean the GPU has influence over the internals of the monitor. It doesn't mean that the GPU needs to signal something magic to the monitor. It's a label of quality. And dealing with ghosting is apparently not part of that quality assessment.
     
    nutball likes this.
  14. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    Why do you say that?

    If you have HDMI in your monitor, it's likely it has memory (because scaling and de-interlacing are required on progressive scan monitors) Low quality scaling and de-interlacing are possible without memory - emphasis on "low quality". I merely linked a document that showed memory is a real thing inside monitors.

    Scalers are normally part of monitors. Those that are memory based seem to add something like 10-20ms of latency. The majority of monitors with lag results tested show this approximate value of lag, implying they're memory-based. The reference for latency is usually a CRT.

    Screens like the Dell P2714H

    http://www.tftcentral.co.uk/reviews/dell_p2714h.htm

    don't seem to have a framebuffer based scaler (or they have a super-fast framebuffer scaling algorithm).

    The G-Sync based Acer XB270HU obviously does have a framebuffer, but we don't know whether that's used for scaling. With G-Synch off it has very low lag:

    http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm

    But the lag increases, "a few milliseconds at most", when G-Sync is turned on (which could imply that it doesn't use the framebuffer for scaling or it has a super-fast scaling algorithm or that the scaler is only active in non-native resolutions - most monitors have the scaler permanently active).

    Tech Report showed that lag is the same on the G-Sync and the Adaptive Sync monitors it compared. That might indicate that the Adaptive Sync monitor has no scaler (very doubtful), or that it has a framebuffer-less scaler or it falls back to framebuffer-less scaling when Adaptive Sync is turned on (to reduce lag) or that the scaler is disengaged (again, to reduce lag). Can't tell. If the latter, then it would seem that a whole load of monitor functionality is turned off while in Adaptive Sync mode, presumably due to bypassing the components that do things like scaling and response-time compensation. The chip I linked earlier does RTC, so it's likely that there are other cases where scaling and RTC are within a single chip.

    Complete guess: BenQ (and the other monitor makers?) chose to do "gaming mode with Adaptive Sync" by bypassing the chip that does framebuffer scaling and RTC (or the bit of the chip that does those things?). If they're all using the same chip with Adaptive Sync support and that's the way that chip works, then whoops.

    Apart from that, their thought process seems to be "we can't do variable overdrive RTC, which is the ideal approach for framerates between 40 and 144fps, so we'll just turn it off. At 144Hz, RTC isn't that important." Of course at lower frame rates RTC becomes more important as ghosting becomes more obvious.

    It's pretty stupid to turn off RTC just because Adaptive Sync is on. Maybe the chip works this way, so the monitor makers are forced to use this compromise?

    I listed those to simply demonstrate that non-G-Sync monitors have memory.

    Some time after 2007?

    http://www.hdtvsolutions.com/BenQ_FP241WZ_LCD_Monitor_Review.htm

    Neither do I. I merely suggested that since memory is frequently a part of monitors, it is possible :wink:
     
  15. Rurouni

    Veteran

    Joined:
    Sep 30, 2008
    Messages:
    1,101
    Likes Received:
    432
    You're giving the Freesync label a bit too much weight. It is probably simply a label given if the monitor had already been tested with AMD "freesync" and they adaptive sync works properly, thus the label. It's up to the monitor manufacturer to implement whatever technique to reduce ghosting. Maybe in the next BenQ freesync monitor, they will add a feature called SyncOverdrive and charge a premium for it.
     
  16. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,889
    Likes Received:
    4,536
    The responsibility lies with whatever is causing the issue, and in this case it's FreeSync. Without or disabling FreeSync there is no ghosting problem.

    https://translate.google.com/transl...sl=pl&tl=en&u=http://pclab.pl/art62755-4.html

    In this case it is no different than G-Sync and no longer a free open Vesa standard but a proprietary one targeted at supporting one graphics manufacturer. The monitor manufacturers have no obligation to support broken standards that are not "plug and play".
     
    #436 pharma, Apr 18, 2015
    Last edited: Apr 18, 2015
  17. madyasiwi

    Newcomer

    Joined:
    Oct 7, 2008
    Messages:
    194
    Likes Received:
    32
    AMD seems to disagree;

    http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/05/29/what-is-project-freesync

    Said series at least, only do bob de-interlacing. So basically just scaling.

    Mere existence does not automatically make something normal to me.

    Having memory enough to store a few lines and having memory enough to store at least a couple frames -- plus all the necessary complexity required for G-Sync like capabilities -- makes a lot of difference for a business where economics of scale is an important aspect. I don't think companies like Novatek, Realtek or MStar will bother design and producing some scaler where the potential user is limited to small portion of 290/290X owners.
     
  18. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    I have no idea what you're talking about :???:
     
  19. madyasiwi

    Newcomer

    Joined:
    Oct 7, 2008
    Messages:
    194
    Likes Received:
    32
    In the midst of opposing reactions over manufacturer response which stated that (VRR) overdrive functionality is part of FreeSync architecture, I took your post posing as an argument that it is not the case.
     
  20. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    There are listed as doing scaler for DP 1.2a, DP1.3 etc .. and so adaptive sync.. whatever is the number of user of AMD gpu's is not in question, basically in 2016 every monitor should support adaptive sync in a way or the other. If they support Displayport standard.
     
    #440 lanek, Apr 19, 2015
    Last edited: Apr 19, 2015
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...