AMD demonstrates Freesync, G-sync equivalent?

Discussion in 'Rendering Technology and APIs' started by Kaotik, Jan 6, 2014.

  1. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,850
    Likes Received:
    2,772
    Location:
    Finland
  2. cal_guy

    Newcomer

    Joined:
    Jun 27, 2008
    Messages:
    216
    Likes Received:
    2
    Considering that current implementations of G-Sync require 768MB of DRAM I'll say G-Sync is as good as dead now even if Freesync is substantially inferior.
     
  3. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,201
    Likes Received:
    2,143
    Go AMD go! I really hope they will manage to push this tech to the market soon.
     
  4. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,345
    Likes Received:
    3,851
    Location:
    Well within 3d
    There is a VESA standard for variable VBLANK, which AMD is using for Freesync.
    Wouldn't Nvidia have been aware of this while conceiving of or developing G-Sync?
    If it was aware, what's the catch that would make Nvidia aim for custom hardware and larger on-board memory?
     
  5. Wynix

    Veteran Regular

    Joined:
    Feb 23, 2013
    Messages:
    1,052
    Likes Received:
    57
    This is great; My question now is what monitors support the VBLANK standard?
    Nvidia will need to prove G-Sync's worth now(if FreeSync is comparable).
     
  6. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    I could be wrong, but Vblank is still used for LCD as a standard.. ( since CRT time, but have been keeped on LCD monitors )
     
    #6 lanek, Jan 6, 2014
    Last edited by a moderator: Jan 6, 2014
  7. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,850
    Likes Received:
    2,772
    Location:
    Finland
    To be more precise, it needs to support variable VBLANK, not just VBLANK (at least that's how I understood it)
     
  8. Psycho

    Regular

    Joined:
    Jun 7, 2008
    Messages:
    745
    Likes Received:
    39
    Location:
    Copenhagen
    Damn.. while I really liked the idea of variable sync, and something being done about this long standing problem, the g-sync implementation of it just uneccesary expensive and generally wrong in so many ways.

    But I could imagine it's only working out of the box (ie on existing devices) on laptops / other integrated (and power saving aware) displays, not through hdmi/displayport.
     
  9. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
  10. madyasiwi

    Newcomer

    Joined:
    Oct 7, 2008
    Messages:
    194
    Likes Received:
    32
    I think they do. VESA standard that deals with variable frame refresh is embedded display port (eDP), but according to its spec eDP "was developed to be used specifically in embedded display applications." Other VESA standard, Direct Drive Monitor, is said to be able to workaround this eDP limitation. But whether the combination of both standard is feasibly doable, or whether it would end up any different that NVidia G-Sync is still a big question at this point. Competition is always welcome though.

    NVidia G-Sync is more likely an evolution of genlock technology. They even had genlock add in board with the GSync brand at some point way back in FX 4000 days here.
     
  11. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    Either to circumvent some patent bobbing around out there, or else just to lock people in on a vendor-specific piece of tech that NV can overcharge for to buff revenue.

    Like I have always said with all of NV's proprietary shit, we can do better without that crap. Proprietary only ever leads to headache in the PC space, ever. It's always been universally true. Always.

    (Now someone will drag up some successful examples of proprietary to thump me in the head - lol. Windows itself would be a prime example I suppose! :D)
     
    digitalwanderer likes this.
  12. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,247
    Likes Received:
    3,447
    Apparently it is not like G.Sync at all, as it requires V.Sync to be active which means additional lag and the potential for cutting the number of frames to half.

    Also In AMD's assessment, NVIDIA is doing it with a combination of variable refresh rate and triple buffering , which is not accurate as that would add even more lag. The AMD representative had another theory later on, but the whole thing doesn't appear to be well studied by AMD yet.



    http://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech
     
    digitalwanderer likes this.
  13. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    Triple buffering doesn't have to add any lag at all if you only use the third buffer if you miss a vsynch (and in the event of a miss you actually lower lag compared to double buffering with vsynch on. :))
     
    digitalwanderer likes this.
  14. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    With GSYNC the display update is of course also synced to the buffer flip. That's the whole point (to sync the display to the buffer flip and to have a variable spacing between display updates to do that) ;). So no, that would be no difference.

    Edit:
    And the "triple buffering" pertains to the external GSYNC board (which has and FPGA and 768MB of memory for some reason). I think his theory was that nV GPUs don't support the variable refresh rate out of the box and need that external board to construct the input data for the panel (for which one or even more frames get stored in that onboard memory). I have actually no better idea why nV needs such an expensive board.
     
    #14 Gipsel, Jan 7, 2014
    Last edited by a moderator: Jan 7, 2014
    digitalwanderer likes this.
  15. Psycho

    Regular

    Joined:
    Jun 7, 2008
    Messages:
    745
    Likes Received:
    39
    Location:
    Copenhagen
    Isn't the "third buffer" in nvidias solution just at the monitor end of the display cable? At least from scott's description it sounds like the needed backup buffer for the case where the panel demands a refresh due to fading.
    And if you don't want tearing you need to have vsync active one way or another - ie at some point you choose to start a (panel) refresh, and then you have to wait for that to finish before starting the next one, aka vsync. But as long as you can delay the next refresh you're not "cutting the number of frames in half", just because you missed the start-refresh window by a ms.
     
  16. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    In that case you could simply send the same buffer from the GPU again. It's basically just the maximum vblank period the panel supports (which it can report to the GPU/driver). One doesn't need an additional buffer in the monitor for that.
     
  17. Psycho

    Regular

    Joined:
    Jun 7, 2008
    Messages:
    745
    Likes Received:
    39
    Location:
    Copenhagen
    But what is the large buffer in the gsync module doing then? Receiving the frame before the panel refresh? that also seems uneccesary (unless the gpu's display output isn't capable of the variable refresh which would seem strange.

    But yes, you're right that 2 buffers should be enough for tear free operation and optimal sync. 3 buffers would allow starting on the next frame earlier (ie without waiting on an ongoing refresh to finish), but unless the minimum refresh rate is close (like 30hz) to the maximum it would not be of much benefit.

    Any idea of the minimum refresh rate for the panels?
     
  18. madyasiwi

    Newcomer

    Joined:
    Oct 7, 2008
    Messages:
    194
    Likes Received:
    32
    You can't, because flipping can happen anytime during image transfer and ended up with tearing. The idea of G-Sync is -- it is the card that should be driving the monitor, not the other way around.


    Edit:
    Also what is the point of demoing variable frame rate using a scene that does not change its content at all (thus producing a constant fps.) Does not look like AMD is really proofing anything here?

    Say the demo runs at constant 20 fps, laptop A set to refresh at 30 Hz and laptop B set to refresh at 60 Hz then the later will be smoother without the need for variable refresh rate.
     
    #18 madyasiwi, Jan 7, 2014
    Last edited by a moderator: Jan 7, 2014
  19. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    Exactly what I said and also the theory of that AMD guy as I understood it. I don't grasp what nV does with an expensive FPGA and 768MB RAM there. It would be completely unnecessary just for GSYNC/variable vblank if the GPU would support a variable vblank output directly.
    In that case you simply delay the buffer flip just as with vsync. The same happens with gsync above the maximum refresh rate of the panel, btw. (this happens if a frametime is shorter than the minimum refresh interval of the panel, so above lets say 120 or 144Hz, otherwise the transfer of the frame is already complete before the next buffer flip). And if you don't like that minimal additional delay (1/144Hz <= 7ms because of the retransfer of the frame, it won't cut the framerate in half ;)) when dropping below 30Hz or 24Hz framerates (whatever the panel can do as minimum), just go triple buffer. But you don't want to go in that low framerate territory anyway. And ask yourself what gsync does if the framerate drops below 30Hz (the stated minimum). It has exactly the same problem that it needs to refresh the panel and can't accept a new frame during that time.
     
    #19 Gipsel, Jan 7, 2014
    Last edited by a moderator: Jan 7, 2014
  20. madyasiwi

    Newcomer

    Joined:
    Oct 7, 2008
    Messages:
    194
    Likes Received:
    32
    Ok, scrap what I said. Now I think its actually simpler than that. You can't do that because in case of G-Sync, the video card only keep the back buffer. And there can't be any VSync induced delay because that is one of the main selling points of G-Sync; to eliminate VSync input lag. And when the frame rate is higher than max panel refresh rate; the latest frame image is stored in the G-Sync memory module. No delay necessary.

    I don't see how this zero tearing and zero VSync delay is simultaneously achievable in any other way.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...