Nvidia G-SYNC

Discussion in 'Rendering Technology and APIs' started by DSC, Oct 18, 2013.

  1. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
    Well you have to admit that G-Sync is actually a pretty good name to describe the technology :D

    Anyway, at a high level, the fundamental idea behind NVIDIA's G-Sync is pretty simple and straightforward (ie. instead of having the monitor drive timing to the GPU, the GPU drives timing to the monitor). This basic idea has probably been thought about and expressed for many years now (NVIDIA claims that they have been working on G-Sync technology for some years now). The difficulty has been in actually implementing the idea. What spiked_mistborn suggested was to use a framebuffer in the monitor, but he never actually suggested how to conveniently do that.

    Edit: the name "G-Sync" is actually not new at all. NVIDIA has been using a Quadro graphics technology called "G-Sync" for some years now. This newer G-Sync monitor technology is obviously much different in comparison, and it appears that NVIDIA has a trademark on the name.
     
    #181 ams, Oct 25, 2013
    Last edited by a moderator: Oct 25, 2013
  2. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,057
    Likes Received:
    3,114
    Location:
    New York
    Probably because nobody thinks engineers with PhD's get their ideas from random forum posts on the internet :)
     
  4. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    The funny thing is only that he proposed the GSync name. Otherwise, such discussions (detailing the exact idea of how it works with transfers only occuring with a new rendered frame) can be found several years back. That idea is kind of obvious and actually quite old.
     
  5. DuckThor Evil

    Legend

    Joined:
    Jul 9, 2004
    Messages:
    5,995
    Likes Received:
    1,062
    Location:
    Finland
    This G-Sync thing is no joke...This brings some pretty major advancements to displays. It will even bring better version of the lightboost 2D hack, but they'll talk about that later. Near zero motion blur on a LCD display.

    Stuff about the lightboost hack.
    http://www.blurbusters.com/zero-motion-blur/lightboost-faq/
     
  6. imaxx

    Newcomer

    Joined:
    Mar 9, 2012
    Messages:
    131
    Likes Received:
    1
    Location:
    cracks
    In "Psychological Types" of Jung, there is an interesting pass that I still remember and love:
    it was the difference between Schopenhauer's theory of the world as 'illusion' and the speech of a mad guys recorded years before him, saying that the world does not exist etc.

    ...his answer was: the former made a full blown system about it, the latter was just chatting without any real work on.

    (btw this is why I *hate* software patents, like Abrash was commenting about ~20 years ago on DDJ about...)
     
  7. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    It's worth noting, if we talk about Lightboost, that it's an either/or thing. G-Sync or Lightboost. The monitor supports both but not at the same time as Lightboost relies on a regular refresh, at least if you want to implement it easily.

    In fact you can use a G-Sync monitor on an AMD card, the G feature is wasted but Lightboost is doable.

    Improvements are promised for Lightboost, but I suppose it first focuses on brightness, color, gamma accuracy?, maybe 144Hz operation.
     
  8. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    It should be relatively simple to implement a continous blend from backlight strobing (at high frame rates) to almost continuous backlight (at low frame rates). That would mean the LCD typical hold-type blur just grows continously towards lower frame rates.

    edit:
    It would work like that:
    (i) backlight off, pixel matrix is getting new values
    (ii) backlight strobe
    (iii) as long as there is no new frame, turn on backlight with the average brightness

    That way one wouldn't need to know beforehand how long a frame will be displayed, it can be extended indefinitely (as long as the pixels can hold their value) because it is not working with a predermined duty cycle for the backlight.
     
    #188 Gipsel, Oct 26, 2013
    Last edited by a moderator: Oct 26, 2013
  9. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,563
    Likes Received:
    171
    Location:
    In the Island of Sodor, where the steam trains lie
  10. Priyadarshi

    Newcomer

    Joined:
    Sep 22, 2012
    Messages:
    57
    Likes Received:
    0
    Location:
    USA
    Nvidia demoed this at our uni a few weeks ago and it looked pretty sweet in motion. Variable frame rates still cause stutter though ;)

    What we really need to fix this problem in software. Game engines should try to predict the scene complexity in the next frame and adjust effects accordingly. Similar to Netflix which is far better than youtube anyday.
     
  11. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    If the framerate varies wildly, sure. But drops from 60Hz to 55 won't cause all the weird problems that it causes now.

    As for predicting the complexity of the next frame, that must be near impossible.
     
  12. Priyadarshi

    Newcomer

    Joined:
    Sep 22, 2012
    Messages:
    57
    Likes Received:
    0
    Location:
    USA
    There is a huge difference between "exactly predicting what the next frame would look like" and "what the next frame might look like". Sometimes even simple extrapolation of motion vectors would give you the idea about what you might be looking at.

    Even for source engine games the frame rate is all round the place from 50-300.
     
  13. Andrew Lauritzen

    Andrew Lauritzen Moderator
    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,629
    Likes Received:
    1,227
    Location:
    British Columbia, Canada
    I agree that this doesn't excuse developers from choosing techniques and optimizing engines for smooth frame generation, but it does solve the problem that there's a cliff at 16.6ms and if you miss it - even by a single microsecond - it looks like trash. That visual discontinuity needs to go away and I'm glad NVIDIA has done the legwork to make that happen.
     
  14. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    Guru3D

    G-sync have been designed for a good part in collaboration with Asus... I can imagine they want some profit from it.
     
    #194 lanek, Nov 2, 2013
    Last edited by a moderator: Nov 2, 2013
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,057
    Likes Received:
    3,114
    Location:
    New York
    I was planning to upgrade to a 2560x1440 27" monitor but don't know if that's wise at this point. From all accounts g-sync really makes a difference. Asus exclusivity sucks and it's not clear whether the likes of Dell will even bother. Maybe I'll just go for the 2713hm anyway and upgrade again if gsync gains traction.
     
  16. Npl

    Npl
    Veteran

    Joined:
    Dec 19, 2004
    Messages:
    1,905
    Likes Received:
    7
    I`d say go for it now. G-Sync main impact is to unsettle the public enough and force a common standard to emerge.
    I guess in around 1.5 - 2 years time you`ll see monitors supporting a good standardized protocoll - and quite possible not G-Sync anymore
     
  17. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,330
    you say go for it now, and you back that up with a statement saying its better to wait
    your thought process is very strange...
     
  18. Npl

    Npl
    Veteran

    Joined:
    Dec 19, 2004
    Messages:
    1,905
    Likes Received:
    7
    only depends on how long you want to wait.
    Something better is always around the corner, if he want a new monitor now then theres no reason to wait for a g-sync enabled one, since this seems to be a transitional step at a premium price and not a good long term investment.

    That better?
     
  19. The first gsync monitors will probably use TN panels to achieve the high refresh rates.

    I'll stick to my old 24" until all the dust surrounding stereo 3D, OLED, gsync, etc. etc. settles a bit.
     
  20. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    I've decided to purchase a G-sync monitor as soon as they are available. I am very sensitive to latency and tearing; this will transform my gaming experience.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...