Nvidia G-SYNC

Discussion in 'Rendering Technology and APIs' started by DSC, Oct 18, 2013.

  1. Rodéric

    Rodéric a.k.a. Ingenu
    Moderator Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,080
    Likes Received:
    997
    Location:
    Planet Earth.
    I think they are 144Hz top atm.
    Honestly if you go faster, just throttle your game ;p
    (Well or drop frames or whatever...)
     
  2. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    It works exactly as vsync there, it locks to the (maximum) refresh rate.
     
  3. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,236
    Likes Received:
    4,259
    Location:
    Guess...
    Thanks everyone. So the higher your monitor refresh rate the better since it gives you more head room for increased frame rates but you never have to worry about going over that refresh rate and tearing since the the GPU would wait for the next monitor refresh to send the next frame once you hit the ceiling.

    So very similar to adaptive vsync with the advantage that below the monitors max refresh rate you'll see no tearing. Apparently stuttering is also experienced below the refresh rate with vsync off but this isn't something I've personally noticed (playing all games with vsync off on a 120hz monitor).
     
  4. Has anyone commented on what perks (or limitations?) this will bring to SLI?
     
  5. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    SLI's microsttuter is gone. Other than that, I can't see any impacts off-the-cuff.
     
  6. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Shouldn't it just display SLI stutter with high fidelity, with no tearing or futher induced stutter? Or that sentence can be ignored, as I don't account for SLI frame pacing.
     
  7. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    gone due to frame pacing or g.sync?
     
  8. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    The differing latencies for alternate frames in SLI are gone. There will be no tearing with gsync, ever.

    The driver doesn't need to artificially delay half the frames anymore.
     
  9. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Well, GPU B releases a frame at t0 = 0 ms, GPU A releases a frame at time t1=15ms, GPU B releases a frame at t2=30, then GPU A has a hard, long frame to render : it goes out at t3=57 but GPU B puts out its frame at t4=62.. Now if every frame takes 30ms again from now on, aren't frames from GPU A and B virtually locked at 5ms from each other?
    That'll give you a jerky and slow picture while the framerate counter is happy showing a big number.

    That's what I understand from the situation.. G-Sync alone gives excellent, tear-free jumpy stuttered framerate.
     
    #169 Blazkowicz, Oct 24, 2013
    Last edited by a moderator: Oct 24, 2013
  10. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    After the first spike, the fps counter stays stable at 33fps.

    As far as interleaved slow and a fast refreshes are concerned, there is nothing the gsync can do about it. Removing it won't fix the problems either.
     
  11. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Well I don't know how much the FPS counter would really work but averaged that will be 66fps (30ms between one GPU's two frames, so two frames per 30 ms)

    Found a framerate graph to illustrate the issue, in v-sync off
    [​IMG]

    Red is GeForce DDR, blue is ATI Rage Fury Maxx :razz:
     
    #171 Blazkowicz, Oct 24, 2013
    Last edited by a moderator: Oct 25, 2013
  12. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
  13. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    I think ToTTenTranz was pointing to a particular comment related to GSync (note the date), not the article content itself.
     
  14. Yes, it's a direct link to the comment, not the article.

    It's a guy in March 2013 who says "hey, what if we did something.. let's call it GSync to solve this" and then proceeds to describe nVidia's G-Sync.
     
  15. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    wow....
     
  16. Yes.
    I'm wondering how/why this hasn't got viral. Yet.
     
  17. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,236
    Likes Received:
    4,259
    Location:
    Guess...
  18. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    I ask me if this guy work for Asus or Nvidia ( lol if it is the case, some comment of his post should have make laugh him like never, i imagine the guy who work on G-sync and who got response as it is impossible, it will not work lol.
     
  19. I don't think so.
    I just happened to stumble on this post from the same user in the comments to Anandtech's review of the R9 290X:

    It's just that no one paid attention to it.
     
    #180 Deleted member 13524, Oct 25, 2013
    Last edited by a moderator: Oct 25, 2013
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...