Nvidia G-SYNC

Discussion in 'Rendering Technology and APIs' started by DSC, Oct 18, 2013.

  1. pharma

    pharma Veteran

    @danybonin
    I see you already post same question in GeForce forums. It is not an easy task creating a dynamic fps limiter so doubt you will find someone to do it.

    You might want to look at Nvidia Inspector to set Custom FPS limiter rates ...

     
    Last edited: Jun 6, 2015
  2. pcchen

    pcchen Moderator Moderator Veteran Subscriber

    I think there's a thread buried somewhere with discussion about SLI micro-stuttering.
    In a nutshell, the reason why it's difficult to eliminate SLI micro-stuttering is because it's very difficult to predict how much time it takes to render the next frame (as in predicting the future...). Or, more precisely, it's difficult to make a perfect prediction every time. It's just like predicting weather: most of time it's correctly predicted, but when it's wrong (especially when the weather man says it's going to be a bright sunny day) it's very annoying.
    When you set the frame limiter to 50 fps you set the bar low so it's very rare that the rendering time exceeds the limit. It's like bringing your umbrella with you every day, you'll be dry unless something extraordinary happens (like a superstorm or something). To make a dynamic frame limiter, you'll need to be able to accurately predict how much time it takes to render the next frame (or next few frames), and that's the problem from the start.
    It might be possible if it's the game engine setting the frame limit, since game engines normally have better idea on how complex the next few frames are going to be, but that's still a difficult problem as it's not that easy to find the relationship between the complexity of your scenes and the rendering time on a specific GPU set up (and on a PC there're countless of different set ups).
     
    pharma likes this.
  3. Thanks for your reply.

    Of course it is probably not an easy task to make something like that. But why not just set the fps limiter few fps below what actual fps is. Like if im at 90 fps, the dynamic limitter would put 85. Will stay there until it reach 84 fps or less, then again limit fps to a few fps below. So when it detect fps below the limit, it move that limit down a few fps, until it read fps below the limit again... etc. The more complicate thing would be how to move the fps limiter up. That could be harder to do.

    Maybe each time there is a move to lower the fps limiter (fps as fall below limiter) then take note of gpu usage. If gpu usage go lower by a determined %, ramp up the limiter until it detect fps is below limiter again, then move limiter down a few fps like it would do normaly.
    This would be complicated i know, would have to use an average fps over time, maybe calculated over a second... i don't know... but i think it is possible to do.
    Im sure thoses who did RTSS and program like evga precision or, of course, Nvidia could do that. Nvidia did a smooth vsync option for sli user. Normal vsync (on a 60hz screen) will switch from 60hz to 30hz as soon as fps below 60, then back to 60hz as soon as fps is 60+. But with the vsync smooth, they managed to program something so that it wont switch back to 60hz until fps is will be stable and enought over 60. So the result is very less switch bettwen 30hz and 60hz. So the same thing could be done for a dynamic fps limiter, for ramping up the limiter.

    Anyway, im sure it is possible to do that, and im sure a lot of people would appreciate that. At leat me, i would.
     
  4. Ethatron

    Ethatron Regular Subscriber

    What you describe is the same problem as for CPU clock states. You can clock up with a low threshold of high utilization (say 0.5 seconds of 75%), and then clock down with a high threshold of low utilization (say 2 seconds of 25%).
     
  5. But a dynamic fps limiter is someting possible to program, no? Probably not an easy task, but we are not talking about sending a man on Mars... no?
    I think that would be a very useful tool for any sli + gsync user. If only i would be able to program this myself... lol.
     
  6. pharma

    pharma Veteran

    I think if Nvidia sees benefit in this then they will create .....
     
  7. Alessio1989

    Alessio1989 Regular

    Does anyone know how G-Sync and Free/Adaptive-Sync behave on flip presentation model compared to bit-btl?
     
  8. Nvidia wont sell more video cards with this, so they don't care. But anyone with sli + gsync @ 144hz who can't see the benefit of this is... ignorant.
     
  9. pharma

    pharma Veteran

  10. pharma

    pharma Veteran

    Tom's Hardware's G-Sync Vs FreeSync Event - July 18 (Updated)

    http://www.tomshardware.com/news/g-sync-vs-freesync-event,29393.html
     
  11. pjbliverpool

    pjbliverpool B3D Scallywag Legend

    Sounds awesome, shame it's on the other side of the world! I really hope they don't disclose which systems are which in orde to give completely objective results.
     
    homerdog and pharma like this.
  12. pharma

    pharma Veteran

    Results are in ....

    AMD FreeSync Versus Nvidia G-Sync: Readers Choose
    http://www.tomshardware.com/reviews/amd-freesync-versus-nvidia-g-sync-reader-event,4246.html
     
  13. pjbliverpool

    pjbliverpool B3D Scallywag Legend

  14. pharma

    pharma Veteran

    [​IMG]
    [​IMG]
    Performance Outside the “Zone”

    ttp://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69379-amds-freesync-long-term-review-3.html
     
    pjbliverpool likes this.
  15. Albuquerque

    Albuquerque Red-headed step child Veteran

    Eww, that behavior (AMD bottoming out at 20fps) sucks and in my opinion defeats much of the purpose of adaptive sync. I have to assume (hope?) this is simply bad driver behavior and alternate means of dealing with sub-40fps refreshes will arrive. If the framerate drops off the planet, why can't they refresh the same frame at the highest supported framerate to derive minimum latency to the next completed frame?
     
    Grall likes this.
  16. What's the point on enabling V-Sync over FreeSync? Doesn't FreeSync nullify tearing by itself?

    Moreover, that hardwarecanucks article is from May. Lots of stuff has been updated in the meanwhile. For example FreeSync + Crossfire is now available since 15.7.
     
  17. Blazkowicz

    Blazkowicz Legend

    The point is that to keep it tear-free the whole way through, you will fall back to vsync if you go below the panel's minimum refresh rate. Thus you jump from 40 fps to 20 fps instantly.

    If the panel had 24Hz as the minimum refresh for instance, then the issue would not happen with the two benchmarks above.
    If the panel stays at 40Hz and you fall back to vsync off instead, I suppose you would get very severe tearing (tearing is awful enough at 60Hz, so at 40Hz it should be extremely bad)

    Freesync is potentially awesome if the refresh range of the panel is great (e.g. a monitor with bounds at 20Hz and 75Hz would be very decent, compared to either a 40Hz minimum or a 60Hz maximum)
     
    Kyyla likes this.
  18. I see, so it's a panel-specific limitation and not a FreeSync one.

    For example, this model goes from 30 to 75Hz, so its low-framerate zone would be below 30FPS.



    Regardless: why is V-Sync being enabled at the same time as FreeSync?
    Why isn't FreeSync alone enough?

    [​IMG]
     
    Last edited by a moderator: Aug 12, 2015
  19. Alexko

    Alexko Veteran Subscriber

    I guess there must be tearing with V-Sync off when you dip below the Freesync threshold. To me, the take-home message is that you really don't want a Freesync display with a high threshold. And I would even argue that AMD screwed up by making Freesync such a free label without strict requirements, and without tiers.

    I don't know why they don't make a "Freesync Gold" label with guarantees on proper overdrive and a wide range of frequencies.
     
    Kyyla and Deleted member 13524 like this.
  20. Silent_Buddha

    Silent_Buddha Legend

    Last edited: Aug 12, 2015
Loading...

Share This Page

Loading...