Nvidia G-SYNC

@danybonin
I see you already post same question in GeForce forums. It is not an easy task creating a dynamic fps limiter so doubt you will find someone to do it.

You might want to look at Nvidia Inspector to set Custom FPS limiter rates ...

 
Last edited:
I think there's a thread buried somewhere with discussion about SLI micro-stuttering.
In a nutshell, the reason why it's difficult to eliminate SLI micro-stuttering is because it's very difficult to predict how much time it takes to render the next frame (as in predicting the future...). Or, more precisely, it's difficult to make a perfect prediction every time. It's just like predicting weather: most of time it's correctly predicted, but when it's wrong (especially when the weather man says it's going to be a bright sunny day) it's very annoying.
When you set the frame limiter to 50 fps you set the bar low so it's very rare that the rendering time exceeds the limit. It's like bringing your umbrella with you every day, you'll be dry unless something extraordinary happens (like a superstorm or something). To make a dynamic frame limiter, you'll need to be able to accurately predict how much time it takes to render the next frame (or next few frames), and that's the problem from the start.
It might be possible if it's the game engine setting the frame limit, since game engines normally have better idea on how complex the next few frames are going to be, but that's still a difficult problem as it's not that easy to find the relationship between the complexity of your scenes and the rendering time on a specific GPU set up (and on a PC there're countless of different set ups).
 
Thanks for your reply.

Of course it is probably not an easy task to make something like that. But why not just set the fps limiter few fps below what actual fps is. Like if im at 90 fps, the dynamic limitter would put 85. Will stay there until it reach 84 fps or less, then again limit fps to a few fps below. So when it detect fps below the limit, it move that limit down a few fps, until it read fps below the limit again... etc. The more complicate thing would be how to move the fps limiter up. That could be harder to do.

Maybe each time there is a move to lower the fps limiter (fps as fall below limiter) then take note of gpu usage. If gpu usage go lower by a determined %, ramp up the limiter until it detect fps is below limiter again, then move limiter down a few fps like it would do normaly.
This would be complicated i know, would have to use an average fps over time, maybe calculated over a second... i don't know... but i think it is possible to do.
Im sure thoses who did RTSS and program like evga precision or, of course, Nvidia could do that. Nvidia did a smooth vsync option for sli user. Normal vsync (on a 60hz screen) will switch from 60hz to 30hz as soon as fps below 60, then back to 60hz as soon as fps is 60+. But with the vsync smooth, they managed to program something so that it wont switch back to 60hz until fps is will be stable and enought over 60. So the result is very less switch bettwen 30hz and 60hz. So the same thing could be done for a dynamic fps limiter, for ramping up the limiter.

Anyway, im sure it is possible to do that, and im sure a lot of people would appreciate that. At leat me, i would.
 
So when it detect fps below the limit, it move that limit down a few fps, until it read fps below the limit again... etc. The more complicate thing would be how to move the fps limiter up. That could be harder to do.

What you describe is the same problem as for CPU clock states. You can clock up with a low threshold of high utilization (say 0.5 seconds of 75%), and then clock down with a high threshold of low utilization (say 2 seconds of 25%).
 
But a dynamic fps limiter is someting possible to program, no? Probably not an easy task, but we are not talking about sending a man on Mars... no?
I think that would be a very useful tool for any sli + gsync user. If only i would be able to program this myself... lol.
 
But a dynamic fps limiter is someting possible to program, no? Probably not an easy task, but we are not talking about sending a man on Mars... no?
I think that would be a very useful tool for any sli + gsync user. If only i would be able to program this myself... lol.

I think if Nvidia sees benefit in this then they will create .....
 
Tom's Hardware's G-Sync Vs FreeSync Event - July 18 (Updated)

On July 18 in City of Industry in Los Angeles, California, Tom's Hardware will be hosting Battle of the Brands: G-Sync vs FreeSync, and we need you to be our judge and jury. We'll have gaming systems, built courtesy of our sponsor, Digital Storm, half of them running G-Sync, half running FreeSync. Your job: Play some of the games we'll have set up, and answer a series of qualitative questions about the experience. It's that simple. Play games, eat food, and know that your fun-filled Saturday afternoon will help us determine the champion between these two technologies from Nvidia and AMD.

Update, 7/8/15, 10:40am PT: This event was effectively sold out in a little more than half a day. Thank you for that. And for those of you who made it, we look forward to seeing you there. For those of you still interested, I will build a very limited waiting list of about 20 people — again, first come, first served. Those 20 people can attend, and there's a decent chance you'll get to test G-Sync vs. Freesync, but no guarantee. But even so, some of our sponsors are going to have gaming stations set up. We'll have food, music, giveaways, and more gaming, so it's worth the drive. Fill out the form further below and we'll be in touch if you make it on the list.

http://www.tomshardware.com/news/g-sync-vs-freesync-event,29393.html
 
Sounds awesome, shame it's on the other side of the world! I really hope they don't disclose which systems are which in orde to give completely objective results.

Results are in ....

AMD FreeSync Versus Nvidia G-Sync: Readers Choose
Naturally, game selection was its own issue, and it quickly became clear that both AMD and Nvidia knew where their respective strengths and weaknesses would materialize. I’ll spare you the politicking, but we eventually settled on a pair of titles commonly associated with AMD (Battlefield 4 and Crysis 3), and two others in Nvidia’s camp (Borderlands: The Pre Sequel and The Witcher 3). Battlefield and Borderlands were deemed our “faster-paced” selections, while Crysis and The Witcher were on the slower side. At the end of the day, though, as a PC gamer, I wanted each title to run at the lushest playable detail settings possible. That’d prove to be another topic for zero-day debate between the two companies. Admittedly, any preference toward quality or performance is going to be subjective. We conceded that both schools of thought are valid, and one gamer's preference may even change depending on genre.

http://www.tomshardware.com/reviews/amd-freesync-versus-nvidia-g-sync-reader-event,4246.html
 
Really good article with some very interesting results! Personally I would have preferred to see vsync left on though. How do the two solutions differ below the variable refresh limit when vsync is on?
FREESYNC-14.jpg

FREESYNC-12.jpg

Performance Outside the “Zone”

In order to visually compare this situation to G-SYNC, I ran the identical tests but modified settings so the GeForce card (in this case a GTX 980) was operating even lower framerates than AMD’s solutions.
In Unigine we can see that AMD’s step-down process is a marked departure from NVIDIA’s implementation which provides a universally smooth output below 40FPS and even as low as 28FPS in some cases.
...
My experience with FreeSync was for the most part overwhelmingly positive but only under those “right conditions” as I mentioned above. Those conditions arise when FreeSync is used in a framerate zone that mirrors that panel’s native minimum and maximum refresh rates. Go below the lower of those two points and things start getting ugly. The panel suddenly finds itself in a 40Hz V-Sync mode and framerates plummet down to 20FPS, leaving a massive 20FPS dead zone between 20 and 40.

This situation doesn’t affect G-SYNC and is one of the major deficiencies of AMD’s technology right now. Could this change in the future? With panels capable of wider refresh rate ranges and possible driver tweaks, there’s certainly some room for hope. However, APUs -which struggle to achieve 40FPS under the best of circumstances- may find themselves better off without FreeSync in many games.
ttp://www.hardwarecanucks.com/forum/hardware-canucks-reviews/69379-amds-freesync-long-term-review-3.html
 
Eww, that behavior (AMD bottoming out at 20fps) sucks and in my opinion defeats much of the purpose of adaptive sync. I have to assume (hope?) this is simply bad driver behavior and alternate means of dealing with sub-40fps refreshes will arrive. If the framerate drops off the planet, why can't they refresh the same frame at the highest supported framerate to derive minimum latency to the next completed frame?
 
What's the point on enabling V-Sync over FreeSync? Doesn't FreeSync nullify tearing by itself?

Moreover, that hardwarecanucks article is from May. Lots of stuff has been updated in the meanwhile. For example FreeSync + Crossfire is now available since 15.7.
 
The point is that to keep it tear-free the whole way through, you will fall back to vsync if you go below the panel's minimum refresh rate. Thus you jump from 40 fps to 20 fps instantly.

If the panel had 24Hz as the minimum refresh for instance, then the issue would not happen with the two benchmarks above.
If the panel stays at 40Hz and you fall back to vsync off instead, I suppose you would get very severe tearing (tearing is awful enough at 60Hz, so at 40Hz it should be extremely bad)

Freesync is potentially awesome if the refresh range of the panel is great (e.g. a monitor with bounds at 20Hz and 75Hz would be very decent, compared to either a 40Hz minimum or a 60Hz maximum)
 
I see, so it's a panel-specific limitation and not a FreeSync one.

For example, this model goes from 30 to 75Hz, so its low-framerate zone would be below 30FPS.



Regardless: why is V-Sync being enabled at the same time as FreeSync?
Why isn't FreeSync alone enough?

 
Last edited by a moderator:
I guess there must be tearing with V-Sync off when you dip below the Freesync threshold. To me, the take-home message is that you really don't want a Freesync display with a high threshold. And I would even argue that AMD screwed up by making Freesync such a free label without strict requirements, and without tiers.

I don't know why they don't make a "Freesync Gold" label with guarantees on proper overdrive and a wide range of frequencies.
 
Last edited:
Back
Top