Nvidia G-SYNC

Back then I had no understanding of why this was happening. I thought it was the strangest damn thing ever.

Now I understand better but while it's been explained to me multiple times I'm still not 100% clear on why capping framerate below the refresh rate is better for input lag, or under what conditions that is the case. I think it has something to with preventing the CPU from queuing up a bunch of frames, but I thought Reflex was supposed to solve that.

The pc space is too complicated. Reflex at least fixes that. You can just turn it on and then you only need to worry about vsync in terms of tearing assuming you have a VRR monitor.

When you enable vsync in game you can either have double or triple buffered output. Triple buffering has 1 extra frame queued so it feels laggier.

Assuming a fixed refresh display, double buffering with vsync is basically one buffer for the frame being displayed and one buffer being written to by the gpu. When the display finishes displaying the frame the buffers swap and the next frame begins being displayed. If the buffers would swap before the gpu is finished writing, the swap is held until the next interval. So 60Hz becomes 30Hz when the interval is missed. Lag from double buffered vsync is basically 1 frame time or 2 frame times when you miss the window. There’s also some lag because if your gpu processes a frame at 10ms it would have to wait for the next fixed interval eg 16ms on a 60Hz display.

If you were to disable vsync the frame buffers swap immediately when the gpu finishes writing a frame to a frame buffer. That means the display could be part way through drawing one frame, the buffers swap, and then it continues drawing data from the next frame which causes the tearing artifact.

With Gsync the same double-buffering is used but the control for sync is on the gpu side. After a frame is drawn the gpu signals it’s swapping buffers so the next frame can begin drawing before a fixed interval. If you force vsync on with a gsync display it actually only controls what happens if you reach or exceed the refresh rate of your monitor. For example do you vsync at 60Hz or do you uncap and tear? I think nvidia control panel ensures double buffering where in game could be triple. With gsync modules it’s also supposed to do some frame pacing optimization, but I’m not sure it matters on other vrr displays without the module.

Long story short is enable reflex when it’s available because it will eliminate input lag caused by frame queuing. Then in terms of your display, if you want to avoid tearing turn on vsync in the nvidia control panel if you have a gsync compatible VRR display. It’ll automatically limit your fps to stay within the VRR window to avoid tearing turn.

If you don’t have VRR turn on reflex and leave vsync off and just live with the tearing.

If the game doesn’t support reflex then try using an in-game limiter to set your fps to something where your gpu never exceeds maybe 95% usage to try to stay cpu limited and reduce frame queuing. External limiters can add a frame of input lag. Some in game limiters do as well but it’s your best bet.
 
Do most VRR monitors have gsync modules? I thought that had mostly gone away now that VRR is standardized.
what @arandomguy said. You don't need it, though G-Sync Ultimate should be pretty nice to have. Page 15 of the Blur Busters article details that the recommendations apply to G-Sync and Freesync. My best monitor at the time, a 240Hz Samsung display featured the G-Sync logo but it worked fine with Freesync.

The pc space is too complicated. Reflex at least fixes that. You can just turn it on and then you only need to worry about vsync in terms of tearing assuming you have a VRR monitor.

When you enable vsync in game you can either have double or triple buffered output. Triple buffering has 1 extra frame queued so it feels laggier.

Assuming a fixed refresh display, double buffering with vsync is basically one buffer for the frame being displayed and one buffer being written to by the gpu. When the display finishes displaying the frame the buffers swap and the next frame begins being displayed. If the buffers would swap before the gpu is finished writing, the swap is held until the next interval. So 60Hz becomes 30Hz when the interval is missed. Lag from double buffered vsync is basically 1 frame time or 2 frame times when you miss the window. There’s also some lag because if your gpu processes a frame at 10ms it would have to wait for the next fixed interval eg 16ms on a 60Hz display.

If you were to disable vsync the frame buffers swap immediately when the gpu finishes writing a frame to a frame buffer. That means the display could be part way through drawing one frame, the buffers swap, and then it continues drawing data from the next frame which causes the tearing artifact.

With Gsync the same double-buffering is used but the control for sync is on the gpu side. After a frame is drawn the gpu signals it’s swapping buffers so the next frame can begin drawing before a fixed interval. If you force vsync on with a gsync display it actually only controls what happens if you reach or exceed the refresh rate of your monitor. For example do you vsync at 60Hz or do you uncap and tear? I think nvidia control panel ensures double buffering where in game could be triple. With gsync modules it’s also supposed to do some frame pacing optimization, but I’m not sure it matters on other vrr displays without the module.

Long story short is enable reflex when it’s available because it will eliminate input lag caused by frame queuing. Then in terms of your display, if you want to avoid tearing turn on vsync in the nvidia control panel if you have a gsync compatible VRR display. It’ll automatically limit your fps to stay within the VRR window to avoid tearing turn.

If you don’t have VRR turn on reflex and leave vsync off and just live with the tearing.

If the game doesn’t support reflex then try using an in-game limiter to set your fps to something where your gpu never exceeds maybe 95% usage to try to stay cpu limited and reduce frame queuing. External limiters can add a frame of input lag. Some in game limiters do as well but it’s your best bet.
now that you mention it, you can have nVidia Reflex enabled on Rivatuner, for anyone whos use a framerate limiter like I do, it works very well on my Intel GPU.
 
Last edited:
what @arandomguy said. You don't need it, though G-Sync Ultimate should be pretty nice to have. Page 15 of the Blur Busters article details that the recommendations apply to G-Sync and Freesync. My best monitor at the time, a 240Hz Samsung display featured the G-Sync logo but it worked fine with Freesync.
I can't even remember last time someone released a G-Sync display, as in, display with G-Sync module. That's the only way to get G-Sync (at least until the MediaTek G-Sync scalers are used).

What you probably had and what practically all "G-Sync" displays currently are is "G-Sync Compatible", not "G-Sync".
"G-Sync Compatible" is simply NVIDIAs branding for Adaptive-sync they've tested to work fine with GeForces. FreeSync isn't picky and works on any Adaptive-sync display, no matter the branding used in display marketing.

now that you mention it, you can have nVidia Reflex enabled on Rivatuner, for anyone whos use a framerate limiter like I do, it works very well on my Intel GPU.
Wait what? Reflex works on any GPU?
 
Last edited:
What you probably had and what practically all "G-Sync" displays currently are, is "G-Sync Compatible", not "G-Sync".
yes, that's right. This was like 5 years ago.

Wait what? Reflex works on any GPU?
you can enable it on Rivatuner.

AZmmKTm.png


I used it mostly on Resident Evil 2 Remake and with the max framerate limiter in Intel's Control Panel set to 162fps -3fps less than the max refresh rate of my monitor as recommended by Blur Busters article-, VRR on, Vsync on, the game worked like a charm using Lossless Scaling FGx3 (while Rivatuner limited the base framerate of the game to 54fps).

It was one of the smoothest experiences I had with any videogame to date, which is quite nice taking into account that the game looks really good and modern still.
 
Shouldn't work in the absence of Nvidia's driver since that's basically RTSS using that for framerate limiting.
you are absolutely right. My bad, I was so happy to have nVidia Reflex enabled. Still, the default async method they use if you don't have nVidia hardware works really well.

3PnxgfO.png
 
what @arandomguy said. You don't need it, though G-Sync Ultimate should be pretty nice to have. Page 15 of the Blur Busters article details that the recommendations apply to G-Sync and Freesync. My best monitor at the time, a 240Hz Samsung display featured the G-Sync logo but it worked fine with Freesync.
I love PC gaming but sometimes it cracks me up. Have to read a 15 page thesis to know the best way to play your videogames. The TLDR is on page 15 which of course has 10 subsections and the first subsection has 38 references :LOL:
 
I love PC gaming but sometimes it cracks me up. Have to read a 15 page thesis to know the best way to play your videogames. The TLDR is on page 15 which of course has 10 subsections and the first subsection has 38 references :LOL:

Whether it's more complex than setting up HIMEM.SYS in CONFIG.SYS is left to the readers as an exercise. ;)
 
Whether it's more complex than setting up HIMEM.SYS in CONFIG.SYS is left to the readers as an exercise. ;)
Himem.sys was a cakewalk.

Let's chat about optimizing EMM386.exe include and exclude regions for upper memory to enhance the space you could reclaim for loadhigh statements. Gotta push all those TSRs out of low memory to keep that 624kb for all those memory-starved 16 bit apps :D
 
I love PC gaming but sometimes it cracks me up. Have to read a 15 page thesis to know the best way to play your videogames. The TLDR is on page 15 which of course has 10 subsections and the first subsection has 38 references :LOL:
hahahah, when one reads that you feel like you studied in Germany, and distance oneself from those who look like studied in Germany and they dissimulate the fact that they actually read something on Wikipedia.
 
Himem.sys was a cakewalk.

Let's chat about optimizing EMM386.exe include and exclude regions for upper memory to enhance the space you could reclaim for loadhigh statements. Gotta push all those TSRs out of low memory to keep that 624kb for all those memory-starved 16 bit apps :D
oh man, the autoexec.bat and config.sys days..., a lost art, some games were so difficult to run in Windows 95 because they needed more basic memory -dunno how to define it- than that was left of those 640KB.
 
Even to this day I consider myself a blackbelt ninja voodoo master of DOS BATCH. I've written more dumb automation scripting in batch than I care to admit, even as recently as a year ago for an archane phone system that needed robocopy backups ( ! ) made because nothing else could do they job they needed. That little side gig made me a thousand dollars for what ended up being about two hours of investigation and basically a five line batch script :D
 
Back
Top