Nvidia G-SYNC

I guess there must be tearing with V-Sync off when you dip below the Freesync threshold. To me, the take-home message is that you really don't want a Freesync display with a high threshold. And I would even argue that AMD screwed up by making Freesync such a free label without strict requirements, and without tiers.

I don't know why they don't make a "Freesync Gold" label with guarantees on proper overdrive and a wide range of frequencies.

That and why in the world would you ever want to have a game that consistently dips down below 40 FPS? The GTA V example above would be unplayable to me. Yes, it wouldn't have tearing, but the control response will still be that of a game fluctuating between 28 and 51 FPS on the Gsync setup.

Personally, I'd want to make sure that the majority of frames were at or near 60 FPS and adjust visuals accordingly (gameplay > graphics) in which case, it'd be unlikely for the Freesync system to offer a significantly different experience from the Gsync system, assuming a similar level of performance.

At any (to me) playable setting the experience would be the same or similar. Scott Wasson made a similar observation when he tested out both. That yes, below the threshold Freesync wasn't great, but he'd never actually play a game that went below the threshold. PCPer made a similar observation with the el cheapo Korean brands that can only be imported. Tune the game for near constant 60 fps, then with either Gsync or Freesync, you'll have tear free and responsive gaming. And that was with a 40 hz cutoff.

Obviously doesn't work well if you are happy gaming at 30 FPS, but tuned for 30 FPS a game will likely dip below the Gsync cutoff as well. And for adaptive-sync, this may not always be the case as the spec allows for variable refresh rates far lower than anything currently seen. It'll all depend on whether a monitor chip manufacturer wants to go to the trouble of implementing it.

Regards,
SB
 
That amounts to buying high end or powerful hardware and/or upgrading it all the time.
I bet most people would be very happy to have a PC that runs Valve and Blizzard multiplayer games at 100 fps but drops down do 30 fps in GTA V or Crysis..
 
The really interesting bit there was that even though G-sync was preferred, most of the people there, 89.3%, aren't willing to pay the G-sync price premium (most of the respondents were either up to 100 USD, 57.1%, or 100-200 USD more, 32.1%).

Regards,
SB
The cost difference is nowhere over $200 -- since in that test the G-Sync system is equipped with a card that is $80 cheaper (and yet still being preferred by majority of test subjects.)

Given that people usually stick with their monitors through two or more generation of video cards, the total cost of ownership disparity between G-Sync and Free-Sync could eventually diminish -- if not completely reversed -- while the former giving better experience all along.
 
Acer Announces 35in Predator Z35 with G-sync and 200Hz

Like that model, the native refresh rate of the panel from AU Optronics is 144Hz, but Acer state in their press release that this can be overclocked to 200Hz.The Acer Predator Z35 will support NVIDIA G-sync technology, which is presumably where the 200Hz refresh rate max comes in, probably only supported when using G-sync although again that's to be confirmed. The resolution is relatively low, only offering 2560 x 1080 maximum, as opposed to the 3440 x 1440 we've seen from some 34" ultra-wide screens.

  • The Predator Z35 touts a 35-inch 21:9 UltraWide Full HD (2560x1080) panel with a curvature of 2000R for an immersive, dynamic and wraparound gaming experience
  • The Predator XB1 Series includes 27-inch and 28-inch models that give gamers a competitive edge with incredibly detailed images and superbly smooth visuals

http://www.guru3d.com/news-story/acer-announces-35in-predator-z35-with-g-sync-and-200hz.html
 
I have the Acer 27" 144Hz IPS G-sync monitor and while I love it, I have a minor annoyance: if the monitor is turned off with the button while the computer is on (that is, the display has not by itself gone into power-saving mode), somehow the display tells the driver/OS the resolution of my desktop has changed to some low number, like 800x600 or something. That means all the windows I had open are resized to this fake low resolution and moved up into the top left corner.

I think it is an artifact of the G-sync module because I have never experienced this before. Have anyone else experienced this?
 
Shit like this has happened a lot to me after moving to windows 10. My Radeon R290X refuses to put my monitor into power save for whatever reason, so I must turn it off with the power button. Frequently I come back to a desktop that has had windows moved around on it.
 
Win8 and Win7 will do the same thing. It's an issue with how Windows handles having 0 monitors.
So my venerable HP ZR24w is a special monitor or can it be because of the interface? DVI is more resilient and DP isn't?
 
Happens with my basement HTPC in Kodi after TV is turned off. My living room HTPC has a HDMI Detective which basically means the PC always thinks there's something attached and giving signals.
 
this video is VERY interesting. Using a framerate limiter (limiting fps to 3fps less than your max refresh rate) for a GPU like mine seems ideal. With that having Vsync on (preferrably on your GPU's options, disable ingame) and Gsync on, results in the least input lag.


A screengrab from the video with the conclusions (in my case the option at the middle is the best one, and also now the Intel Graphics Software app has a "Low Latency" setting).

VSDsC7o.png
 
Last edited:
this video is VERY interesting. Using a framerate limiter (limiting fps to 3fps less than your max refresh rate) for a GPU like mine seems ideal. With that having Vsync on and Gsync On adds the least input lag.
I've noticed since Oblivion that Bethesda games would have horrible mouse lag with vsync on unless you capped FPS to just under the monitor refresh rate. You could force triple buffering vsync with a cap of 59FPS and the input lag was fine, but with all other settings unchanged, turning off the cap would introduce game breaking lag. I wonder if this is related to the problems that are now being solved with Reflex etc.
 
I've noticed since Oblivion that Bethesda games would have horrible mouse lag with vsync on unless you capped FPS to just under the monitor refresh rate. You could force triple buffering vsync with a cap of 59FPS and the input lag was fine, but with all other settings unchanged, turning off the cap would introduce game breaking lag. I wonder if this is related to the problems that are now being solved with Reflex etc.
if you think about it, this is really old stuff, but it's good to refresh the memory once in a while. I remember thinking many years ago that those people limiting the framerate on their 60fps display to 57fps or so, were odd, until I understood what it was about. That's pre G-Sync era though.

This page also tests how G-Sync/Freesync is working on your display:

 
if you think about it, this is really old stuff, but it's good to refresh the memory once in a while. I remember thinking many years ago that those people limiting the framerate on their 60fps display to 57fps or so, were odd, until I understood what it was about. That's pre G-Sync era though.

This page also tests how G-Sync/Freesync is working on your display:

Back then I had no understanding of why this was happening. I thought it was the strangest damn thing ever.

Now I understand better but while it's been explained to me multiple times I'm still not 100% clear on why capping framerate below the refresh rate is better for input lag, or under what conditions that is the case. I think it has something to with preventing the CPU from queuing up a bunch of frames, but I thought Reflex was supposed to solve that.
 
Back then I had no understanding of why this was happening. I thought it was the strangest damn thing ever.

Now I understand better but while it's been explained to me multiple times I'm still not 100% clear on why capping framerate below the refresh rate is better for input lag, or under what conditions that is the case. I think it has something to with preventing the CPU from queuing up a bunch of frames, but I thought Reflex was supposed to solve that.
yup, I also need a refresh from time to time. Page 14 (especially) and 15 of G-Sync 101 article from Blur Busters have a great summary of which values are the best for the ideal G-Sync settings and a lot of details like mouse Hz (1000Hz polling rate) to avoid mouse micro stutters and the best power plan to use in Windows (High performance), etc:


 
Last edited:
yup, I also need a refresh from time to time. Page 14 (especially) and 15 of G-Sync 101 article from Blur Busters have a great summary of which values are the best for the ideal G-Sync settings and a lot of details like mouse Hz (1000Hz polling rate) to avoid mouse micro stutters and the best power plan to use in Windows (High performance), etc:


Do most VRR monitors have gsync modules? I thought that had mostly gone away now that VRR is standardized.
 
Most displays wouldn't have the physical module. But the physical module really only governs the display side, the input side from the PC and considerations could be same regardless of the terminology.

G-Sync though is used for more branding than just the physical module. Much like how DLSS has a larger branding umbrella than what it originally was.

If we want to be pedantic VRR, VESA adaptive-sync, HDMI VRR, Freesync (Premium/Premium Pro/over HDMI), G-Sync (Compatible/Ultimate/Pulsar), and etc. all mean different things but often just get used interchangeably to refer to variable refresh rate.
 
Back
Top