Nvidia G-SYNC

So does anyone know what happens when your rendering speed exceeds your monitors refresh rate under G-Sync? Lets say you have a 60hz monitor and your frame rate is 100-130 fps?

I think they are 144Hz top atm.
Honestly if you go faster, just throttle your game ;p
(Well or drop frames or whatever...)
 
Thanks everyone. So the higher your monitor refresh rate the better since it gives you more head room for increased frame rates but you never have to worry about going over that refresh rate and tearing since the the GPU would wait for the next monitor refresh to send the next frame once you hit the ceiling.

So very similar to adaptive vsync with the advantage that below the monitors max refresh rate you'll see no tearing. Apparently stuttering is also experienced below the refresh rate with vsync off but this isn't something I've personally noticed (playing all games with vsync off on a 120hz monitor).
 
Has anyone commented on what perks (or limitations?) this will bring to SLI?
 
Shouldn't it just display SLI stutter with high fidelity, with no tearing or futher induced stutter? Or that sentence can be ignored, as I don't account for SLI frame pacing.
 
Shouldn't it just display SLI stutter with high fidelity, with no tearing or futher induced stutter? Or that sentence can be ignored, as I don't account for SLI frame pacing.

The differing latencies for alternate frames in SLI are gone. There will be no tearing with gsync, ever.

The driver doesn't need to artificially delay half the frames anymore.
 
Well, GPU B releases a frame at t0 = 0 ms, GPU A releases a frame at time t1=15ms, GPU B releases a frame at t2=30, then GPU A has a hard, long frame to render : it goes out at t3=57 but GPU B puts out its frame at t4=62.. Now if every frame takes 30ms again from now on, aren't frames from GPU A and B virtually locked at 5ms from each other?
That'll give you a jerky and slow picture while the framerate counter is happy showing a big number.

That's what I understand from the situation.. G-Sync alone gives excellent, tear-free jumpy stuttered framerate.
 
Last edited by a moderator:
Well, GPU B releases a frame at t0 = 0 ms, GPU A releases a frame at time t1=15ms, GPU B releases a frame at t2=30, then GPU A has a hard, long frame to render : it goes out at t3=57 but GPU B puts out its frame at t4=62.. Now if every frame takes 30ms again from now on, aren't frames from GPU A and B virtually locked at 5ms from each other?
That'll give you a jerky and slow picture while the framerate counter is happy showing a big number.

That's what I understand from the situation.. G-Sync alone gives excellent, tear-free jumpy framerate.

After the first spike, the fps counter stays stable at 33fps.

As far as interleaved slow and a fast refreshes are concerned, there is nothing the gsync can do about it. Removing it won't fix the problems either.
 
Well I don't know how much the FPS counter would really work but averaged that will be 66fps (30ms between one GPU's two frames, so two frames per 30 ms)

Found a framerate graph to illustrate the issue, in v-sync off
graphafr.gif


Red is GeForce DDR, blue is ATI Rage Fury Maxx :p
 
Last edited by a moderator:
Yes, it's a direct link to the comment, not the article.

It's a guy in March 2013 who says "hey, what if we did something.. let's call it GSync to solve this" and then proceeds to describe nVidia's G-Sync.
 
I ask me if this guy work for Asus or Nvidia ( lol if it is the case, some comment of his post should have make laugh him like never, i imagine the guy who work on G-sync and who got response as it is impossible, it will not work lol.
 
I ask me if this guy work for Asus or Nvidia ( lol if it is the case, some comment of his post should have make laugh him like never, i imagine the guy who work on G-sync and who got response as it is impossible, it will not work lol.

I don't think so.
I just happened to stumble on this post from the same user in the comments to Anandtech's review of the R9 290X:

spiked_mistborn said:
Nice job AMD! Competition is good! Also, feel free to use my GSYNC idea about putting a frame buffer in the display and letting the video card control the refresh rate. This post is from March 2013. Apparently adding a dash to make it G-Sync makes it different somehow. http://techreport.com/discussion/24...nvidia-frame-capture-tools?post=719842#719842

It's just that no one paid attention to it.
 
Last edited by a moderator:
Back
Top