Screen Tearing Debate

I'm in the middle of an argument on screen tearing and refresh rates and you guys seemed like a tech savvy bunch to ask. Basically, I'm arguing with a guy that thinks that a game has to have vsync on to prevent screen tearing, regardless of a game's frame rate. He claims that even a game running at a solid 60fps (never going above or below 60fps) on a 60hz monitor without vsync will have screen tearing. I argue that a solid 60fps or even 30fps (again, assuming it's perfectly 60 or 30) without vsync will not have screen tearing. Am I wrong?
 
The only way to ensure that tearing doesn't occur is to synchonize the back<->front buffer flip to the screen vertical refresh signal. Tearing occurs if the buffers are flipped when the DAC reads the frame buffer from the memory.

NVIDIAs new gsync technology sends frames to the display when the GPU finishes them and refreshes the display at rendering rate (shows only finished frames). This doesn't need vsync and doesn't tear, but only a few (expensive) displays support this tech.
 
the guys right
its gotta be in sync, even if its 60hz. how is it at the 'right' refresh i.e it could always be out of sync thus is gonna flicker like a bastard, youve gotta wait for the right time to present the image onscreen
 
the guys right
its gotta be in sync, even if its 60hz. how is it at the 'right' refresh i.e it could always be out of sync thus is gonna flicker like a bastard, youve gotta wait for the right time to present the image onscreen

That doesn't make sense then. That would mean that nVidia's G-Sync wouldn't actually eliminate tearing.
 
That doesn't make sense then. That would mean that nVidia's G-Sync wouldn't actually eliminate tearing.
Normally the GPU sends a frame to the display every 16.6 milliseconds (60 Hz) and the display refreshes the pixels as it receives the data (from upper part of the screen to lower part). If the frame is not ready when the data sending starts, you will see tearing (upper part of the frame shows the new frame, but lower part is still the old one). Vsync is the most common method to solve the tearing issue (data is updated during the vblank).

Gsync on the other hand sends the frame when it's ready, and the display has some internal memory to store it. When the data is fully received the display updates all the pixels. There is no fixed 16.6 millisecond (60 Hz) refresh interval, so there's no need to synchronize to it. Tearing is thus impossible.
 
Last edited by a moderator:
Would Gsync introduce latencies (as in mouse/controller input latencies)?
No, it should actually reduce latency, as the frame sending can start immediately after the frame is completed. Normally you wait for the next vsync (average 8.33 milliseconds). I am not sure if the display can start updating the pixels immediately when the frame data starts to arrive, or does it store it first and update when it's fully received.
 
Adaptative-sync, now included by VESA in DisplayPort 1.2a, is also a promising freely available tech (both AMD and NVidia can use it) which eliminates tearing and judder in games.
 
Back
Top