V-Sync vs. FPS (revisited)

Chris_T

Newcomer
Every so often a forum war gets started regarding this issue.

Someone in another forum said:
Think of it like this: Refresh rate is basically your monitors own Frames per Second output. When a video cards frames go above this rate it inserts partial frames per refresh cycle. This results in horizontal "tearing" of the image. Tearing is best described as seeing lines across your screen where the images do not line up, and is best experienced when spinning your character around in a game. On the other hand, when your cards FPS go below the refresh rate all you get is duplicate video frames being displayed during multiple refresh cycles.

Now this sounds like an argument to have V-sync turned on, and with HIGH *over 100Hz-140Hz* refresh rates this usually works out well. But for people playing at higher resolutions with lower refresh rates there is actually a disadvantage to this because it lowers your games FPS overall. This is because all current and previous Video cards are brute force renderers. Much like a car trying to go through a mudhole, a car can go through a mudhole better at 100Mph vs. 60Mph. When a graphics card gets to a part of a game where the frames drop(mudhole), its better to have higher frames overall to help it pull through this part.

VS.

A Person disagreeing said:
Untrue - in fact, VSYNC is a bigger problem with high refresh rates.
If your monitor is say 60Hz refresh, you will never have VSYNC frame
rate cuts so long as the card maintains >=60 frames/sec rendering.
Having VSYNC on with a high (CRT say, 100-140+), you willl have
frame rate cuts anytime the rendering rate drops below that - and
few cards around can render at 100+ FPS consistently under
any settings and resolutions other than very low.

Who is correct? Both correct? Both have it wrong?
As always, if this is in the wrong forum, move it appropriately.
 
If you have triple buffering enabled, there is no framerate penalty for enabling vsync (other than the framerate being limited to your monitor's refresh). The penalty is only with double-buffering.

So, enable triple buffering and vsync, take the small performance drop associated with the extra framebuffer usage (if there is any drop), and have a tear-free experience that's as high-performance as with vsync disabled.
 
Guess the issue is someone claiming that you get "tearing" when a video cards FPS drops below your monitors refresh rate.

(EDIT: I was informed that the use of cuts in the reply wasn't the same as Tearing, its just that I think of tearing as cuts of frames across the screen)

Also isn't Triple Buffering only available with Direct X through the games video settings properties, which means in most games, its unavailable. OTOH I know you can force TriBuff in OpenGL through cards drivers.

Speaking as an ATI owner.
 
Last edited by a moderator:
Chris_T said:
Guess the issue is someone claiming that you get "tearing" when a video cards FPS drops below your monitors refresh rate.
Er, you get tearing all the time, no matter what your framerate or refresh rate are, if you are running with vsync disabled.

Also isn't Triple Buffering only available with Direct X through the games video settings properties, which means in most games, its unavailable. OTOH I know you can force TriBuff in OpenGL through cards drivers.

Speaking as an ATI owner.
Yes. Also, in nVidia's current drivers, there is a setting for enabling triple buffering, and that setting is independent of Direct3D or OpenGL. I haven't yet tested it yet, however, as tearing just never bothered me, as I only ever notice it in very rare, typically contrived situations.
 
Chris_T said:
Every so often a forum war gets started regarding this issue.
[...]
Who is correct? Both correct? Both have it wrong?
As always, if this is in the wrong forum, move it appropriately.

I just can't understand why are some people that ignorant. Those are probably the same people who say that you can't see more than 60 fps.

short answer: vsync doesn't lower your fps.

long answer:

horizontal monitor frequency = how many times per second the monitor shows the image
vertical = how many times a second the monitor refreshes the image (between each refresh a different image is shown - the image that is sent from the graphics card to the monitor)

let's say that my monitor has 120 Hz at YxZ resolution in game X and I get constant 160 fps in that game. With vsync or without I will only see 120 FPS _no matter what_. Because that monitor, at that resolution can only refresh the image 120 times a second. Meaning I will see a different image 120 times/second. But with vsync on the card will sty synchronized with the monitor refresh rate and not display a different frame until the one beeing shown is completely rendered. That prevents image tearing, because only one frame can be seen in one refresh. Without vsync the card will not be synched with the monitor and just give frames to the monitor as they are beeing rendered. The problem here is that my monitor may have a refresh rate of 120 Hz but the card will render the scene at (say) 360 fps. So in the time it takes the monitor to draw one picture (1 refresh) the card will render 3 frames and send them to the monitor. I will see "3 frames in one" - tearing. This is more noticable when moving around in a game or turning horizontally. But remember that while vsync will be disabled I will still see only 120 frames on the monitor even though the card renders 360 frames it's just that the 3 frames that the card renders will be shown at the same time on the monitor. In a perfect situation every 1/3 of the monitor screen would be a different image. Or maybe the monitor won't even show the 2/3 or the 360 fps you're getting because the monitor might receive them inbetween monitor's refreshed, but after it has processed the first frame.

No matter how you look at it, vsync is a perfect solution without any drawbacks. You WON'T lose any frames, because the extra ones rendered by your card won't show up on your monitor. The only good scenario I can think of where vsync might not be a good idea is i.e. in Q3 where physics are fps dependand - meaning that in Quake3 you can actually jump higher with more frames. If you have below 125 Hz in Q3, then disabling vsync is a good idea to get "better" physics. You will get more fps, you just won't see them.

Now for the tricky part. There is no such thing as vsync in OpenGL. As I don't know much about OGL I found this comment from the lead programmer at Raven Software (SoF, Q4):

"There is no Vsync in OpenGL as a command. Most apps use the GLFlush command, sometimes followed by a GLFinish command. The GLFlush command basically says "Ok, what ever commands you have in your buffer, send 'em to the rendering device now, and get it working." It doesn't care where the raster is in the drawing sync, it just goes out and does it. The GLFinish command will then make the app wait until the rendering device has completed all the commands it has been sent up til then. This gives you the fastest feedback, fairly obviously. Now, depending on whether you are double buffering your video displays (ie rendering to the back one while the front one is being displayed) you might want to use a swapbuffers command. This means that you can afford to slap out commands to the rendering device when ever you feel like it, since it's always going to be rendering to an unseen buffer. The SwapBuffers command does what it says, it swaps the buffers between the front and the back. When it actually does this, ie at Vsync or just randomly whenever it can depends on the card you are using. Sometimes you can set the 'wait for Vsync' in the properties dialog for your card, sometimes it has to be set via registry options. It's messy and highly card dependant. Obviously working in a window you don't get any kind of Vsyncing going on.

As for Quake II & III - John C. makes the game run the fastest he can. Obviously waiting for Vsync before window swapping can cause a slow down. If you take 1.1 frames to draw a scene, then wait for Vsync before swapping frame buffers that means that .9 of that frame is spent doing nothing on the card. The OpenGL context can accept commands and buffer them up, but it's not going to be doing any rendering until the buffers are swapped and the back buffer is unlocked for rendering again. You can see why this would slow the game down."

Just to make it clear once again: You do get lower fps with vsync on, but if your card produces more frames than your monitor can display then you don't see those extra frames. Period.

Don't believe me, here's a quote from Tim Sweeney:

"I don't have any clue why someone would disable VSync for gameplay. The only legit reason for this is to benchmark 3D card performance without the monitor's refresh rate skewing the results."

I hope that settles it, because I am sensitive on that topic :)

Now let's talk about how our eyes can't see more than 30 fps :)))
 
Varg, my understanding of the benefits of disabling vsync are somewhat different than yours.

"No matter how you look at it, vsync is a perfect solution without any drawbacks. You WON'T lose any frames, because the extra ones rendered by your card won't show up on your monitor."

You may not lose whole frames, but even if your framerate is higher than your refresh rate, you're going to lose partial frames. If your framerate is below your refresh rate, then it seems to me that you are going to "lose" a frame if a new one isn't complete by the next vsync, as you're going to see the last one again. Without vsync, you'd at least see part of a new frame.

Vsync obviously can lower your framerate with timedemos, and it does have the potential to lower the amount of frames (partial or whole) that you can physically see, no?

"Just to make it clear once again: You do get lower fps with vsync on, but if your card produces more frames than your monitor can display then you don't see those extra frames. Period."

Here again you're arguing from the perspective of the card rendering faster than the monitor refreshes (GPU snob :p ;)). I don't think it's often that people's cards hit 85+fps average (to say nothing of 160fps), especially given the general consensus of 60fps as an acceptable average (see 3D card reviews, some game timedemo results, etc.).

The two quotes in the OP were referring to different things: the first, to visible frames; the second, to visual tearing associated with rendering partial frames. I'm with Chal: triple buffering and vsync.

I would like to know where to enable triple buffering in D3D with ATT, though. Is it via Flip Queue Size (and if so, does that mean I can force sextuple buffering :oops:)? Or maybe it's only in the latest version, 1.04.780?
 
Last edited by a moderator:
Varg Vikernes said:
No matter how you look at it, vsync is a perfect solution without any drawbacks.
No, it's not.

First of all, you have to use triple buffering there not to be a framerate penalty, and that only comes at an extra cost in memory space, which can in turn adverse affect framerates.

Why do you need triple buffering? Consider the following double-buffering scenario:
0ms: monitor refresh comes, frame begins rendering
10ms: monitor refresh comes
15ms: frame ends rendering, but no refresh is currently happening, so the hardware waits for VBLANK before swapping buffers so that there is no tearing.
20ms: monitor refresh comes, buffers swapped, next frame begins rendering.

Notice that the video card, in this situation, was completely idle for 5ms waiting to sync up with the refresh rate. This is the framerate penalty associated with enabling VSYNC with only double buffering. This must occur because with double buffering, there's only storage space enough for the frame currently being output to the monitor, and for the frame currently being rendered. There's no extra space for the frame that is done rendering, but waiting for VSYNC.

Triple buffering adds this space, removing the direct framerate penalty associated with enabling VSYNC, but costs more memory, and thus may indirectly decrease the framerate somewhat.

Edit:
Just to be a little bit more explicit, if, for example, your monitor is running at a refresh of 100Hz and you are running with double-buffering and VSYNC enabled, the only possible framerates you can get are 100fps, 50fps, 33fps, 25fps, 20fps, and so on down the line.
 
On thing to consider with vertical sync is that it increases the "output lag"(which can feel like input lag since it takes a while before you see the result) since you may have to wait up to a whole frame longer before switching buffers(though with triple buffering work will begin on a new frame while waiting for the next vertical retrace to display the completed one). In many games being good depends very much on defeating the latency inherent in your brain and memorizing a bunch of short sequences of events so you can do them quickly without having to see the result of one before you do the other.

For slow rate of fire type weapons in FPS games you can just "twitch" to what you want to shoot at(at least if mouse acceleration is off and framerate is descent) and fire without waiting to see if you are on target. In this case you don't depend on feedback and vsync or no vsync doesn't matter as much(it still means that you are aiming at where your target were on average 1/2 a frame ago; but the whole deal with learning to aim by "twitch" is still very much something you have to practice for every specific game, at least for me. So I don't think this matters) .

With something like an SMG in a FPS you depend on following where the target is now and where it is moving to and correcting your aim continously. Here a little latency in output or input will be devastating. The >100 ms latency in your brain makes it impossible to follow targets that move quickly and erratically in an arbitrary fashion with much success. Thankfully that's not the case, there are rules for how a player, enemy or object may move(or wishes to move. Knowing that a player is heading for a power up or trying to dodge someone elses rocket or bunnyhopping or whatever will let you predict their movement with much success.)

But if you tack on half a frames extra latency at say 50 fps that's 10 ms extra with triple buffered vsync over no vsync, and even more over double buffering with vsync(since you will have a lower framerate). Given the nature of how players move when they move erratically(by controlling acceleration and angle) devation from expected path ought to be roughly quadratic over a short period of time in terms of position on the screen, meaning that any short latency before input is accepted or before the result of that input is displayed will be a severe punishment.

This can be due to vertical sync, ghosting, low framerate, low mouse sample rate(which also can cause graphical stuttering. Since during some frames the mouse will have sent it's new position to windows once while during others twice, the movement will be very unsmooth. The default 125 Hz of the USB port is quite awful; if you don't feel like "overclocking" your USB port rate(which may cause problems, though rarely something which can't be fixed by setting it back down again) you can use the PS/2 port with 200Hz and still be within spec(which the mouse was most likely design to be useable with)), mouse "smoothing"(which is mostly needed because people use the the default 125 Hz sample rate for USB or the default 100 Hz of PS/2 that some mouse drivers default to).

I'm not one to play games just because of looking at the level design and graphics alone. In the types of games where latency really matters I would seriously consider using double buffer and no vertical sync, especially if I can't force the game to use triple buffered vsync.

Chalnoth said:
Er, you get tearing all the time, no matter what your framerate or refresh rate are, if you are running with vsync disabled.


Yes. Also, in nVidia's current drivers, there is a setting for enabling triple buffering, and that setting is independent of Direct3D or OpenGL. I haven't yet tested it yet, however, as tearing just never bothered me, as I only ever notice it in very rare, typically contrived situations.

People with LCDs tend to notice it much less. And of course you notice it much less in slow games rather than fast paced shooters.

One of the places where it is most annoying is the muzzle flash in FPSs and flickering lights(how come even "realistic" shooters have these HUGE muzzle flashes for every single shot fired? That's just not realistic at all).
 
Last edited by a moderator:
Well, gamers are like certain birds. We're attracted to shiny things, like ginormous muzzle flashes and ridiculous sun flares and washouts. At least, that's what focus groups appear to be telling game devs and publishers. ;)

Good point, I forgot to mention the extra tiny bit of lag. It's probably not an issue where you'd sacrifice some framerate to avoid screen tearing (dbl buff, no vsync). I wonder if it isn't as big of a deal at a constant framerate (for instance, fps above vsync'ed refresh rate), as it's constant. It'd probably be more annoying with a wildly flucuating framerate, but that's probably a case where you'd want to disable vsync, anyway.
 
Pete said:
You may not lose whole frames, but even if your framerate is higher than your refresh rate, you're going to lose partial frames. If your framerate is below your refresh rate, then it seems to me that you are going to "lose" a frame if a new one isn't complete by the next vsync, as you're going to see the last one again. Without vsync, you'd at least see part of a new frame.
The way I see it you wind up looseing partial frames either way. You loose them like you explained with vsync on, but those new partial frames come at the expence of seeing the rest of the frame which they are replacing. So you can either get one whole image of a complete frame with vsync on, or a whole image that is a collage showing various parts of different frames when you trun vsync off. I'd much rather have the former, especally in games with lots of flashing lights like soylent mentioned.
Pete said:
I would like to know where to enable triple buffering in D3D with ATT, though. Is it via Flip Queue Size (and if so, does that mean I can force sextuple buffering :oops:)? Or maybe it's only in the latest version, 1.04.780?
I'm not sure what the Flip Queue Size thing is, but triple buffering can be forced in D3D though the profiles section's additional options. That did crash one game on me, though I don't even recall what game it was and I have used it in quite a few others so it works great for the most part anyway.
 
Varg Vikernes said:
short answer: vsync doesn't lower your fps.
Sure, If you play hypothetical games in the land of theory. In the real world, however, try to play any fast-paced FPS on an oldish graphic card that only averages around 30-40 fps (with vsync off) and then turn vsync on and it will be unplayable. Whether you are seeing "whole" frames or not makes little difference to the perceived smoothness.
 
Diplo, I'd agree with you if you were using double buffering in these games. If you also enabled (or forced) triple buffering, I should think that it'd still work quite well.
 
Excatly, as long as you have triple buffering and enough vram to support it without turning to swapping, your framerate would come out a hair less at most. That in some land of theory either, I do exactly that in most every game I play.
 
Chalnoth said:
Diplo, I'd agree with you if you were using double buffering in these games. If you also enabled (or forced) triple buffering, I should think that it'd still work quite well.
Yeah, that's probably true (but I'd have to test it in practice as there's theory and then there's the 'feel' of playing UT deathmatch ;) ). However, the people most likely to suffer low-frame rates are usually the ones with lower-end cards that have the least memory and hence may not have 'room' for an extra frame-buffer. I agree, though, that ideally you want vysnc on with triple buffering - but it's an ideal and something that many (if not most) people cannot always achieve.
 
Er, low-end cards usually tend to have far more memory than they will ever need. It's the aging high-end cards that people may have issues with.
 
Back
Top