Every so often a forum war gets started regarding this issue.
VS.
Who is correct? Both correct? Both have it wrong?
As always, if this is in the wrong forum, move it appropriately.
Someone in another forum said:Think of it like this: Refresh rate is basically your monitors own Frames per Second output. When a video cards frames go above this rate it inserts partial frames per refresh cycle. This results in horizontal "tearing" of the image. Tearing is best described as seeing lines across your screen where the images do not line up, and is best experienced when spinning your character around in a game. On the other hand, when your cards FPS go below the refresh rate all you get is duplicate video frames being displayed during multiple refresh cycles.
Now this sounds like an argument to have V-sync turned on, and with HIGH *over 100Hz-140Hz* refresh rates this usually works out well. But for people playing at higher resolutions with lower refresh rates there is actually a disadvantage to this because it lowers your games FPS overall. This is because all current and previous Video cards are brute force renderers. Much like a car trying to go through a mudhole, a car can go through a mudhole better at 100Mph vs. 60Mph. When a graphics card gets to a part of a game where the frames drop(mudhole), its better to have higher frames overall to help it pull through this part.
VS.
A Person disagreeing said:Untrue - in fact, VSYNC is a bigger problem with high refresh rates.
If your monitor is say 60Hz refresh, you will never have VSYNC frame
rate cuts so long as the card maintains >=60 frames/sec rendering.
Having VSYNC on with a high (CRT say, 100-140+), you willl have
frame rate cuts anytime the rendering rate drops below that - and
few cards around can render at 100+ FPS consistently under
any settings and resolutions other than very low.
Who is correct? Both correct? Both have it wrong?
As always, if this is in the wrong forum, move it appropriately.