The old Refresh Rate question.

Riptides

Newcomer
We all know the old bit about your refresh rate is how many times a second the monitor refreshes the screen. With V-sync enabled your video card displays one frame per monitor refresh, hence 60Hz = 60FPS Max in a game. Optimally you would want to play games with V-Sync enabled, since with it disabled what you see is still limited by the refresh rate of the monitor, hence vertical tearing. So at a 85Hz Refresh you will never see more than 85 of the rendered frames that the video card puts out per second regardless of your Vsync setting. Am i correct here in my thinking ?

But the thing that has me wondering, is..
With V-Sync enabled, is the actual rendering process taking place in the card matching the vertical refresh rate ? Or is the card still rendering above and beyond the refresh rate but is only outputting to match the monitor's rate ?

And is the myth that with the Vsync enabled on a card, does that mean it raises the minimum FPS since the card does not have to work so hard to maintain a higher FPS true at all ?

hope these are valid questions :-?
-rip
 
You are correct. With 85Hz refresh rate, you cannot see more than 85 fps because you are looking at the monitor displaying 85 images per second.

(added: wasn't careful enough when reading your first question. OpenGL guy is correct in that you see parts of several frames simultaneously. With "85 fps" in the "you cannot see more than 85 fps" I dont mean rendered frames, but displayed "frames").

With vsync enabled the rendering process never renders more fps than the refresh rate, the process is stopped (blocked) if all framebuffers are complete and you are waiting for the next display refesh.

About minimum fps:
With vsync and double buffering (one backbuffer being rendered into and one front buffer being displayed) your minimum fps can be lower than with vsync off. This occurs when the rendering process has lower rate than the refresh rate, because now and then the process has to stop waiting for next refresh, even if the rendering rate is slower than refreshrate.

With vsync and triplebuffering you have an extra backbuffer that can you can start rendering into while waiting for next refresh. The only time the rendering process is stopped is if the rendering rate is faster than refresh rate. Thus vsync+triplebuffering doesn't lower the minimum refresh rate, but it costs more framebuffer memory and adds a little latency.

Oh and about "stopping the rendering process": of course the rendering stops if the cpu doesn't continuously give new rendering commands too.
I'm assuming this doesn't happen above.

added:
but some people prefer the faster updates that vsync off can give
I agree it's about preference. It seems a lot of people disable vsync just because they see it gives a higher measured fps, and thus that is "better" though.
IMO a consistent framerate without tearing is much better than slightly lower latency.
 
ripvanwinkle said:
So at a 85Hz Refresh you will never see more than 85 of the rendered frames that the video card puts out per second regardless of your Vsync setting. Am i correct here in my thinking ?
Not exactly. Picture a case where your monitor is at 60Hz and the application is running at a consistent 180 fps with vsync disabled. The result on your monitor would be something like:
Code:
---------------------
|  Frame 1          |
|                   |
---------------------
|  Frame 2          |
|                   |
---------------------
|  Frame 3          |
|                   |
---------------------
In other words, you would see parts of 3 frames per vsync. Yes, you would get tearing, but some people prefer the faster updates that vsync off can give. It's mainly a matter of personal choice.
 
Ah, so that clears up an important part, and also describes tearing now to me perfectly, honestly i had never seen this that bad before until my Ge4ti. Thats why i see things being drawn on the bottom of my screen before it fills the entire screen.

But, optimally/visually speaking, I would want to have v-sync enabled, but the whole crux of current tech is that they are brute force rendering and so max FPS is relative to the minimum FPS a card can get.

Really i would want a card that was optimized to maintain a maximum FPS in the area of 100-120. But also could maintain the minimum FPS in that window as well.
 
ripvanwinkle said:
Really i would want a card that was optimized to maintain a maximum FPS in the area of 100-120. But also could maintain the minimum FPS in that window as well.

I read something about future cards doing something like this (hopefully)
well i think thats what it means :)

digit-life said:
In the near future pixel processors will become entire doubles (as to capabilities) of vertex ones because of the same data format and the same arithmetic instructions; the only thing lacking is an instruction order management, but this problem can be solved. The distinction between pixel and vertex processors will be vanishing. In several architecture generations a graphics accelerator will turn into a set of identical general-purpose vector processors which will have flexible configurable queues for asynchronous transfer of parameters between them. Processors' efforts will be distributed on the fly depending on an approach used for making an image of a balance of a required performance on certain tasks:

some will be in charge of animation and tessellation (geometry generation),
some will control geometrical transformations,
some will manage shading and lighting,
some will deal with texture sampling (they will be intelligent texture units able to program arbitrary filtering methods or calculate procedure textures).

It talks about a lot more then this but i pasted the whole part anyways :)
 
Back
Top