DiGuru said:
Sandwich said:
On the contrary, for consoles there's all the more reason to output more frames faster: the TV signal is 60hz or 50 for PAL.
There's a reason why people with TFT screens complain more about tearing than gamers with proper CRTs.
If you turn on vsync, you will have no tearing. And most of the time, there is tearing because you don't use triple buffering, and there are too many frames rendered.
Could be a particular game that only works with double buffering.
So I don't see the point why a framerate that is much higher than the monitor refresh would reduce tearing. It doesn't. It increases tearing, but makes it less visible, as the changes in between frames get smaller.
A CRT doesn't increase framerate obviously. I'm pointing to the higher Refresh rate of good CRTs. With twice the refresh rate, tearing images will be alternated with complete frames. Much better.
Aiming for 60fps as is common for console shooters is worst case scenario really.
Example: vsync on. begin retrace. generate new frame takes 17ms. Missed retrace by a hair. Wait another 17ms.
You are seeing on screen is what occured up to 2 frames ago.
That's why frames are buffered. That doesn't mean, that the next frame isn't displayed, or the computer won't start rendering the next frame. But you might skip a frame every once in a while.
Ofcourse, I'm not saying that the fps decreases, but you still get the lag. Triple buffering will do little about the lag, but unlike with double buffering you don't get your fps halved aswell.
You still get a lag between 1 or 2 frames, but with this advantage that slow frames can also get skipped, so on average <1.5.
What you should actually want is a great disparity between fps and RR. On the one hand rendering excessive frames reduces lag. On the other hand, low fps and fast refresh hides tearing and minimizes lag (you can up the detail even further and use double buffer no vsync for extra efficiency).
No, using the resources for other things than rendering excess frames reduces lag and tearing, and gives you more detailed visuals. Just because the resources aren't wasted.
Slower frames doesn't reduce lag, because the time it takes to render a frame itself IS lag, but you get less trace lag if the refresh rate is much higher. You get maximum additional lag if fps and rr are equal.
And by upping the detail by hand, you reduce the fps in any case. You're only saying, that if you have excess frames, you can reduce that and get better visuals by changing the setting yourself.
I'm saying faster is better. Speed for visuals is a trade-off.
But if you leave the frame rate management to the game, it can make sure you always get the minimal amount you want, instead of highly varying frame rates, where you would want to change the detail settings every minute.
You always get variation. Nothing prevents that except the pointless capping of frames.
It's just pathetic for a console shooter to assume less than 60 fps will somehow be enough. 60 fps should be the bare minimum. Consoles should be about blazingly fast action. That's what a console is supposed to be good at.
I'm not even a hardcore gamer, but even I can clearly see how sluggish console shooters are compared to the PC.
For a shooter you shouldn't aim for 60fps, just because the display is 60hz.
Aim higher. 90 fps will do.
You get less lag, which many gamers *will* notice at these low speeds.
Minimum frames would also increase, which is always a good thing.