ROG27 said:Does anyone know if there is a proneness to screen tearing/vsynch problems with tile rendering GPUs? Most all Xbox 360 games I've played have Vsynch issues. What is up with that?
A technical explanation would be greatly appreciated if indeed there is such an issue.
Rockster said:I have speculated that the apparent lack of VSync in most 360 titles is due to their use of an outboard video display chip. Because it's not integrated into the GPU, it may not have channels of communication equivalent to that of a modern PC chip to coordinate page flips.
see colon said:the gamecube has some titles with screen tearing as well. i remember noticing it in more than a few games, but the only one i remember off the top of my mind is Simpsons: Road Rage. IIRC all of the games i noticed it in were multiplatform. also, on the original xbox i think Bruce Lee had screen tearing. though, to be honest, it's been so long since i played it, and i played it for such a short time, and it was so horrid that perhaps my mind invented the screen tearing in that game to help blunt the pain of playing the game.
A176 said:...consoles have frame tearing?
A176 said:...consoles have frame tearing?
Fox5 said:Surprised me too. I used to think the only solution to an unsteady framerate on consoles was triple buffering, but apparently they can disable vsync.
I had just kind of assumed it was necessary to have vsync enabled, especiallly since any TV out function on a PC that I've used forces vsync enabled.
london-boy said:Well, disabling vsync has been a little trick that was started to be used a few years ago when devs couldn't get stable framerates, but didn't want the framerate to plunge a lot. With VSYNC enabled, any fluctuation of framerate would be very noticeable as it needs to be a dividend (or multiple?) of 60/50, which means if something slows down, it goes down to 30/25 or even worse 15/12.5...
With VSYNC disabled, the framerate can go down to 49, 43 or whatever the system wants, causing a less noticeable slowdown. The only drawback is that the screen tears.
Win win as long as you don't care for quality updates. Tearing drops a game's classiness by 48.4% according to a recent pollFox5 said:Disabling vsync also requires less vram, win-win situation?
Only if they are tiling and trying to render with only a front buffer.Fox5 said:Disabling vsync also requires less vram, win-win situation?
If you're double-buffering anyway, no, it doesn't save vram to disable vsync.Fox5 said:Disabling vsync also requires less vram, win-win situation?
zeckensack said:If you're double-buffering anyway, no, it doesn't save vram to disable vsync.
If you disable vsync as an alternative to vsync on+triple-buffering, then yes, of course.