Question about Tile Rendering

ROG27

Regular
Does anyone know if there is a proneness to screen tearing/vsynch problems with tile rendering GPUs? Most all Xbox 360 games I've played have Vsynch issues. What is up with that?

A technical explanation would be greatly appreciated if indeed there is such an issue.
 
ROG27 said:
Does anyone know if there is a proneness to screen tearing/vsynch problems with tile rendering GPUs? Most all Xbox 360 games I've played have Vsynch issues. What is up with that?

A technical explanation would be greatly appreciated if indeed there is such an issue.

Nothinbg to do with tile rendering.

It's an application level decision to turn Vsync on or not, tearing is simply a function of Vsync not being enabled.
 
Tile based deferred rendering can actually make single buffered displays be a workable option for a game's framebuffer, though not necessarily a viable choice.
 
I have speculated that the apparent lack of VSync in most 360 titles is due to their use of an outboard video display chip. Because it's not integrated into the GPU, it may not have channels of communication equivalent to that of a modern PC chip to coordinate page flips.
 
the gamecube has some titles with screen tearing as well. i remember noticing it in more than a few games, but the only one i remember off the top of my mind is Simpsons: Road Rage. IIRC all of the games i noticed it in were multiplatform. also, on the original xbox i think Bruce Lee had screen tearing. though, to be honest, it's been so long since i played it, and i played it for such a short time, and it was so horrid that perhaps my mind invented the screen tearing in that game to help blunt the pain of playing the game.
 
Rockster said:
I have speculated that the apparent lack of VSync in most 360 titles is due to their use of an outboard video display chip. Because it's not integrated into the GPU, it may not have channels of communication equivalent to that of a modern PC chip to coordinate page flips.

Nope has all the support it needs.
 
see colon said:
the gamecube has some titles with screen tearing as well. i remember noticing it in more than a few games, but the only one i remember off the top of my mind is Simpsons: Road Rage. IIRC all of the games i noticed it in were multiplatform. also, on the original xbox i think Bruce Lee had screen tearing. though, to be honest, it's been so long since i played it, and i played it for such a short time, and it was so horrid that perhaps my mind invented the screen tearing in that game to help blunt the pain of playing the game.

God of War on PS2 has severe tearing.

Strange that gamecube games would have tearing, I thought the standard choice for gamecube games was to go with triple buffering rather than disabling vsync (or using double buffering), but I guess for 3rd party games it's different. Perhaps it was easier just to disable vsync than to rework the game to fit into gamecube's memory limitations.
 
Deathrow under 50Hz mode had loads of tearing, so did practically every second demo on xbox.
 
Last edited by a moderator:
A176 said:
...consoles have frame tearing?

Surprised me too. I used to think the only solution to an unsteady framerate on consoles was triple buffering, but apparently they can disable vsync.
I had just kind of assumed it was necessary to have vsync enabled, especiallly since any TV out function on a PC that I've used forces vsync enabled.
 
Fox5 said:
Surprised me too. I used to think the only solution to an unsteady framerate on consoles was triple buffering, but apparently they can disable vsync.
I had just kind of assumed it was necessary to have vsync enabled, especiallly since any TV out function on a PC that I've used forces vsync enabled.

Well, disabling vsync has been a little trick that was started to be used a few years ago when devs couldn't get stable framerates, but didn't want the framerate to plunge a lot. With VSYNC enabled, any fluctuation of framerate would be very noticeable as it needs to be a dividend (or multiple?) of 60/50, which means if something slows down, it goes down to 30/25 or even worse 15/12.5...
With VSYNC disabled, the framerate can go down to 49, 43 or whatever the system wants, causing a less noticeable slowdown. The only drawback is that the screen tears.
 
london-boy said:
Well, disabling vsync has been a little trick that was started to be used a few years ago when devs couldn't get stable framerates, but didn't want the framerate to plunge a lot. With VSYNC enabled, any fluctuation of framerate would be very noticeable as it needs to be a dividend (or multiple?) of 60/50, which means if something slows down, it goes down to 30/25 or even worse 15/12.5...
With VSYNC disabled, the framerate can go down to 49, 43 or whatever the system wants, causing a less noticeable slowdown. The only drawback is that the screen tears.

Disabling vsync also requires less vram, win-win situation?
 
Fox5 said:
Disabling vsync also requires less vram, win-win situation?
Win win as long as you don't care for quality updates. Tearing drops a game's classiness by 48.4% according to a recent poll
(1 person asked, all me)
 
Personally I've always hated tearing, but I can see where disabling Vsync is an attractive option if your regularly just over 1 frame, or you have intermitent spikes.
 
Fox5 said:
Disabling vsync also requires less vram, win-win situation?
Only if they are tiling and trying to render with only a front buffer.
 
Last edited by a moderator:
Fox5 said:
Disabling vsync also requires less vram, win-win situation?
If you're double-buffering anyway, no, it doesn't save vram to disable vsync.
If you disable vsync as an alternative to vsync on+triple-buffering, then yes, of course.
 
zeckensack said:
If you're double-buffering anyway, no, it doesn't save vram to disable vsync.
If you disable vsync as an alternative to vsync on+triple-buffering, then yes, of course.

Hmm, I thought if you disabled vsync you only needed a single buffer since you're just drawing/updating the image as its ready. I've even seen it hypothesized that with LCD tech and the right driving controller, only part of the image has to be redrawn (and thus the data resent) rather than following the typical scanline method of redrawing from top to bottom. (perhaps the gameboys or the nintendo ds can operate in this way?)
 
Since this thread is about question about tile rendering, I have a question on my own.

Is it hard to implement tile rendering to X360 versions of multiplatform games , if it's hard this might be quite a big problem for X360 or atleast it seems that way to me. It has always been clear that exclusives are the ones that show the true potential of the console, but this could make the gap even bigger.
 
Last edited by a moderator:
Back
Top