What affect does overscan compensation have on games? *spawn

I've got a question, when a game asks you to re-adjust the arrows of each side of the screen to fit your TV screen before you start a game, does that mean it's not native 720 or HD ratio?

No, it's a best practice that Sony asks developers to follow, in case you do have a TV with so-called 'overscan'.

However, if you are entering this picture, and the maximum (biggest) setting places the arrows outside of your view (drop off the tv image), then your TV most likely has overscan. My livingroom TV has this (1366x768 or something) and my study doesn't, so I noticed this difference.

I actually noticed in one game that there was a performance difference for a game when I had the maximum settings or something smaller, which was actually the default for that game (Zen Pinball 2). I think Sony recommends to default to 10% smaller actually. I have asked Digital Foundry to do some framerate tests sometime for some games that support these settings. It makes sense, because it could involve some form of scaling in some cases where the settings are smaller than native framebuffer res. They haven't gotten round to it though, and it's probably not a very hot item.
 
The game asks me to re adjust the arrow of each side of the screen so the tip of the arrow reaches the end of your TV screen just like every COD game, does that mean the game is not rendering at 720 res?
That's adjusting for overscan. Some TVs still cut out some of the frame, despite LCD panels being an exact fit unlike the CRT tubes they replaced. This means you'd miss the outside 10% of the view unless the view is shrunk to fit. You can render any resolution and perform a final shrink, so your choice of overscan adjustment shouldn't affect render resolution. Conceptually, you could take the user-selected shrunk resolution and render to a small framebuffer, which would free up a little resources for a little extra framerate, but I don't think anyone does that.
 
No, it's a best practice that Sony asks developers to follow, in case you do have a TV with so-called 'overscan'.

However, if you are entering this picture, and the maximum (biggest) setting places the arrows outside of your view (drop off the tv image), then your TV most likely has overscan. My livingroom TV has this (1366x768 or something) and my study doesn't, so I noticed this difference.

I actually noticed in one game that there was a performance difference for a game when I had the maximum settings or something smaller, which was actually the default for that game (Zen Pinball 2). I think Sony recommends to default to 10% smaller actually. I have asked Digital Foundry to do some framerate tests sometime for some games that support these settings. It makes sense, because it could involve some form of scaling in some cases where the settings are smaller than native framebuffer res. They haven't gotten round to it though, and it's probably not a very hot item.

Actually it would make sense that if you tune it smaller it would significantly increase performance.
lets just take it to (sort of) an extreme such that tune this so badly that the resulting rendered area is only 50% wide and 50% tall of the original rendered, resulting in only 25% pixels to render, and the rest of the 75% to be simply black.

It seems pretty straightforward that this would surely result in a increase in performance as long as they don't just render the whole god damn thing and then resize it to fit.
 
Last edited by a moderator:
That's adjusting for overscan. Some TVs still cut out some of the frame, despite LCD panels being an exact fit unlike the CRT tubes they replaced. This means you'd miss the outside 10% of the view unless the view is shrunk to fit. You can render any resolution and perform a final shrink, so your choice of overscan adjustment shouldn't affect render resolution. Conceptually, you could take the user-selected shrunk resolution and render to a small framebuffer, which would free up a little resources for a little extra framerate, but I don't think anyone does that.

Thanks Shifty for the detailed run through, always wondered about the reason behind it.
 
That's adjusting for overscan. Some TVs still cut out some of the frame, despite LCD panels being an exact fit unlike the CRT tubes they replaced. This means you'd miss the outside 10% of the view unless the view is shrunk to fit. You can render any resolution and perform a final shrink, so your choice of overscan adjustment shouldn't affect render resolution. Conceptually, you could take the user-selected shrunk resolution and render to a small framebuffer, which would free up a little resources for a little extra framerate, but I don't think anyone does that.

Wouldn't that final resizing add burden to the whole pipeline? It seems much more straightforward to do the latter.
 
Very interesting, well for Hitman I'm pretty content with the performance so far with all the tip of the arrows just touching the border.
 
Wouldn't that final resizing add burden to the whole pipeline? It seems much more straightforward to do the latter.
Depends. If you have a post-processing step, adding a rescale might add virtually nothing to workload. If the downsample is in addition to the normal rendering, then yes. Supporting arbitrary rendering resolutions might add complexity to the engine that's not worth it and you'd deliver the hit in performance instead.

It's worth an investigation by someone.
 
Some games might only be adjusting the HUD to make sure you can see that stuff while leaving the game rendering alone.
 
Last edited by a moderator:
Some games might only be adjusting the HUD to make sure you can see that stuff while leaving the game rendering alone.

Yes, I think the fact that there are several variations of dealing with it, which is why it hasn't been much on the radar of perfomance checks before I think.
 
Back
Top