HDTVs: Game Mode 2.0... tearing clean-up? What else?

I tried to find a thread where people talked about how to make the best use of the HDTV and utilize the different settings to get the best image quality, and this is the more similar thread I found about that particular subject.

Best thing I ever did was enable native mode on the my tv (note this is not the game as game mode). That let's it display a 1920x1080 image completely unmolested by any post processing on the tv. So all my native 1920x1080 gaming goes through untouched and looks so much clearer compared to having native mode off. As a bonus you see more of the game because the tv will no longer do 10% overscan, instead your game will fill your screen exactly border to border. I really was amazed at the clarity jump I got from this, if your tv supports it then definitely give it a try. It's not always clear what setting it is because each manufacturer seems to call it something different. On mine I think it was called "Pure mode" or something like that.
 
All I'm interested in is reducing latency to as close to 0 as possible.
Any processing that requires a stored frame is detrimental to that.
 
All I'm interested in is reducing latency to as close to 0 as possible.
Any processing that requires a stored frame is detrimental to that.

It's why lag tests is important, some LCD have 0-33ms of retard than CRT, but the big part is 66ms or more, and for some TV Game Mode had more lag, very strange…
And yes personally Game mode on my TV let me (but I'm can setting on if I want) with all processing off, got a 33ms on general mode so hope less.:)

@Joker: 1:1 aspect is a need for a TVHD used to gaming, much better image like you said.

@Arwin: Got a automatique select mode on my Sony… But seem to not function with the HDMI Input… May be with TNT, need to test.
 
I've recently played Guitar Hero at a friends house on his "Ambilight" tv... I couldn't hit a SINGLE note... he did, however, play rather well. I then went into the calibration menu to sort things out, when I found out that the TV did indeed have >160ms lag according to the game... it's quite insane. But it also tells me (since my friend was able to play the game like this), that people can adapt to this situation. It's surely not ideal, by any means... but well "it works". After my setting the correct latency values, I did "win", and he didn't hit a single note.
 
Another thing I want games to know automatically: when I was playing Zen Pinball 2 yesterday, after a while I noticed in the settings that it was using 10% border margin. When I put those to 0%, the game actually ran with less lag so I'm guessing it scales, and that the scaling introduces lag. If my TV shows 1920x1080 pixels without overscan, I'd love it if games knew about this automatically and disable any overscan.
 
That seems to be the case for some Panasanic TVs as well. Feed them a 1080p signal and you are supposed to get 16 ms input lag, but when I measured 720p signal with RB3 I got 50 ms input lag.
 
Regarding the sharpness... I've seen several TVs (iirc Samsungs in particular) that blur the image, if you go below a certain value in the Sharpness setting (i.e. the value has a range of 0 to 100, where 50 is "off", and anything lower will blur the image as a result).

I usually hate most "post processing" stuff TVs do, especially if they aren't dismissable. Luckily, my TV lets me switch off all of them. But it IS highly subjetive, too. Some people prefer the "TV store look", whereas others want the "pristine output displayed at D65".

Usually, setting "sharpness" to high values leads to "halos" around characters etc, which I absolutely hate. So, to each his own, I guess.
That's quite interesting. My HDTV doesn't feature that option to disable all post-processing.

I have a Samsung and I know what you mean, because low Sharpness certainly blurs the image.

It doesn't look bad if you love watercolour paint, but most of the finest detail is gone.

i.e. in RDR my Hungarian half horse loses a lot of detail if the Sharpness is set to a low value, the horse goes from his white-ish flea bitten colour to a cell-shaded watercolour white, or something like that, where you can't discern the little spots covering his coat.

Maybe this isn't the best example because of the low quality of the pictures but my Hungarian half (Snow) :) goes from looking like this:

1000px-Hungarian_Half-bred.jpg



....to look something like this, where the fine details are lost:

1000px-004.JPG


Aside from that, in RDR the distant beautiful embankments in the game go from a clay like embossed brown colour with some little shades of different details and stratums, to a flat uniform brown colour --once again, watercolour like.

I am also playing Williams Pinball Classics as of late and when I play the Funhouse table -one of my favourite pinball machines-, some letters like the P on the playfield -over the clock- become a pixelated mess, and others at the top of the playfield become practically indistinguishable.

Funhouse_B-800-by-480-595x357.jpg

 
Finally, out of curiosity I increased the Sharpness to 100 but I didn't notice any halo.

I want it as an option, always. Some TVs look like $*"(@# when all their post processing is turned off, and if the lag isn't that bad, the choice should lie with the user. Heck, even if the lag is that bad, the choice should lie with the user!
Game mode can be enabled in my TV but I can't say tell it does turn off all the post-processing, because everything looks as if nothing happened.

The most obvious difference I found is that when you enable Game mode, the default settings switch to Standard -Movie and Dynamic aren't available then, and are greyed out-, but I can make the image match the IQ in Movie mode, the difference isn't apparent at first glance.
 
Best thing I ever did was enable native mode on the my tv (note this is not the game as game mode). That let's it display a 1920x1080 image completely unmolested by any post processing on the tv. So all my native 1920x1080 gaming goes through untouched and looks so much clearer compared to having native mode off. As a bonus you see more of the game because the tv will no longer do 10% overscan, instead your game will fill your screen exactly border to border. I really was amazed at the clarity jump I got from this, if your tv supports it then definitely give it a try. It's not always clear what setting it is because each manufacturer seems to call it something different. On mine I think it was called "Pure mode" or something like that.
I would LOVE to try that, but unfortunately my TV doesn't feature that setting. I know there is a secret menu but I don't think it's there nor I want to fiddle with those values because I won't be knowing what I am doing and I remember reading somewhere that I could render the TV useless.

I enabled Game mode and I can't cycle through aspect modes, but I made the image match the settings I had before even if Game mode sacrifices some of the post process, which I don't care about tbh.

That "secret" menu I was talking about is called picture menu, iirc. Thanks for your suggestion by the way. :)

Another thing I want games to know automatically: when I was playing Zen Pinball 2 yesterday, after a while I noticed in the settings that it was using 10% border margin. When I put those to 0%, the game actually ran with less lag so I'm guessing it scales, and that the scaling introduces lag. If my TV shows 1920x1080 pixels without overscan, I'd love it if games knew about this automatically and disable any overscan.
I solved that problem a long time ago. I send native to the Samsung panel, using Just Scan -it's only available for HDMI in my TV though-.

As Rekator mentioned, Just Scan is 1:1 picture. What you see is the exact image and resolution the console is sending to your TV.

Just Scan means the picture has no overscan or underscan, so what you are seeing is "edge to edge" picture without processing downscaling or upscaling.

Ever since I "discovered" this I gained a lot of horizontal field of view especially, compared to settings like Wide, 16:9, 4:3, Wide Zoom, etc, and although I didn't check the vertical FOV much, it makes a huge differente in every possible way.

Another dramatic and positive difference for Samsung TVs happens when you switch the Color Space setting from Auto to Wide, the colour is a lot more realistic and vivid.

Wide Color Space and Just Scan are the epitome of image quality for me.
 
That seems to be the case for some Panasanic TVs as well. Feed them a 1080p signal and you are supposed to get 16 ms input lag, but when I measured 720p signal with RB3 I got 50 ms input lag.

How were you measuring, the only valid way is to compare to a CRT with no processing and the same image.

The TV is only one part of input lag, so the time from push button to see response is much higher.
For a double buffered game with synchronous gameplay you have a worst case input lag as follows
You just miss the poll for the input, so you wait 1 frame (on average 1/2) for it to be read
1 frame as you process and render the frame - all your doing is putting some commands in a buffer
If you are GPU bound, you get to wait upto another frame before those commands make it to become images on the back buffer.
Another Frame to send the complete frame to the TV
Whatever TV processing goes here after the 3-4 frames the game took.

So you start with 32-54ms of latency on a 60fps game before the TV get's involved.
Worse if you're frame rate is 30fps then that's 64-90ish ms

And trust me this has a huge impact on the way a game feels, most people though never get to A/B the game.
 
Likely he was using the Rockband's own lag calibration system.
 
Sadly those wouldn't work unless games native resolution is the output resolution.
I rather prefer the AA performed by the native hardware. Also, wouldn't this add a lot of rendering time to the picture?

I mean, AA is a post-process and I think it would be a big overhead for a simple TV.
 
That's quite interesting. My HDTV doesn't feature that option to disable all post-processing.

I have a Samsung and I know what you mean, because low Sharpness certainly blurs the image.

Generally speaking on most Samsung's sharpness set to 0 is the correct setting to avoid any edge enhancement or unnatural softening. You can use test patterns to find out on your set.

I think the issue here is that Samsung like to have somekind of noise reduction permanently enabled on their TVs - at least on most of their European models. Film grain and other high frequency detail will get smoothed over at a result, which is a real pain if you can't get around the problem.

Try using Game Mode and then switching to Standard Mode. (I.E using the Standard Picture mode with Game Mode activated). This works as a fix for most of the 2010 - 2012 models. Although, this can create auto dimming issues with the backlight on some sets / floating blacks on the Plasmas.

The above works a treat on my C580 though, and the APL fluctuations aren't visible by eye either, only when using a meter. Scaling is just as good too when looking at the SMPTE and HQV test patterns.
 
Remember the rock band / guitar hero calibration settings cannot resolve the actual problem which is lack of visual/audio feedback when you hit a note. You really need a low latency display for those games especially if you're playing with other people splitscreen.
 
Well, of course it can... if it knows how much latency a game has, it can "prerender" the screen with said latency as to show the user the correct timings. It doesn't, however, reduce any "user involved" latency at all (i.e. if you hit a wrong note, it'll still show up x ms late).

And since the user has no influence as to how the song is played (beyond him putting in the notes), it's rather straightforward, too. Won't work for most other games, though.
 
It doesn't, however, reduce any "user involved" latency at all (i.e. if you hit a wrong note, it'll still show up x ms late).

That's what I mean, and it's important to have that visual feedback when you hit a note.
 
Generally speaking on most Samsung's sharpness set to 0 is the correct setting to avoid any edge enhancement or unnatural softening. You can use test patterns to find out on your set.

I think the issue here is that Samsung like to have somekind of noise reduction permanently enabled on their TVs - at least on most of their European models. Film grain and other high frequency detail will get smoothed over at a result, which is a real pain if you can't get around the problem.

Try using Game Mode and then switching to Standard Mode. (I.E using the Standard Picture mode with Game Mode activated). This works as a fix for most of the 2010 - 2012 models. Although, this can create auto dimming issues with the backlight on some sets / floating blacks on the Plasmas.

The above works a treat on my C580 though, and the APL fluctuations aren't visible by eye either, only when using a meter. Scaling is just as good too when looking at the SMPTE and HQV test patterns.
Thanks for the input, KyoDash. I have taken some of the opinions in this thread into consideration and I definitely switched to Game mode, and now I am trying to find the sharpness setting I am most comfortable with. My Samsung TV is european as you mention, from 2008, it's relatively small for an HDTV (22") and I have a larger one -46"- in the living room, but I always play in my bedroom.
 
Alas I don't have the same model as you, the C580 seems to be a more modern HDTV compared to mine, but activating Game mode/Standard picture might produce similar results.
 
After some more experimenting I find sharpness to be a tricky feature.
 
I jacked it up the other day when playing Red Dead Redemption and it looked way better than the setting I was using -Sharpness at 20-.

I find that everything looks better, more embossed, but fellow forumer TheWretched here mentioned halos and stuff, which might mean that Sharpness adds noise, and by adding more sharpness you are altering the original picture coming from the console in weird ways too.
 
It must be true especially in the case of fonts, but games look a bit crisper to me. Oddly enough, I can't recall exactly if it was in the manual of the TV in my bedroom or the one in the living room but it recommended in it to not use the sharpness setting on HDMI inputs, which is the input I utilize.
 
Yesterday I did a little experiment decreasing the Sharpness to 0, and Sharpness seems to soften the image at some point more than just removing noise.
 
You suggest putting it at 0, TheWretched mentioned that 50 can be 0 in some TVs and the manual says that it's not something you're supposed to mess around with setting it to more than 0 using HDMI.
 
The problem is what it is true "0" and what the TV *says* "0" is. To find out I think some time and effort might be necessary. Your C580 has "0" sharpness at 0, and others seem to have it at 50, and maybe others have it at some other weird number.
 
I had read time ago that Sharpness at low levels for HD input content produces the best image quality.
 
However in my HDTV it doesn't seem so easy, because if you actually do this, I have a feeling it really makes the picture soft, sometimes out of focus even.

Maybe sharpness was meant for CRT TVs, and doesn't function or doesn't make sense in modern TVs, like LCDs, DLPs, OLEDs.... etc.

My experience with it is that there has to be a number that balances between noise added by sharpenss and the crispness. That should be the correct value for sharpness.
 
So in order to get it *right* I think the only way is to adjust it until I find something I like.

My interpretation is that for a non-CRT television, perhaps there is no such thing as correct, knowing how complex these TVs are, only what looks good to you.

As I said, yesterday I set sharpness to 0, -Game Mode/Standard this time around- the flea bitten spots of the Hungarian half (RDR) didn't almost vanished, unlike my perception the other day when I wasn't discerning the fine detail well, and the contrast seemed to fall off noticeably.
 
But yesterday it looked good to me. I could distinguish the aliasing in some places, and I was happy with it, because it is how the image is meant to be displayed. The game doesn't render at 1080p with AAx4, but at 720p with AAx2, so that's the original image quality for the game.
 
Also my TV native res is 1680x1050 and it is set to Just Scan, so maybe the upscaling could add a little noise, and like joker454 I don't like at all an artificial enhanced picture if it adds lag, being able to discern little flaws here and there was good news for me.

I also switched values, did reset it to 0, then stepped slowly up to 75 -using Dynamic Mode (a mode I like, but the colours are oversaturated and a bit off!) default-.

There was very little increase in detail, but there was also noise beginning to be clearly apparent in some places of the background.
 
As I dialed back to 0, disabling sharpness, the noise decreased but I am trying to find out if some specific details in textures and stuff stay about constant or not, because I really dislike a blurred image.
 
I am thinking about keeping the sharpness lowered to 0 -I had it at 20 for almost a year-, and hope it will take me a day to get used to it but I noticed it does look more uniform, and still watercolour like.
 
So here I am stuck with my Sharpness dilemma. In short, I switched to Game Mode, Standard and Sharpness 0. :) Let's see how it goes.
 
Last edited by a moderator:
So I have been using an internet search engine and I found this VERY interesting article about how controls like Contrast, Brightness, Tint and Sharpness shouldn't even be there in the first place on modern HDTVs.

They are a thing of the past according to the author:

Controls of a Bygone Era

Even more shocking, today’s digital monitors and HDTVs still have the same basic user controls that were found in the original analog NTSC color TVs from 1953: brightness, contrast, tint, and sharpness. These controls only made sense for analog signals on the old NTSC television system. Brightness controlled the CRT direct-current bias, contrast controlled the video amplifier gain, tint controlled the phase of the color subcarrier, and sharpness performed analog high-frequency peaking to compensate for the limited video bandwidth of the old vacuum tube amplifiers. Today, none of these controls are necessary for digital signals.


Rotary controls for a mid-century analog CRT. Those Contrast and Brightness controls are legit.


Brightness and contrast controls shouldn’t be there because, for digital video, the black level is fixed at level 16, reference white at 235, and peak white at 255. Similarly, tint and phase have no real meaning for digital signals. Finally, the sharpness control isn’t appropriate for digital displays because in a digital image there’s no transmission degradation—the image is received exactly as it appeared at the source. Sharpening the image involves digitally processing the pixels, which leads to artifacts and noise unless it’s done at resolutions much higher than the final displayed resolution, which, of course, isn’t the case inside your monitor or HDTV.
Controls that Do Worse Than Nothing

Most monitor and HDTV user-menu options are actually unnecessary features added for marketing purposes—gimmicks to suggest the display has unique features that other models lack. Even worse, most of these options actually decrease image and picture quality.

In many cases, it’s not even clear what these sham controls really do. The documentation seldom explains them, and I even know engineers from high-level manufacturers who don’t know what the controls do, either. When I test TVs, I spend an inordinate amount of time using test patterns to figure out what the options and selections really do, and in most cases, turning off the fancy options leads to the best picture quality and accuracy.



Digital on-screen controls for a Samsung Syncmaster 242MP—that really have no business being labeled Contrast and Brightness.


The following is a list of useless (or near-useless) menu options and selections from three HDTVs sold by major brands: Black Corrector, Advanced CE, Clear White, Color Space, Live Color, DRC Mode, DRC Palette, Dynamic Contrast, xvYCC, Color Matrix, RGB Dynamic Range, Black Level, Gamma, White Balance, HDMI Black Level, Fresh Contrast, Fresh Color, Flesh Tone, Eye Care, Digital NR, DNIe, Detail Enhancer, Edge Enhancer, Real Cinema, Cine Motion, Film Mode, Blue Only Mode.
Some of the terms sound impressive, but almost all of this is unnecessary puffery and jargon that confuses not only consumers but the pros, as well.

Link to the full article:

http://www.maximumpc.com/article/features/display_myths_shattered
 
Back
Top