HDTVs: Game Mode 2.0... tearing clean-up? What else?

Sony's reveal of 3D technology built into its new Bravias that works with PlayStation 3 got me thinking. What other kinds of interfaces with games consoles would we like to see in our HDTVs?

How about v-sync tear correction?

http://www.eurogamer.net/videos/digitalfoundry-uncharted-the-v-sync-difference?size=hd

You'd obviously get additional lag, and on 1VBL games the algorithm would be a lot more difficult to pull off, but it could work...

how about.. triple buffering? :LOL:

ontopic: actually I am quite happy with HDMI_CEC, it's cool that the ps3 turns on my hdtv at the correct channel, or that the hdtv can turn of the ps3.
What I would like to see however, is auto-callibration. That the game can send optimal display settings to the hdtv. Of course, this will never be possible as there are too much hdtvs, but to me it would be nice.
 
What I would like to see however, is auto-callibration. That the game can send optimal display settings to the hdtv. Of course, this will never be possible as there are too much hdtvs, but to me it would be nice.

No.

Even the same TV model will have different factory tolerances and thus 2 sets can vary in their calibration.

Games could adopt ISF standard, esp those trying to go for realism.
 
HDMI CEC doesn't work properly for me. When I turn my PS3 on, it turns the TV on, but goes to the wrong input - it goes to HDMI1 which is my HTPC. It also doesn't fully turn my AVReceiver on either, it just turns it on in standby mode so the HDMI signal can pass-through my AVReceiver. I also have very limited control with my TV remote (Samsung A650 LCD); I can navigate through the XMB and press enter (which is basically X).
 
Does tearing affect all HDTV's differently, do some cope better with tearing than others or if a TV has the same fixed refresh rate (50/60hz) will there be the same amount of tearing assuming the scene is constant and not dynamic like gameplay? Does motion technology like 100hz/120hz that you see on most modern TV's help or worsen the effect?
 
HDMI CEC doesn't work properly for me. When I turn my PS3 on, it turns the TV on, but goes to the wrong input - it goes to HDMI1 which is my HTPC. It also doesn't fully turn my AVReceiver on either, it just turns it on in standby mode so the HDMI signal can pass-through my AVReceiver. I also have very limited control with my TV remote (Samsung A650 LCD); I can navigate through the XMB and press enter (which is basically X).

It does work for me... i.e. switch on the TV and when I switch off the TV the PS3 switches off, as long as it is idle (i.e. not playing video or games etc.). But it doesn't work the other way around (the PS3 doesn't switch off the TV... but I guess that is by design, so it doesn't switch off the TV, when the user watches something else).

Using my TV remote doesn't work though (or I guess I have an older TV, which doesn't support this), although it is a Sony.

I don't use my receiver (only has HDMI passthrough anyway), so I can't say if that works.
 
Does tearing affect all HDTV's differently, do some cope better with tearing than others or if a TV has the same fixed refresh rate (50/60hz) will there be the same amount of tearing assuming the scene is constant and not dynamic like gameplay? Does motion technology like 100hz/120hz that you see on most modern TV's help or worsen the effect?

Tearing is not affected by the TV, unless maybe you have a 1080p set and a specific game performs better in 720p mode or some such. But that would be a very rare exception.

You shouldn't be using frame interpolation for games as it increases input lag. Though it would be interesting to see what happens when frame interpolation meets screen tearing :devilish:
 
It does work for me... i.e. switch on the TV and when I switch off the TV the PS3 switches off, as long as it is idle (i.e. not playing video or games etc.). But it doesn't work the other way around (the PS3 doesn't switch off the TV... but I guess that is by design, so it doesn't switch off the TV, when the user watches something else).

Using my TV remote doesn't work though (or I guess I have an older TV, which doesn't support this), although it is a Sony.

I don't use my receiver (only has HDMI passthrough anyway), so I can't say if that works.

Does this work only with PS3 Slim?
 
Sony's reveal of 3D technology built into its new Bravias that works with PlayStation 3 got me thinking. What other kinds of interfaces with games consoles would we like to see in our HDTVs?

How about v-sync tear correction?

http://www.eurogamer.net/videos/digitalfoundry-uncharted-the-v-sync-difference?size=hd

You'd obviously get additional lag, and on 1VBL games the algorithm would be a lot more difficult to pull off, but it could work...

I've just for the first time seen the new 3D technologies at work in a theatre, in the form of Pixar's Up in a Dolby 3D theatre that uses the polarised glasses technology, and an interestingly curved cinema screen. I have to say I was completely blown away by it, and the idea that I could have that technology for movies at home one day, or play Uncharted 2, God of War, Gran Turismo or whatever other game with that, just floors me!
 
Do we need at least double the frame rate of current games to be displayed smoothly in 3D display? or 15fps (for current 30fps games) for each view (eye) would be smooth enough for 3D? If it need to double the frame rate and someone wants to make 1080p 60fps games, does it need the TV to accept 1080p120Hz? does the current or future HDMI standard support it? From the 3D TV info I read, they do 240Hz (120Hz for each eye?), what kind of input (connector) do they use? Is it just interpolation like current 100/120Hz TV?
 
A "30fps" game is going to need to output 60 frames, 30 for the right eye, 30 for the left eye, to reach a "fluid" 30fps. Most of these offset the rendering for each eye, thus the resulting 3D look.

If Next Gen goes this route GPUs will need a nice bump (maybe 2x GPUs?) Of course if they go this route you wonder what type of fall back they will do for non-3D displays? 60fps for non-3D, 30fps for 3D? Assuming GPU limited...
 
What happens if there are frame drops? Does the 3D effect go away? That would be a bit jarring I'd think.
 
I tried to find a thread where people talked about how to make the best use of the HDTV and utilize the different settings to get the best image quality, and this is the more similar thread I found about that particular subject.

I remember similar threads in the forum but I couldn't find them, so if this isn't the right place for this post, I wouldn't mind if moderators moved it to a different one. :I
 
Time ago I followed the guidelines of an article published in an specialized site about how to calibrate your HDTV, and I accordingly set the different values of every setting as suggested, with some personal touches. :smile:
 
I've always been a huge advocate of sharp image quality, and there was one setting in particular I wasn't so happy about, because it recommended to set the Sharpnesss to 20 or less. I did that and the image looked blurry to me, losing some fine detail.
 
This was about a year ago, and basically I barely fiddled with the HDTV settings since then.
 
Deep down I regarded that particular Sharpness value distrustfully, so lately I have been playing Red Dead Redemption and guided by my own conscience I suddenly felt like increasing the Sharpness to a value where I felt more comfortable with.
 
I had read people on this forum saying that after the part of Mexico the game looks incredible in general and I have to admit they are spot on.
 
It looked amazing before too, but after increasing the Sharpness from 20 to 75 or more -while the game looked gorgeous before, of course- RDR went from gorgeous to absolutely jaw-dropping adding to the inherent gorgeous beauty of it. It looks simply stunning now, I was awestruck.
 
I think that the step up from the previous Sharpness setting to the current value is a HUGEEEEE jump, and I can only say that I didn't expect such an specific, apparently simple value to have such a dramatic effect on the image quality.

It's like the image has a lot more "relief" and emboss, as if it comes out to life.
 
Aside from that, RDR is a very clean game with a very crisp IQ, and looks great despite the fact that more Sharpness brings the flaws out more. It all looks better and more comfortable for the eyes to me anyways, so... If only the text looks a bit more aliased, but I like it that way.
 
My point by posting this message is to show that the image quality on the HDTV has improved DRASTICALLY after this change. Has the same happened to you? I wonder...

Don't underestimate such a simple but functional and tremendously useful setting, remember, it's more comfortable for your eyes and the image gets actually much more sharp and lush. :cool:
 
Regarding the sharpness... I've seen several TVs (iirc Samsungs in particular) that blur the image, if you go below a certain value in the Sharpness setting (i.e. the value has a range of 0 to 100, where 50 is "off", and anything lower will blur the image as a result).

I usually hate most "post processing" stuff TVs do, especially if they aren't dismissable. Luckily, my TV lets me switch off all of them. But it IS highly subjetive, too. Some people prefer the "TV store look", whereas others want the "pristine output displayed at D65".

Usually, setting "sharpness" to high values leads to "halos" around characters etc, which I absolutely hate. So, to each his own, I guess.
 
What happens if there are frame drops? Does the 3D effect go away? That would be a bit jarring I'd think.
What happens with standard 2D with double buffering, you see same frame until new frame is ready.
This is the reason why both eyes receive image with from same point of time. (60 rendered images, 30fps game logic.)
 
Carmack, in his keynote, suggested that there should be an HDMI CEC code that can put TVs into game mode (minimal lag mode) automatically whenever a game starts that needs low latency (like fps, etc.)
 
Carmack, in his keynote, suggested that there should be an HDMI CEC code that can put TVs into game mode (minimal lag mode) automatically whenever a game starts that needs low latency (like fps, etc.)
I want it as an option, always. Some TVs look like $*"(@# when all their post processing is turned off, and if the lag isn't that bad, the choice should lie with the user. Heck, even if the lag is that bad, the choice should lie with the user!
 
I want it as an option, always. Some TVs look like $*"(@# when all their post processing is turned off, and if the lag isn't that bad, the choice should lie with the user. Heck, even if the lag is that bad, the choice should lie with the user!

Well, in that case I'd say make it an option in the game that is on by default, but the user can turn off.

Or perhaps even better, just make a HDMI CEC feature that says 'gaming mode' (on by default, but configurable in game) and let TVs allow users configure how the TV should act for 'game mode'
 
Back
Top