120hz technology in future consoles?

specwarGP2

Regular
I'm curious, has anyone seen these 120hz TVs with motion enhancers in action? They take games that run at 30fps and interpolate frames to pump it up to 60 or 120fps. I have to say that it looks incredibly smooth.

Is there some reason this type of technology won't be used in future consoles? The processing required must not be that intensive... it is in a TV after all.
 
There's a problem with that...that's lag.

Let's say you have frame1 and frame2 generated by the console. If your TV or whatever going to interpolate a frame between frame1 and frame2, let's call it iframeA.

So that would mean the game has already generated frame1 and frame2, while your TV interpolate between frame1 and frame2 and is displaying iframeA. As you can see, the game is a frame or two ahead.
 
I can't wait to try that feature out on my x360 and PS3 when I get my Samsung. Would be a good workaround to all the struggling framerate games. Hopefully the lag isn't that noticeable I don't play fighting games where frame accurate moves are important. Of course hopefully in next-gen consoles they'll get a handle on all the framerate problems and not have to disable v-sync to make the damn game playable this is ridiculous.
 
There's a problem with that...that's lag.

Let's say you have frame1 and frame2 generated by the console. If your TV or whatever going to interpolate a frame between frame1 and frame2, let's call it iframeA.

So that would mean the game has already generated frame1 and frame2, while your TV interpolate between frame1 and frame2 and is displaying iframeA. As you can see, the game is a frame or two ahead.

Right, that's where the problem is currently. Because TV has to interpolate. Wouldnt the lag be eliminated if the console did the interpolating?
 
I'm fairly sure that the HDMI interface doesn't support 120Hz, so any sort of interpolation could only be done up to 60Hz. I'd be interested to know what algorhythms are used for the interpolation if it's that good - personally I've not seen it.
 
HDMI spec can go pretty high: http://www.hdmi.org/press/pr/pr_20060622.aspx

"Higher speed: HDMI 1.3 increases its single-link bandwidth from 165MHz (4.95 gigabits per second) to 340 MHz (10.2 Gbps) to support the demands of future high definition display devices, such as higher resolutions, Deep Color™ and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds."

Personally, I loathe frame interpolation in sets and if it makes it's way into games, I'd like an option to turn it off.
 
With Bravia TV, I've got to turn game mode on. If not games can appear choppy or alias a lot.

I suspect this 120 Hz stuff is the problem and maybe some other post processing stuff, but I am not sure.

Do the screen actually refresh at 120 Hz ? Because I can't get 120 Hz with PC connected to it. I can get 85 Hz with VGA connection. But HDMI only 60 Hz. Do I need to install a TV driver or something ?
 
I switched off the interpolation tech in my TV. Sure, it looked good when it worked, but it was inconsistent, visibly switching on and off - and produce nasty artifacts with parallax (eg, the camera pans past a close chain link fence).

I'd be interested to see a game implement a similar effect. In the past I have wondered about this. Instead of using rendered motion vectors in a motion blur shader, why not render the frame in tiles, over X displayed frames. Then instead of motion blur apply a similar parallax like effect when displaying the completed frame. So if you have three tiles to fill the screen, then you render one tile per frame off screen, while drawing the last generated frame full screen with parallax.
You would be a frame or three behind, but the jump in framerate could somewhat compensate. With an engine that already tiles on the xbox I can't see it being an enormous challenge.

Of course it may look terrible :)
 
We're already scaling the heck out of our horizontal and vertical resolutions to blanket discrepancies between source and native panel resolution; let's not bloat our motion while we're at it. Please. To me it's like blurring your eyes to call the world a soft place.
 
TV frame interpolation just adds more potential visual lag. It'll probably get worse as 240Hz TVs are already appearing. It works for film content because it's passive (i.e. the user isn't waiting for his/her input to be displayed) and film is shot at a relatively low framerate that doesn't snyc well at a standard screen refresh; which leads to jerkyness and occasional stroboscopic effects. Secondly, assuming this was something even desirable, why waste precious processing resources on your console when the TV is available to offload that for you?

V3 said:
Do the screen actually refresh at 120 Hz ? Because I can't get 120 Hz with PC connected to it. I can get 85 Hz with VGA connection. But HDMI only 60 Hz. Do I need to install a TV driver or something ?

Yes, however they don't sync@120Hz. They TMDS sync @24Hz, 30Hz and 60Hz typically and interpolate the signal up to 120Hz. The reason 120Hz (and now 240Hz) is chosen is simply because it's a multiple of those frequencies (5, 4, 2 respectively (and 10, 8, 4)), that makes for easy interpolation.
 
I switched off the interpolation tech in my TV. Sure, it looked good when it worked, but it was inconsistent, visibly switching on and off - and produce nasty artifacts with parallax (eg, the camera pans past a close chain link fence).

Isn't the interpolation switching off due to the inconsistent source framerate? I agree that there are some nasty artifacting going on, and interpolation can't be used all the time, but when its working properly it looks incredibly smooth.

Secondly, assuming this was something even desirable, why waste precious processing resources on your console when the TV is available to offload that for you?

Well if you moved the interpolation from the TV to the console, you could figure out a way to eliminate the lag right? Also, how resource intensive can interpolation be if a TV can do it? Seems like a "cheap" way to double framerate.
 
I would much rather see console manufacturers making it a requirement for games to run at a solid 60 fps than to have to rely on any type of interpolation from your TV. Devs would cry of the headaches it would initially cause them but if it was a requirement they would stop crying once they realize that in order for the game to be brought out it would need a stable framerate.
 
Right, that's where the problem is currently. Because TV has to interpolate. Wouldnt the lag be eliminated if the console did the interpolating?

If it's on the console, I could imagine that it could be use for cut scene, or where the interaction doesn't require quick response. Yeah, so it's possible...but is it worth it? I guess it depends on how much it will cost in term that the silicon could be apply to making the game runs faster in general.
 
I can't wait to try that feature out on my x360 and PS3 when I get my Samsung. Would be a good workaround to all the struggling framerate games.

A 120hz tv doing interpolating will not help you at all with games that allready struggle with framerates. It may look a little smoother, but the results of your input (what you do with your gamepad) will still be displayed at the same pisspoor framerate that the game runs at.

Of course hopefully in next-gen consoles they'll get a handle on all the framerate problems and not have to disable v-sync to make the damn game playable this is ridiculous.

Um... It has nothing to do with the power of a console, as long as there is a strong focus on graphics framerate problems are bound to happend.
 
A 120hz tv doing interpolating will not help you at all with games that allready struggle with framerates. It may look a little smoother, but the results of your input (what you do with your gamepad) will still be displayed at the same pisspoor framerate that the game runs at.

Which the vast majority of gamers wouldn't even notice. Most LCDTVs out there don't have a game mode, and have extreme lag.
 
i thought the main objective of 120Hz was eliminating interpolation/pulldown.

movies at 24 fps display each frame 5 times in a row
TV/games at 30 fps display each frame 4 times in a row
TV/games at 60 fps display each frame 2 times in a row

lowest common denominator
 
Isn't the interpolation switching off due to the inconsistent source framerate? I agree that there are some nasty artifacting going on, and interpolation can't be used all the time, but when its working properly it looks incredibly smooth.

I didn't clarify, I never used it for games. I turned it off for TV too.
 
i thought the main objective of 120Hz was eliminating interpolation/pulldown.

movies at 24 fps display each frame 5 times in a row
TV/games at 30 fps display each frame 4 times in a row
TV/games at 60 fps display each frame 2 times in a row

lowest common denominator

120hz eliminates judder for 24fps movies, but used with interpolation it can noticeably reduce blur caused by the sample and hold effect.
 
specwarGP2 said:
Wouldnt the lag be eliminated if the console did the interpolating?
No because you inevitably have to run rendering at least one additional frame behind for this to work. Not sure if it's as bad as TV added lag, but you will always add some latency by doing this.

grandmaster said:
I'm fairly sure that the HDMI interface doesn't support 120Hz, so any sort of interpolation could only be done up to 60Hz.
Hence why it's done by the TVs not consoles? :p
Anyway, as far as console-side processing goes, interpolation for rendering is becoming pretty common (Eg. run the game at 30 or whatever, but render at 60 with interpolation). Of course that doesn't help anything if you're primarily rendering limited, but there's plenty of games that aren't.
 
Back
Top