anthonyredou
Banned
I think yes, it was at 60.
Considering TV's have a 60fps interlaced refresh, there won't be any difference. You'd need to have some sort of motion blur tech to see any.
how can sony do interpolation, it would have to know the contents of a future frame and as tv is displayed as soon as the frame is transmitted it couldnt know ?
15 fps is really too low for most types of game, though. It doesn't start to get decent until around 25-30 fps.I want a stable frame rate . stable 15fps , stable 30fps , stable 800fps .
I rather have it stable at 15fps than a frame rate that is over the place but averages 60fps . those 1fps that jump to 100 fps really screw up the game
Send them down to the local big screen TV store and see the sony tv demoing 60vs120 side by side.
That's just interpolated data, and it looks considerably better.
I think you can try using Quake3 or other older games with CRT monitor capable of hitting high refreshes. Just play it with different rate.
In the demo I saw it was split screen with movie clips showing. They didn't seem particularly designed to show off anything like you suggest.I watched some 120Hz LCDs and regular ones and there wasn't any real difference I could see Russ.
I think it depends on the media they are playing. If they did something on purpose to show it off (like the spinning room kind of thing) then maybe, but just the regular media did not show anything I could tell.
In the demo I saw it was split screen with movie clips showing. They didn't seem particularly designed to show off anything like you suggest.
The one clip I remember was "A Night at the Museum" scene where Ben Stiller is getting attacked by the cowboys and romans.
The difference between 60 normal and 120 was pretty striking in split screen. I'm sure it would be less noticeable in two sets, side by side. I'm sure it also helps get rid of the jitters of the 3/2 pulldown of film.