30fps vs 60fps - is there a tech demo out there to persuade the ignorant?

Considering TV's have a 60fps interlaced refresh, there won't be any difference. You'd need to have some sort of motion blur tech to see any.

The drive for 120Hz television is the far smoother interpolation of both NTSC (60hz) and standard movie (24hz) framerates. On a standard 60hz set, playing 24hz media results in a duplicated frame every ?third? frame (might be fourth frame), so you get wierd motion artifacts when smoothly panning through a scene.

I don't believe the push for 120hz had really any other bearing on things.
 
The point I made about 120 vs. 60 was to show him that "yes, virginia, you can percieve the difference between the two refresh rates" using a simple demo.

(And yes, the Sony stuff does interpolation to 'fill in' the new frames.)
 
how can sony do interpolation, it would have to know the contents of a future frame and as tv is displayed as soon as the frame is transmitted it couldnt know ?
 
I want a stable frame rate . stable 15fps , stable 30fps , stable 800fps .

I rather have it stable at 15fps than a frame rate that is over the place but averages 60fps . those 1fps that jump to 100 fps really screw up the game
 
how can sony do interpolation, it would have to know the contents of a future frame and as tv is displayed as soon as the frame is transmitted it couldnt know ?

Why would modern TVs necessarily display as soon as the frame is recieved?
 
I want a stable frame rate . stable 15fps , stable 30fps , stable 800fps .

I rather have it stable at 15fps than a frame rate that is over the place but averages 60fps . those 1fps that jump to 100 fps really screw up the game
15 fps is really too low for most types of game, though. It doesn't start to get decent until around 25-30 fps.
 
Send them down to the local big screen TV store and see the sony tv demoing 60vs120 side by side.

That's just interpolated data, and it looks considerably better.

I watched some 120Hz LCDs and regular ones and there wasn't any real difference I could see Russ.

I think it depends on the media they are playing. If they did something on purpose to show it off (like the spinning room kind of thing) then maybe, but just the regular media did not show anything I could tell.
 
I think you can try using Quake3 or other older games with CRT monitor capable of hitting high refreshes. Just play it with different rate.


or let it vary with the com_maxfps environment variable in the console (you can use open arena which is a free software quake3 clone)
 
I watched some 120Hz LCDs and regular ones and there wasn't any real difference I could see Russ.

I think it depends on the media they are playing. If they did something on purpose to show it off (like the spinning room kind of thing) then maybe, but just the regular media did not show anything I could tell.
In the demo I saw it was split screen with movie clips showing. They didn't seem particularly designed to show off anything like you suggest.

The one clip I remember was "A Night at the Museum" scene where Ben Stiller is getting attacked by the cowboys and romans.

The difference between 60 normal and 120 was pretty striking in split screen. I'm sure it would be less noticeable in two sets, side by side. I'm sure it also helps get rid of the jitters of the 3/2 pulldown of film.
 
In the demo I saw it was split screen with movie clips showing. They didn't seem particularly designed to show off anything like you suggest.

The one clip I remember was "A Night at the Museum" scene where Ben Stiller is getting attacked by the cowboys and romans.

The difference between 60 normal and 120 was pretty striking in split screen. I'm sure it would be less noticeable in two sets, side by side. I'm sure it also helps get rid of the jitters of the 3/2 pulldown of film.

I know about the jitters bit, but these were just 1080i over the air signals.

I personally would be wary of demos myself. I would prefer they show it on separate tvs. Split screen can make one look weird when it would look fine otherwise.

I have just seen a lot of bogus demos, for example image stabilization. Panasonic has this demo mode where instead of turning off image stabilization they just blur the image...kudos to them. I figured it out b/c I set the camera on a table, with the timer so there was no shaking whatsoever, and it wasn't even that dark. The image was just as blurry as when I held it all shaky. Total bogus.

BTW that isn't saying they did such a nefarious thing, just that I am a bit skeptical of such things. I actually wonder why gaming LCDs are not already 120Hz though as well.
 
Here is an ancient (DOS, but it has worked on every XP machine I've tried) demo that shows a strange side effect of running at half/quarter refresh rate:

http://www.marky.com/files/dos/motion.zip

In the demo, what looks like monitor ghosting or crappy motion blur on the low-fps boxes is actually all in your head. Your cave-brain is not built to watch things flashMove-flashStop-flashMove-flashStop. Somehow it interprets double-refreshes of a moving object as two moving objects.

This tripped me up when I was making a software rasterizer long ago. "How did that crappy motion blur get in there? I sure didn't write it and I'm writing the friggin triangle rasterizer!" It was because my demo was 30 fps.
 
So that explains why the motion blur from 30fps doesn't show up in screenshots. I suspected it was the human element rather than the display but it's nice to have confirmation.
 
Back
Top