london-boy said:
Makes two of us then.
EDIT: And more links, since i don't like talking out of my ass. Still can't find one single source anywhere regarding 50fps broadcasts in Europe, or 60fps in the rest of the world Everything says 50Hz/25fps and 60Hz/30fps. And i'm a pretty good Googler.
http://www.google.com/search?sourceid=gmail&q=pal%20frame%20rate
I'm not too surprised you have trouble finding solid information out there - it's thin on the ground whereas less accurate stuff is plentiful. For some reason, even TV engineers themselves, or other "professionals" working with this stuff can't seem to get their head around the concepts.
Everything *used* to be at 25 or 30 fps because it was all shot on film and the TV systems were designed to show stuff that only really needed to update at 24fps but with as much resolution as possible. The most complex "algorithm" used was to either repeat fields or speed everything up.
As far as some experts are concerned, nothing has progressed since then. Digital video apparently never happened...
It's the same as this persistant falacy that the human eye can't see 50Hz movement. People just keep treading out the same old tired myths even when the truth is quite literaly staring us in the face.
The real situation is far more complicated.
Given how many stages the picture goes through between "something happening in the real world" and "light hits your eye", it's no surprise that quite a lot of those stages mangle the picture in interesting ways, even assuming you have a 50/60 image to start with.
Obviously anything shot on film is ruled out straight away. Stuff on shot on video might well start off OK but get deinterlaced at source and processed (for example, converted to an mpeg stream for storage or broadcast).
Even when it gets to your TV it's not safe - many flat-panel displays or modern CRTs do their own deinterlacing and some are not entirely smart, effectively building a 30/25fps progressive signal out of something that really should be 60/50.
These days I can definitely see smooth motion combined with combing artefacts while watching some TV programs. I would only get this if the source material is at 50fps and isn't being deinterlaced. The smarter deinterlace filter I used to use could make a reasonable stab of removing the combing without killing the motion, but the current one I use doesn't do that. Either is better than the stupid default filters which deinterlace to 30/25...
So it's definitely there, I see it every day, but I'm not at all surprised that you could be watching the same stuff and not being seeing it, nor that you can't find concrete information about it.