Interpolation on a TV increases the input lag and creates artifacts. The frame-rate upscaling method described by Digital Foundry reduces input lag and deals with artifacts by pointedly adding motion blur.
As described above, there are inherent problems with motion blur in games.
It can look nice in demos, obviously, because the demonstration has a defined item of interest which the motion blur can optimize for. Games, in general, do not.
The profile of B3D forum members have changed over the years. When I first got in here, a fair number of visitors belonged to the Quake123/UT/CS communities, and their reason to be interested in the technical side of 3D rendering was performance - high frame rate rendering (and high frequency keyboard/mouse polling) simply meant better competitive performance.
These days those individuals are all but gone. Their problem is solved.
On the PC. You can pretty much always dial down the settings to meet 120 or 60 solid FPS, and USB polling rate is decent even without modification.
However, console gamers do not have the same graphical control options, their displays are anything but fast, their input devices do not support fast and accurate positioning. So we still have these 30 vs 60 fps discussions. People aren't allowed to make their own decisions and the underlying hardware poorly supports the benefits of high frame rates.
You guys who look at it from a static reaction time point of view are somewhat missing the point. The question isn't only, or even primarily, time from detection to response even though that is indeed a factor. Rather, it is in the process of making the appropriate response. If something is coming in fast from the left, and you need to shoot it down, you need to judge two things - speed and trajectory, and respond accordingly. The simple fact is that the more samples your brain has to work with the better it can evaluate that movement, and allow you to intercept that trajectory with a well placed bullet.
It's a bit like catching a ball under stroboscopic light - the longer between the strobes the harder it is to catch the ball. (Again, in an actual gaming situation motion blur doesn't help you here, as it doesn't know what movement you're trying to follow (if any). It would do fine at the proof of concept stage where it would only be that one opponent to care about.). Everyone can understand this, but it depends on the game, the hardware support, the individual player skill et cetera to determine when more input data (more frames) no longer result in a better prediction. It lies in the nature of the problem that the better you are, and the better your means of control, the faster the game is, the more critical this becomes as a factor.
Ergo: There will never be universal agreement on how high frame rates are desirable.
I will say this though. Peter Jackson filmed The Hobbit using double the normal frame rate, and even without being able to A/B test, and without control issues or any means for the viewer to affect the viewport or anything else, there has been a vast internet storm decrying the movie as "not being film like". The difference is apparent enough even under such controlled circumstances. So the issue of frame rate is definitely not only about control, but also a matter of graphical quality. Although, as with The Hobbit, the lack of it may actually be what you prefer.