Monitors tend to have higher refresh rates, and they're built differently from HDTVs, so figure those'd be the big contributing factors. Monitors have been doing 120 Hz, 240 Hz etc. for years now; seems like 4K televisions are only just starting to get into that area.
Even then, things like color balance and different between the two, there's more pixel density with a monitor (so even if you have same framerates on a monitor vs. television, a 30 FPS game probably looks more "sluggish" on a monitor due to higher pixel density over the surface area, and I guess whatever features televisions have built-in for blurring effects or whatnot, are not present in vast majority of monitor displays), and I wouldn't be surprised if televisions having higher burn-in rates inadvertently benefits 30 FPS games, since they tend to apply a lot of motion blur to smoothen animations (certain developers in particular excel at this).
Also from my experience monitors seem to give better color saturation and sharpness; maybe 30 FPS games on a television benefit from the slightly less saturated pop of colors since you aren't visually distinguishing certain color segmentation as sharply as you would on a monitor? Combined with lower pixel density, slightly more prone burn-in, and the other stuff I can see why 30 FPS is generally a target for console games that aren't competitive-driven.