One of the reasons is purely based on source material, in the professional tv-world the equipment used to produce and edit Hi-Def, (and SD as well) varies to an extreme extent.
You have anything from 960x720, 1280x720,1440x1080 to 1920x1080 at framerates from 25 to 60 either progressive or interlaced.
Using a color sample from 4:1:1 4:2:0 4:2:2: to 4:4:4 (rare for recording devices)
With bitrates from 18 to 50mbit to 100mbit using either CBR or VBR in MPEG2 with or without long gop. And some even using AVC upto 100mbit.
And then we have the editing process where the material goes through a process where hardware, resources and money decides how gentle the original material gets treated. Do we dump it in a less bitrate heavy format with fewer demands on hardware and storage or do we dump it in an insane high bitrate and struggle with hardware demands.
Finally we render and master the product, the mastering media again decides the quality, is it a 25 Mbit XDCAM HD Disc, or is it a HDCAM SR HQ tape at 880 Mbit.
I can only say that, real SD material recorded on professional equipment, and edited at 1:1 , and put on Digibeta tape via SDI, looks fucking amazing. It´s amazing how good SD can look but pretty much no one ever gets to see, just how good.
Now imagine how good HD material looks at the source, in a HiDef OB Truck, recording concerts it´s almost sad how much can be lost when the final product is out on TV
Sorry, got carried away