
Bobbsy
Boring Old Git
Aaaah, I certainly don't want to start that...not my intention.
I love a good hijack too...and maybe start another flame war.
It depends on what you watch. I can't say I care much for digital noise and pixelation on broadcast TV, but then maybe I've been lucky so far. I'm a bit more picky about what I watch, too. Admittedly, I've never been a fan of broadcast TV, but from time to time I catch movies at a friend's house that has DirecTV, and I thought the picture was great. You can't really tell me that broadcast on 480i from a quad video machine will look as good as the same program that has a new 2-4k digital transfer and is being broadcast in HD, because I won't believe you.
That said, maybe it's what you're watching. Shit shot digitally will always look like shit shot digitally, and it doesn't matter what resolution it has. So maybe that's your problem? I'm still coming down from the golden age of television (the '90's), where damn near every program of substance was shot on 35mm film and it's really incomparable. Thankfully, that hasn't disappeared altogether. If you have a chance to catch the Walking Dead (Super 16), Boardwalk Empire (35mm), or something else that shoots film, I think you'd be astounded at how good tv can look...better yet, get blu-ray.
Hmmm...we're going to have to agree to disagree on your "shit shot digitally" comment. The original feed (before any compression) coming from a properly adjusted broadcast HD camera looks stunning. The problems only start when people start throwing MPEG or DV compression into the mix. I'd put good modern electronic cameras well ahead of most 35mm once you start thinking about grain etc. (and, for those of you on NTSC, the kludge needed to display a 24fps source onto a 30fps format).
And, of course, all this is still back at the studio where the pictures probably (if you're lucky) exist in RGB or YUV. However, at some point in the process these good pictures will be converted to NTSC or PAL with the "footprint" of artifacts that entails. This is where any analogy with analogue/digital audio disappears. Analogue TV AS VIEWED IN THE HOME was restricted to the encoded formats of PAL or NTSC (or SECAM if you're French) and covered with blurry, swimming edges etc. There was certainly no golden age of analogue TV once you got down to PAL or NTSC rubbish.
Of course now the broadcasters are doing the same thing and converting their pristine digital pictures into almost unwatchable rubbish by cramming up to 3 channels into one of your 6MHz UHF channels (or 4 on an 8MHz UK channel). Clever tricks like stat muxing can help but, in the end, you're still cramming a quart into a pint pot.
I agree that Blu-Ray looks a lot better than the HDTV being broadcast but even there, there are huge variations. Video bit rates range from under 15mpbs up into the low 30 mbps ranges--and, of course, the coding system can be MPEG, VC-1 or AVC.
Anyhow, there are too many variables for a generalisation...but, if you get back to the original camera output, HD digital video is stunning and so far ahead of either 480i/580i or 35mm passed through a telecine that there's simply no comparison. Alas, what you see at home bears no relationship.