Am I seeing MPEG Encoding on my new HDTV?

SouthSIDE Glen

independentrecording.net
I upgraded my video display on my editing/mixing desk a few weeks ago; I now have a 22" 1080P LCD TV/monitor. When it comes to watching TV (not counting computer video, DVD, etc., but just talking about actual *television* programming), I limit myself to the new digital over-air broadcasts. (I don't want to go into why I don't do cable or satellite, that's a whole other story/thread that I don't want to get into.)

Anyway, I have noticed quite a bit of artifacting of a couple of types that look awfully familiar to me.

The first is in the display of art graphics (e.g. company logos and such on commercial advertising and some cartoon animations), where I often see artifacting that looks a awful lot like heavy JPEG compression, the kind we've all seen on some internet images with heavy JPEG, where you see sloppy dithering in a kind of digital "fog" around what should be sharp borders or small or narrow objects.

I also see a related kind of distortion when watching reruns of "Family Guy", where vertical lines or borders "ghost" horizontally; e.g. where there is a vertical black line, there are anywhere from 6-8 "shadow" lines ghosting to the right of the original. Interestingly, these artifacts do not show up in commercials for the show or in still pic bumpers for the show, only in the actual broadcasts.

The second artifact is a pixelated dithering in otherwise solid or very slightly shaded color swatches. This one is rarer, about the only place I can predict seeing it with any certainty is in the ivdeo screens showing the answers and the contestant's scores on the game show "Jeopardy' (which, curiously, advertises proudly as being in HD), but sometimes I see it in the playback of pre-recorded location video on newscasts.

This stuff does not happen all the time, by any means. In fact, it seems that live shows that broadcast in HD and use live HD graphics, such as newscasts (not playing back video) and the late-night talk shows are artifact-free. And in fact where it does happen the most is during commercials, though it's not all commercials by any means.

i know it's not my reception, I know what that kind of digital artifacting looks like. And I doubt it's my TV, because of the content-related pattern to the appearance of the artifacting (BTW, I never have any of this artifacting when watching VHS, DVD or computer video on that same TV).

It appears to me that perhaps (?) these artifacts may be caused by the behind-the-scened, post-prodution (or perhaps live?) conversion of an analog source to a digital signal, but that is only a guess on my part based upon the familar appearance of the artifacting to lossy-type compression I've seen in the computer world.

Does anybody have any insight to what just may be the actual process going on here? Just curiosity on my part.

G.

The second
 
It appears to me that perhaps (?) these artifacts may be caused by the behind-the-scened, post-prodution (or perhaps live?) conversion of an analog source to a digital signal, but that is only a guess on my part based upon the familar appearance of the artifacting to lossy-type compression I've seen in the computer world.

Does anybody have any insight to what just may be the actual process going on here? Just curiosity on my part.
I think you pretty much nailed it in your first paragraph that I quoted. Scaling content that was done originally in other formats of varying resolutions as well as signals that were once progressive or interlaced all being thrown into the digital blender tends to still leave a few lumps.

The curse of any good high definition TV, like a really good pair of studio monitors, is that it lets you see all the faults in the signal with unwavering judgment.

It also doesn't help that they originally came out with more then a dozen DTV signal formats and then ask a TV set that was primarily designed to only really accurately scan only one of them, to do a decent job of dealing with all the variables. Some screen technologies are better then others in that sense and plasma sets do tend to scale more gracefully then most if not all LCD sets I've seen and the actual scaling engines in the sets also vary in quality by fairly wide margins and is why sets like Pioneer's Elite plasma line will look head and shoulders better then almost anything else on the market. They also cost way more too so the old adage; You get what you pay for", still factors large in the equation.

Cheers! :)
 
I am surprised that you see so much artifacting on over-the-air digital broadcasts. I have setup an antenna and have DirectTV, and I can tell you that the local channel broadcasts definitely do look better when viewed through the antenna compared to DirectTV, due to the fact that DirectTV adds their own compression to be able to fit 10 million channels in the available bandwidth.

Admittedly some artifacting is visible on over-the-air broadcasts as well, and I think it is a combination of upconversion of old program material as well as some lossy compression, but it's definitely no where near as bad as what I see on some DirectTV programming.

I specially hate when solid blacks look like blotchy square pixellated mess instead of solid black.

This is why I like Blu-Rays: no such compression artifacts whatsoever.
 
The curse of any good high definition TV, like a really good pair of studio monitors, is that it lets you see all the faults in the signal with unwavering judgment.
The irony here - if my working theory of the codec used in the various ADC conversions is causing lossy-style compression artifacts - is that I don't think the artifacting I'm seeing is so much a matter of the monitor revealing the faults, but rather the faults being more macro-level artifacting *generated* by the need to transmit a "superior" signal, to the point where the original analog source actually in the more obvious obvious ways looks superior to the digital version.

I'm going to have to check my mother's set-up and see if the same artifacts happen. She just has an old Mitsubishi 26" analog CRT TV with a $60 LG converter box hooked up via composite connection. I believe she should still be able to see that "JPEG-like fog dither", assuming that is actually part of the signal. It could take a while for me to compile a reliable and repeatable list of sources (e.g. This commercial on this channel) and then to go and check for them on her TV, but it'll be interesting to se whazzup wid dat.

I am surprised that you see so much artifacting on over-the-air digital broadcasts. I have setup an antenna and have DirectTV, and I can tell you that the local channel broadcasts definitely do look better when viewed through the antenna compared to DirectTV, due to the fact that DirectTV adds their own compression to be able to fit 10 million channels in the available bandwidth.
I can't make that comparison, as I have not seen a DirectTV signal for quite a while. It's very interesting though - and admittedly somewhat personally gratifying, to be honest ;) - to hear that BDTV is actually of higher quality than DirectTV. I'm not quite sure how to quantify just how "much" I get. There is plenty of programming where I won't catch any such artifacting, but there is enough to pique enough curiosity in me to wonder just whats going on.

Again, this is only a quasi-wild guess on my part, but I wonder just what the process chain is when one starts with an analog signal, because it really does seem very source-dependent; i.e. commercial X or program Y will always expose artifacting, regardless (I think, but I'm not positive) of channel or broadcast resolution, whereas commercial or program Z will almost always be artifact-free. It seems as if maybe (?) the ad agency or program distributor has the responsibility of taking their original analog stuff and re-mastering or creating a digital version of it to send to the stations for broadcast, and some of them go cheap or otherwise drop the ball in the quality of conversion that's performed.

As I'm typing this, a perfect example just happened. I mentioned the "Family Guy" example before. I'm watching it on WGN-DT right now, replete with the vertical line ghosting. Yet just 60 seconds before it started there was a WGN-created commercial showing "what's next", and the FGuy images were ghost-free. It's as if they both used a similar source but WGN's digital mastering was of better quality.
I specially hate when solid blacks look like blotchy square pixellated mess instead of solid black.
I haven't seen that on black yet, but that's exactly what I'm talking about when I refer to the blue video screens on "Jeopardy" (which ironically proudly advertises as being in HD...I guess so we can finally see Alex Trebek's nose hairs? ;) )

But yeah, as Ghost referred to, the really annoying part is the lack of standards for aspect ratio, resolution, etc. , and having to witness those differences from source to source.

G.
 
The irony here - if my working theory of the codec used in the various ADC conversions is causing lossy-style compression artifacts - is that I don't think the artifacting I'm seeing is so much a matter of the monitor revealing the faults, but rather the faults being more macro-level artifacting *generated* by the need to transmit a "superior" signal, to the point where the original analog source actually in the more obvious obvious ways looks superior to the digital version.
It doesn't really matter where the junk is coming from, the key, at least to me, is in the quality of the scaling engine in the TV set itself and the underlaying screen technology of either LCD or plasma. Plasma has many design similarities to CRT's in that each pixel produces its own brightness and color and it also shares the use of phosphors in the screen to smooth out the colors and most importantly, smooth out the detail so that stair cased diagonal lines appear closer to to straight lines and that facet alone is responsible for covering up a lot of the hash that you don't want to see in the picture.

Think back also to when you used to use a CRT computer monitor and how it handled different resolution settings and how it didn't need "Clear Type" in order to read text on your screen! LCD monitors have not overcome these critical issues and again, plasma TV's are as close to a CRT as you're gonna get in this brave new, flat panel world.

I use a 37" Panasonic plasma TV in my living room and I watch a lot of SD channels on it and all of them are very watchable and clean looking. When I do the same thing on my 22" LCD in my studio, the noise is absolutely insane and I truly can't stand to watch it for more then a few minutes unless I'm viewing HD content and even then, the picture is too crisp. It's essentially useless unless I'm just using it for computer graphics and even then, I find my eyes get tired faster trying to work with it...sad too because it's supposed to be a very well rated monitor.

Cheers! :)
 
As I'm typing this, a perfect example just happened. I mentioned the "Family Guy" example before. I'm watching it on WGN-DT right now, replete with the vertical line ghosting. Yet just 60 seconds before it started there was a WGN-created commercial showing "what's next", and the FGuy images were ghost-free. It's as if they both used a similar source but WGN's digital mastering was of better quality.

Can you describe this "ghosting" a bit more please? Are you saying that when watching a program that was originally in standard definition so it was recorded in 4:3 aspect ratio you see some gray shading going on on the black stripes on the sides? Or does it look more like 1-2 pixel line across the edge where things are changing really fast. The reason I am asking that is because if it's the second case, that's because sometimes with HD you see the "overscan" area that's usually cutoff on standard definition TVs. This is usually the area where you have things such as SMPTE time code burnt into the film and other similar information. Since only very little of that area shows on the TV it looks like this crazy moving noisy line at the top of the screen or on one of the sides.

So, if it's the latter case, change your TVs aspect ratio from 16:9 to "Just Scan", this will enable the "overscan" feature, thus chopping off those "extra" areas from the program material.
 
It doesn't really matter where the junk is coming from, the key, at least to me, is in the quality of the scaling engine in the TV set itself and the underlaying screen technology of either LCD or plasma.
If this artifacting is in the source as we theorize, the reproduction technology is irrelevant, because the artifacting is on a macro, low resolution/large surface area scale whereas any screen technology would reproduce it.

Again, it seems to me that it's not an artifact of the TV in and of itself; there is plenty of source material at ultra-high HD resolution as well as low resolution in which no artifacting occurs. I am in fact, *very* impressed with the overall picture quality in every respect, but when a digital codec pixelates and dithers certain details of the image on a scale far larger than even NTSC analog resolution, it's going to show up everywhere.
Can you describe this "ghosting" a bit more please? Are you saying that when watching a program that was originally in standard definition so it was recorded in 4:3 aspect ratio you see some gray shading going on on the black stripes on the sides?
No, it's not an edge of frame or AR thing at all.

First, I'll say that FG reruns are the ONLY program I have seen with this particular artifact; I have not seen any Fox first-run FGs on this TV yet, so I can't say whether it's a rerun-on-WGN-only thing, only that that is the only place I have watched it thus far. I can say, though, that I have seen other animations on this TV on various channels, none of which exhibited this "ghosting", it's been so far a FG-only thing. But it's guaranteed to happen on FG, every episode I have watched (and there have been many) exhibited ths same artifacting.

It's all happening within frame (a standard 4:3 aspect ratio display that's not stretched or cropped), and it takes the form of a "ghosting" of dark vertical lines within the image. For example, let's say there's an image of the doorway in the wall of the living room with basically a black rectangle border. Both of the vertical sides of that rectangle will be accompanied by multiple "ghosts" of those vertical lines (as many as 6-8 at times) that are fainter in intensity/contrast than the original, and always to the right of the original. The separation between the original line and the ghosts is of regular interval, and that interval is far larger than any screen resolution scale issue. This is only on vertical (or nearly vertical) lines, no horizontally-aligned details ghost or smear in any visible way, and there is no ghosting of "areas" like color swatches, it's only in dark vertical lines.

I'm not currently set up to capture these images (I have a HDR, but I have a re-wiring project slated for my desk before I add that box into the mix), but I can try to photoshop a simulation of what it looks like to illustrate if need be.

G.
 
Yeah a picture of the actual problem would be helpful because I'm having trouble trying to visualize exactly what we're talking about here.

Cheers! :)
 
Yeah a picture of the actual problem would be helpful because I'm having trouble trying to visualize exactly what we're talking about here.
OK, I'll work on synthing both types of artfiacting that I'm talking about. I'm pretty confident that I can deliver a pretty accurate representation of both of them. Just give me some time to work that into my day...

G.
 
Samples

To hell with my day! Here's some faked examples of what I'm talking about, but they are pretty true to the real thing...enough to give you guys an idea, I hope...

The first one is the "jpeg-style" pixelating fog. I see this most often in commercials, but I can sometimes also see this in background details in some newscast on-scene video reports. The amount of artifacting varies, many are even higher contrast than what we see here. Note that this is not on *all* commercials, just some (a good half of them, I'd estimate). Also note that CG graphics generated directly in digital - like titles and graphics on most late night TV shows, and captions and crawls on newscasts - almost never display these kind of artifacts for me.

And finally note that the size of the image does not much matter. Even if the titles below were 50% of the screen, the "pixel fog" can still be substantial - enough to swamp any screen resolutuion issues.

Here's a before and after of the "jpeg fog" effect:


And here's a representation of the Family Guy-only "ghosting". It appears slightly exaggerated here because of the lower resolution of the graphic I'm using as compared to the actual live video resolution, but you should get the general idea. Again, a before and after:



G.
 
Glen, your picture links don't appear to be working...I can't see anything other then tiny little boxes with a red X in the middle!

Cheers! :)
I'm not sure why that's happening; they're just some small in-line images with public read permission set properly, and they work fine for me. Well, here's an upload of them to this server, maybe that'll help (in the order of jpeg fog before/after and FG ghosting before/after:
 

Attachments

  • logo_test.jpg
    logo_test.jpg
    15.7 KB · Views: 12
  • logo_comp.jpg
    logo_comp.jpg
    8.8 KB · Views: 12
  • fg_plain.jpg
    fg_plain.jpg
    27.9 KB · Views: 12
  • fg_ghost.jpg
    fg_ghost.jpg
    29 KB · Views: 12
OK, on your first picture example simulation, this looks like the classic jpeg artifacting effect and again, I'm going to return to my original answer and blame this on the TV's scaling engine and inherent LCD technical shortcomings.

Yes, many other signals will look 100% clean on these screens and it is because the program content is matched to the native scanning resolution of the screen or is of a mathematical fraction that divides evenly into the scanning lines and still comes out cleanly.

Yes, a plasma TV will still show these issues too but, will generally show them to a smaller and more tolerable degree. And even here too, there will still be differences between different plasma screens because of the quality of the scaling engines and some are much better then others!

If this artifacting is in the source as we theorize, the reproduction technology is irrelevant, because the artifacting is on a macro, low resolution/large surface area scale whereas any screen technology would reproduce it.
The reproduction technology is definitely not irrelevant and I have no idea why you refuse to believe that? That's kind of like saying that all loudspeaker technology is irrelevant.

The ghosting artifacts that you were seeing in the Family Guy shows, I have no explanation for other then I remember seeing similar artifacts in some of the older Simpson's shows too and I can only attribute it to poor transfers at the distribution level when they're making duplicate copies for independent release to various affiliate channels and networks. It's probably some type of time code clock feedback jitter and that issue should show up on all screen technologies.

But, the jpeg noise in your first example can be presented less objectionably with a better TV.

What TV is this that you're using? I don't believe you've said what it is.

Cheers! :)
 
OK, on your first picture example simulation, this looks like the classic jpeg artifacting effect and again, I'm going to return to my original answer and blame this on the TV's scaling engine and inherent LCD technical shortcomings.
And again, I'm going to say that it probably looks like a classic compression artifact because it probably IS a classic compression artifact :). Which would put the creation of the artifact far upstream of the receiver or the display.
Yes, many other signals will look 100% clean on these screens and it is because the program content is matched to the native scanning resolution of the screen or is of a mathematical fraction that divides evenly into the scanning lines and still comes out cleanly.
Except that I cannot find a correlation between the broadcast resolution and the appearance of artifacting.

First, when the commercial for Family Guy on WGN at 9:59 is at 720i and the program itself on the same station one minute later at 10:00 is also at 720i, and the program artifacts but the commercial does not, there's something else going on. The refresh rates are not changing any more than the resolution is. And when I see compression artifacting on a newscast on-scene video package but not on the in-studio feeds in the exact same broadcast, that tends to indicate that the difference lies pre-transmitter.

Second, the artifacting shows up on some material that's broadcast at 480i, 720i and 720p. It also fails to show up on other material at each of those resolutions; there is no direct correlation I can find between artifacting and broadcast resolution. Rather, the only correlations I can find are based upon *source type*, and they seem to be sources that probably started out as analog content and have been converted to digital either before broadcast or on-the-fly before hitting the digital transmitter; the examples I gave of TV commercials (especially non-big-name-national-account advertisers) and of on-scene video packages - both likely to be filmed with analog or non-HD digital cameras and put on tape the same way (at the source, anyway) - it would seem to indicate that the artifacting is coming out of the transmitter and not generated on the receiving end, because the transmission specs are not changing.
The reproduction technology is definitely not irrelevant and I have no idea why you refuse to believe that? That's kind of like saying that all loudspeaker technology is irrelevant.
I *meant* it was irrelevant to these forms of artifacting, because it appears that the artifacting is a function of upstream encoding and not in receiver compatibility or display technology, and is of suca a macro effect as to swamp the subtle detail differences between a plasma display and a comparable LCD disply. Oh, and BTW, I couldn't care less what technology goes into my loudspeakers. As long as they deliver the sound I want for as long as I expect, I don't care whether it's made with a high-tech ion vapor dispersion technique or hammered out by a 19th century blacksmith on an anvil.
The ghosting artifacts that you were seeing in the Family Guy shows, I have no explanation for other then I remember seeing similar artifacts in some of the older Simpson's shows too and I can only attribute it to poor transfers at the distribution level when they're making duplicate copies for independent release to various affiliate channels and networks. It's probably some type of time code clock feedback jitter and that issue should show up on all screen technologies.
At least there we are on general agreement. I guess we'll just have to agree to misunderstand each other (as most gearheads and non-gearheads tend to do ;) ), because I can't understand why you're willing to accept this as a re-masteriing/duplication issue, but 'PEG-style compression artifacting has to be because I'm not using a plasma display and not because of a re-mastering/distribution issue - especially in the face of the evidence suggesting that the artifacting is probably due to signal *compression* and not to rate or resolution conversion.

The TV is a Vizio VO22L-10A. A know a lot of gear guys will look down upon it as being a lowly Sam's Club model, but I went to three different Sam's Clubs and compared it against similar models from Sony, HP, LG, Phillips, etc. etc. etc.* and it beat all competing models in all three stores in picture appearance, and most of them in printed specs, and did so at an extremely reasonable price.

Here's a sample of the printed specs:

Size: 22.6", 16:9 aspect ratio
Resolution: 1920x1080
Dot Pitch: 0.248mmx0.248mm
Signal Compatibility: 480i, 480p, 720p, 1080i, 1080p
Display compatibility: 720p
Colors: 16.7 million
Brightness (typical): 300nits
Contrast: 5000:1
Response time (typical): 5ms

Not being mush of a gear chaser, for me plasma is too large, too expensive, and too limited in life expectancy to justify seeing whether Alex Trebek's third nostril hair faces slightly left or just straight down. IT'S ONLY TV, and not worth fussing over, IMHO. Good enough is good enough.

Besides, when you get to be my age, you realize that you've spent the best part of your life life throwing money away on technology that you'll just be replacing in five to ten years or so, anyway, and that you were no more or less happier that whole time than your next door neighbor who had nothing play with but a stick and an imagination.

But that doesn't mean that I stay away from understanding the technology behind all this stuff, and knowing what's what and *why*. And I found it quite curious just how much the content technology used seems to lag behind the transmission technology. Why, for god's sake, do they feel the need to so highly compress the digital conversion on some analog content - as it increasingly seems to me to be what's up here? That question, which probably has more to do with human psychology than it does human technology, just fascinates me.

* Almost as fascinating, but actually far more humorous, is the re-appearance of long-defunct brand names and logos that have been bought up by (usually) overseas companies that have no relation whatsoever to the original companies. The DTV converters I bought for my mother are manufactured by LG Electronics, but actually have the lightning-bolt Zenith logo on them so that my 88-year-old mother will think it's as American as John Wayne :rolleyes:. And right next to all those brands I mentioned above for flat screen TVs were screens branded "Magnavox" and "Westinghouse". What a joke (Even more of a joke is the fact that anybody would even WANT to pretend to be "Magnavox" :p)

G.
 
Last edited:
Well, no offense intended but, Visio is pretty much at the bottom of the totem pole when it comes to overall picture quality. I know you said it seemed to look better then the Sony's and the Phillips and the HP models but unfortunately, those brands are also low on the pole too! :p

In LCD TV's in those sizes, I'd look at Sharp or Panasonic.

I'm not saying plasma TV's are perfect, just that they handle the types of issues your talking about a bit more gracefully. And the short life span is an old wife's tale at this point. Current Panasonic models have a 100,000 hour life span. The one true problem with plasmas is that they tend to only start from the 42" size and go up from there. So, if you need a smaller sized screen and want plasma, you're out of luck.

I hear you on the whole wasting too much money on technology deal! Lord, I must have 80,000 dollars of technology sitting in my 2 bedroom apartment between my studio gear, camera gear and TV, surround, stereo gear! :o I could have bought an entire neighborhood in Detroit for that! :D

Anyway, not much else I can add at this point so...

Cheers! :)
 
I'm not sure why that's happening; they're just some small in-line images with public read permission set properly, and they work fine for me. Well, here's an upload of them to this server, maybe that'll help (in the order of jpeg fog before/after and FG ghosting before/after:

Well, don't think I can offer any helpful advice here, except...

























Try a different mic position :D
 
Back
Top