Monday, June 13, 2005

HDTV Quirks, Part I, Grayscale

In Picking an HDTV I gave a rundown on some of the basics of choosing an HDTV. Now I'd like to start pointing out some of the quirks I've run into with my own two HDTVs, in hopes that they may prove instructive to other HDTV buyers.

My first HDTV is a 61" Samsung DLP-based rear projector with native 1,280 x 720p resolution. My second HDTV is a Hitachi 32" plasma. The Hitachi's resolution involves a 1,024 x 852 grid of pixels — greater than the Samsung's resolution vertically but less horizontally.

These two TVs share a common flaw. The nature of this fault is alluded to in a review of another, newer Hitachi plasma in the March/April 2005 The Perfect Vision. Randy Tomlinson writes in his review of the Hitachi 42HDX61 42" Ultravision (pp. 74-76), "Various older black-and-white movies on some of VOOM's Original movie channels were shown without a color tint — which is very rare."

Neither of my HDTVs can make the same boast. Black-and-white material on both of them usually takes on a slight greeish cast.

On the Hitachi, furthermore, how green the cast is depends on the setting of the Color control. When Color is set moderately high, there's less green in a black-and-white picture, oddly, than there is when Color is reduced to lower and lower values ultimately approaching zero. (But when Color is reduced all the way to zero, the greenish tinge completely disappears!)

The green cast on the Samsung is independent of the Color control. However, it's strongest on composite video and S-video inputs and almost absent on component video an DVI.

Selecting different color temperature settings on the respective sets makes little difference, except that in some cases the greenishness may be partly masked by a notable reddishness or bluishness, depending on the color temperature selected.


I take it that the greenishness is associated with the sets' grayscales. "Grayscale" is geek-talk for how precisely the red, green, and blue primary colors are balanced into a neutral gray. This optimum grayscale balance has to exist across the entire range of shades of gray between pure black and pure white.

Exactly which version of neutral gray the set is ostensibly balanced for depends on the color temperature setting the user selects. "Cool" settings produce bluer grays, while "warm" settings give redder grays. Supposedly, the "correct" color temperature setting produces a very slightly reddish gray; this color temperature is the one geeks call "D6500" or "6500° Kelvin."

In all this "science" of color temperature, there's no real provision for opting for greener or less-green grays. The reason is that the various "legitimate" color temperatures lie along a well-defined curve on a piece of graph paper. For this particular curve, the green component of gray is always cast in concrete. If a black-and-white image on a real-world TV gives grays that are too green (or not green enough), it's because the set is "calibrated" to a point on the graph paper that doesn't lie on the approved curve.

As a result, not only will black-and-white images look green around the gills, but color images will not be exactly as they are supposed to be.


Calibration is the key word here. As it comes from the factory, almost any model of HDTV will not be calibrated to give a uniform, spot-on-D6500, not-at-all-greenish grayscale at every brightness level that is capable of being displayed.

The reasons why these sets are not "properly" calibrated at the factory are murky. One of them seems to be that these sets are actually miscalibrated to look good under harsh, bright fluourescent lighting in video stores. In such environs, the sets have to be "cranked" to maximize light output. That usually means that one or two of the three primaries will "run out of gas" before the third does. Green, for instance, often doesn't have the dynamic "legs" of blue ... and red is even worse.

So green may be artificially boosted in the underlying grayscale to compensate for the difference between the most green and the most blue the set can crank out. Meanwhile, red is typically boosted in the circuits that decode an actual full-color signal, thus keeping flesh tones right. This boost to red is what geeks call "red push."

Hence, it's up to the finicky user to have the set re-calibrated. That usually involves paying a technician certified by the Imaging Science Foundation (ISF) to bring weird instruments into your home and use them to guide tweaks he or she will make to numeric parameters in the TV's hidden "service menu."


I haven't had my sets ISF-calibrated. I tried — oh, yes, indeed, I tried. I found a local TV repair outfit which offers the service. I called on the telephone for an in-home consultation. A professional phone-answering person was my contact. She wouldn't let me speak to the actual technician, nor leave a message to be called back by him personally. I was instead told to let my new Samsung DLP "settle in" for at least three months, in order that the calibration, when done, would be a proper and permanent one.

Fine, said I. But can I make an appointment in advance for a date three months from now?

No, I was told, just call back when the time comes. Which is what I did ... only to be told that (a) the ISF tech does his calibrations on Saturdays only, never on weekdays, and that (b) he was at that time all booked up for the coming eight weeks.

So I put my name on his queue, waited eight weeks, and at the very last minute was called by the professional phone-answerer to say that the tech had fallen ill and regrettably needed to take the weekend off.

I insisted on talking to the guy (for the first and only time). He was really sick, when he at long last called me back, so much so that his voice was a mere croak. I felt sorry for him ... but when I mentioned how long I'd been waiting for his services, and how much I really didn't want to go back on the very end of his 8-week-long queue, and how I thought he might make an exception in my case and do my calibration on a weekday — after he got well, of course — he stiffed me.

That was the last contact I had with him or his company.


Then I found a guy online who claimed he could walk me through self-calibrating my Samsung remotely. We went through a number of iterations in which he fed me several bunches of service-menu changes he'd successfully used on Samsung DLP's he'd actually calibrated in person.

Trouble was, none of these sets of changes made my picture any better. For the most part, they made it worse. I suspect the fact that he conspicuously wasn't succeeding in getting my TV dialed in aright eventually took a toll on his ego. After a while, he stopped answering my e-mails.

Having had such lousy luck getting the Samsung calibrated, I never even tried with the Hitachi.


There is, I understand, no guarantee that even the best ISF calibration will produce a perfectly neutral grayscale, anyway. Not with these newfangled fixed-pixel, non-CRT, "digital" displays.

In the old days, a TV with a color picture tube didn't really have to work all that hard to produce a uniformly neutral grayscale. The light-emitting phosphors were, after all, of fairly standard hues, red, green, and blue. If the circuitry driving the CRT was set up properly at one image brightness or "IRE" level, chances were all other IRE levels would fall right in line.

The circuitry was not digital, but analog. Furthermore, it did not have to do weird things to the input signal to compensate for the inherent non-CRT-like characteristics of the display. After all, it was a CRT, through and through.

Not so, plasma. Not so, LCD, DLP, LCoS, DILA, and all the other arcane letters in the digital display alphabet soup. These fixed-pixel digital devices typically don't have phosphors of the same exact colors as a CRT's (in the case of plasma), or they don't have phosphors at all (in all the other cases). Additionally, their operating characteristics aren't at all CRT-like. So they have to digitally process the input signal to hide these facts.

The digital signal processing (DSP) functions they use to do this are not perfect. Compromises are required. It's simply a matter of moving the compromises around to where they're least bothersome to the viewer ... which is why there are parameters that can be tweaked in the TVs' service menus.


But with the most skilled tweaking by highly trained geeks with extremely expensive and sensitive instruments, results typically vary. I recently read in the "Technical Forum" portion of the January/February 2005 The Perfect Vision (p. 16) this response to a reader's question by the same well-versed Mr. Tomlinson:
[The] excessive green [in the ISF-calibrated picture] could be coming from two places. First, the ISF technician's color analyzer could have been inaccurate, and even the slightest errors toward green are instantly picked up by the eye. ... The second possibility lies in the color decoder adjustment. If that's done [by the technician] using color filters (the way we usually do it), there will be a green push to the picture. This green [unlike the first kind] will not show up on a black and white picture ... .
So color analyzers costing up to $15,000 can be inaccurate, says Tomlinson, as can the color filters used to gauge "green push." What are we paying these guys for, if the results do not necessarily come out spot-on accurate? Inaccurate is what we went in with!


I'm not saying fixed-pixel HDTVs are crap, or that ISF calibrations aren't worth it. I'm just saying that in this life we should expect to have to make tradeoffs. When we buy a fixed-pixel, high-tech, exotic HDTV, we should realize going in that they're not perfect. They may be really, really bright. They may have dazzling colors. They may be capable of stunning video resolution. But they have Achilles' heels.

For instance, few of them can render deep, satisfying blacks, or nuanced shadow detail. Many of them are guilty of "false contouring," where shades and colors that are supposed to grade smoothly into one another appear to have ersatz discrete bands. And most of them, apparently, have grayscales that, come hell or high water, are slightly tinted green or some other hue.

Some things you just have to live with.

Sunday, June 12, 2005

Picking an HDTV

In order to watch HDTV, you have (surprise, surprise) to possess an HDTV. (You also have to have at least one HD signal source ... but that's a topic for another time.) What kind of HDTV should you get?

As far as I'm concerned, your HDTV needs to have a widescreen aspect ratio, expressed numerically as 16:9 or 1.78:1. There are some so-called HDTVs that have a squarish 4:3 screen, like in the old days. When they receive a 16:9 program, they put black bars above/below the image. Many do so without losing vertical resolution, thanks to a clever signal-processing trick. But the image's frame is still scrunched, relative to the full height of the screen. Who needs that?

To be a true HDTV, your unit must be able to display either 1080i or 720p (or both). 1080i puts 1,920 pixels across the frame and 1,080 up and down. The odd-numbered pixel rows of each frame arrive in the first 1/30 second, the even-numbered rows in the next. This is what is meant by "interlaced scanning," the "i" in 1080i.

720p frames are 1,280 pixels across by 720 up and down. All pixels are refreshed at once, every 1/60 second. This is "progressive scan," the "p" in 720p.

Not all true HDTVs can cope with the full 1,920-pixel horizontal resolution of 1080i (but all can at least render all 1,280 pixels in each 720p line). When 1080i is being displayed, the set's lesser native horizontal resolution may reduce the amount of detail present in the image somewhat. That slight reduction is not usually considered a major defect.

Any so-called digital TV that cannot give the full resolution of 720p, but which has more than standard-def resolution, is an "extended definition television," or EDTV. EDTVs give a nice picture, but they're not true HD.

All true HDTVs, like EDTVs, have some way of inputting, as well as displaying, 1080i/720p signals. Those that are "HDTV monitors" or "HD-ready TVs" can't receive HDTV over-the-air broadcasts without an external settop box and antenna. Those that are marketed as HDTVs per se, without any further qualification, or those that say "HD built-in," can.

All HDTVs and HDTV monitors can receive 108oi and/or 720p signals from outboard gear like cable-TV boxes and satellite receivers — plus, a lot of DVD players now upconvert their outputs to 720p or 1080i.

These hi-def signals coming into the HDTV from external gear can be analog or digital. If analog, they will typically come in over three-headed "component video" cables. If digital, DVI and/or HDMI connections will frequently be used. Either of those digital connections should be capable of honoring HDCP copyright protection, or the HDTV might not be compatible with cerain external gear.

That's basically it. Any TV made for the U.S. market which complies with those criteria is an HDTV. But which one should you buy?


There is, of course, no quick answer.

First off, you have to choose from four basic flavors: the flat panel, the rear projector, the front projector, and the direct-view TV. Of these, the last is the most familiar. A direct-view HDTV uses (gasp!) a "picture tube": a cathode ray tube or CRT. On the face of this tube appears a full-color picture ... sound familiar? It's just that the picture has a lot more detail (and is wider) than ever before.

A flat-panel TV is just that: a thin, attractive, light-producing slab, usually an LCD (liquid crystal display) or a plasma panel.

A front projector makes an image and throws it out onto a distant external screen, just like in movie theaters, enlarging it as it is projected. LCD and CRT technology are often in use here ... one tiny LCD panel or three separate picture tubes, red, green, and blue. Other technologies used for front projectors include DLP (digital light processing) chips. A front projector is the option of choice for the true "home theater."

A rear projector is just like a front projector except the screen is housed within the TV itself and is translucent. The picture is again produced by LCD, tri-CRT, or DLP technology, or something similar. With tri-CRT technology, the image is optically thrown onto the back of the translucent screen from three modest-sized cathode ray tubes, one red, one green, and one blue, inside the set. LCD and DLP rear projectors replace the three tubes with a single tiny digital image producer — or, in the case of very expensive DLPs, three separate "micromirror devices," one for each primary color.

Any of the technologies used in flat-panel, rear-projector, or front-projector displays — with the exception of the venerable CRT, that is — is a "fixed-pixel" technology. The image is made of tiny discrete sources of light — in DLP, they're actually individual micromirrors — not by an electron beam sweeping across a phosphorescent coating. Each fixed-pixel display has a fixed number of these light sources — which constitute the pixels themselves — lined up in the horizontal direction, and also in the vertical. These two numbers are the limiting factors on how hi-def the displays can be.

When it comes to CRTs, the upper resolution limits are not as easy to know. (I'll leave it at that.) No direct-view CRTs can produce all the fine detail of 1920x1080i, but some (ultra-pricey) front- and rear-projectors can. Many aficionados think CRTs give the best, smoothest, most natural video renditions, with the most accurate colors, the deepest blacks, and the greatest amount of shadow detail.


So, again, how do you pick an HDTV? Once you know which basic type you want, you can use Viewing Angle Calculator and/or Viewing Distance Calculator to figure out how big an HDTV you need, based in part on your expected viewing distance and desired viewing angle.

Once you've reached this point in your deliberations, however, you may find you need to revise your thinking. For instance, if you decide you want a TV over 45" in diagonal measure, that pretty much lets out flat-panel LCDs, which may otherwise have been your first choice. Anything over about 34" will eliminate direct-view CRTs. And if you just want a modest-sized HD display for the kitchen counter, plasmas are decidedly too big.

Rear- and front-projectors as well as plasmas, on the other hand, can give you pictures that are pretty darn big ... for a price. The price is apt to be fairly reasonable for big-screen, CRT-based rear projectors and much steeper for all other truly large-screen displays and projectors.

You also need to make sure you physically have room for what you hanker to buy ... and all its outboard gear. This is generally a major problem only with the two types of projectors, front and rear. Direct-view CRTs are usually not all that big. Flat panels are quite space-efficient, even in the larger screen sizes, and can be mounted on a wall.

Rear projectors typically require a lot of depth clearance and a lot of height clearance. Those that are CRT-based usually house their three CRTs in the base, which means they take up a lot of space below the screen where you cannot put other gear. Non-CRT RPTVs are, on the other hand, getting thinner and thinner, and they usually sit tamely on a shelf above other gear.

Front projectors are a special case. They pretty much require a full-fledged home theater arrangement, in a dedicated room. The projector mounts on the ceiling, or at the back wall, or behind the (translucent) screen ... or sometimes it just sits on a raised table in the middle of the theater area. The other connected A/V gear is stashed elsewhere, often out of sight. The seats are positioned in front of the screen at an ideal distance from the projected image. The room is darkened for viewing, which means no exposed windows — and the result often outshines the local cineplex.


All that said, once again, how do you pick an HDTV? One way is to do a lot of research, read a lot of equipment reports and reviews. You can find a treasure trove of links to equipment reviews listed online at the eCoustics.com website. Lots of valuable online home-electronics reviews can be found at c|net.

You also need to devour the enthusiast magazines such as Sound & Vision, The Perfect Vision, Widescreen Review, and Home Theater.

You need to visit Circuit City, Best Buy, Tweeter, and so forth — the actual bricks-and-mortar stores. But take any technical information you get there with a grain of salt. On the other hand, you can trust your eyes a little more. If a display looks particularly good, investigate it. (But remember, displays that look good under intense store lighting may not look good at home. Then again, those that seem weak in the store may be superior at home under more restrained lighting conditions.)

Then there are the online forums, two of the best being the AVS Forum and the Home Theater Forum.

Also, pay attention to your friends' HDTVs. What kind(s) do they have? How do they like them? How do you like them? What strengths/weaknesses most impress/bother you?

Happy hunting!

Viewing Angle Calculator

Here's another calculator somewhat like my Viewing Distance Calculator. Except that this one computes the viewing angle in degrees for each viewing distance from 1 foot through 15 feet. You simply enter your diagonal screen size in inches. Then click on the Calculate Now button. For each increment in viewing distance of one foot, below it will appear the number of degrees subtended at your eyes' retinas by the two sides of the screen. (The asssumption is that the display is an HDTV with a widescreen aspect ratio of 16:9 or 1.78:1.)

VIEWING ANGLE CALCULATOR
Enter diagonal screen size in inches:inches
1 foot:2 feet:3 feet:4 feet:5 feet:
degreesdegreesdegreesdegreesdegrees
6 feet:7 feet:8 feet:9 feet:10 feet:
degreesdegreesdegreesdegreesdegrees
11 feet:12 feet:13 feet:14 feet:15 feet:
degreesdegreesdegreesdegreesdegrees
Viewing angles in the range between 26° and 36° are usually considered best for HDTV. For 1,920-horizontal-pixel 1080i HDTV, 31° gives you a seating position at which each pixel subtends 1 minute of arc on your retinas — the theoretical optimum, in terms of human visual acuity.

For 1,280-horizontal-pixel 720p HDTV, the theoretical optimum viewing angle is 21.5° — but you can probably move closer with no noticeable ill effect.

Saturday, June 11, 2005

Viewing Distance Calculator

Here is the viewing distance calculator I mentioned in More on Viewing Distance, in which I extolled the riveting, electrifying, immersive experience of watching HDTV from a position close enough to the screen to produce a "viewing angle" of somewhere around 30°:

VIEWING DISTANCE CALCULATOR
Enter screen diagonal size in inches:inches
Enter viewing angle in degrees:degrees
Recommended viewing distance:feet
Recommended viewing distance:screen heights
The calculator's purpose is to let you enter the diagonal screen size in inches of a 16:9 HDTV display along with a desired viewing angle, the angle in degrees subtended at your eyes by the two sides of the screen. Various experts recommend viewing angles such as 26° or 30° or 33° or 36° for full-fledged 1080i or 720p HDTV.

If you are lucky enough to have a TV that offers true 1080i resolution, with 1,920 pixels worth of detail across the screen, you may want to use 31 degrees as your viewing angle. It's the angle that makes each pixel subtend an arc of 1 minute on your retina — the smallest angle discernable by the eye. It requires a TV-to-you viewing distance of 3.2 screen heights.

For those of you with 720p resolution — 1,280 pixels horizontally — the equivalent viewing angle is 21.5 degrees, and the number of screen heights of distance from you to the screen will be 4.7. In other words, you will need to sit further back from the screen than with 1080i, if you don't want to be looking at pixels bigger that the smallest dot your eye can distinguish. (When I'm watching actual hi-def material, I personally do not find sitting closer to my own 720p monitor than 4.7 screen heights bothersome. So I recommend pretending you have true 1080i resolution, even if you don't.)

So enter your screen size and a desired angle, and when you click on the "Calculate Now" button, the calculator will tell you what viewing distance, expressed in feet and also in multiples of the screen's height, will produce that angle.

For my own future reference, here is how I managed to incorporate the calculator in this Blogger post. ("{" and "}" replace "<" and ">" in what follows.) First of all, Blogger does not permit the text of JavaScript scripts to appear right in the HTML of a post. But JavaScripts can appear between the {HEAD} and {/HEAD} tags in a Blogger template. So I put this there, just before {/HEAD}:

{!-- This is where JavaScript scripts go: --}
{div id="scripts"}

{script type="text/javascript" src="http://home.comcast.net/~epstewart/calcvdnew.js"}
{/script}

{/div}
{!-- This ends where JavaScript scripts go --}
I put the actual JavaScript code in the external file http://home.comcast.net/~epstewart/calcvdnew.js. (Actually, I created it locally on my Mac using TextEdit, then I uploaded it to my Comcast Personal WebPages using Fetch.)

Finally, I created a {TABLE}, within a {FORM}, within a {CENTER}'ed block. {TD} details in the table were specified in such a way as to interact with the (external) JavaScript function which I named, without much originality, compute(form).

For example, one of my table rows said:
{TR align=center}
{TD colSpan=2}{INPUT onclick="compute(this.form)" type=button value="Calculate Now"}{/TD}
{/TR}

It created the "Calculate Now" button in such a way as to cause it, when clicked on, to invoke the compute(form) JavaScript function.

Other table rows contained JavaScript-compatible language to either obtain as input or report as output the appropriate numeric values. These values matched variables defined in compute(form) in http://home.comcast.net/~epstewart/calcvdnew.js.

Note that I actually had to edit my table HTML to remove all line breaks. Though unsightly, it prevents browsers from adding spurious blank lines above the table as displayed, one for each line break embedded in the table.

I set things up this way so I could, if I want, come back at a future date and make more Blogger post-accessible JavaScripts. I would create and upload the script as a .js file on my Personal WebPages. I would insert a reference to that file in my template. Then I would enter as HTML in a regular post the necessary {FORM} and {TABLE} stuff to make use of the script.

Friday, June 10, 2005

More on Viewing Distance

I've come up with a handy-dandy calculator which lets you input your diagonal screen size in inches and desired viewing angle in degrees. It spits out the appropriate viewing distance from your 16:9 HDTV screen, in feet and in screen heights, to achieve that viewing angle (i.e., the angle subtended on the retina by the full width of the TV screen). My single-purpose calculator was derived from the more elaborate one by C.M. Collins on this web page. Some folks may find mine easier to use.

The reason I came up with it is that I want to update what I originally suggested in On Viewing Distance. I said there that, however nice it might be to sit approximately 3 screen heights away from a 1080i image or 4.5-to-5 screen heights away from a 720p image, a 6x-to-7x seating distance was more doable in terms of "the exigencies of space, cost, and furniture arrangement." And I intimated that the compromise was perfectly OK.

Then last night I made a liar of myself. I was watching the DVR recording I'd made of HBO HD's The Matrix Revolutions. On impulse, in mid-movie I dragged a chair to the 30° spot that put my retinas roughly 8.3 feet (3.3 screen heights) away from my 61" screen. Plopping myself down in that "sweet spot" made all the difference. It turned a pallid viewing experience into an electrifying one.

The Matrix Revolutions is the final installment in the Wachowski brothers' Matrix trilogy. The original movie met with great applause, the second with much less enthusiasm, and the third with even less. The Matrix "world" is nonetheless one that, obviously, thrills a certain minority of moviegoers. I must say I'm not one of them. Yet when I watched the reputedly weakest of the three Matrix films in hi-def with a 30° viewing angle in my living room, I was swept away.

Whatever else you say about it, Matrix 3 has some pretty cool CGI battle scenes. They're primarily what won me over to the 3.3x/30° camp. I was immersed in the action to an extent I've never experienced at home before.

The psychology was weird. It was late and I was tired, so I was half tempted to shut down for the night and pick up on the morrow ... but I simply couldn't. I was in the grip of the time flow of the movie, and until it ended I knew I wasn't going anwhere. Part of me said, "C'mon. At least, hit pause. Break the spell." The rest of me said, "Not on your life."

I realized then that I had stumbled on the crucial difference between "watching TV" and "experiencing home theater." After all the pieces are in place — a 16:9 HDTV monitor, a 5.1 Dolby Digital sound system, an HD signal source — it's still not "home theater" until you achieve something close to the SMPTE-recommended viewing angle of 30°.

Note that the 3.3x/30° viewing distance/angle is, strictly speaking, right only for 1080i, not for 720p (which is what I actually have). As I said in On Viewing Distance, you need to get that close to 1080i to avoid compromising the ultra-fine-grain "retinal resolution" which the format offers. For 720p, though, 4.5x-to-5x gets you all the "retinal resolution" that format has to offer. Sit any closer and you invite "video noise, artifacts, and poor-quality low-resolution sources" to ruin your day. Moreover, you're eyeballs are supposedly apt to notice, and complain, that they're not getting full 1080i resolution.

Well, none of those things happened. The 1080i of the HBO HD transmission was converted to 720p by my cable box and exported to my Samsung DLP via DVI. On its screen I saw no noise, no artifacts, and no telltale signs of compromised video resolution. So I sat there wondering whether a true 1080i display would have given me any more convincing an experience than I was already having. My best guess was that it wouldn't.

Thursday, June 09, 2005

On Viewing Distance

"Resolution starts (and ends) with the eye," writes Geoffrey Morrison in "GearWorks: Viewing Distance vs. Resolution," covering an important topic in the January 2005 issue of Home Theater magazine (pp. 38-39). "If you're one of the lucky few with 20/20 vision ... your eye can discern one-sixtieth of a degree of arc ... ." So "objects roughly 0.067 inches wide" are the smallest we can distinguish at 20 feet.

At a more typical screen-to-retina viewing distance of 10 feet — half of 20 — we need objects to be at least 0.067 ÷ 2 = 0.033 inches wide in order to be able to distinguish them. At 10 feet, an object 0.033 inches wide subtends one-sixtieth of a degree: one minute of arc.

The "objects" Morrison has in mind are screen pixels. If we are sitting 10 feet from a TV, the smallest pixels we can discriminate from adjacent pixels of sufficiently different light intensity are 0.033 inches wide. If we move up to 5 feet away from the same screen, then we can "see" individual image pixels that are only 0.033 ÷ 2 = 0.0165 inches wide.

My Samsung 61" DLP rear projector has a 720p-based pixel grid that is 1,280 pixels wide by 720 pixels tall. To find the width of a single pixel, I can simply multiply the screen's diagonal measurement, 61 inches, by 0.87. (For calculator geeks, 0.87 is cos[tan-1(9/16)], 9/16 being the inverse of the screen's 16:9 aspect ratio.) Then I divide that result by 1,280, the number of pixels of horizontal resolution the screen provides.

It all comes to [(61 * 0.87) / 1280], giving my Samsung a pixel width of approximately 0.042 inches.


How far back from the Samsung can I sit before those 0.042-inch pixels begin to fuzz together? The formula for maximum viewing distance (MVD) as a function of pixel width (PW) is:

MVD = PW / tan(1/60°)

Calculators ready? Mine reports that tan(1/60°) is 0.0002909, so:

MVD = PW / 0.0002909

When PW is 0.042 inches, the formula gives an MVD of roughly 141 inches, or about 12 feet, give or take.


What would be my maximum viewing distance if my 61" TV were not 720p-native but 1080i? Well, 1080i puts 1,920 pixels across the screen instead of 1,280. 1,920 divided by 1,280 yields 1.5:1, or 3:2. So the 1080i pixel width would be the inverse of that, 2/3, times the width of the 720p-style pixels.

Which means I'd have to sit at 2/3 of the 720p viewing distance to avoid any loss of what I like to call "retinal resolution": 8 feet.


The lesson here is that for any screen size and horizontal resolution there is a maximal viewing distance beyond which "retinal resolution" suffers. This maximal distance can be expressed in screen heights, rather than inches or feet, thus factoring out the specific diagonal screen measure in inches.

For my 61" 720p-native TV, the screen height is 61 x 0.49, which is about 30 inches. (0.49 is sin[tan-1(9/16)].) My maximal viewing distance, brought down from above, is 141 in. Dividing that figure by the screen's height in inches, I get a maximal viewing distance of 4.7 screen heights.

If the TV were 1080i, my maximum viewing distance would be 2/3 that, or 3.1 screen heights.

So if you get much further away from a 1080i display than 3 screen heights — a "3x seating distance" — you start to lose "retinal resolution." But 720p is more forgiving. It lets you back up to a 4.5x or 5x viewing distance.

There is a handy-dandy calculator for some of these values on this web page. (And on this page there is yet more discussion of the viewing distance topic.) The calculator tells me on its very bottom line that, at 12 feet, my 61" 16:9 display is too far away, under the assumption that I want to see a fully resolved 1080i image. It implies I ought to move up to an 8-foot viewing distance.

It doesn't tell me that 12 feet is just right for 720p, but I can tell this by multiplying 8 feet by 3/2.

The calculator also tells me my "viewing angle," which is the angle subtended by the full width of the screen when I sit at my customary 12-foot viewing distance, is 20.9 degrees. In his article, Morrison says the best full-screen viewing angle is either 30 or 33 degrees, depending on which of two sentences you read. The handy-dandy web calculator says that in order for me to get an ideal 30-degree viewing angle, I'd have to sit 8.3 feet away from my diaplay ... which is basically the same viewing distance required for 1080i resolution.

Moral: when we move up to "just the right" (3x) seating distance from a 1080i display, we get ideal "retinal resolution" and, furthermore, we get the best viewing angle.

Or if, with a 720p display, we back up to 4.5-to-5 screen heights away — tolerable with 720p — we lose no "retinal resolution." However, we compromise the viewing angle that brings (in Morrison's phrase) "optimum viewer enjoyment."

But Morrison also points out tht the material we're watching need not have "anything close to that level of detail" — 1,920 distinct pixels across the screen. Moreover, "video noise, artifacts, and poor-quality low-resolution sources ... are all too noticeable at [such] close distances." So he recommends we adopt a blanket 5x optimum seating distance, even for 1080i displays.

Truth to be told, though, I personally find even 5x tough to achieve, given the exigencies of space, cost, and furniture arrangement. I actually sit more like 14-15 feet, not 12, from my Samsung. That's more like 6x than 5x.

Wednesday, June 08, 2005

HBO-HD's Mystic River: Missing "OAR"

I recently "taped" (actually, recorded to my cable DVR) HBO HD's cablecast of director Clint Eastwood's 2003 noir thriller Mystic River and then watched it yesterday. I just so happened to have the DVD of the same movie on hand from Netflix for comparison purposes. The thing that leapt out at me is that the transfer HBO used is "zoomed in" to fill the entire height of a 16:9 (or 1.78:1) screen. The original aspect ratio of this Panavision film was 2:35:1, which the DVD honors with an anamorphic transfer that puts black bars at the top and bottom of the screen.

I toggled back and forth and was surprised to find that the full-screen zoom on HBO made for (in my humble opinion) a noticeably ugly composition, compared to the anamorphic letterbox on the DVD. To put the same sentiment in reverse, the actual original-aspect-ratio composition of the film's shots on DVD was as aesthetically pleasing as the HBO 16:9 crop was not.

I say I was surprised, because I don't usuaally consider myself a purist about such things.

When you think about it, the 2:35-to-1.78 crop gives you fully 75% of the scene, with the remaining 25% shaved off at an average of just 12.5% per side ... though I'm sure the film transfer can be panned slightly within those confines to take a little more off one side than the other, when need be. How important can twice 12.5% be?

As it turns out, quite important.

It's not that anything crucial to seeing what's going on is lost. Rather, it's that the framing of a film shot "says" something crucial — something ephemeral, something subliminal — that gets trampled on by the crop.

That's why home theater aficionados are always nattering on and on about preserving OAR: the original aspect ratio. My side-by-side test with Mystic River, the HD cable version vs. the DVD, says they're absolutely right. Even though the DVD looks noticeably softer because it's not hi-def, it still provides the more pleasing viewing experience.


So why is HBO using a zoomed-in 16:9 transfer? The most likely reason is that there are so many customers who resent the letterbox bars. "I spent a kazillion dollars on this big-screen plasma TV," many probably say, "and I want every pixel lit."

That, and black bars can cause screen burn-in, an uneven dimming of the pixels that do get lit vs. the ones that stay dark and never age. Supposedly the bane of plasma sets, it's less problematic for other TV types.

My attitude is, the TV itself should offer a Zoom function or mode that will optionally remove excise the black bars, so the program source doesn't have to. Unfortunately, all too many TV sets' Zoom modes inexplicbly disappear when 720p/1080i HD is input. That's true of both my Samsung DLP rear projector and my Hitachi wall-mounted plasma flat panel.

So we urgently need HDTVs and monitors that don't disable hi-def Zoom. It can be done. I'm presently ogling a Sony KD-34XBR960 Direct View CRT for my bedroom that enables all aspect modes for all inputs.


HBO and cable companies, as well as consumers in general, might also note that more and more of us are watching DVDs, anamorphic or otherwise, via hi-def digital connections to our HDTVs. A number of high-end DVD players can upconvert any DVD's content to 720p or 1080i at their digital DVI or HDMI output.

This doesn't improve picture detail, but it still gives a better picture because the video stays in the digital domain all the way from disc to TV. No digital-to-analog conversions degrade the picture. But when the typical HDTV monitor receives the digital video input, it lacks the ability to Zoom away whatever black bars there are. And plasma HDTV owners and others are putting up with that.

Black bars are not the end of the world, even for plasma owners. The cable industry ought to consider leaving all movies in their original aspect ratio, black bars or no.

For one thing, the black bars make the video easier to compress. MPEG compression takes advantage of redundancy in the picture — such as in unrelieved black bars on the screen — permitting itself a lower output bitrate from the compressor. The same picture quality can be acheived at a lower number of megabits per second.

Lower bitrates in turn mean a channel, hi-def or not, takes up a smaller slice of a digital slot in the cable bandwidth spectrum. (See HDTV and Cable Systems for more on how cable slots are allocated.) The arithmetic here is simple. Each digital slot in the cable system's bandwidth "shelf space" can hold 10-12 SD channels or 2-3 HD channels. It's more likely to be able to use the higher number, with no picture degradation, if each channel's actual bitrate is lower than its maximum theoretical bitrate — which is exactly what happens when the MPEG encoder can take advantage of lots of black-bar redundancy.


This is not a matter on which everyone will agree, unfortunately. It's the longstanding letterbox-vs.-pan & scan dispute brought up to date.

Before the home video era began back in the 1970s, there was no such thing as letterboxing. Purists began to howl about the cropping of movies on cable and VHS, and when the now-defunct laserdisc was introduced, widescreen movies were presented in a letterbox transfer for the first time. Unsurprisingly, the same "correct" film-to-video transfers began to invade premium cable and VHS ... and run-of-the-mill non-purists began to fume.

After all, TV screens were small and squarish back then, with marginal vertical resolution. Letterboxing made the picture yet smaller and sacrficed much of what little vertical resolution there was.

Digital TV's designers had that complaint in mind when they agreed on a 16:9 screen aspect ratio: not quite wide enough for most movies, but the lingering presence of (thinner) letterboxing bars was ostensibly to be made up for by larger, wider screens with greater vertical resolution.

Still, there remains a psychological aversion to letterboxing among many TV consumers. A lot of folks feel they're getting cheated if every pixel isn't lit.

Perhaps the answer is multiple transfers, one for the every-pixel-lit crowd and one for the OAR crowd. HBO could alternate between the two, since they show each movie umpty-ump times a week anyway. People could pick the version they prefer, record it to their HD DVRs, and watch it whenever.

Or am I being too pie-in-the-sky here?

HDTV and Cable Systems

It seems like all of a sudden I'm into cable HDTV ... which brings up the question of how HDTV is even compatible with cable. I've found an article online, "What It Takes to Get HDTV on Cable," by Leslie Ellis, an excellent tech writer, which explains some of it in layman's terms.

Ms. Ellis wrote the article almost three years ago, when cable operators were (albeit reluctantly) facing the fact that they would have to carry HDTV, which even with MPEG-2 compression of the digital video stream requires a data rate of 19.2 megabits per second (Mbps). That compares with 3.5 Mbps for a standard-def channel.

Each cable transmission channel — I'll call it a "slot" to distinguish it from a video channel of the type we're more familiar with — has a 6-megahertz bandwidth, but cable operators can squeeze multiple video channels such as HBO or TNT into one digital cable slot. (Not so, analog slots — see below.) And there are many 6-MHz slots on the cable system.

The cable operators — the biggest, such as Comcast, are the "MSOs," for "multi-system operators" — use a technique called quadrature amplitude modulation, or QAM, to do this. Modulation is the way digital video and audio streams are piggybacked onto the carrier frequency for the slot. "QAM" rhymes with either "mom" or "ma'am."

Specifically, the method called "256 QAM" shoehorns a total data rate of 39 Mbps into one 6-MHz transmission channel. That translates to two 19.2-MHz HDTV video channels per slot.

Or in some cases, writes Ellis, "three HD channels can slip into a 256-QAM channel, depending on the source material." Translation: not all HDTV channels make full use of the 19.2 Mbps they are allotted. The reason may be that the video image they are transmitting doesn't contain a lot of detail to begin with. Or it may be that they are filtering out "excess" detail or removing it by means of overcompression.

Ellis compares the temptation to squeeze too many HDTV channels into one 6-MHz slot with

... the early days of video compression, when 24 channels of video were going to fit snugly into one 6 MHz channel ... That much snug affected picture quality, which made content creators grimace. Now, most operators don’t push more than 10 or so SD channels into one 6 MHz channel.

QAM is, as I understand it, not used to modulate the digital signal onto a carrier for over-the-air transmission, just for cable. For OTA a different method is used, called 8-VSB. "VSB" stands for vestigial sideband. It's another modulation scheme entirely.

A digital tuner in a TV or settop box may be designed to receive QAM, or it may be 8-VSB, or in some cases it may be both. Some HDTV sets contain an onboard 8-VSB tuner and also a physical slot that accepts an optional CableCard. The latter is a small card that contains a QAM tuner and allows digital cable reception without a settop box.


Ms. Ellis's entire "Translation Please" archive can be accessed here. These are columns Ellis originally wrote for the Broadband Week section of Multichannel News. Her column titled "Why 6 MHz Channels Take up 6 MHz: Part 1" sheds more light on all this.

If the entire 750-MHz bandwidth of the cable system were a shelf 750 inches long, each 6-MHz slot would occupy 6 inches. The first 54 inches (equivalent to 9 slots, by my calculation) are reserved for miscellaneous uses. Then comes the "analog shelf space," ending at inch 550. This analog-only space can hold around 75 channels. I gather there can be but one analog channel per slot.

Next, the "digital shelf space" consists of around 33 slots, each holding around 10-12 standard-def digital TV channels.

True, when you divide 750 by 6, you get 125 potential 6-MHz slots — and 9 miscellaneous + 75 analog + 33 digital only comes to 117. Ellis doesn't explain that discrepancy. But she does mention that there's really no big reason the digital space needs to be carved up into tidy 6-MHz slots at all, a bandwidth relic of analog days. Digital gurus want to know not about bandwidth in MHz, but data rate in Mbps.

In concept, the "digital shelf space" could be carved up differently, or used as one big source of bitrate. There is something in the works called "DOCSIS 3.0, also known as 'wideband DOCSIS' and 'channel bonding'," that addresses this possibility. Meanwhile, it's still 6 MHz per digital slot.


I should note, in this regard, that my local Comcast outfit has notified its customers that it's about to begin a shift to all-digital. I assume this means that at a date uncertain in the future, there will be no analog slots left. All the current analog channels — many of which are picked up by Comcast from a satellite in digital form and converted to analog — will change to digital.

That will mean every cable customer will need either a digital cable box or CableCard for each TV, I gather. No more running the cable feed directly into the TV's "cable ready" tuner.

I assume the present "analog shelf space" will gradually shrink in favor of more "digital shelf space," in this scheme. Comcast says it will be shuffling the channel lineup around as it converts. I imagine they always want to consolidate the remaining analog channels at the low end of the shelf. The space thus freed up will be converted to digital slots. And I'll bet DOCSIS 3.0 — whatever it actually may be — is providing some sort of road map for Comcast's conversion-to-all-digital process.

Ellis also says, a wee bit cryptically, "A digital channel earmarked for broadband data ... moves at 38 Mbps per 6 MHz channel, sliced according to downstream speed ceilings." I assume the "downstream speed ceiling" for HDTV is 19.2 Mbps. That's why two HDTV channels can go in one digital slot without further tweaks.

It seems intuitively obvious that eliminating the artificial and unnecessary 6-MHz chunks for digital channels, to the extent that that is possible, can do nothing but make it easier to blend more HD and SD digital channels into the one "big fat pipe" (as Ellis calls it), without compromised picture quality. If that's what DOCSIS 3.0 is about, I'm all for it.

And I'm all for cable going all-digital, even if some customers need added hardware to accommodate it. Digital channels simply look a whale of a lot better than analog channels, though they are still just standard-def.

Sunday, June 05, 2005

30-Second-Skip on Motorola DCT6412

Offering a lot of info and tips on my particular model of HD DVR cable box, this webpage says 30-second-forward-skip on the Motorola DCT6412 is possible!

Here's how:

How To Add 30-Second Skip

The following technique can be used to map an unused or unneeded button on the "silver" remote to the 30-second skip command. Current versions of the iGuide DVR software will skip forward 30 seconds into a recording when this command is sent. The '15-second back' button can be a good choice, since PgDn already provides that functionality.

  1. Press the "Cable" button at the top of the remote to put it into Cable Box control mode.
  2. Press and hold the "Setup" button until the "Cable" button blinks twice.
  3. Type in the code 994. The "Cable" button will blink twice.
  4. Press (do not hold) the "Setup" button.
  5. Type in the code 00173.
  6. Press whatever button you want to map the 30-second skip command to.

To get rid of it:

How To Restore a Remapped Button

The following technique can be used to restore the original function to a button that has been remapped.

  1. Press the "Cable" button.
  2. Press and hold the "Setup" button until the "Cable" button blinks twice.
  3. Type in the code 994. The "Cable" button will blink twice.
  4. Press (do not hold) the "Setup" button.
  5. Press the button you are restoring twice.

(If the above doesn't work, retry and skip step 4 (do not press the "Setup" button))

I tried it with the apparently functionless "- Day" button (the one with a white "B" in a blue square). It worked, but then I realized the button does have a function ... to go back a day in the Program Guide. So I tried the restore procedure, which also worked (after I realized that the double-press in step 5 had to include a pronounced pause between the two presses).

So then I tried reprogramming the "Lock" button (gray "A" in yellow triangle), since I have no use for its function (which is to set up a 4-digit PIN to block access to certain features and programs). This button is conveniently near the skip-back-15-seconds button, so I can jog forward and back quite easily.

The "Lock" button also worked fine as a skip-forward-30-seconds button.

This is a great commercial-skip feature, which of course is why it's hidden. I have a likewise-reprogrammed skip-forward-30-seconds button on my DirecTV-with-TiVo receiver. Getting through the long commercial stretch before "Final Jeopardy" is a matter of some seven quick button presses!

Making Hi-Def Archival Copies

In Motorola DCT6412 HD DVR Cable Box I talked about the high-defintion cable box-cum-DVR Comcast supplies to its customers here in Baltimore County, Marlyand. One thing I didn't mention is that it sports a FireWire connector. That means it possibly can be used to export copies of digital TV programs!

I got the idea from a letter in the January/February 2005 The Perfect Vision magazine (p. 10; not available online). Reader Chris Teague writes that he (she?) archives HD material from a similar Motorola box to a D-VHS videocassette recorder.

An item on p. 16 of the March/April issue also mentions using FireWire to copy HD and other digital TV from a Sony cable box to D-VHS, and also to an external RCA hard-drive recorder.

What's going on here? Isn't that sort of thing supposed to be impossible?


Not, apparently, with FireWire. FireWire is a type of digital interface which can transport digital video, along with digital audio, from one device such as a cable box to another such as a D-VHS recorder. It's also used to connect PCs and Macs for transmitting data to/from peripherals. Special FireWire connections allow home movies to be copied from camcorders to PCs for editing. Other names for FireWire include IEEE 1394, i.LINK, and DTVLink.

FireWire (FW) differs from Digital Video Interface (DVI). FW uses different cables and connectors than DVI. FW carries only digitally compressed video and audio, while DVI carries only uncompressed digital video, and no audio. FW is two-way and its devices can be elaborately networked, while DVI is one-way and cannot.

Both DVI and FW are copy-protected. Each has its own protection scheme, HDCP for DVI and DTCP or "5C" (because five companies designed it) for FW. The two schemes have certain features in common. Basically, in either scheme, when a digital video stream is to be sent from a source device to a receiving device, both devices have to prove their bona fides as authorized units that are guaranteed by their manufacturers to play by the rules.

But FW's 5C copy-protection scheme adds "copy control" to the protection used for DVI. DVI's uncompressed digital video uses data rates too high to be copied by any consumer device, so copy control is a non-issue. But D-VHS recorders and certain other types of devices can record the compressed video (and audio) streams associated with FireWire.

So the FW source device plants a digital "flag" in the output stream, based on a similar flag it finds in the content it is outputting.

Content providers of over-the-air digital TV must, by law, flag their content "copy freely" — even when it comes to you courtesy of a cable system or satellite.

Pay-per-view and video-on-demand content is flagged "copy never." (This may be why my Motorola box can't record this content.) "Copy never" content can be viewed over a FireWire connection, but not recorded.

In between these extremes, "linear" (i.e., not user-selected) content on premium channels such as HBO bears a "copy once" or "copy one generation" flag. That lets you make an archival copy. The copy itself is marked "no more copies," so it cannot be duplicated.

Supposedly, the 5C standard does make provision for a "no more copies" archival copy to be moved — that is, copied to another recording device, with the original recording then being erased or deleted. But as far as I know, the "move" function remains dormant in all current 5C-compliant devices. (See the January/February 2005 issue of The Perfect Vision, pp. 53-54, for a bit more on this.)

My question is, accordingly, can I even make archival copies of hi-def movies airing on HBO and other premium channels? Very likely the answer is yes, with this proviso: I doubt I can copy the version recorded on the DVR.

That is, I imagine that with a FireWire-connected device I can externally record "copy once" content as it is being transmitted. But I assume that, as it is recorded on the DVR, it's flagged "no more copies."

But that assumption is subject to doubt. Here is a thread on the AVS Forum in which a poster claims that, by rights, "recording to the Hard Drive of the [Motorola 6412] STB does NOT count" against the "copy once" flag. True, the poster was in fact unable to copy HBO-type material to his D-VHS recorder, suggesting he may have been wrong. Moreover, he failed to follow up after being asked whether the same problem cropped up with realtime HD content, or with presumably un-flagged material on INHD. So in my mind the whole issue remains unresolved, and I can as yet find no further information on the Web that would resolve it.


Another possibility is to record the cable box's realtime and/or recorded HD or SD content as sent via FireWire to a computer. I have a Mac, and there is information about how to do this sort of thing here. Also here. And also here.

One of the caveats here is that the content you want to record on your computer cannot be encrypted or scrambled. Encryption is separate and distinct from copy-control flagging. It scrambles the digital TV data in a way that (presumably) only an authorized receiving device can cope with. A Mac or PC is not an authorized device that knows how to decrypt scrambled cable channels.

Another caveat for us Mac'ers is that Mac OS version 10.3.x (Panther) must be used. No earlier versions will work ... and it's not absolutely clear that the latest-and-greatest version, 10.4.x Tiger, works either.

I have two Macs, an iMac running Panther and the other, a now-ancient PowerBook laptop, running 10.2 Jaguar. It's so sluggish, I'm not sure I want to burden it with Panther (or Tiger). It's processor is too slow for importing HD video, and it's hard drive too small.

The iMac would probably work ... if I had a lot more free disk space. But it's on the top floor and the Motorola is on the ground floor. Connecting it (or lugging it) would be a hassle.

If I were to do this, I'd clearly need to replace the old PowerBook with a new one with a lot of horsepower and disk space.


But I don't really fancy the "computer option." At best, it would be true HD in name only, since I know of no way to save any significant amount of HD video as such without chewing up however much disk space I have. I figure maybe 5-to-10 GB per hour of HD material would be needed.

I could cut that perhaps by a factor of 1/10 (I'm guessing) if I used DivX compression. DivX is how most video content shared illegally over the Web is compressed. But that would compromise the video quality, so what's the point?

No, the way I'd really like to go would be to use something like a D-VHS recorder ... but disk-based.

Using DVD-like, removable media.

That I could keep forever.

Right now, that's not possible. In a few months, though, high-definition DVDs will arrive — including recordable versions thereof.

There are actually some hi-def DVD formats here today, but the ones people are waiting for use a blue laser, not red. Red lasers can't focus their beam tightly enough to cram truly vast amounts of data on a single disc. Blue lasers can.

Trouble is, there are likely to be two competing blue-laser formats, HD DVD and Blu-Ray. Each has a roster of important companies in its stable, and neither can be played in the other format's players. Can you say "format war"?

There are some indications the HD DVD and Blu-Ray camps may hammer out a compromise to avoid a war. We'll probably know very soon.

If some version of prerecorded blue-laser HD disc captures consumers' hearts in the next year or two, then recordable versions thereof ought to serve nicely as hi-def archival media. One can at least hope a FireWire-capable blue-laser recorder/player, suitably authorized for the purpose of archiving HDTV, will come to market not long after the blue-laser format is (formats are?) introduced.

I'll keep my fingers crossed.

Saturday, June 04, 2005

Aspect Ratios & Film Transfers

16:9 image
4:3 image
HDTV uses a "widescreen" 16:9 aspect ratio: a screen 9 units tall has 16 units of width. Traditional TV programming is 4:3 — 4 units tall by 3 wide.

My two 16:9 hi-tech TV displays consequently can have problems displaying old-style 4:3 material without stretching the image horizontally to fill the screen's full width, making everybody look short and fat.

I say I "can have problems" because in many situations there is a quick remedy. I can use the Aspect button on the remote to select the "Normal" mode, rather than "Wide." Black "pillarbox" bars magically appear at either side of a properly proportioned 4:3 image.

Only problem is, that doesn't work quite the way I want it to when I'm using the Samsung's DVI (Digital Video Interface) input from my Motorola cable box. In the DVI input's "Normal" mode, the TV not only squeezes the image horizontally, it squeezes it vertically as well. I end up with a very small, properly proportioned, 4:3 picture in the center of the screen.

What's going on is that — for DVI only, not for component-video input — the Samsung in "Normal" mode uses a 640-pixel by 480-pixel area of the whole 1280x720 screen, so as to do an "ultra-correct" one-to-one, pixel-by-pixel mapping of input pixels to output pixels.

That may be "ultra-correct," but it's not smart. I'd prefer the entire height of the screen to be used, for 4:3 material. What to do?

One option is to switch to the Samsung's component-video input, using "Normal" mode, to view 4:3 images. But that's a pain.

A better solution is to twiddle the user settings of the Motorola DCT6412 cable box. If I set TV Type to 16:9 (the default setting) and DVI/YPbPr Output to 720p or 1080i rather than 480p or 480i, I can choose Off, 480i, 480p, or Stretch for 4:3 Override. It turns out that Off does exactly what I want: "pillarboxes" 4:3 material between black bars at either side.


That works only if the cable box can tell the material is 4:3, though. Today I tuned in the TNT HD feed (on cable channel 249) while it was playing the Sylvester Stallone action thriller Assassins. The screen's width was filled by fat-faced characters, as before. I tuned to the regular TNT channel (30) and found a properly proportioned 4:3 image of the same movie there — meaning that the 4:3 "pan and scan" version of the film was being digitally stretched by TNT for the HD feed. My gear couldn't do much about that (unless I tediously set it up to watch channel 249 over the component-video input, using "Normal" mode).

A bit later, Assassins ended and TNT began its telecast of Clint Eastwood's Blood Work. That one looked just fine, over DVI, via channel 249 ... and also just fine via channel 30! For this movie, TNT was showing a widescreen HD version on 249 and chopping its sides off to make a 4:3 image on 30. Over "Wide" mode DVI input on my Samsung, the latter showed up at a correct 4:3, while the former came in at the intended 16:9.

The moral: TNT ought to use the second method exclusively, never the first. It ought to chop a 16:9 image's sides for standard-def 4:3 feeds, not stretch 4:3 to make 16:9.


But that assumes TNT has access to 16:9 "transfers" of all its widescreen movies.

Every TV movie has to undergo a film-to-video transfer. The original celluloid is scanned, frame by frame, on a machine that is generally called a telecine. In modern telecines, an optical sensor follows a sharply focused beam of light and turns minute spots on the film into pixels, one pixel at a time. The output pixels are recorded on, say, a computer hard drive. From there, they can be manipulated in various ways and sent out to storage media such as magnetic tape.

The telecine operator, sometimes called a colorist, often has choices to make. One choice may be what digital video format to use for the output pixels. Commonly used today are "2K" and "4K" formats. 2K scans a typical 35mm film frame at 2048 pixels across by 1556 vertically. 4K does 4096 x 3112 scans.

See this web page for more on 2K and 4K. One poster there says,

It's generally taken that 4K is the resolution needed to fully capture all the detail down to the level of individual grains, on 35mm film. 2K has been used more often though, as it's half the resolution for 1/4 the data and storage space.

Notice that both 2K and 4K give better pixel resolution than (1920 x) 1080i, much less (1280 x) 720p. A 2K or 4K scan of a film has to be downconverted for HDTV ... and possibly again for 480i SDTV.

2K and 4K telecines are relatively recent technology. Earlier telecines produced lower-resolution — often analog — video scans: HDTV-quality digital, only fairly recently; EDTV-quality digital, less recently; or even plain old 480i analog, in the good old days. Those older transfers or video "masters" abound in the vaults today. They were used not only for telecasts, but also for mastering DVDs, VHS tapes, and laserdiscs.


Another decision the colorist makes is which aspect ratio to use: 4:3 or 16:9.

For 4:3 transfers, when the source film is widescreen, there has to be an accomodation:

"Pan and scan" 4:3 transfers zoom in on the part of the scene where the action is, then hop to a different part of the image for the next scene. Sometimes there are artificial pans from one talking head to another in the same scene, ones the director of the movie wouldn't approve. Yuck!

"Letterbox" 4:3 transfers put black bars above and below the entire widescreen image, with no cropping or zooming in. Some people say yuck! to that, as much of the screen isn't used.

"Anamorphic" 4:3 transfers compress the widescreen image horizontally, with the understanding that playback gear can stretch it back out for display on a 16:9 screen. In some cases, there may still be minimal black bars at top and bottom.

Most aficionados think anamorphic is the way to go. It puts the greatest possible amount of the original image into the greatest possible portion of the 16:9 TV screen. For a 4:3 screen, the playback hardware simply simulates letterboxing.

Then there are the true 16:9 transfers. If the source movie also happens to use the 16:9 (aka 1.78:1) aspect ratio, then there's no problem. But few movies do. Most widescreen movies use either 1.85:1 or 2:35:1. Then there are basically two options:

"Matted" or "cropped" 16:9 transfers just chop off enough of the two sides of the widescreen (say, 2:35:1) film to make the picture fill an entire 16:9 aspect ratio frame from top to bottom. This is roughly equivalent to "pan and scan" for 4:3 transfers. It destroys the original framing of the scene, a.k.a. the OAR, which stands for "original aspect ratio."

(Actually, I made the names "matted" and "cropped" up. For all I know, the industry calls this sort of transfer "full screen," or something like that. It might also be well described as "zoomed," since the Zoom mode of an HDTV produces a similar effect.)

"OAR" 16:9 transfers, like those for 4:3, put black bars above and below the image, allowing the entire image to be seen, exactly as framed by the filmmaker, in OAR. Naturally, if the movie was originally filmed with a 16:9 or 1.78:1 frame, there are no black bars in an OAR transfer.

(There is no such thing, by the way, as an anamorphically squeezed 16:9 transfer. Anamorphic squeezing is done only to shoehorn 16:9 material into a nominally 4:3 frame.)

TNT's copy of the originally 1.85:1 Assassins is, clearly, 4:3 pan and scan. There are boatloads of pan and scan and letterbox 4:3 transfers in the vaults, from the days before DVD players and other digital devices could unsqueeze anamorphic transfers.

TNT's copy of the originally 2:35:1 Blood Work (since it fills a 16:9 screen side-to-side and top-to-bottom) looks like a matte/crop/zoom transfer. Given the choice, I'll take it over stretching a 4:3 pan and scan transfer to fill a 16:9 frame. But, best of all in my book, give me a true OAR 16:9 presentation, even if it leaves the top and bottom of the screen black.

Friday, June 03, 2005

Pulling the Analog Plug

An article titled "The end of analog TV" recently appeared on the MSNBC website. "Depending on the outcome of discussions in Congress," it starts off, "television as we know it may end at exactly midnight Dec. 31, 2006."

The article is talking about the date the plug may be pulled on the current television standard, which is analog, in favor of all-digital TV.

HDTV is digital TV and can't be carried in analog form. You can't cram enough pixels into the channel space unless you use digital compression techniques such as MPEG-2.

So Congress and the Federal Communications Commission told broadcasters a decade ago that they could use new, previously unallocated channels to send out digital signals over the air — and that at a date uncertain, far off in the future, they'd have to shut down their analog broadcasts and return those channels to Uncle Sam. When a certain fairly hefty percentage of households were digital-ready — but not before the end of 2006 — the old analog channels would go dark.

We can now begin to descry Dec. 31, 2006, on the far horizon ... and it looks as if the requisite 85 percent of homes will not have digital TVs.

"Over 1400 broadcasters now transmit in digital as well as analog," the article says, "reaching 99 percent of the U.S. television market. So the problem is not the readiness of broadcasters. No, despite the recent upsurge in sales of plasma flat panels and other wow-factor digital TVs, not enough of them have invaded enough living rooms. The article says that "at present there are only about 30 million televisions with digital tuners in American homes, out of a total of several hundred million installed sets."


Of course, neither of my hi-tech screens have onboard digital tuners.

In addition to my Samsung 61" DLP rear-projector, which has true 1,280 x 720p resolution, I have a 32" Hitachi plasma monitor. The Hitachi uses 1,024 pixels across the 16:9 screen and 852 vertically. Its screen pixels are wider-than-square, a common situation with plasma sets. When it receives a 720p or 1080i signal with square pixels, its internal digital circuitry maps them to the less-horizontally-dense, non-square screen pixels.

At any rate, I can't decode over-the-air (OTA) digital transmissions on either digital TV ... but so what? I have digital-cable-cum-HDTV, and I also have DirecTV satellite service, which is likewise digital (but not hi-def, since I have yet to invest in the necessary technology upgrade).

So, who needs an OTA digi-tuner? I sure don't — but see below for who else might.


The MSNBC article, under the assumption that the New Year's Eve '06 85-percent deadline is going to be missed, goes on:

Congress needs to do something ... . For starters, there’s the remarkable fact that Americans are still buying over 20 million analog sets each year, all of which could be obsolete rather quickly. If Detroit was selling cars that used a type of gasoline that would soon no longer be available, consumers would expect to be informed. Thus analog sets clearly need some kind of warning label ... .

Those who are gung ho for digital television want Congress to now make 12/31/06 a firm "date certain" upon which the analog plug will definitely be pulled.

If that happened, folks who didn't buy a TV with an onboard digital tuner — and weren't interested in relying on cable or satellite — would presumably have to spend perhaps $100 on a "converter box." But, says the article,

85 percent of Americans now get all their television from cable or satellite providers, so for the most part the change-over won’t affect them.

Admittedly, some of those households have second or third TV sets that are OTA-dependent and would need converter boxes. But the real problem is

... the 15 million or so U.S. households whose only television service comes over the air. For these people, predominately lower-income and disproportionately black and Hispanic, the cut-off will be bad news indeed.

The citizens who can least afford to plunk down a C-note for a converter are the ones that would be left high and dry.


Uncle Sam is making noises about possibly subsidizing low-income households' converters to the tune of a billion (or several billion) dollars, with the anticipated cost declining as converter box prices drop over the next several years, if the transition to digital is delayed. This is not empty largesse, since the current analog channels, once they revert to Uncle Sam, will be auctioned off for big money. Anything which hastens that influx of dollars will help offset the huge deficits expected over the next unpty-ump years.

(Actually, I gather the scenario is a little more complex than that. The digital broadcast streams now occupying UHF TV channels 52 – 69 would, I believe, move to the channels currently in use for analog broadcasting. Then the channels in the 52 - 69 range would be sold off at auction by the federal government.)

Also urging a quick transition to all-digital TV are the consumer electronics manufacturers, who expect to reap a bonanza. Too, Intel and other computer-industry giants want analog TV to go bye-bye so they can use the freed up channel space for high-speed wireless Internet services. So there are a lot of heavy hitters behind the transition.

So, the article says,

Rep. Joe Barton [R-TX], chairman of the House Energy and Commerce committee, .. is expected to introduce a cut-off bill sometime in the next few months.

Maybe, for all I know, he's already done so, since the MSNBC article was first published. Barton's ally in the Senate is John McCain (R-AZ), who also wants a "date certain" to be set sooner rather than later. But it's not clear that either legislator is wedded to 12/31/06 specifically as the cut-off date. Meanwhile, other movers and shakers seem to prefer "a more gradual, region-by-region approach that might even extend to the end of this decade."

Analyzing this, I'd say there are two major impediments to a rapid analog cut-off. One, the effects on low-income households, particularly if (as the Bush administration wants) there are no subsidies. Two, the effects on Mr. and Mrs. Average Homeowner, who will (however unjustifiably) feel put upon, dazed and confused, and angry at "those politicians in Washington."

My feeling is that settling this politically will take up all the time that exists between now and 12/31/06, and then some. But at some point down the road, the analog plug will be pulled. It may happen all at once, or it may happen region-by-region or city-by-city, but it will happen ... and lots of people will be panic-stricken. Ain't that just like a human: to pay no attention at all to that locomotive that's heading down the track for you, until it squashes you and then you complain at the top of your voice.

As for the subsidies for the poorer among us, I'm all for them. Since the tab Uncle Sam picks up will shrink, the further off the cut-off date is set, I suspect this is another reason to believe in a cut-off date in the 2008 or 2009 range rather than in 2006 or 2007.

***

P.S. There is a follow-up to the MSNBC article here, in which the writer points out that non-HD digital sets, much less pricey than the HD ones, are coming soon:

Last week I had a chance to see one of the first non-HD digital televisions at a home entertainment show in New York. RCA showed its 27-inch 27V514T set displaying a digital over-the-air broadcast next to a set receiving the same channel in analog form, through the same antenna. The digital image was significantly better ... .

As a matter of fact, the original article states that by 2007, all new TV sets sold in the U.S. must have digital tuners. This is "under current law." Or is it actually an FCC rule which mandates it? Never mind; the point is that I doubt Congress will pull the plug on analog broadcasts until that law (or rule) has had a chance to whittle down the number of tunerless households.

Again, it looks to me as if 2008 or 2009 is a more likely time frame for the demise of the last over-the-air TV "ghost." (Digital transmissions aren't burdened with those faint double images, once so familiar, which plagued analog broadcasts as they bounced off buildings and trees en route to the receiving antenna.)

Hi-Def Birds?

Orioles
team
logo
So far I've managed to catch just one Baltimore Orioles baseball game in high-definition since I've had my new cable box/DVR. It was last Friday's game against the Tigers, in which the Os' clawed their way back from a 4-0 deficit to go into the bottom of the ninth a single tally behind, at 4-3. They loaded the bases with two out when one of their youngest hitters, Jeff Fiorentino, struck out to end the game.

The game was broadcast live from the Birds' home venue in Baltimore, Oriole Park at Camden Yards, on Comcast SportsNet Mid-Atlantic, HD channel 200 on my Comcast box.

The picture was absolutely stunning on my 720p 61" Samsung DLP. Did it start out as 1080i, or 720p? Who knows? For sports fans, though, this kind of picture (and sound) quality is reason enough to go hi-def.

It looks to me as if, sadly, none of the televised O's games that don't originate from Camden Yards — i.e., none of the away games — are in HD. But it's real hard to find out for sure.

For example, take a look at the team's broadcast schedule here. Now, compare that listing with this one. Notice the listings for the games against Houston and Colorado on June 13 through 17. On the second list, which is from the Comcast SportsNet website, one sees "CSN/CSNHD," implying these games will be shown in HD on channel 200. But in the first list, which is from the Orioles' official website, it just says "CSN." No mention of CSNHD.

It gets hairier. Some of the games on the official O's website schedule are shown as "WJZ / PAX 66," and yet others as "WB54 / PAX 66." Of the three channels mentioned, as far as I know only WJZ has a hi-def twin in my cable lineup: WJZ-DT, channel 212. WB54 is WNUV-54, channel 14 — non-hi def, and not even digital. PAX 66 is channel 20 on cable, also non-digital lo-def.

So possibly, just possibly, home games on WJZ will show up on WJZ-DT in HD. One can always hope. (Notice also that the CSN listing is entirely silent about these other broadcast outlets.)


My cable box's onboard program guide does in fact tell me about Birds' games appearing on supposedly-HD channel 200 — except that it forgets to distinguish between those carried only in standard definition and those actually carried in true high defintion.

Plus, channel 200 is presently omitted from the guide's dedicated list of supposedly HDTV-only channels, with the lo-def CSN sister channel, channel 7, appearing in its stead. If I select from the list a game that I know damn well is going to be shown in HD on 200 and tell the DVR to record it, the channel-7 lo-def version is what gets recorded — which is exactly what you'd expect, since it says channel 7 right there on the screen. Comcast simply needs to make sure it's channel 200 that shows up in the HDTV channel list, not channel 7.

Notice also that the O's official broadcast schedule makes no mention of games on ESPN or ESPN2. Fox's Games of the Week are listed, though — but there's no indication whether any of them are going to be in hi-def. (I do receive a hi-def twin of my local Fox station: channel 213, WBFF-DT, so, again, I can hope.)

ESPN and ESPN2 present their upcoming HD offerings here. Only the next month or so gets listed, and I find no Baltimore Orioles games shown at the present time (the list runs from June 3 through July 4th).

Joy of joys, I do get ESPN-HD on channel 202. I'm guessing that it carries whatever HD offerings happen to be on either ESPN or ESPN2, but I'm not sure about that. Maybe it's just duplicating ESPN, not ESPN2. Maybe there's a separate ESPN2-HD which my cable system doesn't carry.

Confusin', ain't it?


Of course, the problem is basically that so few people's homes are HD-ready yet. To be HD-ready, they'll have to be at least digital-ready. I'm not clear on how digital-ready they'll need to be before the powers-that-be in Washington deem it time to pull the plug on analog TV. More on that in my next post.

Thursday, June 02, 2005

Motorola DCT6412 HD DVR Cable Box

Motorola
DCT6412
HD DVR
cable box
My new hi-def cable box-cum-DVR from Comcast is a Motorola DCT6412. It has a 120 GB hard disk drive capable of holding some 15 hours of hi-def material or 60 hours of lo-def. It uses a fairly nice "TV Guide" program locator — though TiVo's much-vaunted interface has a more elaborate capability for setting up "wishlists" and such. (I understand Comcast will be switching to TiVo in 2006.)

A pair of internal tuners let me watch/record two channels at once. And I've just discovered that the DVI output actually works!

Digital Video Interface is a digital alternative to a wideband component video (YPbPr) connection, which is analog. Using YPbPr, the cable box has to convert the incoming digital channel into three analog components. After receiving those three components, my Samsung 61" DLP TV has to re-convert them to digital. But with DVI — transmitted via a pricey special cable — the intermediate digital-to-analog and analog-to-digital conversions are bypassed. The result with an all-digital signal path from source to screen seems to be a slightly cleaner picture.

I must admit I'm a bit surprised Comcast's box supports DVI. Although DVI is digitally copy-protected, there's always the possibility someone will crack the protection and allow perfect digital bitstreams of the latest and greatest movies to BitTorrent their way around the Internet.

And wouldn't it be nice to be able to record those hi-def digi-goodies on, say, D-VHS tape? Not possible, alas, with DVI.

Still, the picture across the DVI interface is out of this world.

HBO's Empire Falls

The new HBO miniseries Empire Falls, with a stellar cast headed by Ed Harris, Paul Newman, Joanne Woodward, and Helen Hunt, is the first made-for-TV movie I've watched on my new HD cable box-cum-DVR. It's a two-parter (2 hrs./1:30) based on a quirky novel by Richard Russo, who also did the equally quirky screenplay.

As a quick Google around the web tells me, many armchair critics have found it boring, yet the IMDB listing rates it at a high 7.8 of 10, based on (currrently) 207 votes. I found the acting so good, the relationships revealed by the plot so subtle and intricate, and the visual quality so excellent that it decidedly held my interest, even though I don't usually go for such "serious" fare.

Paul Newman
in HBO's
Empire Falls
Besides, how often do you see Paul Newman look this disreputable? The crumbs in his beard show up nicely in hi-def, and you can almost tell how bad his character smells.

This production was apparently shot on film, like a movie, but designed especially for transfer to 1080i HDTV. It exactly fills a 16:9 screen with luscious color in most segments, while flashback sequences are intentionally washed out to a pastel look. There is little if any film grain apparent on my 61" 720p Samsung. Nor do any digital or compression artifacts mar the view. Certain images feature post-production special effects that are very "TV like," such as the red X'es that appear over an old photograph, crossing out Indians who succumbed to "various European diseases" during the town of Empire Falls's colonial era.

All in all, I would say Empire Falls is quite representative of the state of the art in made-for-HDTV film work — but what do I know, it's just about the first such effort I've viewed. It deserves a look by all who don't consider nuanced psychological relationships a yawn. HBO-HD will be repeating it often over the near-term time frame.

Monday, May 30, 2005

The Definition of Definition?

What is the definition of "definition," as in "high-definition"? Is it simply how-many-pixels by how-many-pixels the transmitted signal provides, or is there more to it than that?

A 1080i signal has exactly 1,920 pixels across the screen horizontally, while there are exactly 1,080 pixels up and down the screen vertically. Each pixel of the frame is refreshed (updated) once every 1/30 second — though, actually, half (those on the odd-numbered scan lines) are refreshed in the first 1/60 second, for one field, and the other half (on the even-numbered lines) are done to make the second 1/60-second field. If every pixel is (spatially or temporally) distinct from every adjacent pixel, you get the maximal resolution or definition.

The same theory applies to 720p, except the pixel grid is 1,280 pixels across by 720 pixels up and down, while every pixel is refreshed once every 1/60 second. Notice that in both 1080i and 720p the pixels are square. The pixels used for encoding digital video on DVD are oblong.

So a plethora of pixels that are spatially or temporally distinct is the essence of high definition. Still, for several reasons, there's more to it than that. There are several things that stand in the way of getting maximal theoretical definition ... and some of them can even be considered good.


For example, a 1080i signal is vertically filtered. About 30-40 percent of the potential vertical resolution is filtered away in order to avoid interlace artifacts such as details flickering on and off as they rise slowly in the picture.

This happens because the two interlaced "halves" of the picture are offset slightly in time. It is even possible for tiny details that are moving upward (or downward) at just the right rate to be completely missed by the interlaced scanning of the image, if they happen to always fall on one of the "missing" scan lines in each 540-line field. But if the rate of ascent or descent is slightly different, these small details will blink.

Or, a completely stationary detail that is present in one of the two fields but not in the other will also blink, in what is referred to as "twitter."

The vertical filtering of 1080i to eliminate the blinking and twitter takes the potential 1,080 "lines of vertical resolution" down to 756 effective lines. It's a good tradeoff: slightly less detail for a calmer picture.

720p is not filtered, since it is not interlaced. Notice that the 756 effective lines of 1080i are not that many more than the 720 actual lines of 720p. For more on this, see DVE Frequently Asked Questions, a discussion by TV guru Joe Kane of his then-upcoming Digital Video Essentials test DVD and D-VHS tape.

Joe Kane writes, in Interlaced Video Go Away, about how the resolution of 1080i video is a lot less than your might think:

Most films transferred to video come in at 800 to 1100 pixels [in each horizontal pixel row of the image] and video material will often be in the order of 1300 to 1400 lines [of horizontal resolution, not the nominal 1920 lines of 1080i]. The clear winner in picture quality is 720p over 1080i. The reason is interlaced artifacts and the vertical filtering required to get from progressive to interlaced. The real vertical resolution of 1080i images in motion is somewhere around 640 lines [due to the filtering]. The true horizontal resolution capability of the broadcast 1080i signal is 1440 pixels or less. The limitations are MPEG encoding and the bandwidth of a TV channel. There is little hope of that getting better any time soon. Even at 1440 x 1080i the MPEG artifacts and lack of vertical resolution in a moving image are far worse than at 720p.

Decoding that: in 1920x1080i video you get just 640 lines of vertical resolution(!), not 1,080, owing to image filtering that has to be done in order to head off the possibility of "interlace artifacts" on your TV screen. These artifacts, if allowed to show up on the screen, would lead to an unwelcome, visible structure of pixel rows/scan lines surrounding moving objects in the picture. They would also cause images to flicker and shimmer when, for example, there is a camera pan taking place.

Furthermore, MPEG encoding, done to compress the digital video signal by drastically reducing the number of bits per second in it, needs to have a lot fewer distinct pixels per pixel row than the nominal 1,920 pixels per row of 1080i: "1440 pixels or less." Otherwise, it is hard to get the desired compression ratios between the number of bits per second going into the encoder and the number of bps coming out. The only other way to get the desired compression would be to put up with irritating "macroblocking" artifacts. Most people prefer a slightly softer image.

Film-based material is notoriously harder to compress than video-based material, so for it, "1440 pixels or less" per pixel row has to be further reduced, to "800 to 1100 pixels." But 720p video, because it is progressive, not interlaced, does not have to be filtered in the way 1080i does. Accordingly, its 720 rows times 1,280 pixels per row arrives intact on the HDTV screen.


Also good, in a sense, is the "chroma subsampling" which reduces the number of bits in a 1080i or 720p signal.

Each pixel actually starts out as three pixels, one red, one green, one blue. These R, G, and B numerical values are combined algebraically according to a certain formula to derive Y, the luminance or luma signal. Y represents a black-and-white or monochrome picture. (Actually, all these values are "gamma-corrected" to stretch the contrast ratios at one end of the brightness range and compress them at the other, but I'll ignore that.)

Once Y is obtained for a pixel, the two values (B - Y), or Pb, and (R - Y), or Pr, are derived. Pb and Pr are the two chrominance or chroma signals. Together, Y, Pb, and Pr are the three separate signals of "component video."

Pb and Pr in effect "color in" the Y monochrome signal with blue and red, respectively. Algebraic manipulation of Y, Pb, and Pr can derive, in effect, the (G - Y) color difference signal which allows green to be "colored in" as well.

But whereas Y is transmitted at full resolution, Pb and Pr are downrezzed somewhat by means of 4:2:2 chroma subsampling. The "4:2:2" notation means essentially that each pair of horizontally adjacent Pb pixels — and, separately, each Pr pair — are blended into one double-width pixel, thus cutting the number of bits needed to represent Pb and Pr in half.

This also serves to halve the horizontal resolution of the two chroma signals. (The vertical resolution is left unchanged, as are the vertical and horizontal resolution of the luminance signal, Y.) Yet, thanks to the fact that the acuity of human vision is lower for color than for monochrome information, the reduction in color resolution is unnoticeable at normal viewing distances.


Saving bits is important. It is the whole rationale of digital video compression. According to the online article High-Definition Television Overview, each broadcast HDTV channel has to be shoehorned into an existing analog channel 6 MHz wide — the channel's "bandwidth." This can be done only if the digital data rate is limited to roughly 20 (actually, 19.2) megabits of information per second.

But HDTV can generate about 120 megabytes per second, uncompressed. (See Charles Poynton, Digital Video and HDTV Algorithms and Interfaces, p. 117.) That's 48 times what's allowed.

Chroma subsampling cuts two of the three YPbPr streams, Pb and Pr, in half, which by my calculation cuts the 120 MB/s down to 80 MB/s. That data rate is not small enough.

Eschewing progressive scan and using interlaced scanning, à la 1080i, cuts that in half: 40 MB/s, or 320 Mbits/s. Still not small enough. True digital compression is needed. Enter the MPEG suite of digital video compression techniques. The standard used for HDTV is MPEG-2; specifically, "MPEG-2 Main Profile at High Level." (DVDs are encoded at much lower data rates using "MPEG-2 Main Profile at Main Level.")

MPEG-2 compression, whatever its profile and level, first removes redundant information that the decoder can restore on its own. But that still isn't enough, so it uses an algorithm to strip out more information. This information cannot be restored by the decoder — the compression is technically "lossy" — but the algorithm is designed to remove only information whose loss is undetectable to the human eye.


That holds true as long as the MPEG compression ratio, which is adjustable, is not too high. But what constitutes "too high" depends on the scene. Busy scenes with fast motion cannot stand as much compression as static scenes with little fine detail.

DVD compressionists adjust the compression ratio scene by scene, but HDTV is broadcast in real time. Usually, there has to be a single compression ratio chosen to accommodate the busiest, most dynamic scenes. If too much compression is done, some scenes can lose visual detail, especially when full of motion.

Here's a case where how-many-pixels by how-many-pixels doesn't really tell you what the definition is. But keep in mind that overly enthusiastic digital compression produces eye-disturbing artifacts above and beyond reducing apparent resolution, so rarely do you hear too much lossy compression blamed for poor picture definition per se.

Too much lossy compression is more likely with 1080i, less likely with 720p. Although 720p refreshes each pixel twice for every one time a 1080i pixel gets updated, the spatial resolution within the 720p frame is so much lower that it's easier to shoehorn 720p in a 6 MHz channel. So 720p needs less compression than 1080i.


And now we come to the vexed question of horizontal resolution. In theory, 1080i can support 1,920 "lines" of it, 720p just 1,280 (since each "line" is really a column of pixels whose width is that of a single pixel).

But Joe Kane says in D-Theater - Questions and Answers that there are several caveats. Due to "many places in the production and distribution chain where image resolution can be lost" — i.e., due to signal-processing compromises — the cruel fact is "that the broadcast limitation of horizontal resolution for the 1080i system is about 1400 lines." Meanwhile, 720p's horizontal resolution remains as advertised: 1,280 lines.

Also, says Kane, "film content ... usually doesn’t get much above the 1300 line mark in horizontal resolution." Or, again, "Horizontal resolution of most film masters in 1080p is in the area of 800 to 1300 lines."

(1080p? That's like 1080i except that it's intended for uses such as film-to-video mastering which can benefit from higher data rates than broadcast HDTV allows. So its 1,920 x 1,080-pixel frames can use progressive rather than interlaced scanning, it needs no vertical filtering, and it can use other frame rates than 30 frames per second.)

The important thing to notice here is that two things can reduce the actual horizontal resolution below its theoretical maximum. One is signal processing, especially with 1080i; the other is limited resolution in the source material (for instance, a movie).

Both of these things can, of course, also reduce vertical resolution. In fact, the vertical filtering which eliminates 1080i interlace artifacts is a type of signal processing that limits vertical resolution.

More problematic is what happens when video starts out at standard definition. Say it begins life at 480i or 480p, the scan rates associated, respectively, with standard-def TV and with DVDs, when progressively scanned. The former is interlaced; the latter is, unsurprisingly, progressive. 480i/p SDTV can be scaled or upconverted to, say, 1080i for HDTV broadcast. But the amount of detail in the picture — both horizontally and vertically — stays the same.

So an HDTV channel that broadcasts upconverted SD material doesn't look much better (if any) than an SDTV channel broadcasting the same SD material. In fact, the "pseudo-HD" broadcast might have even less detail, if some of it was lost in the signal processing for the upconversion.

One wrinkle on pseudo-HD is what I just encountered on the ESPN-HD channel, on the Memorial Day broadcast of the National Lacrosse Championship (Johns Hopkins 9, Duke 8). Quite on purpose, owing to the fact that they were using SD cameras and transmission equipment, they took a standard-def picture with the squarish 4:3 aspect ratio and put it between two hi-def "pillarboxes" on the 16:9 screen. Though the actual picture was clean (because it was digital) it didn't have that crisp HD feel to it.

This same kind of pseudo-HD thing can reportedly happen inadvertently in a signal transmission chain that isn't set up right. Suppose a TV network sends a member station both a 1080i and a 480i version of a program. The station is supposed to send the former out over its digital channel, the latter over its traditional analog channel. But what if there's a screwup, and the 480i feed gets upconverted for the 1080i broadcast, while the 1080i feed is ignored? Result: a nominally 1080i broadcast with just 480i-like resolution.


There is a third thing which can harm horizontal resolution. It is actually itself a kind of signal processing: intentional downresolution or downconversion. "Downrezzing," it's familiarly called. For example, the May/June issue of The Perfect Vision magazine cites the TiVo Community Forum web site to the effect that DirecTV is reducing the resolution of its 1080i channels from 1,920 x 1,080 to 1,280 x 1,080. (See "Has DirecTV Downrez'd HDTV?," pp. 14-15).

The reporter could not get DirecTV to confirm this policy. If true, it is doubtless being done in order not to overload the data transmission capacity of its satellite transponders, while still offering the same number of HD channels.


So there are several things which make the effective, as opposed to theoretical, definition in a digital HDTV picture hard to pin down. As we have seen, filtering, chroma subsampling, MPEG compression, limited resolution in source material, losses in digital signal processing, and downrezzing are among the most important.

We can compare either of the HDTV formats — 1080i or 720p — to a straw, and the picture content to a milkshake. The more detail exists in the content, the "thicker" the shake. The thicker the shake, the "fatter" the straw ought to be — i.e., the higher the definition of the format should be.

All the same, just because you have a super-fat straw doesn't guarantee that your milkshake isn't thin and soupy. Just because you are receiving a 1080i or 720p signal doesn't mean the content isn't essentially 480i.