TCM HD arrives!
That "original aspect ratio" thing is big, by the way. OAR applies to movies shot in wider than the 4:3 Academy ratio of Hollywood's Golden Age. TV screens used to be 4:3 too, so Golden Age movies fit them perfectly. But OAR rendering of widescreen flicks — as opposed to "panning and scanning" to fill the entire 4:3 TV screen with selected parts of the original film frame — meant putting letterboxing bars at the top and bottom of the screen. Many who were not celluloid cognoscenti hated the bars. The cognoscenti loved them.
Along came today's 16:9 widescreen behemoths, and OAR still meant putting letterboxing or "matte" bars at screen top and bottom, for the oodles of movies whose frame dimensions are notably wider than 1.7777...:1 (which is 16:9 reduced to a decimal value). CinemaScope from the 1950s, in particular, was 2.35:1!
If you want to see all of Around the World in 80 Days (1956) or 20,000 Leagues Under the Sea (1954), among countless other memorable spectaculars of the era, you have to see them in OAR.
If you want to see great movies shown to their best advantage on TV, you also need to see them in HD. People who have Blu-ray players know that. Still, it may be years before Around the World in 80 Days and 20,000 Leagues Under the Sea show up on Blu-ray. Having TCM come into our homes in 1080i gives us hope that one day soon, even before said titles come to Blu-ray, such fare may be visible in a format putting over 2,000,000 pixels on the TV screen.
But there remain obstacles. Whether TCM shows 20,000 Leagues or ATW80 in glorious HD on its HD channel, or simply upconverts it from a standard-def version, is a huge question.
It all has to do with how TCM's vault copy of the title has been transferred from film to video.
Today, film-to-video transfer is usually done using film scanners. These are devices that you feed a reel of celluloid into, wait a very long while while the scanner records every tiny detail of every film frame, and out comes a digital copy in a high-definition-plus video format such as 2K or even 4K.
2K: that's "one K" — "K" refers to the power-of-two number 1,024, not 1,000 — multiplied by two, yielding 2,048, which is the number of pixels per row or video scan line in the digitally scanned 2K output. So each frame of video in the 2K output has 2,048 pixels in it. That makes its resolution better than that of 1080i/1080p HD, because 1080i/p video frames have only 1,920 pixels per row.
4K scans double the 2K per-row number, to 4,096 pixels per row.
2K and 4K scans have varying numbers of pixel rows. They generally have 1,152 and 2,304 pixel rows, respectively, for images using the 16:9 aspect ratio — see the Wikipedia article List of common resolutions. Those numbers are 9/16 the number of pixels per row.
CinemaScope and other super-wide film formats have aspect ratios that exceed 16:9, so for them, 2K/4K scans have fewer than 1,152/2,304 pixel rows. And there are many widescreen films whose frames have, say, the moderately narrow 1.66:1 aspect ratio, so 2K and 4K scans will have more than 1,152 and 2,304 pixel rows, respectively.
But never mind the exact number of pixel rows. 2K or 4K resolution is still better than 1080i/p. The true figure of merit is how many pixels there are per row.
Here's a graphic from the Wikipedia article Digital cinematography that shows what's going on:
The assumed aspect ratio in this illustration is 2.39:1, which is quite a bit wider, proportionally, than the 1.77:1 of 1080i/p or 720p HDTV. That means a 2K scan of the 2.39:1 film will have 857 pixel rows, and 4K will have 1,714 rows. (To get the number of pixel rows, divide 2,048 for 2K, or 4,096 for 4K, by 2.39, and round fractional values up.)
This illustration represents, for each resolution, the relative number of pixels per video frame in the scanner's output. As displayed on a video screen, of course, all the resolutions would exactly fill the width of the screen. But a 4K scan would have twice the detail of 2K in the horizontal dimension and twice the detail in the vertical dimension. Hence, the respective sizes of the rectangles in the illustration represent not the image size but the amount of detail present in the various video scans.
1080i/p and 720p use a narrower-than-2.39:1 aspect ratio, 1.77:1, so when a 2.39:1 film is being scanned in OAR for 16:9 HDTV, letterboxing bars do generally need to be added. Some of the resolution that is available in the vertical dimension of the HDTV screen is accordingly wasted. Film cognoscenti don't mind; they'd rather see the film in OAR.
Moreover, the film scanner may eschew adding the letterboxing matte bars under the assumption that they will be added at some later stage of the film-to-TV-screen "bucket brigade."
TCM HD might ideally like to have 2K or 4K scans for each film in its vaults. For showing on the 1080i TCM HD cable channel, each 2K/4K archive copy would need to be downconverted to 1080i.
Another option would be to start with, specifically, a 1080p scan of any given film. Each frame of the film would be scanned to a single frame of the film scanner's 1,920 x 1,080-pixel video output. There are 24 frames per second in film, so there would be 24 fps of video output from the scanner. Each video frame would represent the entirety of one (and only one) input film frame.
The result would be "1080p24" video, where the "p" says that the video frames are "progressive": they're not separated into two video "fields" per frame, with each digital video field carrying, in odd-even alternating sequence, just the odd-numbered or just the even-numbered pixel rows of the image. Video in which there are alternating fields is "interlaced." The "i" in 1080i says the video is interlaced.
Progressive video is more filmlike than interlaced video. Films on Blu-ray use progressive video at 24 fps, and many modern HDTVs can input 1080p24 video from a Blu-ray player over an HDMI connection. That gives the ultimate in video quality on an HDTV screen.
TCM HD is stuck with transmitting 1080i video. It can't use 1080p, because cable TV (even when digital) isn't able to carry that video format; 1080p uses too many bits per second of channel bandwidth. Cable TV has to use 1080i, not 1080p. Furthermore, cable TV has to use 1080i at 60 video fields per second — "1080i60," it's called, or "1080i @ 60 Hz". That field rate amounts to 30 video frames per second, but the second half of the information in each frame arrives 1/60 second later than the first half.
If TCM has in its vaults a 1080p24 video transfer of a film, it can convert it to 1080i60 for cablecast. There are technical issues that affect the video quality of the result, but the conversion itself is otherwise pretty straightforward.
Among the technical issues involved in converting 1080p24 to 1080i60 is the need to avoid "interlace artifacts." One of the most problematic of the interlace artifacts has to do with scene details that are very small — smaller in their vertical dimension than the height of two adjacent scan lines or pixel rows.
In interlaced scanning, such tiny scene elements can, either wholly or partially, briefly disappear. That can happen when, for example, there is a diagonal camera pan with respect to a stationary scene. In any given video field, a tiny element of the scene may happen to partially or wholly coincide with a pixel row that is missing in that field. If so, the detail simply isn't fully represented in the field — if it's there at all.
In the very next video field, the same detail may have moved slightly with respect to the frame of the picture, owing to the camera pan. Now the detail may show up in its entirety, or only partially, or (again) not at all. In the next field in the sequence, it may show up to a different extent — and so on.
To the eye, the result of all this fine detail being shown to varying degrees in successive video fields may be an impression of false shimmering or flickering in the picture.
This is why interlaced video is often filtered. In the vertical direction with respect to the video screen, a (today, usually digital) filter can be used to remove details of the picture that may cause shimmer.
Unfortunately, filtering to avoid interlace artifacts such as shimmer also reduces the amount of "good" vertical detail in the picture, softening the image somewhat even while retaining all of the image's horizontal detail.
Yet another option is for TCM HD to convert to 1080i a lower-resolution film-to-video transfer for a given film.
For example, TCM might have in its vaults a DVD-quality transfer of, say, Alfred Hitchcock's 1954 classic Rear Window. If it's DVD quality, that means it's probably in the 480i format. There are 480 pixel rows per field in the interlaced video, and there are (up to) 720 pixels per row. The pixels don't have the square shape of the pixels in the formats I just talked about, so when the image is spread across a wide 16:9 screen, the apparent resolution isn't as great as it might otherwise be.
(Actually, since Rear Window was shot with an aspect ratio of 1.66:1, which is narrower than HDTV's 16:9, this film would likely be shown on TCM HD with thin vertical letterboxing bars at the sides of the screen. Or, since the term "letterboxing" properly refers to horizontal matte bars only, the term "pillarboxing" can be used instead.)
Moreover, there are 60 fields per second in this hypothetical 480i scan of Rear Window. Since 60 fields per second is not a multiple of 24 frames per second, some of the fields will have to be repeated an extra time. This creates so-called "telecine judder" — telecine (which can be pronounced with three or four syllables) being how film was transferred to video before there were digital film scanners. That word, telecine, refers to the machine that was used, and also to the process of using it.
Telecine judder shows up quite readily when there is a smooth camera pan across a scene. It looks herky-jerky instead. (Fast pans always involve some so-called "strobing," even when a pristine copy of the film is projected on a theater screen. Telecine judder simply accentuates it.)
If TCM HD shows this "480i60" (as it's technically called) hypothetical scan of Rear Window, it must first be upconverted to 1080i60.
Converting among different digital video formats is called scaling. Going from a lower resolution to a higher is scaling up or upscaling. These are both synonyms for upconverting.
The result of upscaling 480i60 to 1080i60 would be noticeably less video resolution than true 1080i60, because video upconversion can never increase true resolution.
There would be just as much telecine judder as in the 480i60 transfer.
Moreover, there could be nasty video artifacts visible in the image that results from the upconversion. Such artifacts might include aliasing, making for spurious moiré patterns in the picture:
(Click the image to enlarge it, then look at the brick walls behind the girl to see the false moiré pattern.)
The aliasing/moiré artifact in a still picture such as that one can get worse when parts of the scene are in apparent motion. If a TV camera taking a picture of a brick wall zooms outward, when the apparently "moving" bricks get small enough, moiré can suddenly appear.
Similarly, upconverting a video image, done clumsily, can in effect add "tiny bricks" (false detail) that can then lead to the shimmering problem described earlier on camera pans, if the added false detail is not filtered out of the 1080i image (along with "good" detail, unfortunately).
In fact, technically speaking, aliasing, the moiré effect, and shimmering and similar interlace artifacts are all examples of the same underlying problem: image details that are, in size, too near to the sizes of individual pixels or (for interlaced video) pixel rows.
There are other artifacts, too, that can appear (or be accentuated) when upscaling digital video from a lower resolution to a higher is done. That's why it is to be hoped that TCM HD will ultimately replace its non-HD vault copies with 1080p, 2K, or 4K scans — or, better yet, 8K scans! — which can then be skillfully downconverted to 1080i with (hopefully) a minimum of visual artifacts.