In the highest-definition digital TV format available over the air, 1080i, the width of the 16:9-aspect-ratio image holds fully 1,920 pixels. This number is accordingly the upper limit on "horizontal spatial resolution," which in turn means that no detail whose width is tinier than 1/1920 the screen's width — a single pixel — can be seen.
In a 2K scan, the number of pixels across the width of the scanned film frame goes up to at most 211, or 2,048. In a 4K scan, that upper limit is doubled, to 4,096.
The actual count of pixels produced horizontally in a 2K or 4K scan can be less than the stated upper limit — for example, when the aperture used by the film camera to enclose the film frame doesn't allow the image to span the whole 35-mm frame from "perforation to perforation," or when room is left along one edge of the film for a soundtrack. Furthermore, a lesser number of pixels will be generated vertically than horizontally, since the image is wider than it is tall.
Even so, 2K and 4K scans generally betoken levels of visual detail higher (though, in the case of 2K, only somewhat higher) than a so-called "HD scan" at 1920 x 1080.
With the advent of high-definition 1080i DVDs expected imminently (see HDTV ... on DVD Real Soon Now?), do yet-more-detailed film scans at sampling rates such as 2K and 4K have any relevance to us now?
Apparently, yes. Experts seem to agree that it's a good idea to digitize anything — audio, video, or what-have-you — at a sampling frequency a good deal higher than the eventual target rate. The term for this is "oversampling."
For example, The Quantel Guide to Digital Intermediate says on pp. 17-18, "Better results are obtained from ‘over sampling’ the OCN [original camera negative]: using greater-than-2K scans, say 4K. All the 4K information is then used to produce a down converted 2K image. The results are sharper and contain more detail than those of straight 2K scans. Same OCN, same picture size, but better-looking images."
Furthermore, according to "The Color-Space Conundrum," a Douglas Bankston two-part technical backgrounder for American Cinematographer online here and here:
A 2K image is a 2K image is a 2K image, right? Depends. One 2K image may appear better in quality than another 2K image. For instance, you have a 1.85 [i.e., 1.85: aspect ratio] frame scanned on a Spirit DataCine [film scanner] at its supposed 2K resolution. What the DataCine really does is scan 1714 pixels across the Academy [Ratio] frame (1920 from [perforation to perforation]), then digitally up-res it to 1828 pixels, which is the Cineon Academy camera aperture width (or 2048 if scanning from perf to perf, including soundtrack area). ... Now take that same frame and scan it on a new 4K Spirit at 4K resolution, 3656x2664 over Academy aperture, then downsize to an 1828x1332 2K file. Sure, the end resolution of the 4K-originated file is the same as the 2K-originated file, but the image from 4K origination looks better to the discerning eye. The 4096x3112 resolution file contains a tremendous amount of extra image information from which to downsample to 1828x1332. That has the same effect as oversampling does in audio.
In 1927, Harry Nyquist, Ph.D., a Swedish immigrant working for AT&T, determined that an analog signal should be sampled at twice the frequency of its highest frequency component at regular intervals over time to create an adequate representation of that signal in a digital form. The minimum sample frequency needed to reconstruct the original signal is called the Nyquist frequency. Failure to heed this theory and littering your image with artifacts is known as the Nyquist annoyance – it comes with a pink slip. The problem with Nyquist sampling is that it requires perfect reconstruction of the digital information back to analog to avoid artifacts. Because real display devices are not capable of this, the wave must be sampled at well above the Nyquist limit – oversampling – in order to minimize artifacts.
So avoiding artifact creation — "the Nyquist annoyance" — is best done by oversampling at more than twice the sampling frequency of the intended result and then downconverting to the target frequency.
This would seem to imply that it would be wise to transfer a film intended for Blu-ray or HD DVD at a scan rate of at least 2K — but 4K would be better!
It's hard to know how much the "Nyquist annoyance" would mar a typical 2K-to-HD transfer, since (see View Masters) most projected 35-mm movies have way less than 1920 x 1080 resolution to begin with. Original camera negatives generally have up to 4K resolution, or even more, but by the time release prints are made from internegatives which were made from interpositives, you're lucky to have 1K resolution, much less HD or 2K.
So if the "Nyquist annoyance" became an issue with an OCN-to-2K-to-HD transfer, the transfer could be intentionally filtered below full 1920 x 1080 HD resolution and still provide more detail than we're used to seeing at the movies.
All the same, video perfectionists will surely howl if filtering is done to suppress the "Nyquist annoyance." They'd presumably be able to see a real difference between a filtered OCN-to-2K-to-HD transfer and an unfiltered OCN-to-4K-to-HD transfer, given the opportunity.
All of which tells me the handwriting is on the wall. Someday in the not-far-distant future, aficionados are going to be appeased only by HD DVD or Blu-ray discs that are specifically "4K digitally scanned," or whatever the operative jargon will be. All of the farsightedness ostensibly displayed today by studios using "HD scans" for creating standard-def DVDs — intending one day to reuse those same transfers for Blu-ray or HD DVD — will turn out to be rank nearsightedness. Sometimes the march of progress turns into a raging gallop!
1 comment:
Interesting article. I am fairly tech savvy but did not know much about this so was interesting to learn. Thank you. Si.
Post a Comment