Sunday, May 28, 2006

1080p from High-Definition DVDs?

Now debuting: two, count 'em, two, mutually incompatible formats for high-definition DVDs: HD DVD and Blu-ray. These are the first two commercially available standards for video discs whose players (not the old-style DVD players we have now) will play the new discs into a suitable HDTV with all the resolution the TV can reproduce.

As long as, that is, the HDTV does not utilize the gold standard of high-definition television display: 1080p resolution. The initially available HD DVD players don't output 1080p. And it's not absolutely clear which, if any, of the Blu-ray players we're eagerly awaiting over the next few months are going to deliver 1080p.

In the last year or so, TVs whose "native" resolution is 1080p have sprung up, big as life, and taken a noticeable slice of the market. Their screens offer fully 1,080 rows of 1,920 pixels each, yielding the best spatial resolution available in consumer video history: roughly two million pixels overall on the screen.

1080i, the standard in use by many over-the-air HDTV broadcasters, has the same number of pixels as 1080p, but only half of them actually change with each screen update: first the odd-numbered rows, and then, a fraction of a second later, the even-numbered rows. This alternation of scan lines is "interlaced scanning," the origin of the "i" in 1080i.

In 1080p — "p" for "progresive scanning" — the entire pixel array gets updated, every time. Motion is smoother. Jaggies at the edges of moving objects, "twittering" scan lines, and now-you-see-it-now-you-don't tiny details — symptoms experts call "interlace artifacts" — are pretty much history.


So, do HD DVD and Blu-ray support 1080p?

The answer is, alas, complicated. In order to get 1080p to your eyeballs, you need at least four things:

  • a disc encoded at 1080p
  • a DVD player that can output 1080p to the TV
  • a TV that can receive 1080p from the player
  • 1080p native resolution at the TV screen

Reportedly, most HD DVD and Blu-ray discs encode their main content, usually a movie, at 1080p, so no worry there.

Any TV which is advertised as 1080p-capable must have that as its native screen resolution, so this is not a huge problem (as long as you happen to own such a TV).

But most of the initial crop of "1080p" TVs cannot actually receive 1080p signals; 1080i is their maximum input capability. (That situation may have changed with the most recently introduced 1080p TVs, however.)

What's more, few if any of the initial HD DVD players actually output 1080p. (That may not be as true for Blu-ray; see below.) They convert the 1080p on the disc to 1080i, and ouptut that. The 1080p-native TV receives the signal as 1080i and deinterlaces it for display on the screen. The interlace-deinterlace sequence can introduce pesky artifacts.


Foolish, right, for HD DVD players not to support full-fledged 1080p output from the get-go? Well, part of the reason for the foolishness is that 1080p must travel between the player and the TV, if at all, along an HDMI cable. HDMI is a standard for the transmission of video data, audio, and other digital goodies between source devices such as DVD players and TVs. It uses HDCP copy protection to keep anyone from intercepting the digital stream and diverting it to their own (illegal) advantage — so it's very, very complicated.

But HDMI/HDCP's originally-strictly-optional ability to transmit and receive 1080p is just now actually appearing in consumer electronics gear for the first time. The initial HD DVD players are meanwhile sticking with an older, non-1080p implementation of HDMI.

For instance, one of the very first HD DVD players is the Toshiba HD-A1. According to Amazon.com, it can output either 720p or 1080i at its HDMI connection. No 1080p.

As for Blu-ray, it looks as if at least some of the initial player models will output 1080p on HDMI. For example, the yet-to-arrive (as of early June, 2006) Sony BDP-S1 will reportedly do so.


Another complicating factor: there are different flavors of 1080p, distinguished by their frame rates. How many frames of video per second are going to be transmitted? There are at least three popular answers: 24, 30, and 60.

24 fps (frames per second) matches the rate at which movies are shot. Hence, most HD DVD and Blu-ray discs are encoded at 1080p/24.

30 fps is typical of television broadcasts, both standard definition and high. If they're 1080i hi-def, as opposed to 720p, they are more precisely 1080i/60, not 1080i/30, since there are two interlaced "fields" per frame, one for the odd-numbered scan lines and one for the even. With interlaced transmission the number after the '/' is the field rate, not the frame rate ... and by the way, often the '/' is omitted: you'll see "1080p24," "1080i60," etc., instead of designations with slashes.

So the frame rate of 1080i/60 is actually 30 fps. 720p high-definition television (1,280 pixels across the screen by 720 vertically) doubles that frame rate to 60 fps: 720p/60. In 720p there are fewer pixels per frame than either 1080i or 1080p, but they're updated twice as often as 1080i/60. 720p is excellent for fast-action sports.

1080p/60, with 60 full frames every second, is another flavor of 1080p. Some current and/or soon-to-come "1080p" TVs apparently will accept 1080p/60 input, which can happen if the DVD player converts 1080p/24 to 1080p/60.

(Today, it's not easy to find out such precise frame-rate details about TVs and DVD players. Here's hoping that changes soon.)


What if the TV can only display 1080p at 30 fps or 60 fps, not at 24 fps, and what's on the disc is 1080p/24? Then either the player or the TV (typically the player) must perform a conversion. Again, as with any type of scan conversion, there is a potential for visible artifacts to result.

In this situation, the main reason for artifacting is that 60 is not an even multiple of 24. Neither is 30, for that matter. If 1080p/24 on DVD is converted to 1080p/60, or 1080p/30, or even 1080i/60, some frames in the output video will necessarily be interdigitated hybrids of two source frames, which can lead to ragged edges on moving objects. Or else the 24 input frames per second will be parceled out to varying numbers of output frames, some to one and others to two, making for jerky motion.

To avoid such motion artifacts, the TV ought to be able to operate at a 1080p frame rate that is an exact multiple of what's on the disc: say, 72 fps. 48 fps would work, too. And incidentally, the rate at which the TV "paints" frames on the screen is its "refresh rate," and is stated in Hertz or cycles per second. 48 frames per second is 48 Hz. 72 fps is 72 Hz.

A 24-Hz refresh rate with a one-to-one correspondence of output frames to incoming 1080p/24 frames would not work well, unfortunately. It would produce annoying flicker on any bright video display screen. (The reasoning is similar to why motion pictures are projected with each frame illuminated twice in 1/24 of a second.)

Ideally, the player would supply the TV with 1080p/24, and the TV could convert it to, say, 1080p/72 (or 1080p/48) to avoid the flicker common when bright video displays use a 24-fps refresh rate. The conversion from 24 to 72 frames per second is straightforward and does not produce visible artifacts.


Just the ability to learn such details about 1080p-native HDTVs and the initial crop of HD DVD and Blu-ray players ain't easy. And let's face it, the ideal hookup as I envision it will look only a smidgen better than converting 1080p/24 on the disc to 1080i/60 for HDMI transmission to the TV, assuming the TV could handle it, and then to (say) 1080p/30 for display on the TV screen at its native resolution. You'd have to be some kind of purist to even care, right?

Well ... as the incipient high-definition DVD format war heats up, we shall see whether people really do care, shall we not?

After all, all the early adopters who spend the big bucks now, at the dawn of hi-def DVD, may find later that waiting a few months, while all the pieces fall in place for an end-to-end 1080p DVD experience without unnecessary artifacts, would have given them a better picture for little if any extra cost.

They're sure to be miffed, right?

Don't say I didn't warn them.

No comments: