Sunday, July 02, 2006

Pixel-Perfect 1080p from DVD?

Although I've recently come very close to deciding on a Pioneer Elite plasma TV (see My Bedroom: Crying Out for HDTV? and More on Pioneer's Elite PRO-1130HD) now I feel I ought to wait. The reason is that this $5,500 Pioneer model has only a 1280x768p screen. It will never yield pixel-perfect pictures from 1080p HD DVD or Blu-ray discs. The 50" Pioneer Elite PRO-FHD1, which fully supports 1080p, will do so, but it's a brand new model just being introduced at a way-high price of $10,000!

What do I mean by pixel-perfect pictures from high-definition DVDs? Both HDs (HD DVDs) and BDs (Blu-ray discs) apparently do/will contain video encoded at 1080p: progressively scanned, non-interlaced frames containing 1,080 lines at 1,920 pixels per line. Pixel-perfect images can result if those frames are transmitted from the DVD player to the TV as such, and displayed as such by the TV.

Instead, the initial crop of HD DVD players convert the 1080p frames on the discs to 1080i for transfer to the TV.

Specifically, they take (in the case of a movie that was shot at the standard frame rate of 24 film frames per second) a 1080p/24 image and make from it interlaced 1080i/60 frames, at the rate of 60 fields, or half-frames, per second.

The first of the two half-frames in each 1080i frame contains just the odd-numbered lines of pixels extracted from a given 1080p frame. The second contains just the even-numbered lines from either the same 1080p frame or an adjacent 1080p frame in the original sequence of frames. Not all 1080i/60 frames represent just a single 1080p frame, which of course does represent a single film frame.

That's because 1080i/60 video, with 60 fields per second, has 30 2-field frames each second, and 30 is not evenly divisible by 24. In converting 1080p/24 video to 1080i/60, a technique called 2:3 pulldown is used. One 1080p/24 frame is used to generate two 1080i/60 fields, then the next 1080p/24 frame spawns three fields, then back to two, then three, and so on. So some of the 1080i/60 frames (specifically, two frames out of every five) turn out to be interlaced hybrids of two 1080p/24 progressive frames.

When those hybrid frames are displayed on a CRT-based HDTV which is designed to handle interlaced video, everything is fine, since the two fields in each frame are temporally separated by 1/60 second. But a fixed-pixel HDTV such as a plasma flat panel has to re-integrate the two fields of each interlaced frame to come up with a full progressive frame each time. To do that properly when 2:3 pulldown has been done, its deinterlacing curcuits have to be smart enough to detect the 2-3 field cadence and perform what is known as inverse telecine on it.

That name comes from the device which has traditionally been responsible for doing film-to-video transfers, the telecine — though today the old-fashioned telecine is being replaced by the digital film scanner. The telecine (or scanner) is what normally does the 2:3 pulldown (called such because the film is jerkily pulled down such that the film's frames stay still for two, then three, then two, etc. video fields) when the video is to be recorded or transmitted in interlaced frames. When it's being recorded in progressive frames, as on HD DVD or Blu-ray discs, no 2:3 pulldown is needed.

By the way, 2:3 pulldown is often referred to as 3:2 pulldown, since a 3-2 field cadence is basically the same as a 2-3 cadence, once you get started.

It's easy for the 1080p HDTV with the right internal smarts to invert the 2-3 (or 3-2) cadence and reconstruct the original 1080p/24 video — provided it knows that's what it must do. It then can deliver the reconstructed 1080p/24 frames to its 1,920 x 1,080 screen at that frame rate, or at the rate of either 48 or 72 frames per second to avoid flicker; each frame is flashed either two or three times in succession.

Problems can arise, however, from the fact that the TV's internal circuits must inspect the incoming 1080i/60 fields to see if they possess telltale signs of the 2-3 cadence. The TV can be fooled briefly by a sequence of fields that don't lend themselves to that internal inspection logic, such as when there is a jump cut in the editing of the film, causing the TV to lose faith in the previously detected 2-3 cadence. When that happens, the TV is apt to stumble for a few frames and knit together two fields per frame that don't belong together. Result: unnecessarily jagged vertical and diagonal edges on objects that are in motion horizontally with respect to the borders of the picture.


Another alternative is to let the TV off the hook and have it display 1080i/60 at 60 frames per second, by generating a full 1080p/60 video frame from each and every incoming 1080i/60 field. It can do that by one of several methods, the simplest of which is to replicate each line in each field, line by line by line. For example, in a field where just odd-numbered lines are provided, line 1 becomes (also) line 2 of the newly generated full frame, line 3 becomes line 4, etc.

There are accordingly no jaggies that result from inappropriately knit-together fields, but an unfortunate side effect is that vertical resolution is effectively halved, from 1,080 lines to 540. Another unfortunate side effect is that a herky-jerky judder is introduced into the image. Objects in motion across the screen don't move at a constant rate, owing to the fact that information from each original group of 24 frames is being parceled out at the rate of 60 frames per second, and 60 is not an even multiple of 24.

Other, more complex methods of creating 1080p/60 from 1080i/60 are possible. In fact, some such method is absolutely necessary for the 1080p HDTV to perform when the original signal is 1080i/60 — say, a hi-def sports broadcast on NBC — and is not the result of 2:3 pulldown from film. It is even possible for some of these methods to smooth out the herky-jerky judder problem entirely. However, few if any of today's HDTVs use such advanced, processing-intensive methods yet.


So when the source is film, wouldn't it be nice if the 1080p/24 material on HD DVD or Blu-ray disc did not have to be subjected to the vagaries of on-the-fly 2:3 pulldown in the player and consequent inverse-telecine processing in the TV?

Then there would be no herky-jerky judder. No jaggies. No halving of vertical resolution. And none of the other so-called "interlace artifacts" that interlaced video — especially when it's not properly deinterlaced — is prone to.


You'd then have pixel-perfect 1080p from DVD.

To get it, first, you'd need a player that can be set to output 1080p/24 on an HDMI digital hookup to the TV. None of the initial HD DVD player models can do that, though it's promised for follow-on models. (The first Blu-ray players, for which aficionadoes are now anxiously waiting, are said to be able to output 1080p/24 on HDMI.)

You'd also need a TV that can both receive 1080p/24 over HDMI and display it as such (though possibly at 2X/3X frame rates for flicker avoidance) on its screen. Many of the so-called 1080p HDTVs being sold right now can't input 1080p at any frame rate. They have 1080p screens, but no 1080p inputs. (That's what I mean by saying the $10,ooo Pioneer Elite PRO-FD1 plasma "fully supports" 1080p. It has both the 1080p screen resolution and the 1080p input capability — at multiple frame rates, not just 24 fps.)


You might wonder how the DVD player would know to output 1080p/24 to the TV, by the way, since in some cases the original video would have been shot at, say, 1080i/60. Wouldn't the player be just as inclined to get mixed up by these various frame-rate/interlacing combinations as the TV?

Thankfully, no. As long as the DVD has been mastered/authored properly — which unfortunately is not a given, with standard-def DVDs today — there are flags and other information stored with the digitized video on even a standard-definition DVD which can clue the player in as to what kind of video source material it is dealing with.

Those flags have always been optional with standard-definition DVDs. When used at all, they have sometimes been misused. Which means progressive-scan DVD players, whose duty it is to deinterlace the video on the disc and send it in progressive form to the TV, sometimes get fooled and render smooth edges as jagged, etc. (When that happens, the cognoscenti say there is "flagging" in the picture, since the erroneously comb-like vertical edges resemble the stripes on an American flag.)

But that shouldn't be a problem with hi-def DVDs/players produced by either the HD DVD camp or the Blu-ray camp. The hi-def discs that are encoded at 1080p/24 have to be so identified, for the player to be able to work right at all. The only question is will the player convert their contents to 1080i/60 for transmission to the TV, or leave the contents at 1080p/24 for a pixel-perfect image on the screen.

As I say, such pixel-perfect 1080p images can happen only if the DVD player and the HDTV both do their part. The player has to (optionally) output 1080p/24, and the 1080p TV must accept it and display it at a user-selectable 24, 48, or 72 frames per second.

Right now, there are zero HD DVD players that output 1080p/24 ... as we still await the first Blu-ray players. Though there are some HDTVs that input and display 1080p/24 at its native frame rate or a multiple thereof, they tend to be brand new, pricey models such as the Pioneer Elite PRO-FHD1, not models for the masses.

Another thing: it's devilishly hard to find out what the input and frame-rate capabilities of a so-called 1080p HDTV are. Information such as this tends to be buried in owner's manuals, which can sometimes be obtained in PDF form online and inspected, but it is often not revealed in the technical specifications announced on the manufacturer web site.


That situation is sure to change, as consumers grow more savvy. By a year or so from now, enough people will have become aware that pixel-perfect movie-style images from 1080p/24-native DVDs are in store for them if they only get the right equipment and use it properly. They will demand such equipment, and that it be conspicuously labeled as having the end-to-end 1080p/24 capability needed for pixel-perfect hi-def pictures.

Moreover, prices on such "advanced" gear will come down from their present straosphere. Not that the most bargain-priced TVs will support the "24p" (or "48p" or "72p") 1,920 x 1,080 frame rate; it takes extra smarts to do that while still supporting the more standard "60i" or "60p" rate. Extra smarts cost extra bucks. But the price premium for the extra smarts will shrink as manufacturers build those smarts into more and more models.


So I think I've changed my mind yet again about 1080p. I said in My Bedroom: Crying Out for HDTV? that "the magic distance for a 50-inch 1080i/p 16:9 TV is about 6 to 7 feet. Any seating distance beyond that loses effective resolution. Somewhere between 9 and 10 feet, you can no longer tell the difference between 1080i/p and 720p." By that logic, 1080p didn't seem to hold that big an attraction for me.

That's still true. I'd be sitting (actually, reclining) at least 10 feet from my anticipated 50" bedroom screen, and my eyes wouldn't be able to see the difference between 1080p and 720p, resolution-wise. But I think I'd be able to see the artifacts that 1080i conversion in the hi-def DVD player and subsequent deinterlacing in the HDTV might introduce, and I figure why put up with them. No, better to wait until all the puzzle pieces are in place for an affordable, pixel-perfect 1080p/24 picture on my bedroom screen.

No comments: