Wednesday, May 31, 2006

More Plasma Anomalies

As I said in an earlier post, my 32" Hitachi plasma can make black-and-white material look faintly greenish. That's not its only oddity, though. The amount of faux greenishness increases with decreasing settings of the color control (as I also reported). And reds tend to look orangish — not a true, satisfying red.

An article by HDTV expert Peter H. Putman, "The Plasma Doctor Is in the House," may explain this last anomaly. Putman says plasma panels, otherwise known in the trade simply as "glass," emit light when their constituent color phosphors are "tickled" by bursts of ultraviolet energy. The UV bursts come with an extra dollop of blue, UV's next-door neighbor in the color spectrum. To counteract the skew toward blue, plasma panels incorporate "capsulated color filters" that are designed in to restore a semblance of CRT-like hue.

"Other schemes have been tried to produce CRT-like phosphor response," Putman writes, "but the effects of UV color shift are still apparent with reds (they appear orange), greens (more of a lime green than a kelly or hunter green), and yellows (frequently shifting to a lemon, rather than an amber color)."

I'm not clear on whether the orangish reds come directly from the UV shift or from the wee color filters that are inserted in the light path to compensate for that shift. Whichever, it's clear that color on plasma TVs is a complex beast indeed. We're lucky it looks as much like CRT color as it does. (A CRT, for those not in the know, is a "cathode ray tube" — an old-fashioned picture tube, in other words.)


Putman also says that plasmas use "pulse-width modulation" to control how much light each phosphor emits: "a technique in which rapid on-off cycles can determine levels of luminance. The ratio of on cycles to off cycles within a given time interval translates into a specific luminance level."

(Here, "a given time interval" means a very tiny fraction of a second. Your eye can't actually see the on and off cycles, rest assured.)

Hence, says Putman, "on some panels, you may observe a color shift as brightness levels increase. The PWM method of simulating analog response works pretty well — my new GE electric range uses it to provide more control over the heating elements — but even PWM has its limits."

Translation: plasma display panels exhibit "non-linear response to changes in luminance levels. While a CRT is a purely linear display (small changes in driving voltage result in equivalent changes in anode current and brightness), a PDP is not." That's why it's apparently necessary for the user to lower the PDP's contrast setting below what the eye might ordinarily prefer. Otherwise, "you may observe a color shift as brightness levels increase."

I didn't mention it in my earlier article, but on my Hitachi plasma, the greenish tinge I see on B&W material seems to be more noticeable at high brightness levels than in relatively dark scenes.


None of this explains why my Hitachi's intrinsic grayscale calibration seems to vary with different settings of its color control — except to imply that with all the tweaking that is done in designing a plasma TV to get its hues even close to CRT-like, it's no surprise that there would be unsuspected interactions among the various user settings like contrast and color.

Another thing the Putman article reveals is that the process of calibrating a plasma's grayscale properly requires (a) a lot of hard-won expertise, compared with standard CRT calibrations, and (b) "a color analyzer with look-up tables for the specific phosphors used in each panel."

The specific-look-up-tables part seems to mean you can't use the generic look-up tables that normally come with a color analyzer, which is an instrument that objectively measures colored light sources.

"Chances are," writes Putman, "your panel came from one of these places: NEC, the Fujitsu-Hitachi plasma factory, Pioneer or Panasonic. (Although there aren't a lot of them out there yet, you will soon see panels coming from LG/Zenith and Samsung, and these will require their own phosphor look-up tables.) My FSR color analyzer is loaded with specific phosphor tables for each of the models listed (even the different phosphors in the Pioneer PDP-502 and PDP-503), thanks to Cliff Plavin of Progressive Labs, who took the individual measurements."


The expertise part comes in especially handy in performing the initial setup for the calibration process proper, in which you have to adjust the set's brightess and contrast controls to bypass the nonlinearities in light output spoken of earlier.

Both parts seem to suggest that, for us plasma TV owners, the idea of having our sets "professionally calibrated" — at no insignificant cost to us, I might add — may be fraught with danger. What if we happen to get a cocksure calibrator who's blissfully unaware of the pitfalls Putman has laid out? Or, what if the calibrator's ideas of proper brightness and contrast settings disagree with our own, such that when we readjust these variables after he leaves, his carefully metered grayscale goes totally kerflooey?

I'm not sure I'd even want a calibrator that doesn't have Cliff Plavin's home number in the directory on his cell phone, at any rate.

Sunday, May 28, 2006

1080p from High-Definition DVDs?

Now debuting: two, count 'em, two, mutually incompatible formats for high-definition DVDs: HD DVD and Blu-ray. These are the first two commercially available standards for video discs whose players (not the old-style DVD players we have now) will play the new discs into a suitable HDTV with all the resolution the TV can reproduce.

As long as, that is, the HDTV does not utilize the gold standard of high-definition television display: 1080p resolution. The initially available HD DVD players don't output 1080p. And it's not absolutely clear which, if any, of the Blu-ray players we're eagerly awaiting over the next few months are going to deliver 1080p.

In the last year or so, TVs whose "native" resolution is 1080p have sprung up, big as life, and taken a noticeable slice of the market. Their screens offer fully 1,080 rows of 1,920 pixels each, yielding the best spatial resolution available in consumer video history: roughly two million pixels overall on the screen.

1080i, the standard in use by many over-the-air HDTV broadcasters, has the same number of pixels as 1080p, but only half of them actually change with each screen update: first the odd-numbered rows, and then, a fraction of a second later, the even-numbered rows. This alternation of scan lines is "interlaced scanning," the origin of the "i" in 1080i.

In 1080p — "p" for "progresive scanning" — the entire pixel array gets updated, every time. Motion is smoother. Jaggies at the edges of moving objects, "twittering" scan lines, and now-you-see-it-now-you-don't tiny details — symptoms experts call "interlace artifacts" — are pretty much history.


So, do HD DVD and Blu-ray support 1080p?

The answer is, alas, complicated. In order to get 1080p to your eyeballs, you need at least four things:

  • a disc encoded at 1080p
  • a DVD player that can output 1080p to the TV
  • a TV that can receive 1080p from the player
  • 1080p native resolution at the TV screen

Reportedly, most HD DVD and Blu-ray discs encode their main content, usually a movie, at 1080p, so no worry there.

Any TV which is advertised as 1080p-capable must have that as its native screen resolution, so this is not a huge problem (as long as you happen to own such a TV).

But most of the initial crop of "1080p" TVs cannot actually receive 1080p signals; 1080i is their maximum input capability. (That situation may have changed with the most recently introduced 1080p TVs, however.)

What's more, few if any of the initial HD DVD players actually output 1080p. (That may not be as true for Blu-ray; see below.) They convert the 1080p on the disc to 1080i, and ouptut that. The 1080p-native TV receives the signal as 1080i and deinterlaces it for display on the screen. The interlace-deinterlace sequence can introduce pesky artifacts.


Foolish, right, for HD DVD players not to support full-fledged 1080p output from the get-go? Well, part of the reason for the foolishness is that 1080p must travel between the player and the TV, if at all, along an HDMI cable. HDMI is a standard for the transmission of video data, audio, and other digital goodies between source devices such as DVD players and TVs. It uses HDCP copy protection to keep anyone from intercepting the digital stream and diverting it to their own (illegal) advantage — so it's very, very complicated.

But HDMI/HDCP's originally-strictly-optional ability to transmit and receive 1080p is just now actually appearing in consumer electronics gear for the first time. The initial HD DVD players are meanwhile sticking with an older, non-1080p implementation of HDMI.

For instance, one of the very first HD DVD players is the Toshiba HD-A1. According to Amazon.com, it can output either 720p or 1080i at its HDMI connection. No 1080p.

As for Blu-ray, it looks as if at least some of the initial player models will output 1080p on HDMI. For example, the yet-to-arrive (as of early June, 2006) Sony BDP-S1 will reportedly do so.


Another complicating factor: there are different flavors of 1080p, distinguished by their frame rates. How many frames of video per second are going to be transmitted? There are at least three popular answers: 24, 30, and 60.

24 fps (frames per second) matches the rate at which movies are shot. Hence, most HD DVD and Blu-ray discs are encoded at 1080p/24.

30 fps is typical of television broadcasts, both standard definition and high. If they're 1080i hi-def, as opposed to 720p, they are more precisely 1080i/60, not 1080i/30, since there are two interlaced "fields" per frame, one for the odd-numbered scan lines and one for the even. With interlaced transmission the number after the '/' is the field rate, not the frame rate ... and by the way, often the '/' is omitted: you'll see "1080p24," "1080i60," etc., instead of designations with slashes.

So the frame rate of 1080i/60 is actually 30 fps. 720p high-definition television (1,280 pixels across the screen by 720 vertically) doubles that frame rate to 60 fps: 720p/60. In 720p there are fewer pixels per frame than either 1080i or 1080p, but they're updated twice as often as 1080i/60. 720p is excellent for fast-action sports.

1080p/60, with 60 full frames every second, is another flavor of 1080p. Some current and/or soon-to-come "1080p" TVs apparently will accept 1080p/60 input, which can happen if the DVD player converts 1080p/24 to 1080p/60.

(Today, it's not easy to find out such precise frame-rate details about TVs and DVD players. Here's hoping that changes soon.)


What if the TV can only display 1080p at 30 fps or 60 fps, not at 24 fps, and what's on the disc is 1080p/24? Then either the player or the TV (typically the player) must perform a conversion. Again, as with any type of scan conversion, there is a potential for visible artifacts to result.

In this situation, the main reason for artifacting is that 60 is not an even multiple of 24. Neither is 30, for that matter. If 1080p/24 on DVD is converted to 1080p/60, or 1080p/30, or even 1080i/60, some frames in the output video will necessarily be interdigitated hybrids of two source frames, which can lead to ragged edges on moving objects. Or else the 24 input frames per second will be parceled out to varying numbers of output frames, some to one and others to two, making for jerky motion.

To avoid such motion artifacts, the TV ought to be able to operate at a 1080p frame rate that is an exact multiple of what's on the disc: say, 72 fps. 48 fps would work, too. And incidentally, the rate at which the TV "paints" frames on the screen is its "refresh rate," and is stated in Hertz or cycles per second. 48 frames per second is 48 Hz. 72 fps is 72 Hz.

A 24-Hz refresh rate with a one-to-one correspondence of output frames to incoming 1080p/24 frames would not work well, unfortunately. It would produce annoying flicker on any bright video display screen. (The reasoning is similar to why motion pictures are projected with each frame illuminated twice in 1/24 of a second.)

Ideally, the player would supply the TV with 1080p/24, and the TV could convert it to, say, 1080p/72 (or 1080p/48) to avoid the flicker common when bright video displays use a 24-fps refresh rate. The conversion from 24 to 72 frames per second is straightforward and does not produce visible artifacts.


Just the ability to learn such details about 1080p-native HDTVs and the initial crop of HD DVD and Blu-ray players ain't easy. And let's face it, the ideal hookup as I envision it will look only a smidgen better than converting 1080p/24 on the disc to 1080i/60 for HDMI transmission to the TV, assuming the TV could handle it, and then to (say) 1080p/30 for display on the TV screen at its native resolution. You'd have to be some kind of purist to even care, right?

Well ... as the incipient high-definition DVD format war heats up, we shall see whether people really do care, shall we not?

After all, all the early adopters who spend the big bucks now, at the dawn of hi-def DVD, may find later that waiting a few months, while all the pieces fall in place for an end-to-end 1080p DVD experience without unnecessary artifacts, would have given them a better picture for little if any extra cost.

They're sure to be miffed, right?

Don't say I didn't warn them.