Sunday, December 31, 2006

More on Contrast

The topic before the house is HDTV contrast — see Contrast Ratios, Pt. I and Contrast Ratios, Pt. II. In the first of those posts I discussed sequential contrast ratio, the ratio of a TV's luminance when reproducing a full-screen peak white signal to that for a full-screen reference black signal. When peak white appears alongside reference black in a single checkerboard image, metering the two gives the more realistic simultaneous contrast ratio, also called ANSI contrast.

ANSI or simultaneous contrast is always much lower than sequential contrast. For instance, the Sony Bravia KDL-40XBR2 HDTV that I recently purchased has (Sony says here) a sequential contrast ratio of 7000:1. Yet the ANSI checkerboard method measures its simultaneous contrast ratio at just 1300:1.

The second post cited Charles Poynton's book Digital Video and HDTV Algorithms and Interfaces to show that a simultaneous contrast ratio as low as 100:1 will produce a fine picture. The eye at any particular moment, adapted as it is to ambient lighting conditions, responds to at most a 1000:1 luminance range. Yet the compression of luminance highlights in a video signal (an aspect of what Poynton calls "gamma correction," "nonlinear image coding," or "tone scale alteration") makes 100:1 fine for TV viewing.

Those two contrast ratios, sequential and simultaneous, are objectively measurable. But some aspects of TV contrast that are not as precisely quantifiable — they're subjective, although the science of human vision does give us some handle on them. Some of these aspects have to do with the effects of ambient lighting conditions on how we perceive a TV picture.


Just as the eye can adapt to different lighting conditions, HDTVs can be watched in pitch-black conditions at night and then again with daylight streaming in the windows the next morning. That they have full on/full off contrast ratios that vastly outstrip the eye's own 1000:1 contrast latitude can help make these TVs adaptable to such widely different conditions.

When you turn the lights out and watch TV in the dark, you can obtain a 100:1 contrast ratio by simply:

  • turning down the TV's brightness control such that image information just above reference black remains barely visible, and
  • turning down the TV's contrast or picture control until peak white is displayed at a luminance that happens to be 100 times brighter than the reference black level

Later, when there's a lot of ambient light in the TV room, the same 100:1 simultaneous contrast ratio can be obtained by boosting both the TV's brightness control and contrast/picture control in tandem.


The subjective contrast of the result, though, will not necessarily be that of the picture under pitch-black viewing conditions!

One reason for the difference is the so-called surround effect (p. 82). The human eye's sensitivity to small brightness variations increases, says Poynton, "when the area of interest is surrounded by bright elements." Conversely, when viewed in a so-called "dark surround," a scene's apparent contrast subjectively flattens out — as does the apparent vividness of the colors.

Moreover, luminance levels produced by a typical TV screen are much, much lower than those in real-life scenes. That, too, causes contrast and color to seem lower than in the original scene ... unless the TV signal is pre-compensated, in the camera, for the effects of low display luminance.

That's one of several reasons why TV signals are subjected to "nonlinear image coding." During so-called "gamma correction," by introducing what Poynton calls "tone scale alteration" into the camera's output signal, nonlinear image coding compensates for, among other things, the the effects of dim or dark viewing conditions and of low display luminances on the human visual system.


The capacity of the eye for adapting to variations in the level of light surrounding an object being viewed compensates for the tendency of light from a "bright surround" to spill into interior details of a scene and wash them out, absent any adaptation, both contrast- and color-wise. When our caveman ancestors were spotting game in a patch of shade surrounded by noonday glare, that surround-effect adaptation was a big help.

These days, when a TV is being watched with the lights off, its screen has in effect a "dark surround." As a result, apparent contrast in the image is lower than actual, measurable contrast. Likewise, the apparent strength of colors is reduced below their objective level in the original scene.

If the same TV is later watched in a brightly lit room, raising its brightness and contrast/picture controls can restore the visibility of just-above-black scene elements while holding the image's actual contrast ratio at 100:1.

Yet the presence of a bright surround now makes apparent contrast higher than before — and colors seem more vivid, too.


This is why Poynton puts such great emphasis on "rendering intent." Among other things, the rendering intent at the time the image is encoded takes into account how dark or bright the ambient lighting is expected to be in the room for the display device.

If a display is to be viewed on a computer in an office with bright fluorescent lights, images will be encoded for only moderate contrast depth, since the eye will furnish its own apparent contrast due to the surround effect.

Images to be displayed in dark, movie-theater-like surroundings need correspondingly more contrast depth in their nonlinear video encodings.

Images intended for dim-but-not-dark surrounds need to be encoded with intermediate contrast depth.

Here, contrast depth refers not to the ratio of peak white to reference black at the extremes of the TV's tonal scale, but to how luminance levels between reference black and peak white are presented, relative to one another.

In the two pictures below, contrast depth is higher in the one on the left than in the one on the right. This is so even though the darkest parts of the two images are equally dark and the lightest portions are equally light.

Contrast depth higher
Contrast depth lower


Contrast depth depends mainly on the "end-to-end power function" used to accomplish so-called "tone-scale alteration" in the encoding of a video signal or computer image, and the restoration of the original tone scale in the image's eventual display (see pp. 83-86).

The luminances between reference black and peak white in the signal must be altered as the image is encoded as, say, an HDTV signal or a JPEG file. Then they must be altered again as the TV or computer decodes the signal.

The latter duty is accomplished by the TV's or computer's "gamma" function. The former happens in the video camera, scanner, etc. When the two tone-scale alterations are combined, the resultant end-to-end power function makes the picture look right under assumed lighting conditions.


Specifically, a video camera subjects the various luminance values in the incoming image to a "power function" whose exponent boosts lower luminances (corresponding to darker scene elements) at the expense of higher, brighter luminances. This is gamma correction. Looked at another way, higher, brighter luminances are heavily "scrunched" or compressed, while darker portions of the scene are not as compressed.

The TV's own internal "power function" has an exponent as well: gamma. Gamma causes the lower-level luminances to be compressed more than the higher-level ones — the opposite of what happens in the camera.

If the camera's gamma-correction exponent is the exact inverse of the TV's gamma exponent, the end-to-end exponent (the product of the two) is 1.0. That would provide too little apparent contrast under most conditions. The denominator of the camera's gamma-correction exponent must accordingly be decreased to provide more contrast depth on the TV screen and more apparent contrast at the eye.

For example, the standard value of a TV's gamma exponent is often taken to be 2.5. If the exponent in the camera is 1/2.5, the end-to-end exponent is exactly 1.0. That's too low. If the exponent in the camera is adjusted to, say, 1/2.2, the end-to-end-exponent is now 2.5/2.2, or roughly 1.1. That gives more contrast depth and more apparent contrast under typical lighting conditions.

Precisely how much the denominator of the camera's encoding exponent (a.k.a. "encoding gamma") ought to be decreased depends on the intended lighting conditions under which the image is to be viewed. For pitch-black surrounds, for example, overcoming the apparent contrast deficit owing to the surround effect necessitates an end-to-end power of fully 1.5. 1.125 works best for bright surrounds, Poynton says, and 1.25 for dim-but-not-dark rooms (see table on p. 85).


The intended lighting conditions for TV watching are usually considered to be a room with dim, but not pitch-black, ambient lighting. If the lights are turned all the way off, as they are in many home theaters today, then the usual end-to-end power function of 1.25 is typically too low to produce enough contrast depth on the display and enough apparent contrast at the eye.

Luckily (see pp. 84-85) the fact that the TV's brightness control is (presumably) turned down below its usual daytime level when the room lights are turned off helps compensate for this problem. When a TV's brightness control is lowered, the TV's gamma — its tendency to boost contrast depth — effectively goes up. That in turn raises the effective end-to-end power function of the signal path from camera to screen, producing more apparent contrast at the eyeball.


All that tweaking of the TV's brightness control is fine if the "night" setting of that control — the one you use when all room lighting is off — produces an ideal picture under such conditions. But often it doesn't.

Lowering the brightness control can increase the TV's effective gamma exponent. That's good; it boosts contrast depth by raising the exponent of the end-to-end power function, thereby enhancing the eye's perception of apparent contrast.

But the contrast depth boost can come at the expense of potentially hiding or "swallowing" shadow detail. Elements of the image whose luminance is just above reference black can, in effect, disappear when the brightness control is turned too far down.

This is one reason why the latest HDTVs now feature a gamma control. It allows you to manipulate the TV's effective gamma exponent at will, to boost or reduce contrast depth without affecting black level in the way that the brightness control does.

The gamma control, unlike either the brightness control or the contrast/picture control, has no effect on the TV's simultaneous contrast ratio. You can tweak brightness and contrast/picture to (1) set black level and (2) adjust simultaneous contrast, the ratio of peak white luminance to black-level luminance. Then you can dial in the appropriate contrast depth with the gamma control. The result can honor the rendering intent of the program's author — even if you aren't viewing the program under intended lighting conditions.


The availability of a gamma control or other controls that affect contrast depth is a currently underappreciated feature of today's digital HDTVs. TV owners have more familiarity with the brightness control, which sets black level, and with the contrast/picture control, which sets the level of peak white. The latter control also proportionately raises or lowers the output level of all luminances below peak white and above reference black — but it does this without altering gamma/contrast depth.

These other, newfangled controls can tailor contrast depth per se, doing so without altering the luminance levels of reference black or peak white. They, along with the familiar brightness control and contrast/picture control, give us greater flexibility to adapt the measurable sequential and simultaneous contrast ratios of a digital HDTV, and to give the optimal contrast depth under actual viewing conditions.

Saturday, December 30, 2006

A New Bedroom HDTV for Me, Part VIII

I reported in A New Bedroom HDTV for Me, Part VII and previous posts that my new Sony KDL-40XBR2 1080p LCD TV has a problem with faint vertical bands that show up in some program content and are easiest to spot in gray test patterns. I have now discovered the same sort of thing complained of by owners of recent Sharp AQUOS 1080p LCDs.

In the Official Sharp Aquos 42D62U /46D62U /52D62U Owner's Thread at the AVS Forum I found post #192 to have this photo as its third attachment. It is of a Sharp AQUOS LC-52D62U 52" 1080p LCD panel.

In my judgment, the kind of vertical banding here is the same as my Sony exhibits. There may be less of it on view, but that may have to do with TV and camera settings, not with any intrinsic difference in the problem itself.

The first post in the AVS Forum thread contains a FAQ about the banding issue, adorablerocket's UNOFFICIAL SHARP D62U "BANDING" FAQ (ADDED 11/30/06). It says in part:

1. What is “banding”?

Banding is [a] name for a defect that occurs on *some* of Sharp’s d62u LCD TVs.

In the AVS forum this defect has been reported on 42”, 46” and 52” size TVs.

Banding is best described as stripes or “bands” of reduced brightness a few inches wide running across the display. These bands can run horizontally or vertically.

The bands range from very severe in some sets to quite subtle and may not be visible under most viewing circumstances. In general bands will be most notable when viewing a solid grey image.

Obviously a flat grey box is not something most people watch on their TVs. However the more an image resembles a flat grey field, the more it will display this defect if present. Slowly moving images of low contrast and saturation, such as may be found when playing some video games (for example turning around slowly in a room made of concrete) or watching some movies (for example in Master and Commander when the camera is focused on the subtle outline of a ship appearing through grey fog) will display the bands. Depending on the severity of the banding present on any given TV, more common images (for example a blank wall behind an actor) may also display the banding.

Generally if the banding is visible during normal use, viewers have found the banding to be an unacceptable defect of the TV. On the other hand viewers with sets that do display banding under the ‘grey screen test’ described above, or on occasional scenes, have said that the banding is livable, though many expect a repair or replacement from Sharp. Of course many other viewers have reported no banding or defects at all on their TVs.

You can search the thread for photos posted by people for whom the banding is clearly visible on a test screen ...

.
.
.

12. What causes banding?

Nobody knows.

There has been speculation that the banding is part of the backlight process. There has also been speculation that it might be a defect in the LCD because it’s similar to a defect found in some LCD projectors.

If I had to [guess], I’d vote for micro misalignment on the horizontal and vertical backlight diffusers ...


There seem to be two different banding issues with the Sharps, one vertical like mine and the other horizontal. "To visualize [the horizontal bands]," one poster writes, "if you drew two horizontal lines across the screen to split it into almost-equal thirds (the middle third being just a touch larger), the bands are right on those lines." I have the feeling (as do many other posters) that some of the horizontal bands are caused by a form of interference, perhaps power-line interference. But, maybe not (see photo below).

There also seem to be posters who simply cannot see the faint vertical banding in the "problem" pictures posted in the thread by other participants. I am put in mind of the Sony-authorized service technician who swapped in a replacement LCD panel for my set. He couldn't see the bands either. Yet my HDTV installer who came back to do extra work for me said he could see them. This seems to be an issue for some eyeballs and not for others.

At right is another photo which shows vertical banding, from post #687. To my eyes, the lack of luminance uniformity is obvious. I also see what I would call horizontal banding as well.

Skipping ahead, in post #5699 an AVS Special Member named "mark_1080p" reproduces my own photo shown above of vertical banding revealed in window glare. (It's nice to be famous.) The poster says, "This would be simple to test for the Sharps, if you see banding in the glare or shine a very bright flashlight on it and see banding, then we know it is not the backlight but rather the coating. If you do not see it, I guess the cause is indeterminate." I'm not sure what he means by "the coating," however. I would imagine it to mean something on the very surface of the front of the LCD panel. To my eye, however, there seems to be a striated irregularity in the front surface that results from a physical unevenness — almost a pleated, corrugated effect — further down in the LCD "sandwich." That is, the problem doesn't seem to be confined to just the outer surface at all.

I get the feeling that the banding issue mushroomed in the Sharp thread over time. The thread was supposed to be about everything on these Sharp TVs, but a quick survey shows that something like one post in ten mentions banding, with the ratio getting higher the further along you go in the 190+ page thread. It's as if people became increasingly sensitized to this very faint source of imperfection by reading about it and seeing pictures of it.

In fact, I'd have to say the "banding issue" has become for Sharp AQUOS 1080p's what the "cloudy backlight issue" has for Sony BRAVIA XBRs like mine: something of a widespread scandal. It's odd that I have a Sony with a "Sharp issue," while I've yet to find a Sharp owner complaining about "clouds" in their backlight.

Friday, December 29, 2006

Contrast Ratios, Pt. II

Contrast is important to getting the best possible picture on an HDTV. In Contrast Ratios, Pt. I I introduced the concept of an HDTV's contrast ratio: for any given HDTV, the ratio between the measured luminances of the brightest and darkest images or portions thereof that the TV can display.

When the luminance produced by the TV for a peak white image occupying the entire screen is compared with that for a reference black full-screen image, the TV's full on/full off contrast ratio can be derived — also called its sequential or dynamic contrast ratio.

The sequential contrast ratio is usually much higher than the TV's simultaneous or static contrast ratio. That lower but more realistic figure is obtained using the ANSI standard of measurement. The light coming from black and white rectangles, displayed on the screen all at once in a 4x4 checkerboard pattern, is metered and compared.

The Sony Bravia KDL-40XBR2 HDTV that I recently purchased has, according to Sony, a sequential contrast ratio of 7000:1. Yet the ANSI checkerboard method measures the simultaneous contrast ratio at just 1300:1. Which brings up the question ...


How much contrast does the eye really need? It can utilize at most a 1000:1 contrast ratio, according to video expert Charles Poynton in Digital Video and HDTV Algorithms and Interfaces.

Because the luminances associated with the brightest glints and highlights in a real-life scene are generally compressed as a TV image is being created, Poynton says the TV itself need only have a 100:1 contrast ratio, not 1000:1.

Other authorities, be it noted, say TV sets profit by having higher contrast ratios than 100:1. Some even say ratios of 1200:1 and up are not too much.


Per Poynton, the eye "can discern different luminances across about a 1000:1 range" (p. 197). That is, in any particular state of adaptation to ambient light, the eye responds to no more than a 1000:1 ratio between the brightest and dimmest luminances present in any real-life scene.

The highest contrast ratio actually usable by the eye at any given moment is thus 1000:1. Even so, the ratio between diffuse white and reference black in a TV signal need be no higher than 100:1.

That, says Poynton, is because highlights are artificially compressed in TV signals in order to "make effective use of luminance ranges [contrast ratios] that are typically available in image display systems" (p. 83).


Peak white luminance, as measured with a full-field white test signal, thus corresponds to diffuse white in an actual TV image: the white of a brightly lit piece of paper, say. It does not represent the luminance of the brightest highlights in the original scene.

In a real scene, something like strong sunlight gleaming off the bumper of a car can produce luminance ten times brighter than the same sunlight reflected off a piece of paper. Since highlights like bumper gleams are not encoded in a TV image at a mathematically correct 10:1 ratio to diffuse white, a 1000:1 contrast ratio is not needed in a TV display. A 100:1 contrast ratio is fully sufficient for a TV to reproduce the luminance range used when any given image or scene is encoded into a video signal.


That contrast ratio in a TV need be no higher than 100:1 is indeed fortunate, because it's hard to get any TV to render truly inky blacks. Says Poynton (p. 197), "In practical imaging systems many factors conspire to increase the luminance of black, thereby lessening the contrast ratio and impairing picture quality. On an electronic display or in a projected image, simultaneous contrast ratio is typically less than 100:1 owing to spill light (stray light) in the ambient environment or flare in the display system."

That is, even with total darkness in the viewing room, the TV itself will produce spill light or stray light as its luminance output reflects back onto its screen. That spill or stray light — plus optical flare being bounced around within the innards of the display — will inexorably lighten the TV's blacks and lower its measured simultaneous contrast.

Surprisingly, Poynton says (see table 19.1, p. 198) that a movie theater, supposedly the gold standard of image display, will typically furnish a simultaneous contrast ratio of just 80:1! Sequential contrast ratios can reach fully 10000:1 when different film images/scenes are compared. Yet when projected in a theater, a single frame on film will typically exhibit a much lower simultaneous contrast ratio.

Meanwhile, Poynton says, a typical TV in a typical living room will sometimes provide a simultaneous contrast ratio of just 20:1!


Just 20:1? What happened to the already-low 100:1 baseline? Basically, it got swallowed up in the ambient light of the typical living room. According to this online article:

To better understand the impact of the presence of light in a room on the contrast ratio performance of an imaging device, it is sufficient to realize that with the light emitted by just one candle in a room [1 lux] there would not be any difference between a 500:1 and a 5000[:1] or even a 10,000:1 contrast ratio!

Increase the level of light in the room to just 30 lux — that's equivalent to a dimly lit room — and contrast ratio figures above 50:1 would turn out to be simply academic even in the case of video projectors with relatively high brightness rating (2000/2500 lumens and above).
(Lux and lumen, by the way, are measures of light. They are related to the candela.)


The point here is that even dim lighting in a TV viewing room cuts significantly into Poynton's 100:1 contrast-ratio norm.

Why? Mainly because the eye adapts to the room's lighting rather than to the TV screen's much lower luminance. It accordingly can't see details in the darker portions of images on screen unless the TV's brightness control is boosted.

Boosting the TV's brightness control to offset ambient room lighting raises the black level of the TV, while it does nothing to change peak white level. Since contrast ratio is the ratio between peak white luminance and reference black luminance, the effective contrast ratio is reduced well below the maximum ratio that the TV could otherwise produce.

Whether the effective simultaneous contrast ratio is 20:1 or 50:1, it is way lower than the best-case sequential contrast ratio the TV could produce, if properly adjusted and viewed in a completely darkened room.

That's too bad, because according to Poynton (p. 197), "Contrast ratio is a major determinant of subjective image quality, so much so that an image reproduced with a high simultaneous contrast ratio may be judged sharper than another image that has higher measured spatial frequency content." The "measured spatial frequency content" amounts to its "real" sharpness.

Spatial frequency is what you are paying for if you buy a 1080p HDTV rather than a 720p. It is, however, not the only thing which determines "subjective image quality" as it relates to perceived sharpness. You also need sufficient simultaneous contrast if you want the image to impress the eye as being ultra-sharp.

Contrast Ratios, Pt. I

If you follow HDTV reviews in magazines and online, you come across terms like black level and contrast ratio. They affect how dark the dark parts of the picture seem, and how bright the bright parts seem. But what do these terms really mean?

The eye-affecting power of light coming from a TV screen (technically, it's called luminance, which is the luminous intensity per unit area of light being radiated in one particular direction) can be measured in either foot-Lamberts (ft-L) or candelas per square meter cd/m2, also called nits. One ft-L equals 3.4262591 cd/m2.

Ideally, when a TV displays a full-field black input signal, it ought to have zero luminance. No TV can do that, though, not even a studio-monitor CRT. When its brightness control is properly adjusted, a TV will exhibit the lowest black level it can, consistent with retaining the visibility of near-black shadow detail. That minimum luminance produced by a properly adjusted TV when it receives a reference black input signal and displays it in a darkened room is its measured black level.


Once you know a TV's black level measured in ft-L or cd/m2, you theoretically know how inky its blacks can get. But you also need to know how bright its whites can be. Peak white luminance is measured in the same way as reference black luminance, except that a full-field white test signal is used instead.

You can then derive the TV's contrast ratio from those two numbers. It's the peak white luminance divided by the reference black luminance, expressed in the form of a ratio. For instance, if peak white luminance is 40 ft-L and the black level is 0.02 ft-L, the is 2000:1. A TV's contrast ratio is sometimes referred to as its dynamic range.

The full on/full off contrast ratio is a misleading figure, though, for at least two reasons. One, most HDTVs can't match such a best-case contrast ratio in any single program scene. Two, the human eye at any given instant can't handle anything like that nominal 2000:1 contrast ratio — much less the 10000:1-and-higher figures being claimed by some HDTV makers.


What's really important, scene by scene, is not the full on/full off contrast ratio that is obtained when peak white and reference black signals are input one at a time and then compared. It's the ratio between peak white and reference black as reproduced in a single scene that counts.

The standard way to measure this is to input a four-by-four checkerboard pattern with alternating white and black rectangles. The ratio between the luminances of the white and black rectangles gives you the so-called ANSI contrast ratio.

This ANSI measurement measures the "simultaneous" contrast ratio of the TV, where the full on/full off measurement tells you its "sequential" contrast ratio. The simultaneous contrast ratio is sometimes called the "static" contrast of the TV, and the sequential contrast ratio is its "dynamic" contrast.

For various reasons, ANSI contrast ratios are typically much lower than full on/full off contrast ratios. A TV that has a nominal 2000:1 full on/full off contrast ratio may have only, say, a 144:1 ANSI ratio.

Or, to take a real-world example, the Sony Bravia KDL-40XBR2 HDTV that I recently purchased has, according to Sony, a best-case contrast ratio 7000:1. That figure represents its dynamic or sequential contrast ratio. Yet, says Sony, "a more stringent method that measures the amount of black and white levels that can appear on the screen at the same time" — i.e., the ANSI static or simultaneous method — measures the contrast ratio at 1300:1.

PC Magazine's review of the Sony
KDL-40XBR2 measured the latter figure at 1205:1. When the backlight of the TV was dropped from its maximum to the minimum level and the TV's peak white levels were "reduced for dark-room viewing," that figure became "a still impressive 550:1."

Why is the ANSI contrast so much lower than the full on/full off contrast? Unavoidably, light from bright areas of the screen will pollute dark areas. That can happen in any display technology, due to light "flare" within the image-forming device or reflections in the optics between it and the viewing screen — not to mention light bouncing back to the display screen from the room itself.


In the next part of this series on contrast ratios, I'll discuss why they're so important.

Saturday, December 23, 2006

A New Bedroom HDTV for Me, Part VII

I've been complaining (most recently in A New Bedroom HDTV for Me, Part VI) about vertical striations (as I now intend to call them) in the picture on my new Sony KDL-40XBR2 1080p LCD TV.


Today I happened to look at the TV while it was off and noticed that its screen was reflecting glare from one of the windows in the room. There was a striation pattern in the reflected glare! I was able to capture it on my digital camera.

The glare-striations are not in the glare light itself. They stay still on the screen when I move my head slightly so as to reposition the glare "hot spot" with respect to the borders of the screen.

Although I can't really match the two banding patterns striation for striation, I'm convinced these two phenomena are actually one! That is, whatever causes the window glare to be non-uniform in its reflection from the screen is also what causes light from the LCD panel, when it is on, to be non-uniform in its distribution.

Why do I think this? Both patterns of non-uniformity involve vertical bands (striations) of various arbitrary widths. The widths are all fairly broad, on the order of an inch or so. They don't seem to appear on either end of the screen, just in the middle 50% or so. There are no horizontal striations, just vertical ones. The striations are quite faint in both cases. Their "edges" are indistinct and gradual in both.

The glare-striations are clearly not produced by the TV's electronics, since the TV is off. They are not due to backlight non-uniformity or leakage through the LCD panel, since there is no backlighting.


What exactly does produce them is hard to say. My best guess is that it's something physical, or mechanical, or geometric — whatever word you want to use for something not perfectly smooth and flat. Possibly the front, outer surface of the LCD panel is rippled. I notice that when I apply light finger pressure (using a tissue), the front surface gives a bit.

Or possibly the ripples are in a deeper layer of the LCD "sandwich."

An LCD panel in an HDTV is made of several layers, the first one (counting from the rear forward) being the backlight. The backlight simply produces light, which then passes through a vertical polarizing filter (the next layer) that polarizes it — makes sure all the light waves are aligned in one direction, the vertical one.

The comes the liquid crystal layer, sandwiched between two glass plates whose grooves cause the intervening sequence of liquid crystal molecules to be helically twisted in alignment. The light passing through the liquid crystal layer has its polarity twisted because the liquid crystal molecules are themselves thus arranged. So the light from the backlight will be able to pass right on through the next layer, a horizontal polarizing filter ... and the pixel will light up.

The color of the pixel — red, green, or blue — is produced by a tiny colored filter in the light path. The colored filters are yet another layer of the LCD sandwich.

The first glass plate in the sandwich houses an array of electrodes, one electrode for each pixel. Each electrode contains a transistor. If any given pixel's transistor has voltage applied to it, it will untwist the liquid crystal material in proportion to the applied voltage. The higher the voltage, the more the crystal untwists.

The result is that less of the light passing through the liquid crystal will be able to penetrate the horizontal polarizing filter. The pixel will darken.

If the maximum voltage is applied to the transistor for a given pixel, the liquid crystal will be fully untwisted, and the pixel will be black (or, at least, it will be as dark as it can be).

But if the liquid crystal "leaks," the pixel won't get that dark. It leaks when:
  • the applied voltage is too low
  • the appropriate maximum voltage doesn't fully untwist the liquid crystal
  • there are impurities in the liquid crystal
  • there are defects in the alignment of the various parts
  • etc.
If a lot of leaks are present in one area of the screen, clouds appear. There have been a lot of complaints about "cloudy" or uneven backlighting with this TV and its Sony siblings. See Official Sony 46" XBR LCD Uneven Backlight/Cloudy Thread for more on this. Luckily, I don't seem to have the cloudy backlighting problem. This is the problem I spoke of as "mura" in earlier posts.

The problem I have — vertical striations — could however be related to defects in the mechanical alignment of things on the various inner layers of the LCD sandwich, such that when light enters the panel from the front (as with the window glare) it is not reflected back evenly.


I have had not one but two LCD panels in my Sony: the original one and a replacement panel installed by an authorized technician after I complained of the striations. As far as I can tell, nothing whatsoever changed when the replacement panel was put in.

To my eye, that is, the exact same striations beset the second panel as beset the first. They appear to be in precisely the same locations on the screen. It is enough to make me wonder if they might be the result of physical irregularities in the front frame of the TV, against which the edges of the LCD panel are pressed when the necessary screws are tightened during assembly of the TV. If so, a replacement panel would be distorted in the same way as the original was distorted.

So that's where I stand on the striations issue at present. I am becoming convinced that there are mechanical distortions in the LCD panel which cause them to show up in pictures being produced on the screen and in glare reflected from the screen.

Thursday, December 21, 2006

A New Bedroom HDTV for Me, Part VI

In A New Bedroom HDTV for Me, Part II through A New Bedroom HDTV for Me, Part V I spent a lot of time kvetching about the problem that my new Sony KDL-40XBR2 1080p LCD TV seems to exhibit: vertical bands that are a little bit darker than their surroundings (and have a slight color tinge). These faint bands show up most prominently in gray full-field test patterns from the Avia calibration DVD, such as the one shown here.

I imagined that this was a case of "mura": "low-contrast imperfections that are larger than a single pixel, and are visible when the display is driven at a constant gray level ... caused by non-uniform distribution or impurities of liquid crystal or mechanical imperfections in the display assembly," according to this PDF. Now I'm not so sure it's mura after all.

Mura is supposed to be a problem with the liquid-crystal display panel, not the whole TV. You replace the panel, you cure the mura. Or you get different mura. You don't get the same exact mura in the second panel.

Yesterday a service technician replaced under warranty the LCD panel in my Sony. The faint vertical bands remained exactly as they were before.


The above photo exaggerates the problem somewhat, I admit. It's not really all that obvious in person. In fact, the service guy couldn't see it at all.

On actual programs it's even less visible. Last night I watched Memoirs of a Geisha on DVD and saw little or none of it. I was watching in the dark, so I turned the TV's Backlight control down to zero and the Picture (contrast) control up to its maximum of 100. I fiddled with the Black Corrector and Advanced C. E. (advanced contrast enhancer) controls in Picture Settings->Advanced Settings to get the deepest, darkest blacks I could. The picture I saw was phenomenal! And as I say, I could detect little, if any, of my "faux mura" problem.

So my problem might best be described as phantom faux mura.


Turning on Clear White in the Advanced Settings, I have found, is also a good idea. It seems to get around the need to fiddle with adjusting the TV's color temperature color-by-color using the White Balance settings. I had originally done the latter as a means of getting rid of non-neutral color tinges in black & white pictures, and it worked well. But it's a pain to do. I now find that Clear White does roughly the same thing, and equally successfully, with just a single button press.

I'm not sure how Clear White really operates or what it does, technically speaking. It doesn't really seem to change the color temperature, which determines how warm-to-cool or reddish-to-bluish the overall picture looks.

Color temperature, in degrees Kelvin or just Kelvins, determines the "color of white"... and of all grays from black to white. It also affects colors per se. It's supposed to be 6504°K — for a fairly warm picture — at every brightness level from white to black. Most TVs prefer a higher, bluer color temperature, which facilitates a brighter image since the light source of most TVs puts out more blue light than red light.

But the Sony, like most modern TVs, gives you several choices of color temperature. I'm using Warm 1.

Turning on Clear White doesn't seem to alter its warmth. The manual says Clear White "emphasizes white and light colors." If by that is meant whites and pastels are boosted in brightness, no, that doesn't seem to be the case. Are they, perhaps, washed out? No, not that either ... not visibly, at least. My best guess is that maybe, just maybe, colors very near to a neutral white or gray (including very dark grays) are moved closer to white/gray, thereby removing faint tinctures of color ... like laundry detergent is supposed to remove grass stains.

Call it video bleach ... except that I cannot tell that it causes any "fading" of the overall picture. Not a bit of it. Colors stay rich and pleasing — in a color picture, of course. In a black & white picture, the picture looks just ... black & white. Not greenish. Not bluish. Not reddish. Just B&W.

For the life of me, I don't see how Clear White does it. It's as if the Sony sees there is no color input signal, just B& ... which in itself is no big mystery. Any TV can do that. But it's also as if the Sony somehow knows it's about to add a false tinge of color to a B&W image, and it resists the temptation of doing that. That's a mystery. How does it know that its internal processing would otherwise result in a smidgen of hue where no hue belongs? How does it know how to correct for it? And if it does know how to correct for it, why doesn't it correct for it all the time?


Now, back to my phantom faux mura. What could be its cause is also something that presently baffles me. The service tech suggested it might be the fault of the DVD player that sends the TV, for instance, a gray-field pattern. But I can tweak the TV into producing the same phantom mura on a blank — hence black — video input, as well as on an inactive cable channel from my TiVo.

There's another mystery associated with that last situation. When I first installed the TV and TiVo, it seems to me that inactive channels put up a gray screen that showed the despised vertical bands fairly easily. Now these channels put up a black screen, the phantom bands are harder to see, and I don't know what happened to account for the difference. With a black screen, I have to maximize Backlight, Brightness, and Picture on the TV to bring out the phantom vertical bands.

But they're definitely there.

At least, my eyes think they're there. I've even begun to wonder if somehow a magnetic field causes them ... but LCDs aren't supposed to respond to magnetism, the way CRTs do. Plus, it doesn't look like a magnetism type of problem. It looks too regular, too ruler-straight, for that.

Tuesday, December 19, 2006

A New Bedroom HDTV for Me, Part V

As in A New Bedroom HDTV for Me, Part IV, I continue to discuss the "mura" problem my Sony KDL-40XBR2 seems to exhibit.

I've found yet another PDF that describes the problem. This one describes "mura" this way:
One large category of defect is called “mura,” derived from the Japanese word for blemish. Mura are typically low-contrast imperfections that are larger than a single pixel, and are visible when the display is driven at a constant gray level. They may be caused by non-uniform distribution or impurities of liquid crystal or mechanical imperfections in the display assembly.
The article, "Flat Panel Display Defect Measurement Using a Human Vision Model," by NASA scientist Andrew Watson, was mentioned in this post in the Official Sony 46" XBR LCD Uneven Backlight/Cloudy Thread, at www.avsforum.com.

A longer article by Watson, "The Spatial Standard Observer: A Human Vision Model for Display Inspection," can be found here. This article brings home to me the fact that mura defects exist in a range of severities measurable in units of just-noticeable difference (JND). A 1-JND mura is barely noticeable when the display is driven at a uniform gray level; it may well be undetectable when a regular picture is on the screen. On the other hand, a mura with a measure of 4 JND would be quite detectable. It would, I assume, affect a regular picture on the screen.

Some illustrations in these articles depict TV screens with faint vertical-band mura defects that look like mine do. But they also have small "blobs," as I call them, while mine has no blobs. It is the nonuniform blobs which are of concern to the article's author. The faint vertical bands are apparently considered insignificant.

A New Bedroom HDTV for Me, Part IV

In A New Bedroom HDTV for Me, Part III I described further the "mura" problem of my new Sony Bravia KDL-40XBR2 1080p LCD HDTV, originally mentioned in A New Bedroom HDTV for Me, Part II.

Since I discovered the problem, I've been hanging out in the Official Sony 46" XBR LCD Uneven Backlight/Cloudy Thread. It represents the problems of hundreds of owners of flawed 46" and 40" XBR LCDs who have responded to a poll. The poll asks about Sonys with "clouds" of extra brightness on portions of the screen.

At present writing, 283 responders have said, "YES, clouds can been seen when dark colors are displayed or when switching inputs." Only 133 say, "NO, my screen has a perfect, fully even, backlight."


Any problem with the picture uniformity of an LCD panel can be called a mura defect. Picture uniformity is what happens when the panel can display full-field test patterns of various shades of gray, ranging from reference black to peak white, with no blemishes or signs of unevenness. Mura is different from dead or stuck pixels, problems which can also beset LCD panels.

Most mura problems involve variations in brightness, not color, but sometimes it is the case that color is not uniform over the entire screen as well. The variations, whether of brightness or color, are often subtle. A brightness variation, for instance, will typically be of such low contrast that one simply cannot see it with ordinary program material. That's when full-field test patterns come in handy.

A poor man's test pattern for the kind of "cloudy backlighting" mura most posters to the thread complain of happens when the Sony is tuned to an unused video input. That turns the screen black, and any clouds can readily be seen. Another way to get the same result is to cue up program material with a black background, such as the end credits of a movie.

backlight2To the right is a photo of one poster's problem "clouds." It comes from this album which contains several other shots posted to the thread by various Sony customers.




Notice that the "clouds" problem, which is called "cluster mura," looks different than the "vertical-band mura" my set has (see image at right). My problem shows up on light-gray fields. The "clouds" show up on dark fields.


I have also seen reports of horizontal banding and of light "bleeding" at the very edges of the screen.


The Sonys aren't the only LCDs with "picture uniformity" problems. I just got my January 2007 issue of Sound & Vision. They test the Toshiba Regza 47-inch 1080p LCD HDTV, saying:

... I noticed that the sky on either side of the screen seemed a little lighter than the center. Earlier, I had observed that on dark gray test patterns, the Toshiba's image was slightly brighter in its far left and right zones than in the middle, and now here it was in program material. This kind of imperfection is not unknown to LCDs. (JVC's 46-inch LT-46FN97, reviewed in December, suffered the same fault.) It may have been endemic only to my sample ...

and

Full-field gray patterns below 50 IRE were slightly darker in the middle than on the sides.

From the review of the Sharp LC-52D62U 52-inch 1080p LCD TV:

... when looking at solid gray test patterns, I noted a distinct pattern of horizontal dark bands across the TV's screen. The bands weren't as visible with regular programs, althought they did show up pretty clearly on occasion. ... Fortunately, a second review sample from Sharp had markedly better picture uniformity, with almost no banding visible on regular programs or test patterns. (Sharp is aware of the issue and said anyone encountering this problem should call 800 BE SHARP [800-237-4277].)

The Sony VPL-VW50 1080p SXRD front projector, using the LCD variant known as Liquid Crystal on Silicon, also had uniformity problems:

Picture uniformity was poor, with pink and green tinting visible on both gray full-field patterns and program material, although the amounts varied on my two test samples.

and

... the top and bottom edges of the frame had a distinct pink tint and the center looked comparatively green — a problem that was even more apparent on test patterns ... I saw this same issue on the black-and-white stills shown in a PBS documentary ... although it was harder to detect on full-color programs. A second sample from Sony showed a reduced level of pink discoloration, and no greenish tint.

Eventually, Sony blamed this on sample variation that falls within range of its manufacturing tolerances, and said "an experienced technician" can enter the service mode and adjust for uniformity using the projector's multipoint gamma control. I was able to remedy the problem using this adjustment, and chances are you can hire an ISF technician to do the same. However, this calibration would not be covered under warranty.


There are, as I say, many types of mura. And they are hard to measure objectively.

This PDF gives some technical insight into the phenomenon, as do this one and this one.

The first PDF shows pictures of the different mura types on page 2. The thrust of the article is: what kind of algorithm can be used to evaluate muras? Among just "cluster muras," which the "clouds" complained of here in this thread represent, there are "round-type" and "line-type." How can the algorithm evaluate both?

The other two PDFs also make it clear that automated mura detection and measurement is not easy — yet the human eye picks picture nonuniformity up in a flash.

These PDFs are aimed at improving quality control — weeding out bad LCD panels before the customer sees them. The panels made for our problem Sonys as well as for a number of other LCD TV brands weren't duly weeded out, obviously.

Muras aren't binary, on-off, yes-no things like dead pixels. A pixel is either dead or it isn't. But a "cloud" can be a puffy cumulus or a faint haze. Maybe if the participants in this thread turn them all into thunderclouds, Sony will get struck by lightning and change its QC ways!

It begins to look to me as if whatever QC worked for lesser LCD panels is failing with the newer, larger, 1080p panels.

Here's an analogy: In the world of DLP-based 1080p rear projectors, there are two kinds. One uses a 1,920 x 1,080-pixel chip, the other a 960 x 1,080-pixel chip. The latter uses "wobulation" — a swiveling mirror in the light path — to double the effective number of horizontal pixels.

The chips are "digital micromirror devices" or DMDs, not computer chips with transistors, but the principle is the same. The more micromirrors or transistors on a chip of a given size, the denser the chip. The denser the chip, the lower the yield of good chips in the manufacturing process. The lower the yield, the more chips have to be thrown away — and the greater the unit cost of good chips, since the cost of making the bad chips gets piggybacked onto the good ones.

Using the lower-density DMDs along with wobulation holds the line on the price of the TV. True 1080p DLPs cost a lot more.

It stands to reason that the same logic applies to display panels. The denser the panel — and 1080p panels are more than twice as dense as 720p panels of equal size — the lower the yield of good panels, whose manufacturing cost accordingly becomes a huge issue. But there's no workaround such as wobulation — what you see is what you get. It appears to me that Sony is lowering standards on what constitutes a "good" panel in order to keep overall costs down and remain competitive.

Wednesday, December 13, 2006

A New Bedroom HDTV for Me, Part III

In A New Bedroom HDTV for Me, Part II I complained about a minor defect in the Sony KDL-40XBR2 40" LCD panel I bought for my bedroom: "an irregular swath, predominantly vertical, sitting slightly to the left of the center of the screen in roughly the top two-thirds of the image." A portion of a solid gray test pattern showed up as slightly too dark in the swath in question.

Further experimentation now reveals the swath is actually more regular than I originally thought. It runs from the top to the bottom of the screen and seems to be part of a series of other vertical brightness "troughs" and "crests" that flank it to either side. It looks like the supposedly perfectly flat LCD panel is faintly "crumpled," in the way a rectangular rug that has been bunched up in one direction can look crumpled .

Another way to describe it is that there is a subtle "accordion" effect in the LCD panel. The "pleats" are, however, not sharp-edged but gradual.

The main "trough," or "accordion pleat," the one almost at screen center, even has a very slight pinkish tinge, where the test pattern is supposedly a uniform light gray.


I've managed to discover that the type of defect I have in my Sony is one that is well-known to afflict LCDs. It is called mura, the Japanese word for blemish, unevenness, or inconsistency. This PDF gives some technical insight into the phenomenon, as do this one and this one. The image at right was taken from the first of these three. It shows one of the several kinds of mura defects in LCDs, the so-called vertical-band or v-band mura. Notice the darker vertical stripe a bit to the right of center.(Other mura types include cluster muras, rubbing muras, and light-leak muras.)

The v-band mura shown here resembles the main brightness "trough" on my Sony screen, except that mine is slightly to the left of center, while this one is slightly to the right. (This example also manifests a central, over-bright "hot spot." I assume that this is an artifact in the example graphic and is not relevant to the overall mura defect issue. My Sony certainly displays no such hot spot.)

This example doesn't show the same flanking "troughs" and "crests" to either side of the main "trough" as my Sony exhibits. Nor does the example demonstrate the ultra-slight pinkish tinge near my main "trough." Yet the example above gives you a pretty good idea of what you would see if you looked at my TV while it was rendering a light gray test image.


Breaking out my digital camera, I took this picture of my Sony displaying a solid gray 30 IRE test screen. (The IRE scale from 0 IRE to 100 IRE measures shades of gray from pure black to full white. 30 IRE is a dark medium gray.) There is a vertical stripe just a tad to the left of the center mark as indicated by the Sony logo. There is lesser non-uniformity of illumination (fainter stripes, "crests," "troughs") surrounding it, all vertically oriented. In fact, you might consider the relatively bright "wings" of the image to be overly illuminated, as is a "crest" to the right of the center line.

In this picture you can't see the pinkish color tinge that I see with my eyes, the one associated with the main vertical band or "trough."


While I was experimenting with the above, I chanced upon the gamma test pattern of the Avia DVD and decided to play around with the Sony's gamma setting. I don't have a lot of confidence in this test pattern giving accurate numbers with non-CRT displays ... but for what it's worth, with the TV's gamma setting Off, the Avia gamma figure read 2.2. Each "higher" Sony gamma setting — there are four choices — decreased Avia gamma by 0.1. So at the maximum setting, Avia reported a gamma figure of 1.8.

Gamma affects the apparent contrast in the image. The higher the gamma figure — the "lower" the Sony setting for gamma — the more contrast there appears to be. When gamma is 2.2, there appears to be more contrast than when gamma is 1.8. 2.2 is considered fairly standard ... though, again, I won't vouch for the precision of the Avia numbers on an LCD TV, or on any TV where "gamma" is generated digitally and not an intrinsic function of a cathode ray tube that displays a picture using interlaced scanning.

Tuesday, December 12, 2006

A New Bedroom HDTV for Me, Part II

In A New Bedroom HDTV for Me, Part I I crowed about my new 1080p HDTV, the Sony KDL-40XBR2 40" LCD panel.

In a minute I'll talk about a minor problem I've discovered. But first, I again want to state how much better this HDTV's picture is than that of the other two HD sets I have, a 32" Hitachi plasma and a 61" Samsung DLP rear-projection unit, both of them of three-year-old design. Yesterday I tweaked the Sony's picture with the Digital Video Essentials test DVD and found that it basically needed very little tweaking!

I had already adjusted grayscale by eye — the Sony offers user-accessible "white balance" adjustments of bias and gain in each of the three color channels (red, green, blue) that combine to make up white or shades of gray. I had chosen the Warm 2 setting for color temperature and the default setting of Low for gamma. I turned off the lights, setting the Sony's backlight to minimum and picture level to maximum. Then I popped in the test DVD and went through its adjustment procedure. The net result was that I needed to drop the brightness level to 29 and the color level to 30. The hue adjustment remained centered at 0. That was all there was to it!

The result was a stunning picture that displayed color footage exquisitely and handled monochrome test patterns equally well.


But, as I say, a minor problem also showed up. It looks as if the backlight behind the LCD image isn't perfectly even in its illumination. There's an irregular swath, predominantly vertical, sitting slightly to the left of the center of the screen in roughly the top two-thirds of the image. In solid light gray test patterns it's barely visible as a slightly darker gray. It doesn't have a distinct edge or border. It grades very gradually into the lighter shades of gray to either side of it.

It shows up only when the image on the screen is of a certain uniform lightness. Most actual programs' images don't reveal it. Or at least, I haven't noticed it except when the screen is basically a uniform light gray.

Backlight unevenness or "cloudiness" or splotchiness is, I've managed to discover, a much-discussed issue with Sony Bravia KDL-nnXBR HDTV owners. This thread at the AV Science Forum deals with it: the Official Sony 46" XBR LCD Uneven Backlight/Cloudy Thread. It's mostly about KDL-46XBR TVs, not KDL-40XBRs, since the uneven backlighting issue seems to beset larger LCD panels more than smaller. But 40" owners have reported similar, if less pronounced, problems.

Most of the complaints deal with light blotches in dark images. I'm seeing a dark blotch in a light image. Despite the apparent difference in symptoms, I think the underlying cause is the same: uneven backlighting.

At least one poster to the thread has suggested that the unevenness is "the result of thermal induced stresses on the LCD panel: — "warping of the panel due to non uniform expansion in response to the increase in temperature" — and says his 46-in. 1080p Samsung European-model LCD "also presents the same type of clouding on the right hand side (top and bottom)."

Intuitively, I expect this is a correct explanation. It might explain why the newer, larger LCDs are prone to the problem that as far as I know is new to the world of LCD HDTVs: the larger the TV, the more prone it is to the problem.

But it doesn't quite explain why owner scuttlebutt has it that Sonys built before August 2006 seem to be immune. It seems to be only the more recent units that exhibit the problem.

More on my experiences with this otherwise magnificent HDTV in Part III!

Sunday, December 10, 2006

A New Bedroom HDTV for Me, Part I

A new Sony Bravia KDL-40XBR2 now graces my home's master bedroom. This 40" LCD flat panel with 1080p resolution has received glowing coverage in the enthusiast press and is ranked tops in its category by Consumer Reports. I had intended to buy a 1080p Sharp AQUOS LC-37D90U 37" LCD, but was put off when I found Best Buy doesn't carry it. I switched allegiance to the Sony ... then ended up buying it online at Abe's of Maine, rather than Best Buy!

I chose Abe's for its lowball price. You can pay as much as $3,100 for this set. I paid $2,339 for my Sony at Abe's on 11/28/06, with free shipping and zero tax. My set is currently (12/11/06) listed at Abes' at $2,315. You can probably find even lower online prices at www.shopping.com.

Online prices on this and other HDTVs change seemingly daily. For instance, the day before I placed my order at Abe's I found a price of $2,539.99 at BestBuy.com. The very next day the price had gone up by some $400, I think. Today, as I write this, it's all the way down to $2,489.99. (To see the BestBuy.com sale price, you have to put the TV in your shopping cart. Otherwise, you see the list price of $3,099.99.)

Buying an HDTV online was a bit harrowing. I fretted about getting a maybe illegal "gray market" item, possibly with no U.S. warranty .... about shipping taking forever ... about the difficulty of getting the heavy, bulky item into my house ... etc. etc. etc. Complicating the situation was the fact that I was also ordering, at the same time:

Abe's, in shipping the two items I ordered from them, was not as quick as the other three vendors. It took eight days to receive the final item from Abe's, the TV itself ... which arrived with much external damage to the carton and the styrofoam packaging inside shattered into chunks, as if the box had been dropped at some point. But the TV looks and works fine. Draw your own conclusions.

None of the other items I ordered arrived with apparent damage. All worked fine, and all seemed to be legitimate, non-gray market items with actual manufacturers' U.S. warranties inside the boxes. None were packed in anything other than the usual full-fledged original manufacturers' cartons, with all the usual logos, manuals, etc. Aside from the damage to the TV's packaging and the inexplicable slight delay in getting Abe's to ship the TV and the TiVo, I personally had no bad experiences with buying home entertainment gear online at deep-discount prices.


I did have some problems getting all the stuff installed.

I bought a Tech-Craft TRK50 TV stand from AudioVideoFurniture.com for $415.00. I put it together all by myself without a hitch. It has a rear pillar with mounting hardware for just about any flat panel TV of moderate size. That makes it capable of elevating the panel to a respectable height for bedroom viewing without resorting to a wall mount.

I had to wait two or three weeks for this stand to be "in stock" at the vendor's warehouse. I could find no other vendor that had it ... though many listed it, and for lower prices.

Figuring that I had just about maxed out my own handyman capabilities assembling the stand, when I finally got it, I went to ServiceMagic.com to locate an installer. They referred me to Al, whose last name I'll withhold, a skilled and competent home theater installer in my area. Al was very easy to contact and work with, and he did install all my gear, including wall-mounting the Yamaha sound projector, for $250.

But the YSP-800 wall-mounted digital sound projector brought woe. It's an array of tweeters and woofers that, through the magic of bouncing sound waves off walls and controlling them with digital signal processing, can make audio emanating from the area of the TV sound like it's coming from five speakers positioned around the room. Add a subwoofer and you have a virtual home theater.

Woe came when Al was trying to mount the DSP on the wall above the TV. He shifted it just a tad to one side ... and it fell on his head.

This happened while I was out fetching sandwiches for lunch. When I got back Al told me, "I had a little problem while you were gone ... ." I looked and saw dings in the front grille. (Al said he had a headache.) The next day I noticed dents in the bottom plate of the base.

Al and I are now working together to figure out whether and how the cosmetic damage can be repaired at reasonable cost to Al, who accepts that he is liable for the damage. The grille can be replaced fairly cheaply. It looks like the metal plate with the dents in it will be a bit tougher; it seems to be, as a part of a larger main base assembly, unavailable separately. Replacing the whole main base assembly would possibly cost big bucks and would involve tearing into the unit's innards ... the idea of which doesn't thrill me.

As I say, Al and I are exploring various alternatives. Just as a guess, I imagine I may wind up with a replacement grille, two unrepaired dents in the base, and a refund of the $250 I paid Al for the whole job. Stay tuned.


This is my third HDTV, to go with a 32" Hitachi plasma and a 61" Samsung DLP, both of three-year-old vintage. Neither of the others is 1080p. In my early experience with the Sony, it renders a noticeably better picture.

It's better not just at 1080p resolution, which I've yet to spend much time with. Even when watching DVDs and standard-def cable, there's a big improvement.

The colors, for one thing, seem more accurate. My Hitachi tends to make reds orangish, while the Samsung biases them toward a cherry hue. The Sony's reds are, in my judgment, spot on.

Then there is the better black level. The Sony's blacks, while not coal-black, are closer to it than either the Hitachi's or the Samsung's. Meanwhile, the Sony manages to produce a wealth of shadow detail. The Hitachi "swallows" an abundance of same, rendering it totally invisible, while the Samsung seems to do something odd to it: the color of very dark portions of the image seems to fade unnaturally into bland grayness.

The Sony does better with grays, too, as with a black & white movie. Last night I watched the 1955 film noir classic Kiss Me Deadly on TCM on analog cable. I used the Sony's White Balance adjustments, found under Advanced Picture Settings — but only when using Custom Picture Mode, not Vivid or Normal — to change a greenish image to one that was pretty darn neutral in hue.

The White Balance adjustments modify Bias and Gain for each of the three color primaries, red, green, and blue. Bias is like a Brightness control, but for just one color. Gain is like a Contrast or Picture control, again just for a single color. Tweak the tri-colored Bias settings along with the three Gain settings, and you are in effect doing a grayscale calibration of the TV.

More in A New Bedroom HDTV for Me, Part II!

Sunday, December 03, 2006

Two Useful HDTV Primers

If you're looking to buy your first HDTV for Christmas 2006, you may find you're overwhelmed. You're asked to choose among flat panels, direct-view picture tubes, rear projectors, and front projectors. Resolutions include 720p, 1080i, and 1080p. Imaging technologies at your beck and call include:

  • LCD (liquid crystal display)
  • PDP (plasma display panel), or just "plasma"
  • DLP (digital light processing)
  • LCoS (liquid crystal on silicon)
  • SXRD (Sony's "Silicon X-tal Reflective Display" take on LCoS)
  • D-ILA (JVC's "Digital Direct Drive Image Light Amplifier" LCoS version)
  • ... and the old-fashioned "picture tube," or CRT (cathode-ray tube).


What to buy? How to decide? How to figure it all out?

You can begin by checking out this HDTV Theater Feature at Discovery.com. The guide's first section, "HDTV Benefits," tells you stuff you probably already know, such as how much wider the HDTV screen is — a 16:9 aspect ratio, to SDTV's 4:3 — and how much better multichannel surround sound is than stereo. And yes, you can get 4.5 times the number of pixels of standard-def TV — up to 2 million of them — but what's a pixel? Click on the word "pixels" and see!

The real meat-and-potatoes of the tutorial begins with the "Purchase Guide." Here the major types of HDTVs are described: traditional direct-view CRTs; flat panel LCDs and plasmas; and rear projectors using CRT, DLP, LCD, and LCoS technologies. (Front projectors aren't covered.) Notice that LCD technology appears in both flat panels and rear projectors.

The guide's "HD Setup" section is worth its weight in gold. Just click on "Basic Setup" or "Advanced Setup" and you'll see an easy-to-use interactive guide to all the confusing connections on the backs of HDTVs and other gear in your system. You can skip the "Get Discovery HD Theater" section unless you want to find out how to receive that high-def channel. The "Glossary/FAQ" section will get you started talking the talk if not walking the walk in HDTV land.


Another fine primer comes in the form of a pair of articles from the technology writer of The Baltimore Sun, Mike Himowitz.

The first, "Plasma? LCD? What to look for in HDTV," makes the point that "we've reached the point of diminishing returns on procrastination." Wal-Mart is supposed to be slashing HDTV prices this holiday season, with other vendors following suit. Why wait to buy?

Another key point: "HD manufacturers give you 11 percent less TV than you got for the same diagonal measurement in the old days." The "wider" 16:9 picture is, in a way, a squashed picture. To get the same screen area as with an SDTV you have to size up. So you may need a pricey new TV cabinet or stand.

That brings up an important point. Where are you going to put your DVD player, cable box, A/V receiver, etc., etc., etc.? With a conventional set, these pieces of gear often found their way to space above or to one side of the TV. No longer. The new sets usually don't have tops to set things on, and they intrude into the side space so much that you can kiss that goodbye. So, even if you wall-mount your new HDTV, you'll need a stand with plenty of shelf space under the TV.


Himowitz advises, and I concur, that CRT-based HDTVs no longer make sense. They're too deep, too heavy, and take up too much of the space beneath their screens. The best flat panels now provide just as good an image, cost less, and use less space. In fact, according to the December 2006 issue of The Perfect Vision, Sony will cease production of direct-view CRT HDTVs and CRT-based rear projectors.

Between the two major flat panel types, LCDs and plasmas, Himowitz advises to favor plasmas in larger sizes (over 40") and LCDs in smaller ones. But I'd say the gap has narrowed in the crucial size range of 40-42" to the point where you should let your eyes and pocketbook be the final judge.

Plasma pluses: really bright pictures, excellent colors, fairly deep blacks. Plasma minuses: they're heavy, hot, electricity hogs with a potential for burn-in of static images; 1080p models are as yet few and pricey. (Manufacturers now say they've licked the burn-in tendency.)

LCD pluses: 1080p resolution available for moderate prices; they reflect room lights less than plasmas; they're light in weight; no burn-in. LCD minuses: moving images can blur due to response-time lags; narrower viewing angles; blacks can be less deep than on plasmas. (Note that the very latest LCD panels pretty much conquer all these minuses.)

Himowitz makes it sound like rear projectors (as opposed to flat-panel HDTVs) are as much for techies as front projectors, but it just ain't so. Models of the LCD, DLP, LCoS, and SXRD varieties, the so-called "microdisplays" with magnifying optics inside, are no harder to set up and use than flat panels — though you ought to avoid the now-passé CRT projectors which require "convergence" adjustments — and they give you the most bang for the buck in screen sizes over 50". The downside: they do need an expensive new bulb every few years, off-angle viewing may be problematic, DLP sets make some people see tri-colored rainbows ... and they can't be mounted on a wall.


The second Himowitz piece, "HDTV specs matter, but what you see counts most," deals first with that confusing HDTV nomenclature. I'd say it's the best quick intro yet to terms like 720p, 1080i, and 1080p.

I disagree slightly with Mr. H's description of 1080p as "overkill," though. True, at present the only way to input 1080p to an HDTV is with a Blu-ray disc player, but it won't be long until HD DVD players and home computers can feed 1080p into HDTVs as well.

Also true, you need to be sitting fairly close to a reasonably large HDTV to see the difference between 1080i and 1080p. If your TV is of modest size compared with your viewing distance, your eyes will blur away much of the extra definition.

I'd still say that if you ever envision having a 1080p source such as Blu-ray, you ought to consider investing in a 1080p TV very, very seriously.

Then again, according to the December '06 issue of Consumer Reports, "DisplaySearch predicts that the average price of a 50-inch 1080p plasma TV will drop from about $5,400 this December to $3,100 by late next year." So if the words "plasma" and "1080p" are at the top of your HDTV shopping list, waiting another year may make sense after all.

As for 720p. Mr. Himowitz is right in calling it "entry level" HDTV these days. It has only 720 lines or rows of pixels, not 1,080 — though it wholly avoids the "i" vs. "p" debate; it's always "p."

Here's where things get truly confusing. I've talked of 720p, 1080i, and 1080p as if you understood those terms. I realize some explanations are due. I'll give them in another post, 1080p, 1080i, and 720p.


I would echo Mr. H. in advising you to find a store where you can view an HDTV in dim surroundings, from your actual anticipated seating distance. In my area, Tweeter is usually best for this. Circuit City and Best Buy too often crowd you in toward their TVs on display and/or use overbright fluorescent lighting.

Bring along a favorite DVD (ideally, one with a lot of dark scenes and bright scenes with a lot of action) and ask to view it with a remote control for the HDTV in your hand. You need to know:

  • How black do the blackest areas of the image get with BRIGHTNESS set suitably low?
  • How much "shadow detail" is there with BRIGHTNESS set that low?
  • How high can you set CONTRAST without "crushing" bright highlights on the screen?
  • Do colors look pleasing and natural?
  • Do faces look sunburned or flushed unless you set COLOR too low?
  • Do you see "false contours" in supposedly smoothly graded picture areas?
  • Do you see jagged edges on objects in moving images?


Another tip: bring along a black & white DVD! The picture should look a netural gray. If it doesn't, the TV may need a costly "grayscale calibration" to perform its best in your home.


The Himowitz article goes on to describe the various connections you may need to make to use your HDTV. To what he says I would add that you definitely ought to favor High-Definition Multimedia Interface (HDMI) connections over all others. He doesn't mention it, but HDMI carries audio as well as video signals, so if your source device outputs HDMI, one cable from it to the HDTV can do it all.

Digital Video Interface (DVI) inputs are the equal of HDMI, video-wise, but they're so two-years-ago. Don't put much stock in them today; their connectors are big and clumsy, and they don't carry audio. If you have a source device that outputs DVI but not HDMI, you can get a DVI-to-HDMI cable or converter for it ... but you'll need to route audio separately.

Of course, your HDTV ought to have enough HDMI ports to handle all your inputs. If it doesn't, you can get an A/V receiver that accepts multiple HDMI inputs and sends just the selected input signal to the TV, again on HDMI. Or you can buy a special HDMI switcher, but that's geeky to the max. Or you can fall back on routing high-def video from the source device to the TV via analog component-video wires, alias YPbPr.

In the latter scenario, or if you're forced to use DVI, audio is carried on its own wire(s). Himowitz is dead wrong to say that using optical and coaxial audio connections are for "tweaks" and buffs. They're the best way (short of HDMI) to route multichannel Dolby Digital or DTS surround sound to an A/V receiver or a home-theater-in-a-box.

Types of connections to be avoided, if possible, include:

  • S-video or composite video connections (use HDMI/DVI or component video)
  • 75-ohm coaxial RF connections, except from an antenna or cable wire
  • Stereo analog audio via right-left RCA audio ports (they're non-multichannel)

Finally, it's good advice to consider getting your HDTV and whatever gear you buy with it professionally installed. It ain't cheap, but it can require skill, expertise, and tools you probably don't have ... and heavy lifting you may not be prepared for.

Tuesday, November 28, 2006

Ripping: The ARccOS Problem

In To Rip, Perchance to Burn and its prior posts in this series I talked about the various pieces of Mac software that can help you make an archival copy of a commercial DVD you own. Among these was MacTheRipper, an actual DVD extractor. It generally works OK, I find, but it so happens that the third or fourth DVD I tried to rip with it, Inside Man, gave it a bellyache.

About midway through the rip, MacThe Ripper complained that bad sectors had deliberately been coded into a VOB (Video OBject) on the DVD and asked me whether to delete or pad them. I chose pad. But MTR immediately got stuck, and no more progress was made. I had to quit MTR and start it again.

Before I did, I Googled "DVD VOB deliberate bad sectors MacTheRipper" and found some forum posts about the problem. One poster recommended to turn on "ARccOS" in MTR. That, I found, is done in the Mode panel (as opposed to the Disc panel) of the MTR window. You simply switch from "Full Disc Extraction" to "Full Disc (ARccOS) Extraction," and you turn on ARccOS.

What is ARccOS? This Wikipedia article says it's an extra dollop of copy protection some DVDs have. Which DVDs? See this list. (Yes, Inside Man is on the list.)


Unfortunately, although that strategy allowed MTR to finish the rip, at the end MTR still complained of bad sectors and suggested the resulting rip might be unplayable. And so it was. I'm trying the rip again, but I have little hope that MTR can cope with this DVD alone.

It's all about an arms race between the studios and the rest of the world. The rest of the world insist they have the right to make an archival copy of a DVD they legitimately possess, in case the original becomes unplayable. The studios say that opens the door to piracy and add ever new layers of copy protection. The purveyors of software like MacTheRipper try to overcome the new protection. In this case, with this DVD, it looks like MTR — at least, the version I have — hasn't quite succeeded.

The MTR version I have is the latest "official" release, 2.6.6 ... but it dates back to early 2005. The guru behind MTR is currently doing a version 3.0, which is in beta testing. Apparently the only way to obtain it is to donate money ... which I might do at some point. But not yet. I can wait.


Meanwhile, as a workaround I followed one forum poster's advice and input the output data of MTR into DVD2OneX. The latter is software that, mainly, further compresses a DVD so it will fit on a single-layer recordable disc. DVD2OneX also has the ability to create a dual-layer version of the original, but that is an option I haven't tried yet.

I told DVD2OneX to create a file set rather than a disc image — much less actually burn a disc — and it gave me a folder with VIDEO_TS and AUDIO_TS subfolders, just like on an actual DVD. At the end it warned me that it had detected 15,765 "mastering errors" in the input data, errors which it had "corrected" in the output — but that the result might still not "work properly."

I pointed Apple's DVD Player app at the VIDEO_TS produced by DVD2OneX and found that it can begin playing the movie seemingly without a problem.


What happened next was that I decided to copy the file set DVD2OneX had produced to a DVD-R and try playing that. Burning to DVD was a mistake, since the file set did not constitute a playable DVD.

The file set included two folders, VIDEO_TS and AUDIO_TS, but they alone don't seem to be enough to make a DVD playable as such. True, I could point Apple's DVD Player app at the disc's VIDEO_TS folder as "DVD Media" and the app would "play" the media files. Otherwise, though, I had a data-only DVD that would not play in a real DVD player.


So, next I tried using DVD2OneX to make a "disc image" on my hard drive. I chose the option to make a disc image for a dual-layer DVD, just to see what would happen. This resulted in an icon on the desktop that, when double-clicked, places a virtual disk volume on the desktop. That volume plays successfully in DVD Player as if it were an actual DVD in the optical drive.

The disk image icon apparently contains 6.85GB. So does the volume it mounts on the desktop when it is double-clicked. Strangely, this is the size of the VIDEO_TS folder alone. The AUDIO_TS folder appears to be empty! I can't explain that.

If I had a SuperDrive that would burn a dual-layer disc — which I don't — I could theoretically now use Disk Utility to burn my disc image into an actual disc. But of course if I had such a drive, I could have burned the disc directly from DVD2OneX.


As far as the playability of the result of using MacTheRipper and then DVD2OneX on Inside Man, there do seem to be minor issues. I'm only part way through watching the movie, but so far there seem to be at least two glitches in the playback. They're minor, as I say ... it's as if some frames were missing. I do find, though, that the scan forward and backward functions of the Apple DVD Player software don't work right with this disc image.

Accordingly, I have to conclude that the result of the rip is pretty good, but not perfect. Perhaps if I were to get the latest beta version of MacTheRipper, I could do better with Inside Man.