Saturday, June 25, 2005

HDTV Quirks, Part III, Contrast Ratio

I've admittedly been going on and on about gamma, the TV display characteristic which determines how brightly all the various levels of luminance appear on the screen. (My most recent installment in that vein was Tweaking Gamma on My Samsung DLP, Part II.) Higher gamma makes shadows seem deeper, while lower gamma opens them up for easier inspection — that's about the simplest way I know to explain the gamma concept.

The reason I investigated gamma at all is that I was dissatisfied with how my Samsung 61" DLP-based HDTV, a rear-projection unit, displayed low-luminance signals, by which I mean low-lit images that are much, much closer in overall luminance to the 0-IRE "black level" of a video signal than to 100-IRE peak white. In other words, there was something uninspiring about how the picture looked when there was nothing particularly bright in the scene.

So I assumed that the proper remedy might be changing the GAMMA parameter in the Samsung's service menu. I proceeded to play with the GAMMA settings and to learn quite a bit about the gamma concept in theory and practice ... only to find the original "factory" setting, GAMMA 4, to be the most ideal in the end. Though GAMMA 4 gave me the lowest "true gamma" of any available setting save 3, (apparently intended for letting external gear determine the gamma curve), and even though that gamma value was (the best I could tell) far lower than what the "experts" say is correct, it was what I liked best.

It seemed best, that is, once I had boosted sub-contrast in the service menu up to its absolute practical maximum, meaning that peak video white was now being output with the greatest light intensity the TV was capable of. Though that had no effect on the TV's 0-IRE black level, it did have a proportional effect on every luminance level above 0 IRE. Which means that video information at a relatively dark 20 IRE, say, was being displayed brighter than before.

GAMMA 4 really shone with boosted sub-contrasty when I watched the Star Wars I: The Phantom Menace DVD. In fact, I'd say that was the most thrilling home video experience I've had to date. There were lots of dark scenes which looked just fine, and lots of bright ones that pretty nearly seared my retinas, looking very cinema-like indeed. Scenes with bright pinpoints of stars set against black sky looked truly convincing. And so on.

Which seems to mean that what resembles a "gamma problem" can really have a "contrast-ratio solution."


A TV's contrast ratio is the ratio between how bright 100-IRE peak video white is displayed and how dark 0-IRE true video black is rendered. When the main contrast control, the one in the ordinary user-accessible menu, is raised to its maximum setting of 100, the TV's entire available contrast ratio is put to use — assuming, that is, that the service menu's sub-contrast setting isn't holding it back.

By boosting sub-contrast, I made the Samsung's effective contrast ratio equal its maximum available contrast ratio. The percentage by which I boosted sub-contrast and thus the luminance of peak white was, I'm guessing, 30 percent. (It's hard to tell exactly what the percentage gain was, because the numbers corresponding to sub-contrast settings don't necessarily track linearly with luminance output.) Taking that number at face value, that meant that every luminance level above 0 IRE was also rendered 30 percent brighter than before. Even, say, lowly 10 IRE went from being super-super-dim to just plain super-dim.

(At this point in the discussion, I am going to start putting some of the text of this post as originally written in red. That means I changed something later on which made what the text says obsolete. For example, the first sentence of the next paragraph says I use, in effect, my Samsung's Dynamic-mode user preset. That was true when the sentence was written. It is no longer true.)

Along with the change to sub-contrast I also switched to employing, in effect, the TV's user-preset mode labeled Dynamic, which meant that the color saturation was boosted well above what the Avia test disc says is right. And I started turning Samsung's proprietary DNIe signal processing chip on, rather than leaving it off — or, in the case of the Samsung's DVI digital input, it meant realizing that DNIe was always effectively on and could never be switched off for that input. Along with learning to appreciate DNIe's good points, I also learned, to my surprise, that a turned-down user sharpness control could nicely ameliorate DNIe's tendency to make the picture look "overenhanced."

Furthermore, I came to grips with the fact that my TV looks better in a well-lit room than in a dimly lit one, owing in part to the fact that its best imitation of "black" is really only a dark gray. Along these lines, I note for what it's worth that I was watching Star Wars I with the brightness or black level control left at a setting of 50, though Avia wants me to raise it to the high 50's to keep 10-IRE dark gray from being "swallowed" by 0-IRE black. (Perhaps later on I will try increasing the brightness/black level and seeing how that affects my subjective evaluation of the picture.)


What is it about a higher contrast ratio, though, that makes the Samsung's picture look so excellent? The answer is probably not a totally objective one; it's partly "psychovisual," if there is such a word. Admittedly, too, I don't fully know whereof I speak, but I'm led to believe that the perceptual apparatus of human vision is capable of ratcheting itself up and down, depending on how much illumination it's getting. It "wants" to interpret the contrast ratio of any photograph, computer graphic, film image, or TV image as being like that typically found in the real world, though for technical reasons no reproduced image can even come close. But the "ratchetability" of vision makes up for that.

Projected photographic images typically do contrast better than "digital" TVs. The contrast ratios of the best "digital" TVs usually aren't as impressive as that found in most cinemas, owing to their unprepossessing black levels. CRT-based displays usually have the best contrast ratios. Though they can't produce huge amounts of peak-white output, their black levels are so low — to the point of unmeasurability, in some cases — that their contrast ratios can be superb. They can even exceed the 10000:1 contrast ratio of the best cinema projection (see the article by Raymond Soneira downloadable here), having a 30000:1 contrast ratio or "dynamic range" in some instances.

If the eye couldn't ratchet, i.e, adapt to the dark, all TVs would look terribly dim and washed out. But the eye's response can ratchet down such that it "thinks" a TV's unimpressive-by-the-numbers contrast ratio looks pretty "okey-day," to quote Jar Jar Binks in Star Wars. That is, it does so as long as the TV's actual, objective, numerical contrast ratio isn't too weak.

When the TV's black-level output isn't all that correct, as happens with not only DLP-based TVs but also LCD, plasma, and other "digital" panels, then its peak-white output needs to be concomitantly high. That's what can keep the contrast ratio in the right ballpark for a convincing on-screen image.


According to Charles Poynton in Digital Video and HDTV Algorithms and Interfaces, the eye "can discern different luminances across about a 1000:1 range" (p. 197). That is, the highest contrast ratio actually "needed" by the eye is 1000:1. But the ratio between diffuse white and the black of a TV image need be no higher than 100:1.

The "decade" (order of magnitude) of contrast ratio separating 100:1 and 1000:1 in a TV image apparently is there in part to allow very localized, directionally reflected gleams — the so-called "specular" highlights in an image — to look properly bright, compared with the look of diffuse white that is spread over larger areas of the image (see p. 83). If specular highlights were reproduced in a TV image at the proper 10:1 ratio vis-à-vis diffuse white, then the 1000:1 ratio would be relevant to TV specification. They aren't, though, with little apparent ill effect — which is why diffuse white isn't encoded for television at a mere 10 IRE!

Thus 100:1 is the figure of merit for a TV's contrast ratio, says Poynton. (Other sources up that conservative figure to "several hundred to one.") That the figure is so low is indeed fortunate because, he says, "In practical imaging systems many factors conspire to increase the luminance of black, thereby lessening the contrast ratio and impairing picture quality. On an electronic display or in a projected image, simultaneous contrast ratio [than evidenced in any one frame of the image] is typically less than 100:1 owing to spill light (stray light) in the ambient environment or flare in the display system."

Says Poynton, accordingly, cinema can furnish a "simultaneous" contrast ratio — that in a single frame of film, as opposed to successive or "sequential" frames — of 80:1. (Sequential film contrast ratios can reach 10000:1.) Meanwhile, a typical TV in a typical living room can have a simultaneous contrast ratio of just 20:1 (see Table 19.1, p. 198).


The contrast ratio of a TV can be measured in several ways, I have learned from various home-theater enthusiast magazines. Most of these ways report ratios much higher than Poynton's conservative 20:1 or 100:1.

One way to measure contrast ratio in a TV display is "full-on/full-off," the principal method used by Home Theater Magazine. A brief discussion of the method can be found here. the basic idea is that a full-field 100-IRE test pattern — i.e., every pixel is at "peak white" — is displayed and the amount of luminance produced by the screen is measured. Then the TV's luminance output for a full-field 0-IRE pattern — every pixel at darkest "black level" — is metered. The two measured luminances, expressed in foot-Lamberts (ft-L or fL), are used to form the contrast ratio.

For instance, HT measured the InFocus 7205 DLP Projector's 100-IRE output at 22.37 ft-L (see this web page). 0 IRE was at 0.022 ft-L. The contrast ratio of 22.37 to 0.022 equals approximately 1,017:1.

Some equipment testers measure luminance not in foot-Lamberts but in candelas per square meter (cd/m2). To cconvert ft-L to cd/m2, multiply by 3.43. For example, 22.37 ft-L times 3.43 equals roughly 78 cd/m2.

More modest measured contrast ratios are typically obtained by means of the ANSI contrast-ratio measurement, in which eight 0-IRE "video black" rectangles are checkerboarded among eight 100-IRE "peak white" rectangles on screen, and the average difference in luminance output is metered and converted into a ratio. The ANSI ratio measured by HT for the InFocus 7205 DLP Projector was 355:1 — still a lot higher than 20:1 or 100:1.

Most of the newer "digital" TV technologies appear to exceed 20:1 or 100:1 quite easily. Amazon.com's description of my particular TV model, Samsung's HLN617W, located here, mentions a 1000:1 contrast ratio. Another review, located here, says my TV's contrast ratio is fully 1500:1. Neither of these figures is apt to represent the more conservative ANSI measurement, be it noted. Even so, I wouldn't be surprised if the Samsung's measurable ANSI contrast ratio isn't way up there in the high triple figures.


I don't necessarily agree with HT that full-on/full-off contrast ratios are all that useful. I think the ANSI method is much more useful. A third possible method is to replace the full-on 100-IRE field with a 100-IRE window test pattern, in which the middle 25% of the screen is occupied by a rectangle at peak white, surrounded by nothing but video black at 0 IRE. The luminance at which the 100-IRE window is displayed is used as the top term in forming the contrast ratio. The bottom term is again derived from a 0-IRE full field.

The reason either the ANSI method or the 100-IRE window is better, in my opinion, is that some displays and projectors are designed to dim luminance somewhat for full fields that contain a lot of very bright pixels. Plasma flat panels, for instance, throttle back on electrical power usage for ultra-bright scenes, in order to restrain the operating temperatures of the panel. And many front projectors have an automatic iris or aperture control which does the same sort of thing in order to avoid blinding viewers with sudden brightness increases.

For the eye adapts to the amount of light it's seeing by changing the diameter of its pupil, by modifying the amounts of the various pigments present in the cells of the retina, and by reconfiguring "neural mechanisms in the visual pathway," Poynton says (p. 196). Its 1000:1 or 100:1 usable contrast ratio is thus like an elevator that can "ascend" or descend the "shaft" of visible luminance.

This shaft visits eight "floors" above the "basement" level where light is just too dim to be seen at all. Each floor represents a tenfold increase in luminance, so the top (eighth) floor represents luminance that is 100 million (108) times that of the basement.

As the eye is exposed to different levels of luminance, it adapts: the elevator, which itself represents only a 100:1 contrast range, ascends or descends. Ignoring, accordingly, the upper "decade" of the 1000:1 contrast ratio the eye is capable of accepting, and restricting attention to the 100:1 ratio he claims is really what's important in television, Poynton makes this point: "Loosely speaking, luminance levels less than 1% of peak [I assume, diffuse] white appear just 'black'" (p. 197).

Once the eye adapts to a TV's inherent dimness, a TV screen "looks" just as bright as nature would outside the window, if the curtains were opened. That is, once the eye has adapted, any image source that provides a (in Poynton's book) 100:1 contrast ratio or better turns luminances less than 1 percent of the maximum into "black."

It would seem to follow that my boosting sub-contrast, which raised the applicable maximum light-output capability of my Samsung TV, turned its very darkest grays into pitch black, as far as my eyes were concerned. And a lot of slightly lighter grays moved closer to the magic 1% "black" cutoff, as it were, and appeared darker to my eyes.

Put another way, the sub-contrast boost I made effectively sent my eyes' two-decade luminance elevator "up a floor" (or perhaps just part of a floor). The number of low-IRE levels of dark gray which my eyes could not distinguish from true black thereby increased.


This effect may have been abetted by what Poynton calls the eye's "contrast sensitivity." This, the lowest discernible "ratio of luminances between two adjacent patches of similar luminance," represents a "threshold of discrimination" of human vision (see pp. 198ff.).

If two fairly dark patches of similar, but not quite equal, luminance are set against a fairly light background, the relatively bright "surround luminance level ... fixes the observer's state of adaptation." When the surrounding luminance falls in a broad range that is fairly bright but not too bright, the eye can distinguish between dark grays that are just 1% apart in luminance level. That might correspond to the difference between 10 and 10.1 IRE in a video signal. Put another way, the critical contrast ratio is just 1.01:1.

But if the surrounding luminance is made either very low or very high, the eye's ability to distinguish among subtly different dark grays diminishes a bit. I'm not sure this subtle change in contrast sensitivity with very low or very high surround luminance is enough to explain anything I happen to be "seeing," but I mention it just in case it does.

Still, the fact remains that there is a limit to human contrast sensitivity, such that subtly different luminances in the range of 1% of the white level to which the eye is adapted cease to be distinguishable as the white "surround" level is raised. When I raised the TV's white level via boosting its service-mode sub-contrast setting, and when I also turned on the room lights, I made formerly distinguishable dark greys look like they were black. This effect was only enhanced by my keeping user brightness "too low" at 50 instead of, say, 58.


What I seem to be saying is that a TV such as mine with a questionable (because way too high) inherent black level looks best when one turns that "lemon" into lemonade.

When the black level of a "digital," non-CRT-based TV isn't anything to write home about, as it usually isn't, try using various strategies to get the TV to "swallow" the most subtle of its shadow details:

(1) Raise user-accessible contrast and service-menu sub-contrast as high as possible, short of "crushing" peak whites. If you can't distinguish 98 IRE from 100 IRE, you've got crushed whites. The relevant Avia test patterns are set up to reveal white crush, if it occurs. On my Samsung, luckily, there appears to be no contrast level whatsoever at which white crush actually does enter the picture.

(2) Keep the viewing room relatively brightly lit, not darkened as you would normally expect. But avoid having so much ambient light that it bounces off the TV screen and impairs black levels even further. Some aficiondos recommend putting indirect lighting behind the TV for this purpose.

(3) Experiment with user brightness settings that are lower than Avia recommends. If brightness is set "too low," anything below (say) 10 IRE will look like 0 IRE — but the overall picture will probably look like it has more contrast, which pleases the eye.


Another factor in my current situation is that I am using a color saturation setting of 65, as in my Samsung's default Dynamic mode ... where Avia "recommends" just 42.

This "wrong" setting of my TV's color control is intimately related to the "wrong" sub-contrast and user brightness settings I am using, as well as the "wrong" amount of ambient room lighting. How it is related is suggested by something Dr. Raymond Soneira writes in the September 2004 issue of Widescreen Review. His article, "Display Technology Shoot-Out, Comparing CRT, LCD, Plasma, & DLP — Part I: The Primary Specs," may be downloaded here.

Dr. Soneira is head honcho at DisplayMate, marketers of computer software which generates test patterns for the scientific measurement of television displays' capabilities using professional instruments. He knows whereof he speaks.

With respect to the topic of achievable black level, he calls it "the [TV's] capability of suppressing light output." He says of a "poor black level" that it "lifts the bottom end of the display's intensity scale and introduces errors in both intensity and color throughout the entire lower end of the scale, not just at the very bottom. All displays produce some light in the form of a very dark-gray when asked to produce a black. This background light is added to all of the colors and intensities that the display is asked to produce. This washes out the dark grays and also the dark colors. For example, dark reds will appear as shades of pink."

Taking that logic a step further, we can conclude that shades that start out at a light-to-moderate pink — such as many flesh tones — can become downright colorless when they are cast in deep shadow. Skin in low-light situations can turn gray, if the TV's inherent black level is too high.

And that's one of the main things that was bothering me in the first place about my Samsung's shaky low-light, near-black renditions. Certain portions of certain faces were just too gray.

I think that's why I'm presently enjoying a color saturation setting — with my user color control set at 65 — that's nominally way too high. Put briefly, it offsets the "gray-face problem." That is, it balances out the tendency of my set's inadequate black level to wash out color in low-light situations, forestalling an effect which I find personally quite objectionable. Even if the color picture is nominally too saturated at higher IRE levels, I don't find that fact particularly objectionable, or even noticeable.


All of which points up an interesting overarching principle: the rules be damned. The rules prescribe buying a TV whose black level is so low it's hard to measure it with instruments, and then watching it in a pitch-black envorinment (or nearly so). The TV's brightness or black level control needs to be set ever so carefully ... of course, after the TV has had its grayscale and gamma curve professionally set up. Then the TV's color and tint controls need to be tweaked to scientific perfection. And all of the TV's "signal-improving" circuits along the lines of Samsung's DNIe must, of course, be disabled. When all of that has been done, you may be lucky enough to obtain a dynamite picture.

But what happens if the TV can't suppress its light output all that well, for purposes of rendering "correct" video black? What if the TV is placed in a multipurpose viewing environment that is fairly well-lit? What if those two compromises call for yet others, such as turning DNIe on or resorting to "incorrect" Dynamic-mode user settings?

Then the rules be damned, I'm slowly learning. Trust your eyes, Luke!

***

I turned the preceding material red because I now want to reconsider it. Last night I put on the Epsiode II: Attack of the Clones DVD. (I'm on a Star Wars kick.) This "film" is not actually a film at all, since George Lucas "filmed" it in HDTV!

That is, he shot it in a digital video format called 1080p24, which means his camera created progressively scanned (not interlaced) video frames, 1,920 pixels across by 1,080 pixels high, at a rate of 24 frames per second (the traditional rate for film frames). He recorded what he was shooting on some kind of digital video recorder or computer, not on celluloid. The only time celluloid entered the picture was after the movie was completed. The movie's video frames, in the reverse of the customary process, were transferred to film for projection in traditional cinemas. Some cinemas, however, used digital video projectors, avoiding film altogether.

When it came time to make the Clones DVD, the original 1080p24 video was simply downconverted to the necessary 480i (or is it 480p?) and then MPEG-2 encoded. The result is a superb, reference-quality DVD with virtually no video artifacts of any kind, stunning colors, deep blacks, excellent contrast, etc. My only complaint is that the total absence of film grain is downright eerie!

But as I was first watching Attack of the Clones, I initially felt dissatisfied with the image ... until I bethought me to try the Avia-recommended settings for brightness and color. I turned color down from Dynamic mode's rather burly 65 to 47, and I turned brightness up from 50 to 58. That made the picture about as close to perfect as I ever expect to see!

So I eat the words I wrote above, now crimsoned in shocking red. Maybe the "rules" for acheiving a good video image ought not be cast aside quite so blithely.

No comments: