Thursday, June 30, 2005

HDTV, or EDTV?

In Circuit City's flier for its Fourth of July 2005 weekend sale, a brand-name-unidentified ("models may vary by store") 42" plasma TV is offered for just $1,499.99, after $300 "instant savings." It's shown as a Widescreen Flat Panel Plasma Enhanced Definition TV. That term, "enhanced definition" — what's up with it?

Enhanced definition television, also known as "ED television" or EDTV, is a level of picture detail less than true high definition but more than standard definition. Simple, huh?

The basic measure of picture detail is how many scan lines there are, plus how often they're updated or refreshed. In 720p HDTV, there are 720 horizontal lines on the screen, and they're all updated at once, every 1/60 second. The fact that they're updated all at once is called "progressive scanning." So HDTV is at least 720p, where 720 gives the number of scan lines and "p" (for "progressive") gives the method of scanning them.

Alternatively, HDTV can be 1080i: 1,080 scan lines, using "interlaced scanning." In interlaced scanning, the odd-numbered scan lines are refreshed in the first 1/60 second and the even-numbered in the second 1/60 second. Then the scanning process begins all over again.

Many of today's TVs use a number of scan lines between 720 and 1,080, such as 768. They convert 720p and 1080i to a scanning format compatible with that particular number of scan lines. In so doing, they generally wind up with a progressive-scan picture. Though they don't use either 720p or 1080i "natively," they're still high-definition, since their screen output format – call it, for instance, "768p" — meets or exceeds 720p.


Enhanced-definition TVs also use progressive scanning, à la 720p, but they use a number of scan lines less than 720. The typical number of scan lines is 480, so most EDTVs natively are 480p TVs.

Not coincidentally, 480p is the type of signal furnished by a progressive-scan DVD player. DVDs are encoded at 480i, which means there are 480 scan lines in the video frame, but they're scanned in the odd-even interlaced pattern, just like standard-def TV and 1080i HDTV. The progressive-scan player can "deinterlace" the two interlaced fields that make up each video frame. Deinterlacing works especially well when the DVD is based on movie material, since the player can basically reconstruct the original film's frames one at a time, in all their glory.

So an EDTV that uses 480 scan lines is a perfect partner for a progressive-scan DVD player. It can give you all the picture detail that's present on a DVD.

An EDTV can also accept and display 720p and 1080i signals. It's just that these truly high-def formats are "downconverted" or "scaled" to 480p by the EDTV, which means they lose a noticeable fraction of fine picture detail.

An EDTV can even accept 720p/1080i signals directly from an over-the-air antenna if it contains a built-in digital tuner. Such a tuner-equipped EDTV is sometimes called "ED built-in." Or, it can be called an "integrated EDTV." An "HD built-in" TV set or "integrated HDTV" is just the same, except that its output definition meets or exceeds 720p. Get it?


The second important aspect of enhanced or high definition television, vis-à-vis standard definition TV, is the widescreen, 16:9 aspect ratio. Standard-def TV is stuck with a squarish, uncinematic 4:3 aspect ratio.

Often, an EDTV is not identified as such in advertising, by any of its usual monikers. All you know from the ad, sometimes, is that it's widescreen, and that it's not labeled HD. If you look at the fine print, you may be lucky enough to find out what it's screen resolution is: say, 852 x 480.

The first number is (usually, but not always) the number of pixels (picture elements) across the screen in each scan line. The second number is the number of pixels high the screen image is. That number, not surprisingly, equals the number of scan lines. (In some TV models, sadly, I've seen the two numbers reversed. That can be confusing indeed.)

If you multiply the second number by the widescreen aspect ratio — 16:9, or 1.78 — you should get (roughly) the first number. This is because the pixels are usually square. Their width is the same as their height.


So an EDTV represents a viable option for folks who simply don't want to shell out the really big bucks for a pricey high-definition TV set just yet.

As I write this, Circuit City is selling a Panasonic 42" Plasma EDTV, model TH-42PD50U, for $2,250. This model is a step up in quality from the brand-unidentified one I mentioned earlier. Its 852 x 480 flat-panel screen can do wonders for DVD viewing at full resolution, and it casn also permit pretty-darn-good HDTV viewing at reduced resolution.

This plasma flat panel has a built-in digital tuner for over-the-air reception. This particular tuner is called an "ATSC tuner," in the jargon, since standard-def reception is always done with an "NTSC tuner." (The Panasonic also has one of those built in.) Notice the changeover from the first letter being "N," for standard-def analog TV, to the first letter "A" for extended- or high-def digital TV.

The Panasonic TH-42PD50U even is digital-cable-ready, since it also has a built-in "QAM tuner," the type of tuner that is used for digital cable as opposed to digital over-the-air reception. (In order to take full advantage of the QAM tuner, you will probably have to rent a credit-card-size "CableCard" from your local cable-TV company and insert it into a slot in the TV. Without the CableCard, you'll be able to receive unscrambled digital cable channels, but to view scrambled ones, you'll need the card.)

Oh, and here's one more little secret. If you buy an EDTV like the Panasonic TH-42PD50U, your friends and neighbors don't ever have to know it's not true hi-def!

Sunday, June 26, 2005

Luminance Contouring

My latest magnum opus, HDTV Quirks, Part III, Contrast Ratio, said a lot about contrast ratios in HDTV displays. In it, I mentioned that I boosted sub-contrast in the service menu of my Samsung DLP rear projector, thereby increasing its effective contrast ratio: the luminance produced for peak video white divided by that for a 0-IRE, black-level signal. In so doing, I noted that I ameliorated a problem that I had originally associated with too low gamma, the exponent which bends the output-luminance-vs.-input-voltage curve for the TV such that shadows are rendered deeper, or shallower, than they otherwise would be.

Contrast ratios and gamma exponents are part of a much broader subject one might dub "luminance contouring." "Contrast tailoring" would be an equally valid name for it.

The basic idea is that each pixel of digital video (ignoring associated "color difference" components) has a certain numerical level indicating intended luminance: a coded number in the range of 16 through 235 — or 0 to 255, for computers; I'm going to stick to the 16-235 digital video range.

Luminance code 16 corresponds to "black"; in terms of analog signal voltage, which usually is expressed not in pure volts but in IRE units, code 16 is 0 IRE.

Digital luminance code 235 corresponds to the maximal, 100-IRE analog voltage level: "peak" white. Peak white is also called "reference" white.

Digital codes below 16 constitute the so-called "footroom" reserved for "blacker-than-black" signals; those over 235 are "headroom" for "whiter-than-white" signals. (The 0-255 system used for computers doesn't allow for headroom or footroom.)

All codes between 16 and 235 accordingly represent different shades of gray. Code 17 stands for the darkest possible gray (i.e., that nearest to black), code 234 for the lightest (i.e., the one nearest white).

Both the 16-235 system and the 0-255 system are 8-bit end-user or video interface systems, in which each digital code for luminance uses a single 8-bit byte. Studio or video processing applications often uses a 10-bit system for greater precision.

(To be quite exact, what is encoded in this way is not luminance but luma, video geek-speak for luminance signals that have been "gamma-corrected." This means that luminance has had its range of intended shades of gray precompensated for display on a CRT, whose inherent gamma exponent would otherwise render shadows too dark and deep. Luma is what is actually represented by codes from 16-235. I'm going to continue to say "luminance" when I really mean "luma," though it's not something a purist would appreciate.)


Now that the basic idea of luminance encoding has been laid out, a good question is: at what actual level of output luminance will a particular TV render each of these codes?

Ideally, it will render code level 16 with no light output whatsoever. In the real world, no TV can do that, however. The best it can do will be some very low level of luminance. The exact level it will use is under control of the "brightness" setting chosen by the user. If "brightness" is set too high, the TV will emit too much light at its so-called "black level."

If, on the other hand, "brightness" is set too low, some range of codes above 16 will look just as "black" as code 16 looks. Such a TV is said to "swallow" its shadow detail.

Once "brightness" is properly set, the luminance that the TV will produce for code-235 reference white is adjusted via the "contrast" or "picture" control. The higher "contrast" is set, the greater the TV's luminance output for a code-235 (or 100-IRE) signal.

But a digital TV's "contrast" must not be set so high that whites are "crushed." A 98-IRE stripe (at, say, code 230) must continue to be distinguishable by the eye from a 100-IRE (code-235) background.

On a CRT, too-high "contrast" can cause general geometric distortion and/or "blooming" (enlarging) of white areas on the screen. It can also turn pure whites to brownish shades.


That pretty much covers setting up a TV's levels for black and for reference white. What about all the shades of gray in between?

This is where gamma, assuming it's selectable or adjustable, comes in. The higher the gamma exponent we choose, the more "reluctant" the TV becomes to displaying "dark" IRE levels, or low-numbered digital codes, with copious luminance. Shadows stay deeper/darker longer as the general level of ambient lighting in the scene goes up.

Say a pixel has digital code 46, making it just 30 of a possible 219 steps up from "black" at code 16. It's supposed to be dark ... but how dark? If gamma is relatively low — say, less than 1.8 — it will show up as a lighter shade of gray (ignoring hue) than it would if gamma were, let's say, 2.5 or over.


The kicker here is that digital TVs don't "do gamma" as a single exponent that applies at all code levels between 16 and 235. As the code level ascends from 16 to 235, gamma may change. This is because gamma, which is an inherent characteristic of a CRT, is simulated on a digital display.

That is, it's computed.

Or, rather, the output luminance level for each possible input code is computed, in advance, and stored in a lookup table. When an input pixel arrives, its code is looked up in the table, and the output luminance value associated with that pixel is located and used.

When you change the TV's "contrast" setting, the table is recomputed. Ditto, when you change user "brightness."

If I go into the service menu of my Samsung and change its GAMMA parameter, the lookup table is in effect recomputed then as well.

If I turn on DNIe — a function which may be turned on or off under user control, but not for DVI/digital video input — the table is again in effect recomputed. (In actuality, there is a lot more processing of the input data going on than that. Before the table lookup occurs, there is a lot of comparing of pixels, one to another, preliminary to "enhancing" the picture. Each input code is apt to be replaced by DNIe with a "better" one. Then the table lookup occurs.)

My intention here, be it understood, is not to boggle anyone's mind. It is rather to make clear that a digital TV like my Samsung DLP can choose any output luminance it damn well pleases for any particular input code it receives!


That fact is important because it has to do with how my TV "tailors" its contrast or "contours" its luminance response. Specifically, it tailors its various output levels of black, gray, and white in ways that may very well remain completely inscrutable to anyone such as I who is not "smarter than the average bear."

To wit, I went into the Samsung's service menu and boosted "sub-contrast" — actually, by name, S_CT(DDP) — a parameter that nominally works much like a user "contrast" control: it elevates the luminance output for reference white at code 235. In proportional fashion, nominally, it also raises luminance output for all codes from 1 through 234. In so doing — again, I say, nominally — it can be thought to have no effect on gamma.

Yet to my eyes it seems to have increased the underlying gamma exponent of the computation by which the TV derives its output luminance levels from input code levels.

As bright parts of scenes got yet brighter, because of the boosted sub-contrast, dark parts stayed relatively dark. The contrast ratio between, say, "bright" code-200 pixels and "dark" code-40 pixels seemed to have been stretched more than I would have anticipated. And I couldn't really say how or why.


The explanation may well have to do with the quirky way the eye responds to various stimuli. If the overall scene brightness is high, the eye adapts upward. Its ability to "see into shadow" goes way down.

But I don't think that's the whole explanation. If it were, then an extended dim or dark scene would let my eye adapt to it, and it wouldn't "stay dark" in my estimation for long. Notably, the eye adapts faster to the dark than it does to the light, which is why it takes so little time to get accustomed to a dark movie theater, yet it's downright painful, for quite a while, to re-emerge from the darkened theater into the sunlight.

But I don't think I'm seeing darker shadows than I did before simply because I'm forcing my eyes to adapt to higher average levels of light in the image. I think the sub-contrast boost has, all by itself, changed the way my Samsung computes output luminances based on input code values.

Fundamentally, this computation is a mapping operation. Each input code (after being twiddled by DNIe, etc.) is mapped to an output luminance by virtue of (I continue to assume) a table lookup. I'm using a sub-contrast setting of 150, which seems to be the highest setting that actually boosts light output from the screen. My main "contrast" control is set at 100, its maximum. Somehow, the 150 and the 100 get combined together to determine how a code-216 input pixel gets displayed.

Somehow, the 150 and the 100 also work together to determine at what luminance level, say, a relatively dark code-40 input pixel is displayed.

The function by which such determinations are made just may be complex enough that changing the 150 or the 100 will have surprising effects on the underlying (ersatz) gamma of my DLP TV.

These surprising effects on gamma may, furthermore, be different for sub-contrast changes than for "main contrast" changes.

And that's about as far as I think I can take this subject of "contrast tailoring" or "luminance tailoring" as it applies to my Samsung DLP-based HDTV. Without actually using test signals and expensive instruments to find out what's really going on, I may never know.

***

I' now like to make some additions. First of all, last night after I'd written the above, I watched the second half of Star Wars II: Attack of the Clones. I decided the picture was improved just a tad by lowering the sharpness setting all the way down to 0 (it had been at 20 for Star Wars I: The Phantom Menace). That kept various edges from looking "too hard," given that DNIe was on. It may also have been responsible for keeping highlight details such as the whites of eyes from looking "too bright."

And I reduced the main "contrast" setting from 100 to 85 to keep from searing my eyeballs. I found, a bit to my surprise, that this second adjustment had no apparent effect on the TV's luminance contour at the low end of the IRE range. The apparent "gamma" of the image remained deep and dark. The color saturations in dark scenes and in the dark parts of bright scenes remained satisfying.

Here, then, is a table summarizing all I have done:


BEFORE

AFTER

GAMMA

4

4 (UNCHANGED)

SUB-CONTRAST

115 (ORIGINALLY 90)

150

DNIe
OFF
ON

MAIN CONTRAST

100

85-100

BRIGHTNESS

50

58

SHARPNESS

IRRELEVANT

0-20

COLOR
65
47


Strictly speaking, these values apply only to 480p input from my DVD player. There are subtle differences between that picture and the ones I get via 720p/component and 720p/DVI from my cable box. Still, even if some of the actual settings differ, the general ideas are the same:

First, as a group these controls and settings interact in rich and surprising ways.

Second, working with them all as a group can effect what amounts to a change in the TV's "gamma curve." (Even though GAMMA itself is left untouched.)

Third, in general, sub-contrast must be maxed out.

Fourth, with sub-contrast maxed out, main contrast can be used to rein in the candlepower of the display for different source material.

Fifth, DNIe is our friend. Use it.

Sixth, to keep DNIe from being too assertive, turn sharpness all the way down. (For some source material, upping it from 0 to a low value — say, 20 — may be indicated.)

Seventh, "standard" settings to color and brightness, à la what Avia recommends, work fine.

Note that GAMMA and sub-contrast are accessible only in the service menu. The others are accessible only in normal user mode. That's why it's hard to test the unexpected interactions among various combinations of these settings ... especially since going into the service menu bypasses the user settings until you go back into normal user mode and switch from Dynamic mode (the default while in the service menu) back to Custom settings.

So to a certain extent there is a limit to how scientific I can get in detailing (much less understanding) how these items interact. But interact they do, I firmly believe, in unexpected ways covered in no textbook or enthusiast magazine that I know of.

Saturday, June 25, 2005

HDTV Quirks, Part III, Contrast Ratio

I've admittedly been going on and on about gamma, the TV display characteristic which determines how brightly all the various levels of luminance appear on the screen. (My most recent installment in that vein was Tweaking Gamma on My Samsung DLP, Part II.) Higher gamma makes shadows seem deeper, while lower gamma opens them up for easier inspection — that's about the simplest way I know to explain the gamma concept.

The reason I investigated gamma at all is that I was dissatisfied with how my Samsung 61" DLP-based HDTV, a rear-projection unit, displayed low-luminance signals, by which I mean low-lit images that are much, much closer in overall luminance to the 0-IRE "black level" of a video signal than to 100-IRE peak white. In other words, there was something uninspiring about how the picture looked when there was nothing particularly bright in the scene.

So I assumed that the proper remedy might be changing the GAMMA parameter in the Samsung's service menu. I proceeded to play with the GAMMA settings and to learn quite a bit about the gamma concept in theory and practice ... only to find the original "factory" setting, GAMMA 4, to be the most ideal in the end. Though GAMMA 4 gave me the lowest "true gamma" of any available setting save 3, (apparently intended for letting external gear determine the gamma curve), and even though that gamma value was (the best I could tell) far lower than what the "experts" say is correct, it was what I liked best.

It seemed best, that is, once I had boosted sub-contrast in the service menu up to its absolute practical maximum, meaning that peak video white was now being output with the greatest light intensity the TV was capable of. Though that had no effect on the TV's 0-IRE black level, it did have a proportional effect on every luminance level above 0 IRE. Which means that video information at a relatively dark 20 IRE, say, was being displayed brighter than before.

GAMMA 4 really shone with boosted sub-contrasty when I watched the Star Wars I: The Phantom Menace DVD. In fact, I'd say that was the most thrilling home video experience I've had to date. There were lots of dark scenes which looked just fine, and lots of bright ones that pretty nearly seared my retinas, looking very cinema-like indeed. Scenes with bright pinpoints of stars set against black sky looked truly convincing. And so on.

Which seems to mean that what resembles a "gamma problem" can really have a "contrast-ratio solution."


A TV's contrast ratio is the ratio between how bright 100-IRE peak video white is displayed and how dark 0-IRE true video black is rendered. When the main contrast control, the one in the ordinary user-accessible menu, is raised to its maximum setting of 100, the TV's entire available contrast ratio is put to use — assuming, that is, that the service menu's sub-contrast setting isn't holding it back.

By boosting sub-contrast, I made the Samsung's effective contrast ratio equal its maximum available contrast ratio. The percentage by which I boosted sub-contrast and thus the luminance of peak white was, I'm guessing, 30 percent. (It's hard to tell exactly what the percentage gain was, because the numbers corresponding to sub-contrast settings don't necessarily track linearly with luminance output.) Taking that number at face value, that meant that every luminance level above 0 IRE was also rendered 30 percent brighter than before. Even, say, lowly 10 IRE went from being super-super-dim to just plain super-dim.

(At this point in the discussion, I am going to start putting some of the text of this post as originally written in red. That means I changed something later on which made what the text says obsolete. For example, the first sentence of the next paragraph says I use, in effect, my Samsung's Dynamic-mode user preset. That was true when the sentence was written. It is no longer true.)

Along with the change to sub-contrast I also switched to employing, in effect, the TV's user-preset mode labeled Dynamic, which meant that the color saturation was boosted well above what the Avia test disc says is right. And I started turning Samsung's proprietary DNIe signal processing chip on, rather than leaving it off — or, in the case of the Samsung's DVI digital input, it meant realizing that DNIe was always effectively on and could never be switched off for that input. Along with learning to appreciate DNIe's good points, I also learned, to my surprise, that a turned-down user sharpness control could nicely ameliorate DNIe's tendency to make the picture look "overenhanced."

Furthermore, I came to grips with the fact that my TV looks better in a well-lit room than in a dimly lit one, owing in part to the fact that its best imitation of "black" is really only a dark gray. Along these lines, I note for what it's worth that I was watching Star Wars I with the brightness or black level control left at a setting of 50, though Avia wants me to raise it to the high 50's to keep 10-IRE dark gray from being "swallowed" by 0-IRE black. (Perhaps later on I will try increasing the brightness/black level and seeing how that affects my subjective evaluation of the picture.)


What is it about a higher contrast ratio, though, that makes the Samsung's picture look so excellent? The answer is probably not a totally objective one; it's partly "psychovisual," if there is such a word. Admittedly, too, I don't fully know whereof I speak, but I'm led to believe that the perceptual apparatus of human vision is capable of ratcheting itself up and down, depending on how much illumination it's getting. It "wants" to interpret the contrast ratio of any photograph, computer graphic, film image, or TV image as being like that typically found in the real world, though for technical reasons no reproduced image can even come close. But the "ratchetability" of vision makes up for that.

Projected photographic images typically do contrast better than "digital" TVs. The contrast ratios of the best "digital" TVs usually aren't as impressive as that found in most cinemas, owing to their unprepossessing black levels. CRT-based displays usually have the best contrast ratios. Though they can't produce huge amounts of peak-white output, their black levels are so low — to the point of unmeasurability, in some cases — that their contrast ratios can be superb. They can even exceed the 10000:1 contrast ratio of the best cinema projection (see the article by Raymond Soneira downloadable here), having a 30000:1 contrast ratio or "dynamic range" in some instances.

If the eye couldn't ratchet, i.e, adapt to the dark, all TVs would look terribly dim and washed out. But the eye's response can ratchet down such that it "thinks" a TV's unimpressive-by-the-numbers contrast ratio looks pretty "okey-day," to quote Jar Jar Binks in Star Wars. That is, it does so as long as the TV's actual, objective, numerical contrast ratio isn't too weak.

When the TV's black-level output isn't all that correct, as happens with not only DLP-based TVs but also LCD, plasma, and other "digital" panels, then its peak-white output needs to be concomitantly high. That's what can keep the contrast ratio in the right ballpark for a convincing on-screen image.


According to Charles Poynton in Digital Video and HDTV Algorithms and Interfaces, the eye "can discern different luminances across about a 1000:1 range" (p. 197). That is, the highest contrast ratio actually "needed" by the eye is 1000:1. But the ratio between diffuse white and the black of a TV image need be no higher than 100:1.

The "decade" (order of magnitude) of contrast ratio separating 100:1 and 1000:1 in a TV image apparently is there in part to allow very localized, directionally reflected gleams — the so-called "specular" highlights in an image — to look properly bright, compared with the look of diffuse white that is spread over larger areas of the image (see p. 83). If specular highlights were reproduced in a TV image at the proper 10:1 ratio vis-à-vis diffuse white, then the 1000:1 ratio would be relevant to TV specification. They aren't, though, with little apparent ill effect — which is why diffuse white isn't encoded for television at a mere 10 IRE!

Thus 100:1 is the figure of merit for a TV's contrast ratio, says Poynton. (Other sources up that conservative figure to "several hundred to one.") That the figure is so low is indeed fortunate because, he says, "In practical imaging systems many factors conspire to increase the luminance of black, thereby lessening the contrast ratio and impairing picture quality. On an electronic display or in a projected image, simultaneous contrast ratio [than evidenced in any one frame of the image] is typically less than 100:1 owing to spill light (stray light) in the ambient environment or flare in the display system."

Says Poynton, accordingly, cinema can furnish a "simultaneous" contrast ratio — that in a single frame of film, as opposed to successive or "sequential" frames — of 80:1. (Sequential film contrast ratios can reach 10000:1.) Meanwhile, a typical TV in a typical living room can have a simultaneous contrast ratio of just 20:1 (see Table 19.1, p. 198).


The contrast ratio of a TV can be measured in several ways, I have learned from various home-theater enthusiast magazines. Most of these ways report ratios much higher than Poynton's conservative 20:1 or 100:1.

One way to measure contrast ratio in a TV display is "full-on/full-off," the principal method used by Home Theater Magazine. A brief discussion of the method can be found here. the basic idea is that a full-field 100-IRE test pattern — i.e., every pixel is at "peak white" — is displayed and the amount of luminance produced by the screen is measured. Then the TV's luminance output for a full-field 0-IRE pattern — every pixel at darkest "black level" — is metered. The two measured luminances, expressed in foot-Lamberts (ft-L or fL), are used to form the contrast ratio.

For instance, HT measured the InFocus 7205 DLP Projector's 100-IRE output at 22.37 ft-L (see this web page). 0 IRE was at 0.022 ft-L. The contrast ratio of 22.37 to 0.022 equals approximately 1,017:1.

Some equipment testers measure luminance not in foot-Lamberts but in candelas per square meter (cd/m2). To cconvert ft-L to cd/m2, multiply by 3.43. For example, 22.37 ft-L times 3.43 equals roughly 78 cd/m2.

More modest measured contrast ratios are typically obtained by means of the ANSI contrast-ratio measurement, in which eight 0-IRE "video black" rectangles are checkerboarded among eight 100-IRE "peak white" rectangles on screen, and the average difference in luminance output is metered and converted into a ratio. The ANSI ratio measured by HT for the InFocus 7205 DLP Projector was 355:1 — still a lot higher than 20:1 or 100:1.

Most of the newer "digital" TV technologies appear to exceed 20:1 or 100:1 quite easily. Amazon.com's description of my particular TV model, Samsung's HLN617W, located here, mentions a 1000:1 contrast ratio. Another review, located here, says my TV's contrast ratio is fully 1500:1. Neither of these figures is apt to represent the more conservative ANSI measurement, be it noted. Even so, I wouldn't be surprised if the Samsung's measurable ANSI contrast ratio isn't way up there in the high triple figures.


I don't necessarily agree with HT that full-on/full-off contrast ratios are all that useful. I think the ANSI method is much more useful. A third possible method is to replace the full-on 100-IRE field with a 100-IRE window test pattern, in which the middle 25% of the screen is occupied by a rectangle at peak white, surrounded by nothing but video black at 0 IRE. The luminance at which the 100-IRE window is displayed is used as the top term in forming the contrast ratio. The bottom term is again derived from a 0-IRE full field.

The reason either the ANSI method or the 100-IRE window is better, in my opinion, is that some displays and projectors are designed to dim luminance somewhat for full fields that contain a lot of very bright pixels. Plasma flat panels, for instance, throttle back on electrical power usage for ultra-bright scenes, in order to restrain the operating temperatures of the panel. And many front projectors have an automatic iris or aperture control which does the same sort of thing in order to avoid blinding viewers with sudden brightness increases.

For the eye adapts to the amount of light it's seeing by changing the diameter of its pupil, by modifying the amounts of the various pigments present in the cells of the retina, and by reconfiguring "neural mechanisms in the visual pathway," Poynton says (p. 196). Its 1000:1 or 100:1 usable contrast ratio is thus like an elevator that can "ascend" or descend the "shaft" of visible luminance.

This shaft visits eight "floors" above the "basement" level where light is just too dim to be seen at all. Each floor represents a tenfold increase in luminance, so the top (eighth) floor represents luminance that is 100 million (108) times that of the basement.

As the eye is exposed to different levels of luminance, it adapts: the elevator, which itself represents only a 100:1 contrast range, ascends or descends. Ignoring, accordingly, the upper "decade" of the 1000:1 contrast ratio the eye is capable of accepting, and restricting attention to the 100:1 ratio he claims is really what's important in television, Poynton makes this point: "Loosely speaking, luminance levels less than 1% of peak [I assume, diffuse] white appear just 'black'" (p. 197).

Once the eye adapts to a TV's inherent dimness, a TV screen "looks" just as bright as nature would outside the window, if the curtains were opened. That is, once the eye has adapted, any image source that provides a (in Poynton's book) 100:1 contrast ratio or better turns luminances less than 1 percent of the maximum into "black."

It would seem to follow that my boosting sub-contrast, which raised the applicable maximum light-output capability of my Samsung TV, turned its very darkest grays into pitch black, as far as my eyes were concerned. And a lot of slightly lighter grays moved closer to the magic 1% "black" cutoff, as it were, and appeared darker to my eyes.

Put another way, the sub-contrast boost I made effectively sent my eyes' two-decade luminance elevator "up a floor" (or perhaps just part of a floor). The number of low-IRE levels of dark gray which my eyes could not distinguish from true black thereby increased.


This effect may have been abetted by what Poynton calls the eye's "contrast sensitivity." This, the lowest discernible "ratio of luminances between two adjacent patches of similar luminance," represents a "threshold of discrimination" of human vision (see pp. 198ff.).

If two fairly dark patches of similar, but not quite equal, luminance are set against a fairly light background, the relatively bright "surround luminance level ... fixes the observer's state of adaptation." When the surrounding luminance falls in a broad range that is fairly bright but not too bright, the eye can distinguish between dark grays that are just 1% apart in luminance level. That might correspond to the difference between 10 and 10.1 IRE in a video signal. Put another way, the critical contrast ratio is just 1.01:1.

But if the surrounding luminance is made either very low or very high, the eye's ability to distinguish among subtly different dark grays diminishes a bit. I'm not sure this subtle change in contrast sensitivity with very low or very high surround luminance is enough to explain anything I happen to be "seeing," but I mention it just in case it does.

Still, the fact remains that there is a limit to human contrast sensitivity, such that subtly different luminances in the range of 1% of the white level to which the eye is adapted cease to be distinguishable as the white "surround" level is raised. When I raised the TV's white level via boosting its service-mode sub-contrast setting, and when I also turned on the room lights, I made formerly distinguishable dark greys look like they were black. This effect was only enhanced by my keeping user brightness "too low" at 50 instead of, say, 58.


What I seem to be saying is that a TV such as mine with a questionable (because way too high) inherent black level looks best when one turns that "lemon" into lemonade.

When the black level of a "digital," non-CRT-based TV isn't anything to write home about, as it usually isn't, try using various strategies to get the TV to "swallow" the most subtle of its shadow details:

(1) Raise user-accessible contrast and service-menu sub-contrast as high as possible, short of "crushing" peak whites. If you can't distinguish 98 IRE from 100 IRE, you've got crushed whites. The relevant Avia test patterns are set up to reveal white crush, if it occurs. On my Samsung, luckily, there appears to be no contrast level whatsoever at which white crush actually does enter the picture.

(2) Keep the viewing room relatively brightly lit, not darkened as you would normally expect. But avoid having so much ambient light that it bounces off the TV screen and impairs black levels even further. Some aficiondos recommend putting indirect lighting behind the TV for this purpose.

(3) Experiment with user brightness settings that are lower than Avia recommends. If brightness is set "too low," anything below (say) 10 IRE will look like 0 IRE — but the overall picture will probably look like it has more contrast, which pleases the eye.


Another factor in my current situation is that I am using a color saturation setting of 65, as in my Samsung's default Dynamic mode ... where Avia "recommends" just 42.

This "wrong" setting of my TV's color control is intimately related to the "wrong" sub-contrast and user brightness settings I am using, as well as the "wrong" amount of ambient room lighting. How it is related is suggested by something Dr. Raymond Soneira writes in the September 2004 issue of Widescreen Review. His article, "Display Technology Shoot-Out, Comparing CRT, LCD, Plasma, & DLP — Part I: The Primary Specs," may be downloaded here.

Dr. Soneira is head honcho at DisplayMate, marketers of computer software which generates test patterns for the scientific measurement of television displays' capabilities using professional instruments. He knows whereof he speaks.

With respect to the topic of achievable black level, he calls it "the [TV's] capability of suppressing light output." He says of a "poor black level" that it "lifts the bottom end of the display's intensity scale and introduces errors in both intensity and color throughout the entire lower end of the scale, not just at the very bottom. All displays produce some light in the form of a very dark-gray when asked to produce a black. This background light is added to all of the colors and intensities that the display is asked to produce. This washes out the dark grays and also the dark colors. For example, dark reds will appear as shades of pink."

Taking that logic a step further, we can conclude that shades that start out at a light-to-moderate pink — such as many flesh tones — can become downright colorless when they are cast in deep shadow. Skin in low-light situations can turn gray, if the TV's inherent black level is too high.

And that's one of the main things that was bothering me in the first place about my Samsung's shaky low-light, near-black renditions. Certain portions of certain faces were just too gray.

I think that's why I'm presently enjoying a color saturation setting — with my user color control set at 65 — that's nominally way too high. Put briefly, it offsets the "gray-face problem." That is, it balances out the tendency of my set's inadequate black level to wash out color in low-light situations, forestalling an effect which I find personally quite objectionable. Even if the color picture is nominally too saturated at higher IRE levels, I don't find that fact particularly objectionable, or even noticeable.


All of which points up an interesting overarching principle: the rules be damned. The rules prescribe buying a TV whose black level is so low it's hard to measure it with instruments, and then watching it in a pitch-black envorinment (or nearly so). The TV's brightness or black level control needs to be set ever so carefully ... of course, after the TV has had its grayscale and gamma curve professionally set up. Then the TV's color and tint controls need to be tweaked to scientific perfection. And all of the TV's "signal-improving" circuits along the lines of Samsung's DNIe must, of course, be disabled. When all of that has been done, you may be lucky enough to obtain a dynamite picture.

But what happens if the TV can't suppress its light output all that well, for purposes of rendering "correct" video black? What if the TV is placed in a multipurpose viewing environment that is fairly well-lit? What if those two compromises call for yet others, such as turning DNIe on or resorting to "incorrect" Dynamic-mode user settings?

Then the rules be damned, I'm slowly learning. Trust your eyes, Luke!

***

I turned the preceding material red because I now want to reconsider it. Last night I put on the Epsiode II: Attack of the Clones DVD. (I'm on a Star Wars kick.) This "film" is not actually a film at all, since George Lucas "filmed" it in HDTV!

That is, he shot it in a digital video format called 1080p24, which means his camera created progressively scanned (not interlaced) video frames, 1,920 pixels across by 1,080 pixels high, at a rate of 24 frames per second (the traditional rate for film frames). He recorded what he was shooting on some kind of digital video recorder or computer, not on celluloid. The only time celluloid entered the picture was after the movie was completed. The movie's video frames, in the reverse of the customary process, were transferred to film for projection in traditional cinemas. Some cinemas, however, used digital video projectors, avoiding film altogether.

When it came time to make the Clones DVD, the original 1080p24 video was simply downconverted to the necessary 480i (or is it 480p?) and then MPEG-2 encoded. The result is a superb, reference-quality DVD with virtually no video artifacts of any kind, stunning colors, deep blacks, excellent contrast, etc. My only complaint is that the total absence of film grain is downright eerie!

But as I was first watching Attack of the Clones, I initially felt dissatisfied with the image ... until I bethought me to try the Avia-recommended settings for brightness and color. I turned color down from Dynamic mode's rather burly 65 to 47, and I turned brightness up from 50 to 58. That made the picture about as close to perfect as I ever expect to see!

So I eat the words I wrote above, now crimsoned in shocking red. Maybe the "rules" for acheiving a good video image ought not be cast aside quite so blithely.

Monday, June 20, 2005

Tweaking Gamma on My Samsung DLP, Part II

In Tweaking Gamma on My Samsung DLP, Part I, I discussed changing the GAMMA parameter in the service-mode menu of my Samsung 61" DLP rear-projection TV. This in turn was based on my general discussion of (shall I call it) the gamma "concept" in HDTV Quirks, Part II, Gamma. Now I'd like to extend my earlier remarks to show why gamma in practice is much more complex than anything I've said thus far.

As a concept, gamma ought to be renamed "contrast tailoring," or, better still, "luminance tailoring." Once you've set the "black level" at which your TV reproduces a minimum-luminance 0-IRE input signal, and once, by means of the contrast control, you've set the "peak brightness" used for a maximum-luminance 100-IRE signal, the question becomes: at what levels will the TV reproduce all the various shades of gray in between?

Gamma is nominally the greater-than-1.0 exponent in a simple input-voltage-to-output-luminance function which determines the TV's response to various IRE levels, its "shades of gray" as it were:

(output luminance) = (input voltage)gamma

And that is what gamma is, on an old-fashioned "picture tube" — a CRT-based TV — with no digital circuitry.

But today's fancy TVs have digital circuitry. Whether they be plasma panels, LCDs, DLPs — or even CRT-based HDTVs, nowadays — they don't content themselves with such simple gamma functions as that one. They use their digital circuits to alter the "gamma curve" and thereby to custom-tailor the luminance response of their screens.

Boiling that down: any two pixels in a digital video image have discrete luminance (or Y) values. These can be stated either as volts or in IRE units between 0 IRE and 100 IRE. Let's say Pixel #1 has 20-IRE luminance, and Pixel #2 has 40-IRE luminance. How bright will the pixels appear to be on the screen, relative to one another and also to the 0-IRE and 100-IRE extremes?

We can ask the same sort of question when the two pixels are at, say, 70 IRE and 80 IRE, or at 45 IRE and 60 IRE. In fact, any two pixel luminance levels could be arbitrarily selected: 5 IRE and 95 IRE; 50 IRE and 51 IRE, etc. How such broad or subtle contrast gradations would be rendered on a TV screen was in the good old days mainly a function of one simple exponent called gamma. Now they're more likely a function of complex luminance tailoring.


When it comes to Samsung displays like my own, luminance tailoring is done in part by a digital circuit or chip called DNIe™. DNIe ("Digital Natural Image engine") is Samsung's elaborate melange of digital video signal processing algorithms. DNIe throws "multiple optimizers and enhancers" at the input signal, one of which is its vaunted "contrast enhancer."

The "Compare DNIe with Conventional" portion of the Samsung online presentation about DNIe which you can find here shows a simplified graph which seems to imply that the gamma curve selected via the service menu GAMMA parameter is intentionally "bent" by DNIe (unless, of course, the DNIe function is turned off in the user menu). According to the graph, light output is reduced at the lower IRE levels. At (let's say) 50 IRE, light output returns to "normal." Then at higher IRE levels light output is increased.

My interpretation: a 20-IRE signal is lowered to have a luminance output of maybe what a 10-IRE signal would ordinarily have. Meanwhile, 80-IRE output is boosted to the level of perhaps 90 IRE. This exaggerates highlights and compresses lowlights in the picture, and gives the picture more "snap."


DNIe also changes measurable gamma, at least insofar as gamma can be gauged by means of the Avia gamma chart, one of the test patterns provided on that handy setup-assist DVD. This chart juxtaposes a range of gray swatches of different IRE values against a background of fine black and white horizontal lines. When you squint, the lines merge into a level of gray, and you simply decide which gray swatch matches it in brightness. Then you read off the gamma value printed over the swatch.

With an old-style tube-type TV, that tells you the gamma of the picture tube ... which is all you really need to know.

With modern "digital" TVs, though, think again. When digital video processing makes the TV's "gamma curve" adhere to something other than the simple formula shown above, it's not quite clear what you're actually measuring with the Avia chart.


Be that as it may, I find I can make this Avia test pattern do some interesting "tricks" by tweaking various user adjustments on my Samsung DLP. What's more, the chart does more tricks when DNIe is on than when DNIe is off. In fact, Avia reports much higher gamma when DNIe is on than when it is off.

First of all, gamma as measured by the Avia chart goes up when the user brightness setting is lowered — and this is so whether or not DNIe is on. On a "regular" CRT, changes in brightness make no difference to gamma.

Next, lowering the user contrast setting increases Avia-measured gamma. CRT gammas don't change with altered contrast-control settings. This oddity likewise occurs independently of whether DNIe is on or off.

If and only if DNIe is on, increasing the user sharpness control on the Samsung causes Avia-measured gamma to go up a tad. Decreasing sharpness lowers measured gamma somewhat. (The sharpness control does not affect the dedicated Avia sharpness pattern, however, since it doesn't visibly increase the detail of the picture. Not does it affect measured gamma at all when DNIe is off. Go figure.)

All of the above applies to all of the Samsung's inputs — composite video, S-video, component video — except DVI. For its DVI input alone, one apparently can't turn off DNIe ... and so I find that Avia-measurable gamma changes not only with brightness and contrast adjustments, but also with sharpness tweaks in DVI mode.


I hasten to emphasize that it's not really clear what meaning, if any, Avia-measured gamma has under such circumstances. When the input-voltage-to-output-luminance function is not a simple curve with a single exponent called gamma, all bets are off. In a "digital" TV, the "gamma curve" is really a lookup table (or tables) used to translate any given input luminance value into an associated output luminance. Such an ersatz, table-based "gamma curve" may have all sorts of oddities which the designers of the Avia chart never envisioned.

Still and all, it's obvious that changes to user brightness, user contrast, and (with DNIe on) user sharpness do something to the way the Samsung tailors its contrast or luminance response. (And, by the way, changes to user color and tint controls have no such side effects.)

That means, for example, that when I go into the service menu and changed GAMMA 4, the factory setting, to what I imagine is a "higher" true-gamma value associated with GAMMA 5, then when I compensate for the change somewhat by increasing user brightness above the middle, or 50, position which I use with GAMMA 4, the increased brightness setting may lower overall gamma back closer to what it had been before.

If I go ahead and change contrast and/or sharpness, too — assuming DNIe is on — I now know that that will affect measurable gamma, too. Or, as I'd rather put it now, it will alter the way my TV tailors its contrast or luminance response.

It's all quite perplexing and bewildering, isn't it?

Furthermore, it makes me wonder about all those enthusiasts who post to various forums how they swear by GAMMA 0, or GAMMA 5, or GAMMA whatever, for Samsung DLPs. Well, what type of input connections were they using? What were the contrast, brightness, and sharpness settings? Was DNIe on? Though they don't affect Avia-measurable gamma, what color and tint settings were in use?


It gets even weirder. After much futzing around with service-menu GAMMA and the various user-menu settings, both with DNIe on and with DNIe off, I find I have something important to add to the above. I'm totally lost!

>Well, not quite lost, but close to it. The problem is that changing GAMMA also changes the overall brightness or dimness of the picture. I'm not just talking about those parts of the picture at relatively low IRE levels. I'm talking about the whole picture, from video black at 0 IRE right on up to 100-IRE peak white.

That's not "supposed" to happen. Changing the "true gamma" exponent of a TV's luminance-response curve is supposed to revise the extent to which the curve bends or sags at luminance levels between 0 IRE and 100 IRE. It's supposed to leave the TV's light output levels at those two endpoints of the luminance-response curve alone.

In reality, on a digital TV like my Samsung, that's not what happens.

The effect of altered GAMMA on light output at 0 IRE is slight, thank goodness. But the effect at the peak-white, 100-IRE level is pronounced. In general, changing to a service-menu GAMMA setting that gives a higher Avia-measured gamma reading lowers peak-white output. Changing to a service-menu GAMMA setting that gives a lower Avia-measured gamma reading raises peak-white output.

Thus, GAMMA 4 has a relatively low Avia gamma reading and a relatively high peak-white output. Changing to GAMMA 2 with it's increased Avia gamma reading reduces the TV's peak-white luminance output. It also reduces its luminance output across the entire tonal scale from 0 IRE on up. That's why the picture gets so much dimmer overall.


There is a way to counter that source of added confusion. I haven't tried it yet on a systematic basis, but I have fiddled with it briefly. In the service menu along with GAMMA are two settings that control the TV's "sub-brightness" and "sub-contrast." In theory, when I change GAMMA, I could also change S_BR(DDP) (for "sub-brightness") and S_CT(DDP) (for "sub-contrast").

Sub-brightness and sub-contrast do in the service menu roughly what the "main" brighntess and contrast controls in the user menu do. Respectively, they set black level and peak-white level (as "gain"). Once black level is anchored aright via main or sub-brightness, raising and lowering 100-IRE peak white via main or sub-contrast pulls all intermediate luminance levels up and down proportionately.

The main and sub- controls interact with one another, naturally. If you raise sub-contrast, for example, you can if you choose offset that by lowering main contrast. (But you do the former in the service menu; the latter is a user-menu function that cannot be accessed from the service menu.)

I gather that, just as you cannot raise main contrast above 100, there is some numeric value beyond which you cannot raise sub-contrast. I believe the top value may be 512, but I'm not sure of this. It may not matter, actually, because there is bound to be a practical top limit to peak-white output that you run into before reaching the theoretical limit.

This is because the "light engine" in a DLP-based TV can produce only so much light. Furthermore, the spectral composition of the illumination shed by the TV's internal lamp will dictate that pushing light output up too high will unbalance the TV's high-IRE grayscale, as one primary color (red, green, or blue) "runs out of steam" before the others do. So all I would really hope to do would be to recover the peak-white output sacrificed by changing to (say) GAMMA 2.


I may try that. Then again, I may not. Yesterday I experimented with setting GAMMA back to its factory value, 4, turning out the lights, and watching the Samsung in near-total darkness. Again, I was watching Vertigo ... it helps to be watching something you're so familiar with, you can't get distracted by the plot. This time I was watching it on DVD, since I was trying to calibrate my user settings with the Avia disc.

After twiddling with contrast, brightness, sharpness, and color settings (tint is unavailable on a component video input such as I use from my DVD player) I came to certain conclusions.

One, it's pretty easy to set the TV's color intensity by eyeball alone, Avia be damned. The point, after all, is to please the eye, not some engineer's notion of "correct."

Two, things look much beter with DNIe on than with it off. It makes "specular highlights" in the scene jump out at you in a way I consider comparable to those in the theatrically projected film I recently saw, Ladies in Lavender. It also enhances detail and visual sharpness in a mostly unobtrusive way, though it can sometimes make vertical (and horizontal?) edges a tad too "hard." And it makes colors somewhat more vivid, but not garish.

Three, the Samsung's sharpness control does nothing at all. With DNIe on, it made no difference to anything I could detect with my eye ... much less gamma per se. That it changes Avia-measured gamma remains something of a mystery, for it does nada to shadow detail in an actual picture.

Four, the absence of a component-video tint control on the TV presents no problem whatever.

Five, getting the brightness control set just right is extremely important. I find the Avia test patterns inadequate for this, for some reason. It's easier to have my DVD player "pillarbox" a 4:3 DVD that contains a "letterboxed" widescreen image — à la Vertigo. Then I find a place in the movie where there is a complete fade to black, and I hit pause to freeze it on the screen. Finally, I adjust the set's brightness until the inset 4:3 "black" frame just blends into the level of the black pillarboxes on the screen. At this point, when I resume playing the movie, the letterboxing bars above and below the actual image cannot be distinguished from the pillarbox bars at its sides. This procedure is extremely easy to do exactly right when I am watching the TV in near-total darkness.

Six, the contrast control can be set at its maximum value, 100 ... or it can be reduced somewhat if bright scenes hurt the eyes.

Doing the above results in what I think is a very good, quite cinema-like picture using the nominal GAMMA 4 setting in the service menu.


Since I wrote the above, I've decided to try GAMMA 0 with boosted service-menu sub-contrast ... and it seems to be giving me the best picture yet.

The rap against GAMMA 0 before was that it dims the picture, overall, even as it raises "true gamma." This was because it lowers the level at which the Samsung TV displays a peak-white, 100-IRE signal, thereby reducing luminance output at all other levels, too.

Raising sub-contrast — called S_CT(DDP) in the service menu — offsets that. I raised sub-contrast to 150, the point at which further increases seem to have no effect. It was at 115 for 480p component video input, 102 for DVI input, since some time ago I raised the two values from their respective "factory" settings of 90 and 82.

So raising sub-contrast to its effective maximum restored the "snap" or "punch" to a GAMMA 0 picture.

What is the "true gamma" of GAMMA 0? It's hard to say, since it's not clear that the Avia Gamma chart reports true gamma for this TV (see above). I can say that toggling among the various GAMMA settings while in the service menu reveals that they do affect Avia-reported gamma.

Remember that when the service mode is active, you are effectively using the TV's Dynamic user preset mode with DNIe on. Given that, GAMMA 0 gives the highest Avia gamma reading, followed by (in descending order) GAMMA 2, GAMMA 1, GAMMA 5, GAMMA 4 (the "factory" setting, giving the set's lowest true gamma over 1.0) ... and, bringing up the rear, GAMMA 3, which I believe implements a true gamma of 1.0.

When I exit the service menu and go back to user mode, GAMMA 0 registers at 1.8 on the Avia gamma chart with DNIe off, 2.8 with it on. These measurements were taken after I had changed away from Dynamic into the Custom preset, set the user contrast control at its maximum of 100 (where it already was), and set user brightness up to about 58 out of 100. The lower (50) brightness of Dynamic mode gives higher Avia gamma readings.

Because user contrast, user brightness, and the enabling/disabling of DNIe change the Avia gamma readings, I imagine the actual readings are unreliable with this TV. Still, I think the direction in which the readings move as service-menu GAMMA is changed are meaningful. I therefore feel safe in concluding that the factory GAMMA 4 setting gives a true gamma much lower than most experts would consider "correct." I imagine moving from GAMMA 4 to 5, then to 1, then to 2, and finally to 0 successively raises true gamma step by step. I'm thinking that GAMMA 0 gets you nearest of all to the "industry standard."

I say that bearing in mind that the TV's luminance response, especially with DNIe on, manifestly distorts what I assume to be an underlying true-gamma curve, à la:

(output luminance) = (input voltage)gamma

There's no other way to explain why user contrast and brightness settings, not to mention toggling DNIe, alter the Avia gamma reading. A distorted underlying curve does not, however, mean that there's no underlying curve at all. Changing GAMMA in the service menu does in fact change the gamma exponent of the underlying curve — that's something I'm taking on faith.

Changing from GAMMA 4 to GAMMA 0 accordingly raises underlying gamma. When sub-contrast is boosted to compensate for the overall dimming of the picture, the result is a picture with deeper shadows ... though it has just as much shadow detail, if user brightness is set properly.


Is there a downside? And why doesn't Samsung set the TV up this way to begin with? These are questions I can't answer definitively. But I'll try anyway ...

Why doesn't Samsung set the TV up this way to begin with? Maybe the "too low" factory gamma choice is intended to open up shadow detail in brightly lit video stores where customers' eye pupils are contracted against all the high-wattage fluorescent lighting. Or something.

As for a technically compelling downside to what I've done, I haven't found one yet. You might wonder (I did) whether boosting sub-contrast so high "crushes" near-peak whites. The relevant Avia test patterns say no. You might also wonder whether it tints the grayscale, particularly in the high-IRE range near peak white. Again, the relevant Avia test patterns say no.

Does it boost video noise? No, not that I can see.

Does it have any negative side effect whatsoever on the picture? I haven't found one as yet. But, as usual, stay tuned.

***

After all that, I just looked at the Star Wars I: The Phantom Menace DVD ... and hopped right back to GAMMA 4.

All the diddling and twiddling I described above seems to have optimized gamma for a dark room, at night, with no outside or inside illumination. Only trouble is, I was just watching the Star Wars DVD in the afternoon, with lots of light coming in through the windows.

GAMMA 0, with its relatively high "true gamma," wasn't cutting it. Too much of what George Lucas put on the screen was getting lost in shadow. My pupil-contracted eyes couldn't see it.

So I went back into the service menu and summoned up the original factory gamma setting, GAMMA 4. At the same time, I left sub-contrast effectively maxed out at 150. Then I exited the service menu and watched Star Wars I in Dynamic mode, with color saturation set significantly higher than Avia says it ought to be.

Dy-no-mite!

Then I made things even better by tweaking the user sharpness setting down from 57 or whatever to 20. Yes ... sharpness, that user-menu control I said before had no visible effect! Turns out it does: a subtle one which tones down the "busyness" that appears at sharp edges when DNIe is on and the sharpness control is set fairly high.

I noticed this "busyness" on the ultra-bright scenes in which Qui-Gon Jinn (Liam Neeson) and company have landed on Tatooine and are walking across the arid landscape toward the city. Their figures as set against the backdrop of the sky had "busy" outlines, possibly related in part to the "edge enhancement" that has been applied to the image on DVD. Phooey on that, I said, and bethought me to try lowering sharpness to get rid of it. I set sharpness at 0, and the "busyness" miraculously disappeared!

The picture even looked a little too soft at that point, so I boosted sharpness back up to 20. The result was about as perfect an image as you could get from a DVD. The pod race sequence looked fantastic.


So. I seem to be proving (at least, to myself) that the true gamma which the textbooks say is "right" — the one associated with GAMMA 0 is pretty close to it — is indeed right for a darkened viewing environment. But GAMMA 4 is right for a sunlit room.

Notably, my Samsung DLP-based rear-projector's Achilles' heel is its relatively high minimum black level. A 0-IRE signal can't produce a truly black screen, only dark gray. This has to do with how the DLP light engine works. White light from an internal lamp bounces off an array of ultra-tiny "micromirrors." Each micromirror swivels independently of the others. The amount of swivel determines the brightness of the pixel associated with any given micromirror. The reflected light passes through a translucent color wheel which spins super-fast. It's red, green, and blue segments color in the image.

Now, if the micromirrors could swivel enough off-axis — which they can't — they might in theory reflect no light whatever, and the image could become truly black. In the real world, some light does bounce off the mirrors and through the color wheel, even for a 0-IRE signal. Plus, the light that hits the swiveled mirrors and doesn't pass through the color wheel scatters, some of it, within the housing of the TV and winds up passing through the viewing screen anyway. That's another reason the screen can't go fully black.

As a result, watching my Samsung DLP RPTV in a fully darkened room shows up its poor-black-level Achilles' heel to an extent not evident when there is a goodly amount of ambient light. With a lot of light elsewhere in the viewing environment, the eye adapts in such a way as to reduce its sensitivity to the small amount of light leakage the TV exhibits with a 0-IRE signal.

In other words, a DLP-based HDTV such as my Samsung doesn't really "want" to be watched in the dark. That may be the real reason why Samsung uses a factory setting, GAMMA 4, whose "true gamma" is unconscionably low. It's as if Samsung is saying, "Forget the old-style rule that TVs ought to be viewed in a dim-lit room. Forget the fact that 'home theater' demands near-total darkness. This TV is for light, airy, multipurpose environments. This TV wants to produce a super-bright, retina-searing, maximully colorful image ... one we think will thrill most consumers, textbooks be damned."

Sunday, June 19, 2005

Tweaking Gamma on My Samsung DLP, Part I

In HDTV Quirks, Part II, Gamma I tried to explain why, hidden in their service menus usually, today's "digital" HDTVs offer a number of gamma choices.

Now I'll go into my experiences fiddling with service-menu GAMMA on my Samsung 61" DLP-based rear-projection HDTV monitor.

My Samsung DLP-based RPTV apparently features some six GAMMA choices, numbered 0-5. The factory setting is GAMMA 4. Right now, I'm experimenting with GAMMA 5 on the DVI input from my hi-def cable box-cum-DVR.

When I'm referring to the Samsung's internal GAMMA numbers, I'll put GAMMA in all-caps to distinguish it from "true gamma" and its values.

These 0-5 GAMMA numbers have nothing to do with mathematical, "true gamma" exponents per se. They're just arbitrary numbers, indexes into locations in the ROM firmware of the TV, I suppose. Nor do higher GAMMA numbers necessarily mean higher, more contrasty gamma. For example, GAMMA 3 obviously yields much lower "true gamma" than GAMMA 2. To my eye, GAMMA 3 probably implements a "true gamma" equal to 1.0 — for when gamma is being set by external gear, I assume.

Nor is it clear that the other GAMMA numbers implement simple
(output luminance) = (input voltage)gamma

functions. When you think about it, digital signal processing can be done by means of lookup tables, rather than math functions with exponents. Take a digital value of input voltage for a single pixel, look it up in a table, and find the output luminance value. If you plot all the generated output values against all the possible input values on log-log axes, maybe you get not a line but a snake.

If the user changes the brightness or contrast setting, that may completely change the "shape" of the lookup table's snake — i.e., the snake may writhe for you.

In my very limited experimentation with Samsung's GAMMA choices so far, I find the different GAMMA numbers (excepting GAMMA 3) give subtly different looks to the picture. The differences are so subtle that I can't really decide if, say, GAMMA 1 is really any different from GAMMA 0 — that type of thing.

The situation is aggravated by the fact that, each time I change GAMMA, I can't always remember exactly what the previous picture looked like. It's too bad there isn't a split-screen approach to comparing two different GAMMA settings.

For that matter, it's too bad GAMMA can't be adjusted by means of user-accessible menus, just like contrast and brightness.



Still and all, I seem to be able to hit pause on the DVR, so that I have a constant image to judge by, and when I toggle among the GAMMA numbers I think I can see subjectively meaningful differences in things like how brightly and with how much color a face appears in an overall-dark scene.

That is, in the lower IRE ranges where I'd expect to see the biggest differences, I do see such differences. With the factory GAMMA 4 setting, such images seem to look "flat." Switching to GAMMA 5 increases the apparent denisty or solidity of a darker, low-IRE image, making it look less flat.

Or so it seems in my early experimentation. More later.

***

After experimenting with GAMMA 5, I decided it wasn't exactly right. It did make mostly dark, low-IRE scenes look less flat than GAMMA 4, but in scenes with a mixture of bright areas and dark, the dark portions still didn't have enough "shadow detail."

So I switched to GAMMA 2. GAMMA 2 appears to implement a lower "true gamma" value than either GAMMA 4 or GAMMA 5, meaning small changes in input signal levels in fairly dark areas of the scene produce more sizable changes in output luminance. This brings details out of the shadows more easily, à la Goldilocks #2, the image on the right above.

The downside is that the image has less "snap" or "punch" overall. What I think is going on here is that snap or punch sells TVs. It wows friends who drop in and glance casually at your TV screen. But for extended viewing, you're better off sacrificing snap for shadow detail. Lowering gamma does that, opening up the shadows in a satisfying way.

I also note that changes to true gamma (via switched GAMMA values in the service menu) seem to mandate altering user-menu brightness and color settings. When I was using GAMMA 5, I felt I had to boost brightness from what had been "correct" with GAMMA 4. But when I switched to GAMMA 2, the "correct" brightness went right back down to where it had been before, and it was color that had to be boosted.

These details of interactions between service-menu GAMMA and the two user-menu settings are probably specific to my particular model of TV. But, as a general rule, I'd say you can expect there to be such interactions.

My Samsung does not make it easy to deal with said interactions, by the way. When you are in the service menu, you don't have access to the user settings — and vice versa. When you go into the service menu, furthermore, the user-menu settings automatically switch over to the TV's so-called Dynamic mode, bypassing whatever Custom settings you may have in force.

Interestingly, it turns out that the default Dynamic mode seems to furnish exactly the right user settings for GAMMA 2, whereas Dynamic mode, with its boosted color level, isn't quite the thing for GAMMA 4 or GAMMA 5.

That makes me think Samsung may have optimized the TV for GAMMA 2/Dynamic mode, but changed to GAMMA 4 for in-store display and the wowing of friends.

***

Another update: I grew dissatisfied with GAMMA 2, for reasons that are hard to put in words, and so I tried GAMMA 0. GAMMA 0 seems to offer a lower "true gamma" value than any of the other GAMMA settings on the Samsung DLP (with the exception of GAMMA 3, which I think has a "true gamma" of 1.0).

Using GAMMA 0, I watched the standard-definition cablecast of Alfred Hitchcock's classic, Vertigo, on TCM last night. This is one of my favorite films; I've watched it countless times. I think I know what it's "supposed to" look like — and that's exactly what it did look like with GAMMA 0. The dark scenes, such as the opening rooftop chase, were dark but accessible to the eye, and the bright, almost washed-out look of the scenes in which James Stewart first begins tailing Kim Novak had, yes, a bright, almost washed-out look. The colors of the flowers in the flower shop where Novak goes to buy her "Carlotta" nosegay were rich ... but not too rich.

Note: I had to boost the user brightness setting from 50 in the default Dynamic mode to 60 to get things looking just right. Again, changes to gamma tend to necessitate changes to brightness, or color, or both.

***

Yet another update: yesterday I went to the movies for the first time in a while, and what I saw on the screen appeared to have much higher gamma than I've been aiming for at home. (I saw Judi Dench and Maggie Smith in Ladies in Lavender, a bit too much on the feminine, emotionally nuanced side for testosterone-poisoned me ... but still a pretty good movie.)

The image on the screen had a lot of dynamic range. The bright stuff was very bright, so much so that a cut to a super-bright scene after an ultra-dark one hurt my eyes, and the dark scenes were really, really dark. In scenes setting backlit darker elements against directly lit bright ones — for example, faces incast shadows against a bright sky — I had to interrogate the barely lit elements with my eyes to pick up their subtle gradations of shadow detail.

If the cinematic "system gamma" or "end-to-end exponent" (the "inverse power function" of the camera negative times the intrinsic gamma value of the print) had not been so high, this would have been unnecessary. Shadow detail would have popped right out at me.

I sat there thinking my Samsung's contrast ratio probably is incapable of giving me that broad a dynamic range between 100-IRE or "peak" video white and the TV's 0-IRE, minimum-light-output "black level." Given that fact of life, still and all I was strongly reminded of the gamma curve I had witnessed when using the service menu's GAMMA 5 setting.

I also noted that many of the movie's skin tones were typically much deeper — more highly saturated, redder — than I had been targeting at home. When Judi Dench, that excellent actress, went all emotional, her face got as red as a red-hot poker. GAMMA 5 provides exactly such rich skin tones.

So I've switched back to GAMMA 5 ... which makes the TV's dynamic range no different, but compresses shadows and intensifies colors. It gives Hitchcock's Vertigo a different, but not unpleasant, look, not unlike that of Ladies in Lavender in the theater.

***

Let me note yet again that changing "true" gamma (via service-menu GAMMA) typically requires changing the TV's user brightness and color controls afterward. I have a DVD, Avia Guide to Home Theater, which makes that a snap. It's designed to help you set up your video and audio environments to something approximating perfection. At some length, it teaches you everything you need to know about home-theater fundamentals ... but I typically skip all that jazz and go right to the basic test patterns.

These patterns let you adjust contrast, brightness, sharpness, color, and tint. The last two adjustments involve matching the intensity of special patches of color when viewed through a supplied blue filter. On my Samsung, though, the tint control is disabled for hi-def inputs ... no biggie, since the test pattern shows the TV's default tint setting as spot-on, anyhow.

Contrast is set using a pattern that places a moving 98-IRE vertical stripe against a 100-IRE background. Many TVs (but not my Samsung) "crush" such subtle differences in high output brightness when the contrast control is set too high, making it impossible to pick out the stripe. In such cases, you simply back contrast down until the stripe reappears. In the case of my Samsung, however, white detail is never crushed, even with the contrast setting maxed out at 100.

Brightness is set with a moving black stripe at (I think) 2 IRE against a 0-IRE backdrop. You lower the brightness control until the stripe vanishes and then raise it slowly until the stripe barely re-emerges. I find this adjustment gives me roughly the same result as putting a letterboxed 4:3 image on the screen and matching the horizontal black bars above and below the framed image to the vertical black bars at the sides of my 16:9 screen.

Sharpness is set using a standard resolution pattern with groups of alternating white and black lines grouped by increasingly narrow widths. You're supposed to adjust the sharpness control so you can distinguish lines of the finest possible pitch ... but without adding any false "haloes" or "ringing" to the picture. Only problem is, the sharpness control on my Samsung inexplicable seems to have no demonstrable effect whatever!

Color and tint are set using a standard "color bars" pattern, slightly modified to juxtapose extra patches of flickering, complementary colors where they can do the most good. When the color and tint controls are properly set (if the tint adjustment is even available), the flashing patches blend into the background when you look at them through the supplied blue filter.

The main drawback to all this is that it really calibrates the TV only with respect to the DVD player and its associated input to the TV. My Bravo D2 DVD player, by V Inc., has its own brightness, contrast, and color adjustments, adding a measure of confusion to the whole deal.

So what I've done is set up the TV/Bravo D2 interface using Avia, and then play my DVD of Vertigo on the Bravo. Meanwhile, I cue up a scene of the version of Vertigo I recorded on my cable DVR: the shot of the flower shop interion after Scottie opens the door works quite well. I pause the playback on this scene, then switch to the DVD, pausing it (hopefully) at the very same frame. Toggling between the two inputs to the TV, I adjust contrast, brightness, and color on the cable-box input to match that of the DVD input.

Not surprisingly, this gets me a pretty good match. Even if the Vertigo that came into my home via analog cable has a lot more noise and video "trash," it looks just the same as the DVD otherwise. Also unsurprisingly, the actual settings of the relevant controls on the TV are not exactly the same ... close, but different. Different signal sources can be expected to need different settings — which is why I am dubious about people who set up one video source using Avia and then slavishly copy those settings to other sources.

HDTV Quirks, Part II, Gamma

Gamma is a key characteristic of TV displays that is not at all easy to understand. It also applies to graphics displayed on computer monitors. Herein, an attempt to explain gamma, starting with computer applications (briefly), and then moving on to gamma in TV and video.

Here are two images, borrowed from this web page about Adobe Photoshop renderings:

Goldilocks #1
Goldilocks #2


Depending on your type of computer monitor, one or the other of these two renditions may look "right" to you, the other "wrong."

The image on the left, Goldilocks #1, is intended for a Macintosh monitor with a relatively low gamma figure of 1.8. Goldilocks #2 has been encoded to look "correct" on a PC (i.e., non-Macintosh) monitor with a relatively high gamma figure of 2.5.

Broadly speaking, gamma in computer graphics is the relationship between how "contrasty" the image-as-encoded is supposed to be, and how it actually looks on a particular monitor. Gamma can be thought of as the monitor's "depth of contrast."

Notice the shadowy areas in Goldilocks #1. Their relatively deep shades look darker and more contrasty than they do in Goldilocks #2. You can't see as much shadow detail, while Goldilocks #2 has much more shadow detail.

If you look at Goldilocks #1 on a gamma=2.5 monitor, she'll probably look too contrasty, and Goldilocks #2 will look "just right."

If you look at Goldilocks #2 on a gamma=1.8 monitor, she'll probably look too pale. Goldilocks #1 will look "just right."

On any monitor, though, the image on the left will seem to have more/deeper contrast, along with stronger color, than the one on the right.

These two renditions also serve to illustrate the subject of gamma as it applies to TVs. In video, gamma describes the mathematical function relating the various possible voltage levels of the input luminance signal to the amounts of light we perceive, coming from the TV screen. That is, gamma relates input voltage to the subjective output luminance the display produces.

Each pixel of digital video has an associated luminance level, Y, giving the black-and-white information — a level of gray, ranging from black through white. Each pixel has two other components, Cb and Cr (also known as Pb and Pr) that fill in the color information.

Y, or signal luminance, is composed of a standardized mixture of red, green, and blue color signals: R, G, and B. Cr and Cb are "color difference" signals. For example, Cr is the difference between R (for red) and Y: R - Y. Cb is B (for blue) minus Y. The value of G - Y, and thus G (for green), can be computed when you know Y, Cr, and Cb.

Thus does a color TV transform YCrCb inputs into RGB components and display those. And it's why different gamma values for a display give you more intense or more pastel-like color, as well as greater or less depth of contrast: changing the TV's response to Y implicitly changes its handling of R, G, and B.

The input luminance levels represented by Y can be measured in volts. Voltage is an analog concept, of course, but it can be converted to digital form simply by registering the number of volts present at each particular instant of time, where every "instant" corresponds to one pixel.

Input luminance levels can likewise be stated in standard "IRE" units ranging from pure black (0 IRE) through all the dark, medium, and light shades of gray to pure white (100 IRE). There exists a straightforward formula to convert input voltage levels to IRE units and vice versa.

Sometimes black in the video signal is raised from 0 IRE to the level of 7.5 IRE, but this so-called "black-level setup" makes little difference to this discussion.

Here are a few gamma curves relating input voltage, shown in IRE units, to output luminance as a percentage of the maximum possible light output the TV is capable of rendering (click on the image to see a larger version of the graph):

The straight blue line illustrates gamma = 1.0, not appropriate for TVs or computer monitors. The magenta curve represents the relatively low gamma of 1.8, common on Macintosh monitors, but not on TVs. The yellow curve is gamma = 2.5, as found on many PCs (and not much higher than on most TVs, whose gamma is usually in the range of 2.2 to 2.5).

Look again at the graph above. Each curve relating output luminance (as a perceived amount of light) to input voltage (as a video signal level in IRE units) rises, from left to right, as input voltage rises — but (except when gamma = 1.0) it doesn't rise at a constant rate. The "gamma curve" is instead bowed; it sags downward.

So the rate of rise of the gamma curve of a TV is relatively slight at lower IRE levels. At higher IRE levels the rate of rise increases, and it keeps increasing until the slope of the "gamma curve" reaches its maximum at 100 IRE.


Mathematically, gamma is actually the exponent of the input voltage-to-output luminance function of a display. Remember your high school math? An exponent of 1.0 would make the display's luminance function a straight line. A positive exponent greater than 1.0 would cause the function to curve, sagging downward. The higher the exponent, the greater the amount of curvature or sag.

The greater the curvature — the higher the gamma — the less the change in output luminance will be for any given change in input luminance at low IRE levels. With high gamma, darker images stay darker longer as the amount of light on the subject is gradually increased. There is less shadow detail.

But at higher IRE levels, as light elements in the B&W image approach pure white, the response of the high-gamma display is quite pronounced. Notice how the whites of the girl's eyes stand out more in the leftmost rendering.

Low-gamma displays, on the other hand, respond more rapidly than high-gamma displays do to increasing input voltages at low IRE levels. They respond less rapidly than high-gamma displays to increasing input voltages at high IRE levels.


Keep in mind that gamma is not the same thing as brightness and/or contrast, as affected by the user "brightness" and "contrast" controls of the display. When you set user brightness, you're setting the amount of light the TV will generate when it receives a minimal, 0-IRE, "black" signal.

Once the "black level" is set, changes to the user contrast control can be made to affect video "gain," a linear function of input-signal luminance. The contrast control is misnamed; it's really a "white level" control. It also proportionately affects the levels of all shades of gray above 0 IRE.

For example, the light output for a 50-IRE, medium-gray input signal will nominally be exactly half that for a 100-IRE signal — that is, if you temporarily ignore the nonlinearity of the display's gamma curve, it will be.

Say you adjust user contrast fairly high, such that 100 IRE produces all the light the TV is capable of producing. Ignoring gamma, a 50-IRE signal would be expected to give you exactly half the perceptible luminance of 100 IRE. (Here, when I speak of "perceptible" luminance, I'm intentionally glossing over the fact that the human eye also has a nonlinear response to the TV's light output.)

If you then reduce the contrast control by 10 percent, a 50-IRE signal will have 1/10 less light output than before, just as a 100-IRE signal will have 1/10 less light output than before.

The function of the contrast control is, as I say, linear. But gamma is an exponent, so it makes overall light output nonlinear. For any given black level/brightness setting, the operation of the linear contrast control combines with the display's nonlinear gamma to determine light output from the screen.

That's why you can't always get your TV's picture to look contrasty enough (or perhaps the opposite, pale enough) just by tweaking your brightness and contrast controls. You may have to go into the service menu and tweak gamma.


The idea behind nonlinear gamma originated with CRTs. TV displays using picture tubes inherently have a gamma exponent in the numerical range of 2.2 to 2.5. If the TV signal weren't "gamma-corrected" at its source, it would look way too contrasty on a CRT. Ever since the advent of TV, signal originators have applied an inverse gamma function to the signal so that it will look right on a CRT. This is the process known as gamma correction.

Now look again at the two Goldilocks images above. The leftmost image, intended for display on a monitor with gamma = 1.8, has been gamma-corrected expressly for just such a display. The original image of the little girl as captured perhaps by a digital camera was subjected to an appropriate inverse gamma function so that when it is eventually rendered on a gamma=1.8 display, it will look right.

Likewise, the rightmost image was gamma-corrected with a different inverse function to look right on a gamma=2.5 display.

Nowadays, we have a lot of non-CRT based display technologies: plasma, DLP, LCD, LCoS, D-ILA, etc. Their inherent gamma may be linear (i.e., equal to 1.0, for a straight-line voltage-to-luminance function). Or else, in some other way or ways, their inherent gamma is apt to be other than that of a CRT.

Which means that when they receive a signal that has been gamma-corrected expressly for a CRT, they'd better figure out some way to imitate a CRT. Thanks to digital signal processing, they can indeed take a healthy stab at it.

Theoretically, the engineers who design a TV know exactly what digital function(s) to apply to the input signal to compensate for the non-CRT like gamma of their display. In practice, though, it's not that simple. For one thing, the inherent gamma function of the display may not be as simple as input voltage raised to a single power:

(output luminance) = (input voltage)gamma

If it were, when drawn on logarithmic axes, the curve would look like a straight line. But what if the log-log plot is not a straight line? What if it uniformly curves? Or, what if the logarithmic plot is a wriggling snake with a non-uniform curvature?

Even worse, what if it's a writhing snake whose bends and esses change their shape as the user adjusts the brightness and contrast controls?

And, furthermore, what if the perceptual responses of the human eye are different entirely to an ultra-bright plasma display or LCD panel in a well-lit room than to a relatively dim picture tube in a semi-darkened room?


All these factors play into the gamma we "see" when we turn on our TVs.

To take the last first, the eye responds differently to image contrast in a darkened viewing room than in a brightly lit one. There are a number of factors contributing to this situation, including the so-called "surround effect," which dictates that how light or dark the area around the image is will affect the amount of contrast we see in the image.

Specifically, a completely black surrounding area immediately adjacent to the screen makes all parts of the image on the screen, light or dark, seem lighter. Meanwhile, it decreases the apaprent contrast of the overall image.

Conversely, an all-white surround darkens the image and gives it more apparent contrast. Intermediate surrounds, accordingly, have intermediate effects on percieved image lightness/darkness and contrast.

Another factor affecting that gamma we "see" is the eye's adaptation to the dark. If we walk into a movie theater while the movie is showing and the lights are dim, our eyes at first will see an image with "too little" contrast on the screen. As our eyes adapt to the dark, the contrast in the film image will gradually emerge, until we eventually feel that it's "just right."

The same applies to a TV image. If we feel there's too little contrast, we can dim the lights.

Accordingly, TVs that are intended to by used in brightly lit rooms, in which peoples' pupils are typically contracted and admit less light, need to have the contrast or "white level" control set higher. This is, for instance, why my Hitachi plasma has "Day" and "Night" user settings. The former is for use in a bright environment, the latter in a dim one.

But the brightness of the ambience affects more than just contrast per se. It also affects the gamma we "see": how quickly contrast changes at low IRE levels, well below the TV's maximum white level, "kick in" versus how fast they affect the picture at higher IRE levels.


There is such a thing as "system gamma," also known as the "end-to-end power function" or "end-to-end exponent." A TV image is expected to be decoded with a gamma exponent of (say) 2.5. It is accordingly gamma-corrected (see above for the definition of gamma correction) using an inverse "power function" whose exponent is (say) 0.5. The product of these two exponents, 0.5 and 2.5, is not a linear 1.0, as one might expect, but 1.25 ... about right for TV viewing in a dim environment.

In a really dark environment, such as in a movie theater, the "encoding exponent" (a characteristic of the camera negative film, in this case) is intentionally made higher: 0.6. The color print film has a "decoding exponent" of, again, about gamma = 2.5. The product of the two exponents (0.6 x 2.5 = 1.5) gives proper results in a nearly totally dark environment.

What can be done if the viewing environment is bright? In this case, the end-to-end exponent ought to be about 1.125. If the encoding exponent is 0.5, which we have said is ideal for TV viewed in a dim room, and if the image is viewed instead in bright surroundings on a gamma=2.5 display, the end-to-end exponent will be (as computed above) 1.25: too high, giving too little shadow detail.

There are two possible remedies. One, we can lower the encoding exponent to 0.45, making the end-to-end "system gamma" the desired 1.125. But that implies the program originator knows the viewer will be watching the image in a "too bright" environment ...

... or, two, we can lower the decoding exponent from a nominal gamma=2.5 to, say, 2.25. That makes the end-to-end exponent the desired 1.125 while the encoding exponent remains the standard 0.5.

For several reasons, the exact exponent values I've used for illustrative purposes in the above examples need to be taken with a heavy grain of salt. For one thing, there's often a slight difference between the actual encoding exponent used in a television system and the "advertised" exponent. For technical reasons, TV cameras and associated broadcast gear may have an advertised exponent of 0.45 when the effective encoding exponent is 0.5. (See Charles Poynton, Digital Video and HDTV: Algorithms and Interfaces, p. 85.) So the calculations I've made may not jibe with similar calculations made by others.

Still and all, the overarching point remains clear: watching a TV image in a relatively bright enviornment when it was intended to be viewed in a much dimmer one can make the use of a lower-than-standard display gamma a must. When you add in the contrast-boosting effect of a bright surrounding area immediately adjacent to the screen — a.k.a. the "surround effect" — you have yet another reason to lower display gamma.


If you invert the above reasoning and apply it to watching a TV image in a quite dark home-theater setting, you find that display gamma may have to be raised to compensate for the darker-than-expected viewing environment.

We can conclude that a TV intended for a brightly lit viewing room needs to have the ability to produce a lot of light at high IRE levels. Often, the TV's high "contrast ratio" is cited in this regard: the ratio between its peak output brightness, measured in candelas per square meter or in foot-Lamberts, and its black level. It also needs to have a lower-than-standard gamma.

A TV intended for a truly dark (not just dim) viewing environment also needs a goodly contrast ratio, but its peak brightness in absolute terms can be less. Typically, CRT-based TVs have less peak brightness than any of the newer display types: plasma, DLP, LCD, etc. But they also produce darker blacks, so their contrast ratio is just as good.

But in a super-dark viewing room, this display's gamma may need to be raised slightly above what would be optimal in a just-dim room. Otherwise, the image may seem washed out and deficient in contrast, even though its peak-to-black contrast ratio is high.

Complicating all of the above is the fact that today's digital TVs typically don't use a simple gamma curve representing voltage raised to a single power that is equally applicable at every IRE level of input. Here is a quote from a recent HDTV review — "Sharp LC-45GX6U AQUOS™ 45 Inch LCD HDTV" in the July 2005 Widescreen Review — that makes the point:

There are no user gamma settings. The gamma measured approximately 2.3 at 10 and 20 IRE, and then rolled off gradually to about 1.65 at 60 IRE. It stayed almost constant above 60 IRE. The low gamma at higher signal levels compresses brightness levels and makes it harder to discern small differences in bright details.

So gamma was measured by the reviewer's instruments at 2.3 at low IRE levels and dropped to a very low 1.65 by the time Y, the signal luminance value, reached 60 IRE. From 60 IRE to 100 IRE it remained at around 1.65. Yet the only fault the reviewer cited related to the low gamma value at higher IRE levels was "some loss of texture on bright surfaces, which appeared smoother than usual."

A term sometimes used for the variable gamma exhibited by modern digital displays is "gamma tracking." Most reviews which mention gamma tracking as such seem to assume that gamma ought to be constant from one IRE level to another. If a TV has other than constant gamma, that is considered bad.

Yet, obviously, the designers of the Sharp LCD display mentioned above, whose list price is a whopping $7,500, don't agree. Nor does the Widescreen Review reviewer, Greg Rogers, have a lot to say about the "weird" gamma tracking of this TV, other than "some loss of texture on bright surfaces" and a general wish that Sharp had incorporated user-accessible gamma adjustments in their TV.


I can only conclude, given all the above, that any preconceived notions we may bring to the table about "proper" or "correct" gamma in a TV display are probably questionable. The gamma that is "right" is apt to depend on the viewing circumstances — ambient lighting, surround effects — as well as the gamma corrrection applied to the transmitted video as it is originally encoded.

Also, modern TVs often have variable gamma figures at different IRE levels. Furthermore, they often have selectable gamma curves (all of them possibly with variable gamma tracking) which can be accessed either from a user menu or the TVs service menu.

And finally, I would be remiss if I didn't mention subjective user preferences. We all like more "pop" from some signal sources, more shadow detail for others. Tailorable gamma would let us pick the gamma curve we like best, on a source-by-source basis. Very few TVs offer that — except, I believe, for ultra-pricey front projectors. Someday, though, as consumers get more finicky, we may see gamma-tailorable user controls in "everyday" TV sets.