Sunday, June 19, 2005

HDTV Quirks, Part II, Gamma

Gamma is a key characteristic of TV displays that is not at all easy to understand. It also applies to graphics displayed on computer monitors. Herein, an attempt to explain gamma, starting with computer applications (briefly), and then moving on to gamma in TV and video.

Here are two images, borrowed from this web page about Adobe Photoshop renderings:

Goldilocks #1
Goldilocks #2


Depending on your type of computer monitor, one or the other of these two renditions may look "right" to you, the other "wrong."

The image on the left, Goldilocks #1, is intended for a Macintosh monitor with a relatively low gamma figure of 1.8. Goldilocks #2 has been encoded to look "correct" on a PC (i.e., non-Macintosh) monitor with a relatively high gamma figure of 2.5.

Broadly speaking, gamma in computer graphics is the relationship between how "contrasty" the image-as-encoded is supposed to be, and how it actually looks on a particular monitor. Gamma can be thought of as the monitor's "depth of contrast."

Notice the shadowy areas in Goldilocks #1. Their relatively deep shades look darker and more contrasty than they do in Goldilocks #2. You can't see as much shadow detail, while Goldilocks #2 has much more shadow detail.

If you look at Goldilocks #1 on a gamma=2.5 monitor, she'll probably look too contrasty, and Goldilocks #2 will look "just right."

If you look at Goldilocks #2 on a gamma=1.8 monitor, she'll probably look too pale. Goldilocks #1 will look "just right."

On any monitor, though, the image on the left will seem to have more/deeper contrast, along with stronger color, than the one on the right.

These two renditions also serve to illustrate the subject of gamma as it applies to TVs. In video, gamma describes the mathematical function relating the various possible voltage levels of the input luminance signal to the amounts of light we perceive, coming from the TV screen. That is, gamma relates input voltage to the subjective output luminance the display produces.

Each pixel of digital video has an associated luminance level, Y, giving the black-and-white information — a level of gray, ranging from black through white. Each pixel has two other components, Cb and Cr (also known as Pb and Pr) that fill in the color information.

Y, or signal luminance, is composed of a standardized mixture of red, green, and blue color signals: R, G, and B. Cr and Cb are "color difference" signals. For example, Cr is the difference between R (for red) and Y: R - Y. Cb is B (for blue) minus Y. The value of G - Y, and thus G (for green), can be computed when you know Y, Cr, and Cb.

Thus does a color TV transform YCrCb inputs into RGB components and display those. And it's why different gamma values for a display give you more intense or more pastel-like color, as well as greater or less depth of contrast: changing the TV's response to Y implicitly changes its handling of R, G, and B.

The input luminance levels represented by Y can be measured in volts. Voltage is an analog concept, of course, but it can be converted to digital form simply by registering the number of volts present at each particular instant of time, where every "instant" corresponds to one pixel.

Input luminance levels can likewise be stated in standard "IRE" units ranging from pure black (0 IRE) through all the dark, medium, and light shades of gray to pure white (100 IRE). There exists a straightforward formula to convert input voltage levels to IRE units and vice versa.

Sometimes black in the video signal is raised from 0 IRE to the level of 7.5 IRE, but this so-called "black-level setup" makes little difference to this discussion.

Here are a few gamma curves relating input voltage, shown in IRE units, to output luminance as a percentage of the maximum possible light output the TV is capable of rendering (click on the image to see a larger version of the graph):

The straight blue line illustrates gamma = 1.0, not appropriate for TVs or computer monitors. The magenta curve represents the relatively low gamma of 1.8, common on Macintosh monitors, but not on TVs. The yellow curve is gamma = 2.5, as found on many PCs (and not much higher than on most TVs, whose gamma is usually in the range of 2.2 to 2.5).

Look again at the graph above. Each curve relating output luminance (as a perceived amount of light) to input voltage (as a video signal level in IRE units) rises, from left to right, as input voltage rises — but (except when gamma = 1.0) it doesn't rise at a constant rate. The "gamma curve" is instead bowed; it sags downward.

So the rate of rise of the gamma curve of a TV is relatively slight at lower IRE levels. At higher IRE levels the rate of rise increases, and it keeps increasing until the slope of the "gamma curve" reaches its maximum at 100 IRE.


Mathematically, gamma is actually the exponent of the input voltage-to-output luminance function of a display. Remember your high school math? An exponent of 1.0 would make the display's luminance function a straight line. A positive exponent greater than 1.0 would cause the function to curve, sagging downward. The higher the exponent, the greater the amount of curvature or sag.

The greater the curvature — the higher the gamma — the less the change in output luminance will be for any given change in input luminance at low IRE levels. With high gamma, darker images stay darker longer as the amount of light on the subject is gradually increased. There is less shadow detail.

But at higher IRE levels, as light elements in the B&W image approach pure white, the response of the high-gamma display is quite pronounced. Notice how the whites of the girl's eyes stand out more in the leftmost rendering.

Low-gamma displays, on the other hand, respond more rapidly than high-gamma displays do to increasing input voltages at low IRE levels. They respond less rapidly than high-gamma displays to increasing input voltages at high IRE levels.


Keep in mind that gamma is not the same thing as brightness and/or contrast, as affected by the user "brightness" and "contrast" controls of the display. When you set user brightness, you're setting the amount of light the TV will generate when it receives a minimal, 0-IRE, "black" signal.

Once the "black level" is set, changes to the user contrast control can be made to affect video "gain," a linear function of input-signal luminance. The contrast control is misnamed; it's really a "white level" control. It also proportionately affects the levels of all shades of gray above 0 IRE.

For example, the light output for a 50-IRE, medium-gray input signal will nominally be exactly half that for a 100-IRE signal — that is, if you temporarily ignore the nonlinearity of the display's gamma curve, it will be.

Say you adjust user contrast fairly high, such that 100 IRE produces all the light the TV is capable of producing. Ignoring gamma, a 50-IRE signal would be expected to give you exactly half the perceptible luminance of 100 IRE. (Here, when I speak of "perceptible" luminance, I'm intentionally glossing over the fact that the human eye also has a nonlinear response to the TV's light output.)

If you then reduce the contrast control by 10 percent, a 50-IRE signal will have 1/10 less light output than before, just as a 100-IRE signal will have 1/10 less light output than before.

The function of the contrast control is, as I say, linear. But gamma is an exponent, so it makes overall light output nonlinear. For any given black level/brightness setting, the operation of the linear contrast control combines with the display's nonlinear gamma to determine light output from the screen.

That's why you can't always get your TV's picture to look contrasty enough (or perhaps the opposite, pale enough) just by tweaking your brightness and contrast controls. You may have to go into the service menu and tweak gamma.


The idea behind nonlinear gamma originated with CRTs. TV displays using picture tubes inherently have a gamma exponent in the numerical range of 2.2 to 2.5. If the TV signal weren't "gamma-corrected" at its source, it would look way too contrasty on a CRT. Ever since the advent of TV, signal originators have applied an inverse gamma function to the signal so that it will look right on a CRT. This is the process known as gamma correction.

Now look again at the two Goldilocks images above. The leftmost image, intended for display on a monitor with gamma = 1.8, has been gamma-corrected expressly for just such a display. The original image of the little girl as captured perhaps by a digital camera was subjected to an appropriate inverse gamma function so that when it is eventually rendered on a gamma=1.8 display, it will look right.

Likewise, the rightmost image was gamma-corrected with a different inverse function to look right on a gamma=2.5 display.

Nowadays, we have a lot of non-CRT based display technologies: plasma, DLP, LCD, LCoS, D-ILA, etc. Their inherent gamma may be linear (i.e., equal to 1.0, for a straight-line voltage-to-luminance function). Or else, in some other way or ways, their inherent gamma is apt to be other than that of a CRT.

Which means that when they receive a signal that has been gamma-corrected expressly for a CRT, they'd better figure out some way to imitate a CRT. Thanks to digital signal processing, they can indeed take a healthy stab at it.

Theoretically, the engineers who design a TV know exactly what digital function(s) to apply to the input signal to compensate for the non-CRT like gamma of their display. In practice, though, it's not that simple. For one thing, the inherent gamma function of the display may not be as simple as input voltage raised to a single power:

(output luminance) = (input voltage)gamma

If it were, when drawn on logarithmic axes, the curve would look like a straight line. But what if the log-log plot is not a straight line? What if it uniformly curves? Or, what if the logarithmic plot is a wriggling snake with a non-uniform curvature?

Even worse, what if it's a writhing snake whose bends and esses change their shape as the user adjusts the brightness and contrast controls?

And, furthermore, what if the perceptual responses of the human eye are different entirely to an ultra-bright plasma display or LCD panel in a well-lit room than to a relatively dim picture tube in a semi-darkened room?


All these factors play into the gamma we "see" when we turn on our TVs.

To take the last first, the eye responds differently to image contrast in a darkened viewing room than in a brightly lit one. There are a number of factors contributing to this situation, including the so-called "surround effect," which dictates that how light or dark the area around the image is will affect the amount of contrast we see in the image.

Specifically, a completely black surrounding area immediately adjacent to the screen makes all parts of the image on the screen, light or dark, seem lighter. Meanwhile, it decreases the apaprent contrast of the overall image.

Conversely, an all-white surround darkens the image and gives it more apparent contrast. Intermediate surrounds, accordingly, have intermediate effects on percieved image lightness/darkness and contrast.

Another factor affecting that gamma we "see" is the eye's adaptation to the dark. If we walk into a movie theater while the movie is showing and the lights are dim, our eyes at first will see an image with "too little" contrast on the screen. As our eyes adapt to the dark, the contrast in the film image will gradually emerge, until we eventually feel that it's "just right."

The same applies to a TV image. If we feel there's too little contrast, we can dim the lights.

Accordingly, TVs that are intended to by used in brightly lit rooms, in which peoples' pupils are typically contracted and admit less light, need to have the contrast or "white level" control set higher. This is, for instance, why my Hitachi plasma has "Day" and "Night" user settings. The former is for use in a bright environment, the latter in a dim one.

But the brightness of the ambience affects more than just contrast per se. It also affects the gamma we "see": how quickly contrast changes at low IRE levels, well below the TV's maximum white level, "kick in" versus how fast they affect the picture at higher IRE levels.


There is such a thing as "system gamma," also known as the "end-to-end power function" or "end-to-end exponent." A TV image is expected to be decoded with a gamma exponent of (say) 2.5. It is accordingly gamma-corrected (see above for the definition of gamma correction) using an inverse "power function" whose exponent is (say) 0.5. The product of these two exponents, 0.5 and 2.5, is not a linear 1.0, as one might expect, but 1.25 ... about right for TV viewing in a dim environment.

In a really dark environment, such as in a movie theater, the "encoding exponent" (a characteristic of the camera negative film, in this case) is intentionally made higher: 0.6. The color print film has a "decoding exponent" of, again, about gamma = 2.5. The product of the two exponents (0.6 x 2.5 = 1.5) gives proper results in a nearly totally dark environment.

What can be done if the viewing environment is bright? In this case, the end-to-end exponent ought to be about 1.125. If the encoding exponent is 0.5, which we have said is ideal for TV viewed in a dim room, and if the image is viewed instead in bright surroundings on a gamma=2.5 display, the end-to-end exponent will be (as computed above) 1.25: too high, giving too little shadow detail.

There are two possible remedies. One, we can lower the encoding exponent to 0.45, making the end-to-end "system gamma" the desired 1.125. But that implies the program originator knows the viewer will be watching the image in a "too bright" environment ...

... or, two, we can lower the decoding exponent from a nominal gamma=2.5 to, say, 2.25. That makes the end-to-end exponent the desired 1.125 while the encoding exponent remains the standard 0.5.

For several reasons, the exact exponent values I've used for illustrative purposes in the above examples need to be taken with a heavy grain of salt. For one thing, there's often a slight difference between the actual encoding exponent used in a television system and the "advertised" exponent. For technical reasons, TV cameras and associated broadcast gear may have an advertised exponent of 0.45 when the effective encoding exponent is 0.5. (See Charles Poynton, Digital Video and HDTV: Algorithms and Interfaces, p. 85.) So the calculations I've made may not jibe with similar calculations made by others.

Still and all, the overarching point remains clear: watching a TV image in a relatively bright enviornment when it was intended to be viewed in a much dimmer one can make the use of a lower-than-standard display gamma a must. When you add in the contrast-boosting effect of a bright surrounding area immediately adjacent to the screen — a.k.a. the "surround effect" — you have yet another reason to lower display gamma.


If you invert the above reasoning and apply it to watching a TV image in a quite dark home-theater setting, you find that display gamma may have to be raised to compensate for the darker-than-expected viewing environment.

We can conclude that a TV intended for a brightly lit viewing room needs to have the ability to produce a lot of light at high IRE levels. Often, the TV's high "contrast ratio" is cited in this regard: the ratio between its peak output brightness, measured in candelas per square meter or in foot-Lamberts, and its black level. It also needs to have a lower-than-standard gamma.

A TV intended for a truly dark (not just dim) viewing environment also needs a goodly contrast ratio, but its peak brightness in absolute terms can be less. Typically, CRT-based TVs have less peak brightness than any of the newer display types: plasma, DLP, LCD, etc. But they also produce darker blacks, so their contrast ratio is just as good.

But in a super-dark viewing room, this display's gamma may need to be raised slightly above what would be optimal in a just-dim room. Otherwise, the image may seem washed out and deficient in contrast, even though its peak-to-black contrast ratio is high.

Complicating all of the above is the fact that today's digital TVs typically don't use a simple gamma curve representing voltage raised to a single power that is equally applicable at every IRE level of input. Here is a quote from a recent HDTV review — "Sharp LC-45GX6U AQUOS™ 45 Inch LCD HDTV" in the July 2005 Widescreen Review — that makes the point:

There are no user gamma settings. The gamma measured approximately 2.3 at 10 and 20 IRE, and then rolled off gradually to about 1.65 at 60 IRE. It stayed almost constant above 60 IRE. The low gamma at higher signal levels compresses brightness levels and makes it harder to discern small differences in bright details.

So gamma was measured by the reviewer's instruments at 2.3 at low IRE levels and dropped to a very low 1.65 by the time Y, the signal luminance value, reached 60 IRE. From 60 IRE to 100 IRE it remained at around 1.65. Yet the only fault the reviewer cited related to the low gamma value at higher IRE levels was "some loss of texture on bright surfaces, which appeared smoother than usual."

A term sometimes used for the variable gamma exhibited by modern digital displays is "gamma tracking." Most reviews which mention gamma tracking as such seem to assume that gamma ought to be constant from one IRE level to another. If a TV has other than constant gamma, that is considered bad.

Yet, obviously, the designers of the Sharp LCD display mentioned above, whose list price is a whopping $7,500, don't agree. Nor does the Widescreen Review reviewer, Greg Rogers, have a lot to say about the "weird" gamma tracking of this TV, other than "some loss of texture on bright surfaces" and a general wish that Sharp had incorporated user-accessible gamma adjustments in their TV.


I can only conclude, given all the above, that any preconceived notions we may bring to the table about "proper" or "correct" gamma in a TV display are probably questionable. The gamma that is "right" is apt to depend on the viewing circumstances — ambient lighting, surround effects — as well as the gamma corrrection applied to the transmitted video as it is originally encoded.

Also, modern TVs often have variable gamma figures at different IRE levels. Furthermore, they often have selectable gamma curves (all of them possibly with variable gamma tracking) which can be accessed either from a user menu or the TVs service menu.

And finally, I would be remiss if I didn't mention subjective user preferences. We all like more "pop" from some signal sources, more shadow detail for others. Tailorable gamma would let us pick the gamma curve we like best, on a source-by-source basis. Very few TVs offer that — except, I believe, for ultra-pricey front projectors. Someday, though, as consumers get more finicky, we may see gamma-tailorable user controls in "everyday" TV sets.

No comments: