Monday, June 19, 2006

Gamma, Again! (Part I)

In the past I have tried several times to explain gamma, a crucial but hard to understand characteristic of TV displays (and also computer monitors) which affects contrast, brightness, hue, and saturation and thus how the TV picture looks to the eye. See HDTV Quirks, Part II, Gamma and Powerful LUTs for Gamma for two of my more elaborate attempts. Now I'd like to bring up the subject again and perhaps correct some of the mistaken impressions I left before.

Basically, a TV's gamma is a "transfer function" that decides how much luminance it will produce at various input signal levels:



At every possible level of the input video signal, V, the TV will produce a certain amount of light output. This is its apparent brightness or luminance, L. Mathematically, the functional relationship is:

L = VƔ

where Ɣ, the Greek letter gamma, is the exponent of the V-to-L function.

A (rough) synonym of luminance is intensity. Luminance or intensity may be stated in absolute units such as candelas per square meter (cd/m2) or foot-Lamberts (ft-L). In the graph above it is represented on the vertical axis in relative units from 0.0 to 1.0.

The input video signal may be a voltage in millivolts, for analog signals, or an integer code value, also known as a code value or pixel value, for digital signals. The V in the equation above stands for either voltage or (code or pixel) value. The digital codes or pixel values that determine luminance or intensity are integers in the range from 0-255 for computer applications, or 16-235 for video applications. Also, IRE units from 0 (or 7.5) to 100 may be used to represent either analog or digital signal levels.

Above, I put relative signal-level units from 0.0 to 10.0 along the horizontal axis, and I omit the 0.0 level entirely — pure or reference black, that is — to avoid trying to take the logarithm of zero later on. 10.0 represents peak white, also known as reference white. Everything between 0.0 and 10.0 is a shade of gray — for now, I'm ignoring color.

Setting aside gamma 1.0 (the red graph), a TV's "gamma curve" is nonlinear. The blue graph, representing gamma 1.8, is in a sense less nonlinear than the orange graph, representing gamma 2.5. The higher the TV's gamma number, the less linear is the TV's luminance output as a function of its input video signal.


The higher the gamma, the more image contrast there seems to be:

For gamma = 1.8
For gamma = 2.5


The photo on the left looks best when the monitor's gamma is 1.8 and is "too contrasty" when the monitor's gamma is 2.5. The Goldilocks on the right looks "just right" at gamma 2.5 and looks too washed out at gamma 1.8. So what an image ultimately looks like depends not only on the decoding gamma of the monitor or TV, but also on the encoding gamma that has been used in creating the image.

Encoding gamma needs to bear a roughly inverse relationship to decoding gamma. If the monitor has decoding gamma 2.5, the encoding gamma exponent ought to be roughly 1/2.5, or 0.4. In practice, for technical reasons having to do with "rendering intent," encoding gamma for gamma-2.5 television images is often modified slightly, to 0.5. The process of applying encoding gamma to TV images prior to broadcast is called gamma correction.


Although the "gamma curve" of a TV or computer monitor actually curves when plotted on linear axes as above, when it is plotted on log-log axes, it typically becomes a straight line:



The logarithm of the relative input video signal level now appears along the horizontal axis. On the vertical axis, it's the logarithm of the relative luminance output. Switching to log-log plotting allows the gamma "curve" to appear straight. Its mathematical slope is now equal to gamma itself, the exponent of the mathematical function which relates output luminance L to input video level V.


The word "luminance" is unfortunately ambiguous. Under the subtopic "Confusing Terminology" in the Wikipedia article on "Gamma Correction," luminance is defined in two ways:

  • the apparent brightness of an object, taking into account the wavelength-dependent sensitivity of the human eye
  • the encoded video signal ... similar to the signal voltage
I am using the first definition, where the "object" whose "brightness" is in question is part of a scene being rendered by a video camera. Eventually that same object with its associated apparent brightness or luminance is a part of a video image displayed on a TV screen or computer monitor.

I called the encoded video signal V above. It could also be called Y', since, in video, Y is the name given to the luminance of the original scene as picked up by the video camera and converted into an electronic signal, and Y' (wye-prime) is that electronic signal after it has undergone gamma correction and is en route to the receiving TV or monitor. To distinguish the two signals, the first, Y, is called (video) "luminance" and the second, Y', is called luma.

Note that Y' ("luma"), which is derived from Y ("luminance"), transmits to the receiving TV or monitor nothing but colorless shades of gray falling along a continuum extending from reference black to peak white. Even so, Y — or actually Y' — is actually derived from three color signals detected by the video camera: R (for red), G (for green), and B (for blue). Each of these single-color signals is gamma-corrected individually to become, respectively, R', G', and B'.

Accordingly, to make monochrome Y' into a color image, it must be accompanied by two color-difference signals. When the signals are analog, these color-difference signals are called Pb and Pr. In digital video, they are Cb and Cr. Pb (or Cb) is the difference between the gamma-corrected blue signal, B', and Y'. Pr (or Cr) is the difference between the gamma-corrected red signal, R', and Y'. The gamma-corrected green signal, G', can be recovered when Y', Pr (or Cr), and Pb (or Cb) are received. A Y'PbPr (or Y'CbCr) signal is often called a "component video" signal.


Assume two TVs or monitors, one using gamma 1.8 and one using gamma 2.5, are adjusted so that they produce the same luminace output for a peak white input signal. From the log-log plot above we see that as the input signal level decreases, the two sets' luminance outputs diverge more and more. Accordingly, differences in gamma show up more noticeably at low input signal levels than at high. Levels near peak white are relatively unaffected by gamma differences. Levels nearer to the signal's reference black level are, on the other hand, strongly affected by gamma.

Qualitatively, this fact means that "gamma deepens shadows." It doesn't really make the luminance of shadow details drop below that of reference black. But it does make it "take longer" for gradual increases in the video signal level, starting at the signal level for reference black and sweeping upward toward that for peak white, to produce concomitant increases in output luminance levels. The image "stays darker longer."

In fact, luminance when gamma is greater than 1.0 only fully "catches up" with luminance for gamma 1.0 at the very top of the scale, at peak white. Every signal level below peak white produces less screen luminance when gamma is higher.

Still, when we see an image displayed at, say, gamma 2.5 side-by-side with the same image at gamma 1.8, we are apt to say that the shadows are "deeper," not that the less dim areas are somehow "brighter." This, again, is because differences in log-log gamma plots are wider, and thus show up more noticeably, at low input signal levels than at high.


A log-log gamma plot for a given TV will vary in slope quite a bit depending on how the brightness control is set. A TV's brightness control actually sets its black level, the luminance it will produce for an input video signal level that equates to pure black.

This incoming video level for pure black is, in digital TV, either 0 or 16, depending on whether the useful range is set as 0-255 or 16-235. In analog video, it is 0 millivolts (or, if so-called 7.5% setup is used for the broadcast signal) 54 mV. From now on, I'll refer to it, using the well-known "IRE units," as 0 IRE. (I'll ignore the possibility of 7.5% setup, which would put black at 7.5 IRE.)

The signal's reference white or peak white level can be 255 or 235 in digital video. In analog video, it can be either 700 mV or 714 mV. Whatever units are used, reference or peak white is also said to be at 100 IRE.

The luminance output level for 100-IRE reference or peak white is set by the TV's contrast control. It ought instead to be called something like video level or gain. It could also appropriately be called "brightness," if the actual brightness control were renamed "black level" and a control which tailors gamma per se were added in the form of a more appropriately named "contrast" control.


Once the white level is set via the contrast control as we know it today, then — assuming nothing is overtly done to change gamma — changes to the setting of the brightness control change gamma anyway. In effect, adjusting the brightness or black level control pivots the log-log gamma plot around its upper end at 100 IRE.

Imagine that the TV's brightness control has been carefully set such that a 0-IRE input signal produces the least amount of luminance the TV is capable of producing, and a 1-IRE input signal produces just enough more luminance to show up with a higher degree of visible lightness in a pitch-black room. You can then, in principle, measure the TV screen's luminance output at various signal levels from 0 IRE to 100 IRE and plot the luminance figures against the input levels on log-log axes. The slope of the resulting plot is, let us say, 2.5, which means that the TV is operating at gamma 2.5.

Now, imagine turning up the brightness control. Every luminance figure at every IRE level of input will go up ... but the ones at lower IRE levels will go up more than the ones at higher IRE levels. At 100 IRE, there will be no change in luminance whatsoever. In effect the log-log plot, while remaining a straight line, swings upward. It pivots clockwise around its rightmost end point at 100 IRE.

It therefore has a shallower slope. Instead of 2.5, the slope might (depending on how far the brightness control is turned up) drop to about 2.3 — which means that the TV is now operating at a gamma figure of 2.3.

If, on the other hand, you imagine turning down the brightness control below its carefully chosen optimum, the log-log plot pivots in the other (i.e., counterclockwise) direction; it takes on a steeper slope; the TV's operating gamma goes up to, say, 2.7.

If you turn the brightness control up from its optimum setting, furthermore, deep blacks will be rendered as dark grays, while if you turn brightness down, low-level video information will be rendered no different than black. Shadow detail will be "swallowed," in other words. In addition to affecting a TV's operating gamma, misadjusted brightness can have other deleterious effects on the image you see.

So changes to a TV's brightness control can alter its operating gamma (as I'll call it) in either direction away from its nominal gamma.


Whether operating or nominal, gamma is important. It not only affects image contrast — how deep and pervasive its shadows and darker elements appear to be — it also affects overall image brightness as well as the hue and saturation of various colors.

We have already seen that an increase to gamma makes every input video signal level between 0 IRE and 100 IRE appear on screen with less luminance. Everything on the screen appears darker and dimmer — though the effect is greater, the lower the original input signal level. Since most people say they prefer a "brighter" picture, TVs often are designed to operate at a gamma that is lower than they really "ought" to.

At first it seems odd to note that gamma affects colors, when it seems to be more of a black-and-white thing. But any color other than pure red, green, and blue at maximum saturation levels (100 IRE) is indeed affected by gamma.

For example, if a certain color of brown is to be represented, it may (let's say) be made of red at 100 IRE and green at 50 IRE, with blue entirely absent. (This analysis is adapted from Dr. Raymond Soneira's article, "Display Technology Shootout: Part II — Gray Scale and Color Accuracy," in the October 2004 issue of Widescreen Review. This article is available online in PDF form here. Dr. Soneira is the president of DisplayMate Technologies. His article can be accessed directly as a web page here.) The 100-IRE red won't be affected by gamma, but the 50-IRE green will.

If gamma is relatively high, 50 IRE green will be reproduced at the TV screen with lower luminance than if gamma is relatively low. As a result, the hue of brown will appear redder (because green contributes less) at higher gamma and less red (because green contributes more) at lower gamma.

Next, imagine replacing some of the red in the input signal with blue: say, a brown that is 75 IRE red, 50 IRE green, and 25 IRE blue. Now, because all three color primaries are represented, the brown is no longer a fully saturated one. Instead, the 25 IRE of blue combines, in effect, with 25 IRE of the red signal and 25 IRE of the green signal to make a shade of gray.

That leaves 50 IRE of red and 25 IRE of green. Both of these will be affected by gamma, but the latter (because lower in signal level) will be affected more. Just as before, gamma differences will change the hue of brown.

But this time, gamma will also affect the luminance of the shade of gray produced by the combining of 25 IRE worth of red, green, and blue signal. If gamma is relatively high, this gray will have relatively low luminance, and the brown will appear on screen purer and more saturated. If gamma is relatively low, the brown will appear less pure and take on more of a pastel shade.

So gamma affects the hue of all colors (not just brown) that mix two or three primaries together in different proportions. It also affects the saturation of all colors that mix all three primaries together in different proportions.

No comments: