To recap, gamma is a number, usually in the range of 1.8 to 2.5 (or so), that describes how changes in a display's luminance or light output, L, track with changes in its input video signal level, V. V may be expressed in millivolts, in IRE units from 0 to 100, or in code values from 0 to 255 (or 16 to 235). Luminance can be measured in foot-Lamberts (ft-L) or candelas per square meter (cd/m2).
Specifically, gamma is the exponent or power-of-V in this functional relationship between V and L:
L = VƔ
As such, when this function is plotted on log-log axes, the result is a straight line(!) with gamma as its slope. The larger gamma is, the steeper the graph's slope ... and the greater the "apparent" image contrast on the screen.
I say "apparent" because gamma doesn't make blacks any blacker or whites whiter. But at signal levels in between o-IRE reference black and 100-IRE peak white, increased display gamma makes us perceive shadows as being "deeper" or "darker." Conversely, if display gamma is decreased, the image appears "flatter," in terms of its "contrast," even as the overall "brightness" of the image goes up.
So what is the "correct" gamma for a display?
That simple question does not, unfortunately, have a simple answer. Rather, there are several possible answers.
The process of answering this question begins with noting that, historically, the concept of display gamma began as a concession to an operating characteristic of the cathode ray tube, or CRT. The familiar color "picture tube" in use today has three electron guns, one for each primary color: red, green, or blue. Each gun fires a beam of electrons at phosphors of the appropriate hue on the inner surface of the face of the tube, exciting the phosphors and making them emit colored light. The more electrons that are fired per unit of time, the brighter the phosphors glow.
An electron gun fires when and if a voltage is applied across its individual cathode and the common, positively charged grid inside the neck of the picture tube. But, crucially, the number of electrons fired per unit of time is not a linear function of the applied voltage. This is where the original idea of a gamma exponent comes in. It characterizes the mathematical relationship that determines the rate of an electron gun's firing as a function of its applied voltage.
As such, the fixed value of gamma in a modern CRT is typically 2.5, according to Charles Poynton in his book Digital Video and HDTV: Algorithms and Interfaces.
However, writes Poynton (p. 268), "The FCC NTSC standard has, since 1953, specified R'G'B' encoding for a display with a 'transfer gradient (gamma exponent) of 2.2'."
Let's break that down into smaller chunks. First of all, NTSC stands for National Television System Committee. In 1953, under the aegis of the Federal Communications Commission (FCC), the NTSC promulgated the television standard that is still in use today for standard-definition television transmissions in the U.S.A. These SDTV signals, also known in today's parlance as 480i, are in full color, though in 1953 color television was yet a brand new technology, still awaiting implementation.
Gamma correction is applied by broadcasters to the NTSC signal in the following way. Three color primaries, R (for red), G (f0r green), and B (for blue), are produced by the image sensor of a video camera. These three signals are immediately altered to, respectively, R', G', and B'. This is done by means of applying a mathematical transfer function whose exponent, the encoding gamma, is roughly the reciprocal of the display's inherent decoding gamma.
Later, R', G', and B' are matrixed at the TV studio to form Y', Pb, and Pr. The "black and white" part of the signal is, accordingly, gamma-corrected Y', or luma. Pb is actually(B' - Y'). Pr is (R' - Y'). From these three transmitted video "components" the television receiver will be able easily to recover B' and R'. It can also derive (G' - Y') and thus G' itself.
Once the television receiver has recovered R', G', and B', it uses each as a voltage to drive the CRT's appropriate electron gun directly (assuming, that is, that it is a CRT). Since the picture tube is intrinsically nonlinear, as reflected in its native decoding gamma exponent of 2.5, in effect R', G', and B' are automatically turned back into R, G, and B at the face of the cathode ray tube itself!
But what's this about a "transfer gradient (gamma exponent) of 2.2"? Shouldn't that be 2.5, in view of the fact that "modern CRTs," per Poynton (p. 268), "have power function laws very close to 2.5"?
For the encoding gamma used by broadcast TV stations when they gamma-correct their signals, the reciprocal of the decoding gamma is, supposedly, typically used. So if 2.2 is assumed to be the display's decoding gamma — not 2.5 — the encoding gamma becomes 1/2.2, or 0.45. If a decoding gamma of 2.5 is assumed, on the other hand, the encoding gamma would nominally be 1/2.5, or 0.4.
Poynton, however, advises not taking this discrepancy "too seriously." For one thing, "the FCC statement," says Poynton, "is widely interpreted to suggest that encoding should approximate a power of 1/2.2," in spite of a decoding exponent that may be (and is) other than 2.2.
According to Dr. Raymond Soneira's four-part series, "Display Technology Shootout," published in Widescreen Review magazine, Sept.-Dec. 2004, studio CRT monitors used in tweaking video images before they are broadcast or rendered on DVD typically have decoding gammas of 2.2, not 2.5. "Current CRTs," he writes in the second part of his series (in WR, Oct. 2004, p. 68) typically have a native gamma in the range of 2.3 to 2.6, so the gamma of 2.20 for Sony (and Ikegami) CRT studio monitors is actually the result of signal processing."
Translation: the original signals, once they've been gamma-corrected as discussed above, are fed to a CRT studio monitor to make sure they look right. This monitor is apt to have a "native gamma" in the range of 2.3 to 2.6 — say, 2.5. But signal processing — probably digital signal processing — that takes place within the monitor's electronics, prior to the final display device, makes the monitor operate as if its decoding gamma is 2.2, not 2.5.
How is this done?
Typically, it's done with "look-up tables." Each possible code value for the input signal — R', B', and G' are treated separately, but alike — can be looked up in a table in the memory of the digital signal processing (DSP) portion of the display's internal circuitry. The code values for each pixel of an input video frame are stored in a "frame buffer." Then they are in effect replaced by new code values that result when the original values are looked up in the look-up table or LUT. (When red, green, and blue primary colors are involved, and not just a grascale image the tables are color look-up tables, or CLUTs.)
As a result, each input code value is replaced with an output value in accordance with a mathematical transfer function which (in this case) converts the input signal so as to make the monitor's native gamma, 2.5, look instead like gamma 2.2.
If the resulting image on the studio monitor doesn't, for whatever reason, yield the proper "depth of contrast" (shall we call it), technicians can revise the original signal's encoding gamma to make it look right after all. Maybe the original encoding gamma, 1/2.2 = 0.45, doesn't suit the source material to a T. Tweaking encoding gamma slightly — to, say, 0.44 — may be indicated.
At the end of the tweaking process, the result is an image that looks "perfect" on a monitor whose decoding gamma is 2.2, not 2.5. Why? Because of the decoding gamma of the studio monitor that is used in tailoring the image — given that the studio monitor's native gamma of 2.5 is altered, via DSP look-up tables, to an effective decoding gamma of 2.2.
All of this suggests that, if you have a CRT monitor or TV in your home, you'd like its native gamma of perhaps 2.5, to be able to be digitally altered to an effective value of 2.2 as well. That way, you could at least in theory see the image of, for example, a movie on DVD just the way studio technicians saw it as they were massaging its essential parameters during the process of authoring the DVD.
In fact, even if you have a plasma, LCD, or other non-CRT display, an effective decoding gamma
of 2.2 would seem to be the absolute holy grail of faithful image rendition.
Well, maybe it is. Or maybe not.
A lot depends on exactly how bright or dim your viewing environment is, compared with that in which the studio technicians were tweaking the images on their gamma-2.2 monitors.
Suppose you pride yourself on having a home theater in which there is zero illumination other than that produced by the screen itself. What if the studio techs were not working their magic in such a pitch black environment?
As Poynton points out in chapter 9 on "Rendering Intent," what really matters is how well a display system's "end-to-end power exponent" suits the purposes for which the images are intended to be used, in a particular viewing environment.
When you take the encoding exponent used to generate the gamma-corrected video signal and multiply it by the receiving TV's effective decoding gamma exponent, you get the "end-to-end power exponent" of the transmission system as a whole. And — perhaps surprisingly — this end-to-end exponent had better be greater than 1.0!
If the end-to-end exponent were exactly 1.0, the overall system would be perfectly linear, and that's not good.
Why? There are several reasons. One, in video images the amount of luminance is scaled way down from that of the original scene. Barring alteration of the end-to-end exponent that applies to the image, that will make the on-screen version of the image seem to have much less apparent contrast (and also much less colorfulness).
Two, when video is watched in the dark or near-dark, the apparent contrast range of the image decreases, in what is known as the "surround effect" (Poynton p. 82). Normally, the human visual system artifically "stretches" the apparent contrast that exists among objects seen in a bright "surround," as in a brightly lit original scene. When that same interior-of-the-scene detail is arbitrarily framed and viewed on a video screen in a dimly lit room, this surround effect is not triggered, and the image contrast is perceived as unnaturally flat.
Three, the actual contrast ratio of the video or film camera is apt to be much less that the 1000:1 or more found in nature. And the display device's own limitations typically constrains the actual contrast in the viewed image yet further.
Gamma-linear transmission systems don't take such realities into account. Hence they don't yield images whose "image contrast" or "tone scale" is perceptually correct.
If the decoding gamma is 2.5 and the encoding gamma is effectively 0.5, their product is 1.25 — an end-to-end exponent well-suited for TV images being watched in dim, but not totally dark, surroundings. Or so says Poynton (p. 85). This ideal situation is what can happen if the "advertised exponent" used in the encoding portion of the system — in, say, the television studio or the DVD post-production facility — is 0.45, or 1/2.2.
Here's what that means: you start with an "advertised" encoding gamma exponent of 0.45, or 1/2.2, and then you alter the curve it generates on graph paper so that the portion near its left end is instead a straight line segment, not a curve. This avoids the unwieldy situation of having a graph that possessed an infinite slope at its leftmost point.
But it also changes the effective overall encoding gamma of the system as a whole to something more like 0.5.
As a result, when you compute the end-to-end exponent of the system as a whole, you multiply the effective decoding gamma of the TV (Poynton simply assumes it's 2.5) by 0.5, not 0.45. The result is 1.25, an end-to-end exponent which gives you image contrast well-suited to a dim-but-not-pitch-black viewing environment.
But what if, as I asked earlier, your home theater is instead pitch black? In that case, Poynton says, you want an end-to-end exponent of 1.5, not 1.25.
Basically, says Poynton (pp. 84-85), it has to do with the fact that turning the lights all the way off in the viewing environment is apt to provoke your reducing the display's black level, via its brightness control setting. Otherwise, you may place a "gray pedestal" beneath all of tehe display's luminance levels, and blacks in particular may look not black but dark gray.
Put the opposite way, turning the lights up in a pitch black TV room typically creates more luminance on and reflected by the screen than just that produced by the TV itself. So the TV's brightness control must typically be turned up so low-level details in the picture don't get swamped by the ambient illumination.
When you reduce the TV's black level for viewing a totally dark viewing room, the slope of the display's log-log gamma plot goes up — from 2.5 to 2.7, in Poynton's example. As a result, the end-to-end exponent (still assuming an effective encoding exponent of 0.5) is now 1.5, not 1.25.
Using similar logic, Poynton shows that the brightness or black level setting of the display will typically nbe boosted above that necessary in a dimly lit environment, if the viewing room is brightly lit. Now the effective decoding gamma drops to about 2.3, for an end-to-end exponent that is, for all intents and purposes, as low as 1.125.
But here's the catch. Video engineers apparently assume you don't want to know all of the above, you don't habitually adjust your TV's brightness depending on how many lights are on in your viewing room, and you have no control whatsoever over gamma. So they impose a "rendering intent" on their images right from the start.
According to Poynton, they assume you'll be watching video on a (gamma 2.5) CRT in a dimly lit room, so they use an effective encoding exponent of 0.5 — or actually 0.51 — based on an "advertised" encoding exponent of 0.45.
The same goes for the producers of the films that eventually find their way to DVD via a so-called "video transfer" process. They use camera negative film and photographic print film that together impose what amounts to an end-to-end exponent of fully 1.5 — suitable for a totally darkened theater.
The creators of computer graphics, on the other hand, assume you'll be viewing them on a screen (again, one that presumably has native gamma 2.5) in a brightly lit office. So they arrange for their encoding gamma to drop to 0.45, based on an "advertised" exponent of 0.42.
This all means that, if you look at images in different (brighter or darker) surroundings than those envisioned by the creators' "rendering intent," you may want to be able to change the effective decoding gamma of your TV or monitor.
Or, if your display has been designed to have an effective decoding gamma of 2.2, not 2.5, you may (or may not) want to be able to change it.
Notice, here, a subtle source of confusion. One authority, Poynton, says video engineers expect your television to decode their signal using an effective gamma of 2.5. Another authority, Soneira, says video sources are tweaked to look right at gamma 2.2, not 2.5, since that is the effective decoding gamma of the typical studio monitor that is used in color- and contrast-balancing the source material.
So, which decoding gamma is "right"? The answer depends on assumptions made by people in far-off studios who don't really know exactly what your viewing environment is like, which effective gamma your TV really has ... and, furthermore, what your own idiosyncratic viewing preferences are, in terms of image contrast and desired colorfulness.
Moreover, when the source material is a movie on film, there is a video-transfer step in the signal delivery chain. In it, a colorist operating a telecine or film scanner decides how best to render an image originally produced on (I assume) camera negative or print film that has its own characteristic "gamma exponent" or "transfer function." The film has been shot and processed with an eye to giving it a "look" that may or may not be at all "natural."
How does the colorist respond? Possibly by choosing an unusual encoding exponent that lets the resulting video output look as "unnatural" as the film seen in theaters. But will that choice stand up to being viewed in your TV room, with your lighting situation?
Again, you might like to be able to take control over your TV display's gamma.
Doing so by means of adjusting brightness is, unfortunately, not good enough. Changing the black level of your TV might be an appropriate response to altered room lighting conditions, as in Poynton's discussion. It has the previously noted side effect on gamma: gamma goes up as black level is reduced, and vice versa. But it does not follow that you'd like to manipulate gamma, as a rule, by manipulating black level.
If you adjust black level properly for a given viewing environment, you will render reference black (0 IRE) in the video signal with minimal luminance on the screen, as is right and proper. Raise black level, and reference black turns into an ever lighter shade of gray. Lower it, and information in low-video-level portions of the image (say, 5 IRE) can't be distinguished from black.
So you don't want to use the TV's brightness or black level control to manipulate gamma, as a general rule.
So how do you manipulate your TV's gamma? That will be the subject of Gamma, Again! (Part III).