Here again are two versions of a single photo:
For gamma = 1.8 | For gamma = 2.5 |
The image on the left is intended to be viewed on a Macinstosh computer monitor whose inherent "attitude" toward rendering contrast is to prefer the subtle over the dramatic. The image on the right is aimed at a PC monitor intrinsically with a more dramatic approach to rendering contrast. So, in view of their two different target monitors, the image on the left has more contrast built in than the image on the right. On your computer monitor — no matter which type it is — the image on the left will look more dramatically "contrasty" than the one the right, simply because the image on the left it has more contrast built into it.
A monitor's (or TV's) "attitude" toward rendering contrast is betokened by its gamma. A Mac monitor typically has a gamma of 1.8: the level of the input video signal is raised to the 1.8 power to compute how much light, or luminance, the monitor's screen will produce.
A PC monitor typically has a higher gamma figure of 2.5. Its input video signal is raised to the 2.5 power, not the 1.8 power.
Gamma makes the luminance output of a monitor or TV nonlinear with respect to input signal levels. If luminance were linear — i.e., if gamma were 1.0, not 1.8 or 2.5 — the monitor or TV would provide too little contrast.
Here is a graph (click on it to see a larger version) comparing gamma 1.0, gamma 1.8, and gamma 2.5:
The horizontal axis represents the level of the input video signal, relative to an arbitrary maximum of 10.0. This signal can be an analog voltage or a digital code level; it can represent either the black-and-white "luminance" signal per se or any one of the three primary color components, red, green, or blue.
The vertical axis represents the amount of light (or luminance) the monitor or display produces for each input voltage or code level. Again, this value is computed and plotted relative to an arbitrary maximum value — this time, 1.0.
Notice that only the gamma-1.0 plot is linear. Real TV's and real monitors typically have gammas between 1.8 and 2.5, so their gamma plots curve.
Take the gamma-1.8 plot. Specifically, the plot sags downward such that low input levels toward the left side of the graph — say, input signal level 4.0 — have luminances way lower than they would be for gamma 1.0. Then, as input signal levels rise toward 10.0, the gamma-1.8 plot gets closer to the gamma-1.0 plot. It never quite catches it — until the relative input level is at 100% of its maximum, that is — but it's like a racehorse closing fast at the end of the race, after falling behind early on.
When the input signal level is low — say, 2.0 — increasing it by 1.0 to the 3.0 level has only a minor effect on luminance output. But going from 7.0 to 8.0, again a boost of 1.0, increases luminance output quite a bit. The fact that the gamma curve sags downward and then hurries upward causes details to emerge more slowly out of shadows than out of more brightly lit parts of the scene.
What's true of gamma-1.8 is even more true of gamma 2.5. Gamma 2.5's plot sags downward even more than gamma 1.8's, so shadows are yet deeper and, by comparison, highlights are more dramatic with gamma 2.5 than with gamma 1.8.
A gamma curve is actually a straight line when plotted on log-log axes:
Here, I've replaced the original axes with their logarithmic equivalents. That is, I've replaced each tick mark along the horizontal axis with one for its point's logarithm. For instance, instead of plotting 4.0 at 4.0, I've plotted it at log 4.0, which is 0.602. 9.0 is now at log 9.0, or 0.954. 10.0 is at log 10.0, or 1. And so on.
A similar mathematical distortion has been done to the vertical axis. The result: gamma "curves" that are all straight lines!
This is to be expected. If
then, as math whizzes already know,
Math whizzes will also recognize that gamma is the slope of the log-log plot. This slope has a constant value when gamma is the same at every input signal level, which is why the log-log plot is a straight line.
Why is this important? One reason is that, as with the two versions of the single image shown above, the gamma that was assumed for the eventual display device — and built into the image accordingly — ought truly to be the gamma of the actual display device. Otherwise, the image can wind up with too much or too little contrast.
When we say that the assumed display gamma is "built into" the image, in the world of television and video we're talking about gamma correction. A TV signal is "gamma-corrected" at the camera or source under the assumption that the eventual TV display will have a gamma of 2.5, which is the figure inherent in the operation of each of the three electron guns of a color CRT.
So the original camera signal (or signals, plural, one each for red, green, and blue) gets passed through a transfer function whose exponent is, not 2.5, but rather its inverse, 1/2.5, or 0.4. (Actually, for technical reasons having to do with the eye's interpretation of contrast in a presumably dimly lit TV-viewing environnment, the camera's gamma-correction exponent is altered slightly from 0.4 to 0.5.)
When the display device is not a CRT, it has no electron guns, and its "gamma" is completely ersatz. If the device is digital, a look-up table or LUT is used to instruct the actual display — say, an LCD panel or DLP micromirror device — how much luminance to produce for each digital red, green, or blue component of each pixel. Each color component of each pixel is, usually, an 8-bit digital value from 0 to 255. When that 8-bit code value is looked up in the LUT, another 8-bit value is obtained which determines the actual output luminance.
What will that second 8-bit value be? If we set aside questions about how the TV's contrast, brightness, color, and tint controls work, the answer depends mainly on the ersatz "gamma curve" built into the LUT.
If the LUT's ersatz "gamma curve" is based strictly on the exponent 1.8 or the exponent 2.5, the graphs shown above depict how the digital LUT translates input signal values to output luminances.
But the "gamma" of the LUT may reflect a different exponent, one not found in the graphs, such as 1.6 or 2.2 or 2.6. The TV's designers may have chosen this "gamma curve" for any one of a number of reasons having presumably to do with making the picture more subjectively appealing, or with camouflaging the inadequacies of their chosen display technology.
Or the "gamma" of the LUT may reflect more than one exponent! That is, the "gamma" at one video input level may be different from that at another. At a relatively low 20% of the maximum input level (video mavens call it a 20 IRE signal), the ersatz "gamma" may be, say, 2.5. At 40% (40 IRE), it may be 1.6. At 80% (80 IRE), 2.2.
So there is nothing to keep a digital TV from having a wide range of ersatz "gamma" exponents built into its LUT, such that (at the extreme) a slightly different "gamma" may be applied to each different input code level from 0 through 255 (or each IRE level from 0 to 100).
If the wide-ranging ersatz "gamma" underlying a digital TV's internal look-up table were to be plotted on log-log axes, it would no longer be a straight line! A straight-line log-log gamma curve assumes a single, constant gamma exponent, independent of the input code value (or analog voltage). Thus, when the gamma exponent varies with input level, the log-log gamma plot is no longer a straight line.
Even a CRT-based TV, if it processes its input signal in the digital domain, can play such games with gamma. True, it's electron guns have their built-in "true gamma," but that single, simple figure (say, 2.5) can be distorted at will by means of a digital look-up table. The LUT is more powerful than the electron gun!
In the service menu of any modern television, CRT-based or otherwise, there is apt to be a way to select "gamma" — not the "true gamma" of a CRT electron gun, which can't be changed, but the "ersatz gamma curve" built into an internal look-up table.
On my Samsung DLP rear-projection TV, there is in fact a GAMMA parameter in the service menu. It can be set to values ranging from 0 through 5 (values higher than 5 apparently being meaningless). Each different setting affects the "attitude" the TV has toward contrast by (I assume) re-specifying the "gamma curve" of its internal look-up table.
As best I can tell, few if any of these GAMMA choices implement a single, simple gamma exponent that is constant for every video input level, à la an old-fashioned, "LUT-less" analog CRT. The various log-log curves are not, presumably, straight. Nor do the GAMMA numbers, 0-5, have any relationship to the ersatz "gamma" exponents being selected. For example GAMMA 2 does not choose an ersatz "gamma" exponent of 2.0.
I have yet to encounter any authoritative discussion of why these alternate GAMMA settings are put there, or how they are intended to be used. Enthusiasts on various home-video forums are wont to claim that, for Samsung DLPs like mine, GAMMA 0, GAMMA 2, or GAMMA 5 is "better" or "more CRT-like" than the original "factory" setting, GAMMA 4.
But my own experiments with these alternate settings have not really borne those claims out.
Even so, I have learned that twiddling with digital, LUT-based "gamma" (because it's so ersatz) can alter the "look and feel" of the picture on the screen in myriad ways which may well appeal to you, with your digital TV display, if not me with mine. And, in the end, that's why I've gone to the trouble of trying to explain what's going on when you adjust the "gamma" underlying your digital TV's powerful internal LUT!
No comments:
Post a Comment