If gamma is fairly high — say, 2.5 or more — shadows in images may appear too deep and dark. Portions of the TV screen whose pixel values are in the low end of the 16-235 brightness range will be rendered with less luminance than if gamma were relatively low. (In fact, all brightness levels besides pure black at pixel code 16 and pure white at pixel code 235 will be dimmer, the higher gamma is.)
As gamma drops from 2.5 to, say, 2.2 or even 1.8, dark areas of the screen begin to open up to visual inspection. The input pixel values don't change, of course, but their output luminance levels go up. And the effect is greater at the lower end of the brightness range than at the higher.
Note that "brightness range" is a phrase I use loosely to connote the entire possible gamut of input signal levels. For digital video, this gamut is expressed in terms of numeric code values that apply separaately to each pixel contained in the image. These values range from 16 at the low end (black) to 235 at the high end (white). Values between 17 and 234 represent shades of gray.
Or, if just one primary color is being considered — say, red — 16 is (again) black and 235 is red at its maximum possible brightness. Values between 17 and 234 represent the darker reds that are found at the low end of the scale and, at the high end, increasingly brighter reds.
When just one primary is involved in defining a particular pixel of the image, the code value of each of the other two primaries is presumably 0, for that pixel. In more general cases, the pixel's other two primaries also have non-zero code values between 16 and 235. If all three primaries have the same code value — say, 0 or 100 or 200 or 235 — they mix to make "colorless" black (code 16), or gray (codes between 17 and 234), or white (code 235).
Code values may also be defined over a 0-255 range, instead of 16-235. 0-255 is the range used for computer graphics. 16-235 is the range nominally used for television video. But some televisions and some video sources such as DVD players prefer the 0-255 brightness range as well.
Analog video uses voltages — numbers of millivolts — to express a signal's brightness range. Also in common use are "IRE units," from 0 to 100, which can apply to either analog or digital signals. Finally, a brightness range is sometimes "normalized" to fit within arbitrary brackets such as 0.0 to 1.0.
However it is defined, this brightness range from black t0 white/maximum brightness is the basis for gamma. A TV's gamma function causes it to reproduce the values in the brightness range in nonlinear fashion: output luminance does not rise in one-to-one lockstep with the input signal. The higher gamma is, the more nonlinear the reproduction. Gamma 2.5 produces a deeper, darker, and more contrasty image than gamma 2.2.
Existing video standards assume that TVs have a gamma of 2.2 ... though cathode-ray tubes or "picture tubes" actually have a native gamma of 2.5! In CRTs today, the difference between gamma 2.5 and gamma 2.2 is the result of digital signal processing. Values from look-up tables (LUTs) buried in the TV's digital circuitry effectively modify the native transfer characteristic of the TV.
Non-CRT displays — flat-panel plasmas, LCD panels and rear projectors, DLP rear projectors, etc. — also need specific transfer functions: CRT-like ones, if they are to look like CRTs. But here, in a recent review of a super-pricey plasma HD monitor, the Mitsubishi PD-6130, we see that TV designers can have other ideas in mind. According to reviewer Gary Merson of Home Theater magazine:
"The set has a four-level gamma control that adjusts the logarithmic relationship between input signal level and display level. Good gamma contributes to subtle changes in brightness, and video is ideal at a gamma value of 2.2. The PD-6130's number-four setting had an average value of 2.05 over the entire brightness range. (The value changed in different parts of the range, which is why I specify an average here.) At the number-one setting, the average gamma measured 1.78."
This one short paragraph says a lot:
- Some HDTVs let users choose among various gamma settings.
- The gamma choices they offer can have arbitrary names/numbers, not the actual gamma values themselves.
- Experts like Merson look for a nominally standard gamma setting of 2.2.
- What they actually find can be a much different gamma.
- Some TVs' gamma figures actually change at different levels over the available pixel-brightness range.
- These TVs' actual average gamma figures, at any setting, are apt to be lower than the standard 2.2.
Sub-2.2 gammas make for a brighter image overall, a selling point at the video store. Gamma values that vary with pixel brightnesses let TV makers goose the picture for impressive blacks and snappy highlights, even as the sub-2.2 average gamma boosts the overall image.
Will such an eye-grabbing image satisfy us at home, in the long run? Maybe not. It would be nice if we could switch to a standard gamma of 2.2, straight across the entire brightness range.
But how? There are a great many obstacles. Obstacle #1 is the fact that we as end users generally have no way to measure the gamma of our TVs.
True, Ovation Multimedia's Avia: Guide to Home Theater, a calibration DVD, has among its test patterns a "Gamma Chart." My experience with it, however, is that the result is at best an average number which doesn't tell much about how the TV's gamma varies over the brightness range.
Moreover, the Avia-reported gamma figure can change if the user alters the TV's brightness/contrast settings or turns on such digital enhancement features as "black enhancement," "dynamic contrast," and the like.
Are there other ways to measure gamma? Ovation does also offer Avia: Pro, which boasts a "Gamma Response and Linearity to Light Test" that I unfortunately know little about. This seven-DVD test suite is aimed at professionals and ultra-serious home enthusiasts and costs around $350.
There are also various video setup/calibration products available from DisplayMate Technologies that run on Windows. I, as a Mac user, can't use them.
Other setup/calibration products exist as well, but it's not clear how many of them claim to let end users to calibrate gamma.
Then there is the problem that many TVs simply don't provide user-accesible menu options for gamma adjustment. Their gamma adjustment capabilities are typically buried in their so-called service menus, which provide a series of parameters that let professional technicians, often using special instruments, to optimize the picture. Most grayscale calibrations, for instance, are done in TV service menus.
Another problem is that, as mentioned, various user-available controls and features can alter gamma, as a side effect, so once you have somehow succeeded in measuring gamma and perhaps even tailoring it to your liking, other tweaks you might make can put you right back at square one.
So maybe the best policy is simply to become "gamma-aware": to learn to recognize what our various tweaks and settings might do to the tone scale, as some experts call the gray scale of our TV screens. ("Gray scale" also describes the TV's "color temperature" — e.g., 6500K — and how neutral or tint-free it renders grays of various brightnesses in the image.)
Being gamma-aware might mean that, as we adjust a TV's brightness/black level setting, we simply realize that gamma is going in the opposite direction. Accordingly, if we raise black level, gamma drops. If we lower black level, gamma rises.
Being gamma-aware might mean that we use the "special" features of our "digital" TVs, such as "black enhancement" and "dynamic contrast," advisedly, since these features often affect gamma. In particular, they can make otherwise "straight" gammas "nonlinear."
By that I mean that ordinarily a luminance-vs.-signal plot, drawn on log-log axes, is expected to be a straight line, with gamma its numerically constant slope. But various digital signal processing tricks, such as "black enhancement" and "dynamic contrast" — by whatever names the TV maker chooses to call them — bend the log-log plot. It no longer has a constant slope. Rather, its slope is now a variable function of values in the brightness range.
That's not necessarily all bad, by the way.
Suppose, for instance, you watch TV in a brightly lit room, and because of it you need to boost the TV's brightness above the nominally correct black level setting, in order to see all the shadow detail that is present in the picture. That lowers effective gamma ... with the perhaps unwanted side effect of reducing color saturation. If you find a user-menu setting, such as "black enhancement," that counteracts that side effect by moving the picture's effective gamma back up in the other direction, you may want to turn it on.
"Gamma-bending" user options (we might call them) are like the various sound profiles provided by a stereo: "clear," "live," "flat," "beat," "pop," etc. Most of them are by design far from flat or linear in their frequency response. Or, turning on a stereo's "bass boost" button doesn't yield a flat response curve, either ... but it can sure make the music sound subjectively better.
Likewise, "gamma-bending" user options can enhance the TV-viewing experience.
But when "gamma-bending" is built by a TV's manufacturer into each and every luminance response curve the TV offers, with no "straight" gamma-2.2 setting available, things are not so good. As Dr. Raymond Soneira's Grayscale/Color Accuracy Shootout article says, non-standard gamma can shift image contrast, image brightness, hue, and color saturation in ways subtle but real. We should not have to put up with it if we don't want to.
Also, when a TV's so-called "gamma" control simply "stretches" the lower portion of a luminance response curve, says Soneira — rather than change its overall slope, as it should — image artifacts such as banding/false contouring can result. I take "stretching" a portion of the brightness range or luminance response curve as equivalent to what I mean by "gamma-bending." It can give subjectively pleasing results — but at a cost to image accuracy.
No comments:
Post a Comment