Contrast ratios and gamma exponents are part of a much broader subject one might dub "luminance contouring." "Contrast tailoring" would be an equally valid name for it.
The basic idea is that each pixel of digital video (ignoring associated "color difference" components) has a certain numerical level indicating intended luminance: a coded number in the range of 16 through 235 — or 0 to 255, for computers; I'm going to stick to the 16-235 digital video range.
Luminance code 16 corresponds to "black"; in terms of analog signal voltage, which usually is expressed not in pure volts but in IRE units, code 16 is 0 IRE.
Digital luminance code 235 corresponds to the maximal, 100-IRE analog voltage level: "peak" white. Peak white is also called "reference" white.
Digital codes below 16 constitute the so-called "footroom" reserved for "blacker-than-black" signals; those over 235 are "headroom" for "whiter-than-white" signals. (The 0-255 system used for computers doesn't allow for headroom or footroom.)
All codes between 16 and 235 accordingly represent different shades of gray. Code 17 stands for the darkest possible gray (i.e., that nearest to black), code 234 for the lightest (i.e., the one nearest white).
Both the 16-235 system and the 0-255 system are 8-bit end-user or video interface systems, in which each digital code for luminance uses a single 8-bit byte. Studio or video processing applications often uses a 10-bit system for greater precision.
(To be quite exact, what is encoded in this way is not luminance but luma, video geek-speak for luminance signals that have been "gamma-corrected." This means that luminance has had its range of intended shades of gray precompensated for display on a CRT, whose inherent gamma exponent would otherwise render shadows too dark and deep. Luma is what is actually represented by codes from 16-235. I'm going to continue to say "luminance" when I really mean "luma," though it's not something a purist would appreciate.)
Now that the basic idea of luminance encoding has been laid out, a good question is: at what actual level of output luminance will a particular TV render each of these codes?
Ideally, it will render code level 16 with no light output whatsoever. In the real world, no TV can do that, however. The best it can do will be some very low level of luminance. The exact level it will use is under control of the "brightness" setting chosen by the user. If "brightness" is set too high, the TV will emit too much light at its so-called "black level."
If, on the other hand, "brightness" is set too low, some range of codes above 16 will look just as "black" as code 16 looks. Such a TV is said to "swallow" its shadow detail.
Once "brightness" is properly set, the luminance that the TV will produce for code-235 reference white is adjusted via the "contrast" or "picture" control. The higher "contrast" is set, the greater the TV's luminance output for a code-235 (or 100-IRE) signal.
But a digital TV's "contrast" must not be set so high that whites are "crushed." A 98-IRE stripe (at, say, code 230) must continue to be distinguishable by the eye from a 100-IRE (code-235) background.
On a CRT, too-high "contrast" can cause general geometric distortion and/or "blooming" (enlarging) of white areas on the screen. It can also turn pure whites to brownish shades.
That pretty much covers setting up a TV's levels for black and for reference white. What about all the shades of gray in between?
This is where gamma, assuming it's selectable or adjustable, comes in. The higher the gamma exponent we choose, the more "reluctant" the TV becomes to displaying "dark" IRE levels, or low-numbered digital codes, with copious luminance. Shadows stay deeper/darker longer as the general level of ambient lighting in the scene goes up.
Say a pixel has digital code 46, making it just 30 of a possible 219 steps up from "black" at code 16. It's supposed to be dark ... but how dark? If gamma is relatively low — say, less than 1.8 — it will show up as a lighter shade of gray (ignoring hue) than it would if gamma were, let's say, 2.5 or over.
The kicker here is that digital TVs don't "do gamma" as a single exponent that applies at all code levels between 16 and 235. As the code level ascends from 16 to 235, gamma may change. This is because gamma, which is an inherent characteristic of a CRT, is simulated on a digital display.
That is, it's computed.
Or, rather, the output luminance level for each possible input code is computed, in advance, and stored in a lookup table. When an input pixel arrives, its code is looked up in the table, and the output luminance value associated with that pixel is located and used.
When you change the TV's "contrast" setting, the table is recomputed. Ditto, when you change user "brightness."
If I go into the service menu of my Samsung and change its GAMMA parameter, the lookup table is in effect recomputed then as well.
If I turn on DNIe — a function which may be turned on or off under user control, but not for DVI/digital video input — the table is again in effect recomputed. (In actuality, there is a lot more processing of the input data going on than that. Before the table lookup occurs, there is a lot of comparing of pixels, one to another, preliminary to "enhancing" the picture. Each input code is apt to be replaced by DNIe with a "better" one. Then the table lookup occurs.)
My intention here, be it understood, is not to boggle anyone's mind. It is rather to make clear that a digital TV like my Samsung DLP can choose any output luminance it damn well pleases for any particular input code it receives!
That fact is important because it has to do with how my TV "tailors" its contrast or "contours" its luminance response. Specifically, it tailors its various output levels of black, gray, and white in ways that may very well remain completely inscrutable to anyone such as I who is not "smarter than the average bear."
To wit, I went into the Samsung's service menu and boosted "sub-contrast" — actually, by name, S_CT(DDP) — a parameter that nominally works much like a user "contrast" control: it elevates the luminance output for reference white at code 235. In proportional fashion, nominally, it also raises luminance output for all codes from 1 through 234. In so doing — again, I say, nominally — it can be thought to have no effect on gamma.
Yet to my eyes it seems to have increased the underlying gamma exponent of the computation by which the TV derives its output luminance levels from input code levels.
As bright parts of scenes got yet brighter, because of the boosted sub-contrast, dark parts stayed relatively dark. The contrast ratio between, say, "bright" code-200 pixels and "dark" code-40 pixels seemed to have been stretched more than I would have anticipated. And I couldn't really say how or why.
The explanation may well have to do with the quirky way the eye responds to various stimuli. If the overall scene brightness is high, the eye adapts upward. Its ability to "see into shadow" goes way down.
But I don't think that's the whole explanation. If it were, then an extended dim or dark scene would let my eye adapt to it, and it wouldn't "stay dark" in my estimation for long. Notably, the eye adapts faster to the dark than it does to the light, which is why it takes so little time to get accustomed to a dark movie theater, yet it's downright painful, for quite a while, to re-emerge from the darkened theater into the sunlight.
But I don't think I'm seeing darker shadows than I did before simply because I'm forcing my eyes to adapt to higher average levels of light in the image. I think the sub-contrast boost has, all by itself, changed the way my Samsung computes output luminances based on input code values.
Fundamentally, this computation is a mapping operation. Each input code (after being twiddled by DNIe, etc.) is mapped to an output luminance by virtue of (I continue to assume) a table lookup. I'm using a sub-contrast setting of 150, which seems to be the highest setting that actually boosts light output from the screen. My main "contrast" control is set at 100, its maximum. Somehow, the 150 and the 100 get combined together to determine how a code-216 input pixel gets displayed.
Somehow, the 150 and the 100 also work together to determine at what luminance level, say, a relatively dark code-40 input pixel is displayed.
The function by which such determinations are made just may be complex enough that changing the 150 or the 100 will have surprising effects on the underlying (ersatz) gamma of my DLP TV.
These surprising effects on gamma may, furthermore, be different for sub-contrast changes than for "main contrast" changes.
And that's about as far as I think I can take this subject of "contrast tailoring" or "luminance tailoring" as it applies to my Samsung DLP-based HDTV. Without actually using test signals and expensive instruments to find out what's really going on, I may never know.
***
I' now like to make some additions. First of all, last night after I'd written the above, I watched the second half of Star Wars II: Attack of the Clones. I decided the picture was improved just a tad by lowering the sharpness setting all the way down to 0 (it had been at 20 for Star Wars I: The Phantom Menace). That kept various edges from looking "too hard," given that DNIe was on. It may also have been responsible for keeping highlight details such as the whites of eyes from looking "too bright."
And I reduced the main "contrast" setting from 100 to 85 to keep from searing my eyeballs. I found, a bit to my surprise, that this second adjustment had no apparent effect on the TV's luminance contour at the low end of the IRE range. The apparent "gamma" of the image remained deep and dark. The color saturations in dark scenes and in the dark parts of bright scenes remained satisfying.
Here, then, is a table summarizing all I have done:
BEFORE | AFTER | |
GAMMA | 4 | 4 (UNCHANGED) |
SUB-CONTRAST | 115 (ORIGINALLY 90) | 150 |
MAIN CONTRAST | 100 | 85-100 |
BRIGHTNESS | 50 | 58 |
SHARPNESS | IRRELEVANT | 0-20 |
Strictly speaking, these values apply only to 480p input from my DVD player. There are subtle differences between that picture and the ones I get via 720p/component and 720p/DVI from my cable box. Still, even if some of the actual settings differ, the general ideas are the same:
First, as a group these controls and settings interact in rich and surprising ways.
Second, working with them all as a group can effect what amounts to a change in the TV's "gamma curve." (Even though GAMMA itself is left untouched.)
Third, in general, sub-contrast must be maxed out.
Fourth, with sub-contrast maxed out, main contrast can be used to rein in the candlepower of the display for different source material.
Fifth, DNIe is our friend. Use it.
Sixth, to keep DNIe from being too assertive, turn sharpness all the way down. (For some source material, upping it from 0 to a low value — say, 20 — may be indicated.)
Seventh, "standard" settings to color and brightness, à la what Avia recommends, work fine.
Note that GAMMA and sub-contrast are accessible only in the service menu. The others are accessible only in normal user mode. That's why it's hard to test the unexpected interactions among various combinations of these settings ... especially since going into the service menu bypasses the user settings until you go back into normal user mode and switch from Dynamic mode (the default while in the service menu) back to Custom settings.
So to a certain extent there is a limit to how scientific I can get in detailing (much less understanding) how these items interact. But interact they do, I firmly believe, in unexpected ways covered in no textbook or enthusiast magazine that I know of.
No comments:
Post a Comment