When the luminance produced by the TV for a peak white image occupying the entire screen is compared with that for a reference black full-screen image, the TV's full on/full off contrast ratio can be derived — also called its sequential or dynamic contrast ratio.
The sequential contrast ratio is usually much higher than the TV's simultaneous or static contrast ratio. That lower but more realistic figure is obtained using the ANSI standard of measurement. The light coming from black and white rectangles, displayed on the screen all at once in a 4x4 checkerboard pattern, is metered and compared.
The Sony Bravia KDL-40XBR2 HDTV that I recently purchased has, according to Sony, a sequential contrast ratio of 7000:1. Yet the ANSI checkerboard method measures the simultaneous contrast ratio at just 1300:1. Which brings up the question ...
How much contrast does the eye really need? It can utilize at most a 1000:1 contrast ratio, according to video expert Charles Poynton in Digital Video and HDTV Algorithms and Interfaces.
Because the luminances associated with the brightest glints and highlights in a real-life scene are generally compressed as a TV image is being created, Poynton says the TV itself need only have a 100:1 contrast ratio, not 1000:1.
Other authorities, be it noted, say TV sets profit by having higher contrast ratios than 100:1. Some even say ratios of 1200:1 and up are not too much.
Per Poynton, the eye "can discern different luminances across about a 1000:1 range" (p. 197). That is, in any particular state of adaptation to ambient light, the eye responds to no more than a 1000:1 ratio between the brightest and dimmest luminances present in any real-life scene.
The highest contrast ratio actually usable by the eye at any given moment is thus 1000:1. Even so, the ratio between diffuse white and reference black in a TV signal need be no higher than 100:1.
That, says Poynton, is because highlights are artificially compressed in TV signals in order to "make effective use of luminance ranges [contrast ratios] that are typically available in image display systems" (p. 83).
Peak white luminance, as measured with a full-field white test signal, thus corresponds to diffuse white in an actual TV image: the white of a brightly lit piece of paper, say. It does not represent the luminance of the brightest highlights in the original scene.
In a real scene, something like strong sunlight gleaming off the bumper of a car can produce luminance ten times brighter than the same sunlight reflected off a piece of paper. Since highlights like bumper gleams are not encoded in a TV image at a mathematically correct 10:1 ratio to diffuse white, a 1000:1 contrast ratio is not needed in a TV display. A 100:1 contrast ratio is fully sufficient for a TV to reproduce the luminance range used when any given image or scene is encoded into a video signal.
That contrast ratio in a TV need be no higher than 100:1 is indeed fortunate, because it's hard to get any TV to render truly inky blacks. Says Poynton (p. 197), "In practical imaging systems many factors conspire to increase the luminance of black, thereby lessening the contrast ratio and impairing picture quality. On an electronic display or in a projected image, simultaneous contrast ratio is typically less than 100:1 owing to spill light (stray light) in the ambient environment or flare in the display system."
That is, even with total darkness in the viewing room, the TV itself will produce spill light or stray light as its luminance output reflects back onto its screen. That spill or stray light — plus optical flare being bounced around within the innards of the display — will inexorably lighten the TV's blacks and lower its measured simultaneous contrast.
Surprisingly, Poynton says (see table 19.1, p. 198) that a movie theater, supposedly the gold standard of image display, will typically furnish a simultaneous contrast ratio of just 80:1! Sequential contrast ratios can reach fully 10000:1 when different film images/scenes are compared. Yet when projected in a theater, a single frame on film will typically exhibit a much lower simultaneous contrast ratio.
Meanwhile, Poynton says, a typical TV in a typical living room will sometimes provide a simultaneous contrast ratio of just 20:1!
Just 20:1? What happened to the already-low 100:1 baseline? Basically, it got swallowed up in the ambient light of the typical living room. According to this online article:
To better understand the impact of the presence of light in a room on the contrast ratio performance of an imaging device, it is sufficient to realize that with the light emitted by just one candle in a room [1 lux] there would not be any difference between a 500:1 and a 5000[:1] or even a 10,000:1 contrast ratio!(Lux and lumen, by the way, are measures of light. They are related to the candela.)
Increase the level of light in the room to just 30 lux — that's equivalent to a dimly lit room — and contrast ratio figures above 50:1 would turn out to be simply academic even in the case of video projectors with relatively high brightness rating (2000/2500 lumens and above).
The point here is that even dim lighting in a TV viewing room cuts significantly into Poynton's 100:1 contrast-ratio norm.
Why? Mainly because the eye adapts to the room's lighting rather than to the TV screen's much lower luminance. It accordingly can't see details in the darker portions of images on screen unless the TV's brightness control is boosted.
Boosting the TV's brightness control to offset ambient room lighting raises the black level of the TV, while it does nothing to change peak white level. Since contrast ratio is the ratio between peak white luminance and reference black luminance, the effective contrast ratio is reduced well below the maximum ratio that the TV could otherwise produce.
Whether the effective simultaneous contrast ratio is 20:1 or 50:1, it is way lower than the best-case sequential contrast ratio the TV could produce, if properly adjusted and viewed in a completely darkened room.
That's too bad, because according to Poynton (p. 197), "Contrast ratio is a major determinant of subjective image quality, so much so that an image reproduced with a high simultaneous contrast ratio may be judged sharper than another image that has higher measured spatial frequency content." The "measured spatial frequency content" amounts to its "real" sharpness.
Spatial frequency is what you are paying for if you buy a 1080p HDTV rather than a 720p. It is, however, not the only thing which determines "subjective image quality" as it relates to perceived sharpness. You also need sufficient simultaneous contrast if you want the image to impress the eye as being ultra-sharp.
No comments:
Post a Comment