Such anomalies can often be laid at the doorstep of an inaccurate "grayscale"; indeed, that's the first reasonable explanation. Grayscale calibration means using test signals and expensive instruments to eliminate the inaccuracy by balancing the three primary colors — red, green, and blue — at every possible brightness level of the TV picture, so that the entire grayscale of the TV is pleasingly neutral in color.
Beware: grayscale calibration, if needed, is best done by a professional technician using state-of-the-art instruments.
For reasons I won't go into here, I had a bad experience trying to get professional grayscale calibration for my TVs. It never happened.
At any rate, recently I chanced to watch perhaps the first program I've ever seen on a high-definition channel in black and white: "Roy Orbison and Friends: A Black and White Night." (The show was recorded perhaps a year before the great singer-songwriter's untimely death from a heart attack in the late 1980s. The "friends" included a young Bruce Springsteen, Elvis Costello, Bonnie Raitt, k. d. lang, and a number of other music biz luminaries. Apparently it was taped in an early hi-def format. Great stuff technically, musically, and nostalgically.)
So this piece of hi-def, if B&W, gold showed up on my Samsung's screen with nary a trace of green!
That's chapter one. Chapter two: the same show was broadcast by my local PBS station during a pledge drive a few weeks ago. It was in standard definition this time. I watched it on my basement plasma, not my DLP, and it looked distressingly, disappointingly green.
Fortunately — and this is chapter three — it was even more recently shown yet again on INHD (or was it INHD2?) in hi-def, and when I watched that transmission on my plasma, no green tinge was apparent. The TV's grayscale appeared to be spot on, with no tint in sight.
Thus when a B&W show comes into my digital-cable DVR box over a standard-def channel, such as that of my local PBS affiliate, it can betray a green tincture. When the same show comes into that same DVR box over a hi-def channel such as INHD or INHD2, there can be no tincture. (In both cases my DVR box sends its output signal to my plasma TV over a digital HDMI/DVI connection, by the way, so the difference has nothing to do with the signal pathway.)
Oddly enough, I think I can explain this anomaly.
The explanation has to do with how TV signals are packaged. With the advent of color TV way back in the 1950s, there had to be tricks by which the three color primaries of red, green, and blue could be mixed to make a suitable black and white picture, if only for the benefit of the many existing non-color TV sets. The resulting signal was called "luminance," or Y. After it had certain other necessary processing steps applied to it, it was subsequently called "luma" or Y' ("Y-prime").
Luma or Y' is the sum of fixed proportions of the three chroma signals R', G', and B'. They in turn are derived from R, G, and B — shorthand for red, green, and blue.
The trouble today is, the fixed proportions of R', G', and B' that are used to compute Y' are different for HDTV than for standard-def TV.
Other things being equal, an HDTV set that expects any signal it receives to conform to the new HDTV luma-encoding standard will do strange things when the signal was actually encoded for SDTV-style luma.
In order to drive its tri-color screen, the HDTV will take apart the luma component — with the help of two "color difference" components it also sees, R' - Y' and B' - Y' — to get R', G', and B'. But if the luma was encoded with SDTV's version of the numerical coefficients, the HDTV will forward too much of the received luma component to the green sub-image of the overall picture, and too little to red and blue. Or so my reasoning goes.
That can turn Peter Lorre's frightened face in Casablanca a tad greenish. It can make Roy Orbison's famously tinted glasses on "Roy Orbison and Friends: A Black and White Night" really tinted.
(For techies, the difference in the two forms of luma encoding, one for SDTV and one for HDTV, can be researched by looking up "Rec. 601," a nickname for ITU-R Recommendation BT.601, the international standard for television studios' non-HDTV digital signals, and "Rec. 709," a nickname for ITU-R Recommendation BT.709, the international standard for television studios' HDTV digital signals. These two standards specify other things besides techniques for luma encoding, but luma encoding is one of the biggies. You can begin your research with this Wikipedia article. Note that the relevant SDTV standard is sometimes called by an earlier name, "CCIR 601.")
It is apparently easy, given equipment that can accomplish the task, to convert between the two luma encodings. That's how the same program could wind up with two different encodings, one on the high-definition channel and one on the standard-def channel. Engineers at the TV studio might apply the requisite conversion matrix — mathematics embodied in electronics — and voilà.
My plasma TV seems to be capable of reversing the process — but only when the signal reaches it in analog form via its component video or "YPbPr" input. In that case alone, the TV offers a user menu item which purports to let me switch between luma encodings manually.
Unfortunately, the same is not true of the digital connection I'm actually using between the cable DVR box and the TV. There, the signal comes out of the cable box's HDMI (High Definition Multimedia Interface) port, passing through an HDMI-to-DVI adapter into the realm of Digital Video Interface, or DVI, the digital video format that my two-year-old TV model is actually capable of receiving. DVI uses digital video signals exactly like those used for HDMI.
Such all-digital signal pathways into the TV simply assume the signal needs no conversion — or if it does, it takes place in the source device, in this case the cable box, prior to HDMI or DVI transmission. That's apparently why the TV, in its user menu, offers no color-encoding selection option for DVI.
* * *
Since filing the above I have done a little more experimenting with my Hitachi plasma TV. I now find that there undoubtedly exist more than one reason why black and white material can appear greenish.
I've recently purchased a four-DVD collection of the old British mystery movies starring Margaret Rutherford as Agatha Christie's amateur sleuth, Miss Marple. From the early 1960s, they're all in B&W. Played into the Hitachi via YPbPr (i.e., component video) from my Bose Lifestyle system's control unit, or via S-video from my Samsung DVD player, the first movie in the series, Murder She Said, exhibits an interesting anomaly. In the middle of one particular scene, at a transition to a new camera shot, the B&W picture changes from not greenish at all to just faintly greenish!
I'm not sure I can confirm this effect on my other HDTV, a Samsung DLP rear projector, using a different DVD player. It is, after all, quite a subtle effect. But it does show up big as life, in my iMac's DVD Player software, so I don't think I'm imagining it.
My best explanantion: I expect there is information recorded on the DVD which triggers a change in (I'm guessing) the so-called color space (Rec. 601 vs. Rec. 709) the DVD player is supposed to use when it decodes the DVD. The digital picture information on a DVD is, I'm aware, accompanied by a raft of on- or off-bits which tell how the video was recorded, how it should be played back, etc. Maybe during the authoring of this DVD a crucial flag was changed at the transition in question. Maybe some DVD players take the change into account and some don't.
As I said earlier, my Hitachi plasma's user menu allows me to change color spaces only for its YPbPr input, not for DVI or S-video input. In general I find that forcing its YPbPr color-space decoding method to that for Rec. 601, or SDTV, does indeed make a very slight difference in a B&W DVD being sent to the TV from the Bose player. It adds quite marginally to the greenishness ... but not enough to account for all the greenishness I see!
I also find that, strangely enough, adjusting the Hitachi user menu's color control affects the greenishness of a B&W picture! When the color setting is lowered, the greenishness of B&W increases. When the color setting is raised, the greenishness (almost) goes away. Furthermore — and this is really strange — this color-setting dependency applies to YPbPr when the Pb and Pr input cables are both disconnected(!), such that the only input the TV receives is the supposedly colorless Y, or luma, signal.
Which suggests that the TV's internal color decoding algorithms do some odd things. You would think — obviously erroneously — that the luma signal would be treated the same, no matter what setting the color control has, since color (chroma; Pb and Pr) and black-and-white (luma; Y') are nominally independent. So if the B&W picture were on the greenish side with one color setting, it would be equally on the greenish side with another. But no! There's a clear-cut difference in the greenishness at different color settings.
That suggests that maybe I've been too hasty in dismissing the possibility (see above) that I ought to have my Hitachi professionally calibrated.
Yet, if my Hitachi is capable of, in effect, changing its calibration settings when the user color setting is altered, I wonder how much good professional calibration would do. I envision the calibrator being totally stumped by the TV's complexity — "They sure didn't tell me it could do that in calibrating school" — and telling me that compromises must inevitably be made. Live with it.
By the way, I think I can conclude from the foregoing that the Murder She Said DVD, at least prior to the camera-shot transition mentioned above, causes signal information to be put out on the Pb and Pr (i.e., chroma) channels such that the Y or luma signal is modified to offset what would otherwise be a greenish tinge. After the transition, perhaps the compensating Pb/Pr signals disappear, and there is greenishness. As I say, this change happens also in my computer software DVD player.
So there may be several possible sources of greenishness in a B&W picture:
- the DVD player or other source device using the wrong color-space decoding parameters, i.e., Rec. 601 rather than Rec. 709
- inaccurate TV grayscale calibration
- other oddities in the TV's color-decoding methodolgy
- oddities in how the DVD was authored or the source program was broadcast
- and perhaps many others
If that's so, I can only conclude that TV in the digital age is so complex that it's almost impossible to get a "perfect" picture, if by "perfect" you mean a B&W rendition that is totally free of tint.