What's on HDTV?

A blog about video (and, occasionally, audio) in the HDTV age.

My Photo
Name:
Location: Catonsville, Maryland, United States

I'm Eric Stewart, a 66-year-old baby boomer, Catholic, single, no kids, two cats, retired as a computer analyst with the Social Security Administration (me, not the cats).

Friday, November 16, 2007

1080p, 1080i, and 720p

(This is an update of an earlier post, now deleted.)

Trying to figure out the actual resolution of a picture you see on an HDTV's screen is a complex affair.

The story of HDTV resolution begins with the terms 1080p, 1080i, and 720p. These terms give a number of rows of pixels on the screen (1,080 or 720, for high-definition TV) and a letter "p" or "i" that tells whether or not every pixel row is lit each time the image on the screen is refreshed.

If the designation is "p," for "progressive," all pixel rows are lit each time. Simple as that.

If "i," for "interlaced," first only the odd-numbered rows of pixels are lit. That's the first screen "refresh" operation in each pair of two such sequential operations. Then, 1/60 second later, only the even-numbered pixel rows are lit. That's the second screen refresh in the pair. After that, it's back to the odd-numbered rows for the next refresh, the first of two screen refresh operations in a new pair. And so on. (Why "interlaced"? Think of how your fingers look when you fold your two hands together: they're "interlaced.")

A pixel? It's a "picture element": a dot on the screen that is independent of every other dot, in terms of what color it is and how light or dark it is.

A "pixel row"? It's a horizontal array of pixels which corresponds to a "scan line" on an old tube-type TV. If you peer at one of those old picture tube TVs up close, you can see the scan lines.

1080p? That's jargon for a high-def picture made of 1,080 rows of pixels with 1,920 pixels in each row. All the rows of pixels are lit up on each and every screen refresh operation. Every pixel on the screen is refreshed each 1/60 second.

1080i? It likewise has 1,080 rows of pixels with 1,920 pixels in each row. Just the odd-numbered rows are lit up on the first screen refresh; then, on the second, just the even-numbered rows. It takes two 1/60-second refreshes to illuminate all the pixels with 1080i. Accordingly, all the pixels are refreshed each 1/30 — not 1/60 — second. 1080p gives you better moving images than 1080i by updating every pixel each time the screen is refreshed.

Both 1080p and 1080i use pixel rows containing 1,920 pixels each. Accordingly, both put over 2 million pixels on the screen. 1080p simply updates them twice as often as 1080i, for smoother moving images.

And 720p? That's like 1080p except that each "frame" of the displayed image is made up of 720 pixel rows, not 1,080. Each row of pixels in the 720p format contains 1,280 pixels, not the 1,920 pixels of 1080i or 1080p.

So 720p puts over 900,000 pixels on the screen. They're all updated as often as with 1080p, every 1/60 second. The whole screen is refreshed twice as often as with 1080i, but there aren't nearly as many pixels on the screen, so the image has much less static detail. However, if your screen is relatively small compared with how far you sit from it, you may not be able to tell the difference.

Keep in mind that 720p, 1080i, and 1080p HDTVs all use screen dimensions in which the ratio of the width to the height is 16:9. For standard-definition TV, the screen's "aspect ratio" is a squarish 4:3.


Terms like 720p, 1080i, and 1080p are ambiguous. They apply to the screen resolution of the HDTV itself. They apply separately to the signals you input into the HDTV.

As these terms apply to input TV signals, 1080p always refers to signals whose individual "frames" contain exactly 1,080 horizontal rows of exactly 1,920 pixels each, intended to be displayed progressively.

1080i signals have image frames with exactly 1,080 horizontal rows of exactly 1,920 pixels each, intended for interlaced display. With 1080i signals, the rows of each image frame are divided into two successive "fields," using odd-even interlace. The first in each pair of fields contains just the odd-numbered pixel rows, and the second field in each pair contains just the even-numbered rows. It takes two fields, 1/60 second apart, to make one frame.

720p signals' image frames always have exactly 720 horizontal rows of exactly 1,280 pixels each, displayed progressively.


As the terms 1080p, 1080i, and 720p apply to high-definition TV sets, they refer to the "native" screen resolution of the TV. As they refer to high-definition TV signals, they refer to the formats the signals are in.

The native resolution of an HDTV is the resolution at which all input signals, whatever their format, will eventually be displayed by the HDTV. Every HDTV converts 720p, 1080i, and 1080p input formats to its native screen resolution. Signal processing done to match the resolution of the input signal to the native resolution of the HDTV is often called "scaling" or "format conversion."

(1) Format conversion for 1080p HDTVs:

First, let's say the native screen resolution of the HDTV is 1080p. If it gets a 1080p input signal, fine; no conversion is needed.

If the input signal to a 1080p HDTV is 720p, that format gets upconverted to 1080p. The TV computes how each individual pixel in the 720p input signal ought to be spread out into adjacent pixels in the same 1080p row — and also into nearby pixels in the adjoining rows — of the output image to be displayed on the screen.

Finally, if a 1080p HDTV's input signal is 1080i, the pixels that are missing in each field of the input signal are filled in by a kind of computational guesswork, taking into account the contents of pixels in the neighboring pixel rows which are actually present in the input field. Also taken into account are the actual contents of the same pixel in the previous field and in the next field in the input stream.

(2) Format conversion for 720p HDTVs:

720p input signals, first of all, are displayed without any scaling or format conversion.

1080p and 1080i input signals are both downconverted to 720p such that each output pixel combines information from more than one 1080p/i input pixel, thereby sacrificing some of the detail present in the original image.

If the input is 1080i, not 1080p, then before the downconversion to 720p takes place, the missing pixel rows in each input field are filled in by computational guesswork, just as when 1080i input is being converted to 1080p.

(3) Format conversion for 1080i HDTVs:

1080i native HDTV resolution is rare. It can be used by HDTVs that produce their images using old-fashioned "picture tubes" or CRTs — either direct-view or rear projection. It can also be used by plasma flat panels made by Hitachi or Fujitsu. Most HDTVs made today do not use CRTs and do not have 1080i native screen resolution. Instead, most HDTVs made today have 1080p or 720p native resolution. (However, all HDTVs accept 1080i input signals.)

On a 1080i-native HDTV, a 1080i input signal will be rendered just as it is, without conversion.

A 1080p input signal will be downconverted to 1080i by having half of its pixel updates thrown away every 1/60 second. That is, in the first 1/60 second all the even-numbered pixel rows of the input signal's frame will be ignored. Then, in the second 1/60 second, the odd-numbered rows of the next frame will be ignored. The result will be equivalent to having received a 1080i signal, not a 1080p signal, in the first place. (1080p input signals are not widely available, since over-the-air HDTV broadcasts are either 1080i or 720p. Blu-ray and HD DVD players do output 1080p signals, however.)

A 720p input signal will be upconverted for display at 1080i. Each of its input pixels will be spread out over more than a single screen pixel. That effectively converts the 720p input to 1080p. Then, half of the derived pixel rows will be discarded, just as when converting 1080p to 1080i.


Most digital HDTVs available today have a native screen resolution listed as either 720p or 1080p. The image is typically progressively displayed, not interlaced, and it has either 720 pixel rows (720p) or 1,080 rows (1080p). Flat-panel HDTVs (LCD, plasma) and rear-projection "microdisplays" (LCD, DLP, LCoS, SXRD) typically fall into this category of HDTVs whose displays are natively progressive.

Actually, some so-called "720p" flat panels actually give you a screen resolution using, say, 768 pixel rows. All digital HDTVs are constructed with X-number of pixels in each row appearing on the screen, and Y-number of rows. For plasma and LCD flat panels, the number Y is sometimes (for whatever reason) 768.

If you input a 720p signal to a flat panel whose native resolution uses 768 rows, you'll see only 720 rows worth of definition, since the internal upconversion to 768 displayed rows doesn't add any extra picture detail. If you input a 1080i or 1080p signal to this 768-row flat panel, the internal downconversion will give you a 768 visible rows of detail, but not the full 1,080 rows present in the input signal.

As for X — how many pixels per row there are in a "720p" or "1080i/p" flat-panel plasma or LCD HDTV — you are apt to find numbers like 1,366 or 1,024, instead of 1,280. 1,366 is a number typically found in flat panels of the LCD variety, whatever their diagonal size. It is also typical of the larger plasma HDTVs, 50 inches and up. Smaller plasmas (but not smaller LCDs) often use just 1,024 pixels per row.

The number of pixels per row of the native screen resolution accordingly need not match the number of pixels per row of the input signal, which will always be 1,280 pixels for 720p or 1,920 pixels for either 1080i or 1080p. Scaling down of the input image can be required, sacrificing horizontal detail.

That's right: depending on the input signal type and the screen resolution, scaling can reduce the amount of horizontal detail in the picture. For example, if the input is 1080p (1,920 x 1,080) and the screen resolution is 1366 x 768, the effective number of pixels per row is reduced from 1,920 to 1,366, while the number of pixel rows is also reduced from 1,080 to 768.

But scaling in the upward direction cannot increase the amount of visible horizontal detail. For instance, if the screen is 1366 x 768 and the signal is 720p (1,280 x 720), scaling 1,280 pixels per row to 1,366 pixels per row will not add to the amount of detail your eye can see. It will simply spread what detail there is across a larger number of screen pixels. Scaling the input signal accordingly can lower the effective resolution of the picture, but it cannot raise it.


The same is true of any sort of format conversion the TV does — say, from 720p as an input format to 1080p as a display format, or from 1080i to 720p. If the conversion is a downconversion (1080i/p to 720p), detail is lost. If the conversion is an upconversion (720p to 1080i/p), no extra detail is gained. Seems unfair, doesn't it?

Furthermore, scaling or format conversion can jettison detail in either of two directions: horizontal detail, vertical detail ... or both. Doubly unfair, no?

That's why I believe most people with sufficient cash in hand will want to buy a 1080p HDTV of the type marketed as "Full HD." Its screen will have fully 1,080 pixel rows and — because it is "Full HD" — fully 1,920 pixels in each row. An HDTV whose native screen resolution is 1080p and which is marketed as "Full HD" will never jettison image detail during format conversion or scaling.

On the other hand, HDTVs that are marketed as "720p" — even if their screens actually use, say, 768 rows — will discard vertical and horizontal image detail for all 1080i and 1080p input signals. They can even reduce horizontal detail for 720p inputs — if their horizontal resolution is less than 1,280 pixels.

Furthermore, HDTVs that are marketed as 1080p but without the "Full HD" designation will discard horizontal detail from a 1080i/p input signal, though no vertical detail will be lost.


Over-the-air, cable, and satellite HDTV channels provide input in either the 1080i or 720p format, never in 1080p. Still, many 1080p HDTVs today can accept a true 1080p input signal from, say, a Blu-ray or HD DVD player. If they are "Full HD" sets, they can display all the horizontal resolution (i.e., 1,920 pixels per row) on the Blu-ray or HD DVD disc.

But beware: some early-model 1080p HDTVs still being sold may not accept 1080p input from a disc player! Oddly, their high-def inputs are limited to 1080i and 720p.

Keep in mind also that scaling and format conversion are always done with standard DVDs that are input to an HDTV. I'm talking about "regular" DVDs, not HD DVDs or Blu-ray discs. Regular, standard DVDs use 480i images: 480 pixel rows in two interlaced fields per frame.

The number of pixels per row on a DVD can be 704 or 720. Either way, the amount of horizontal detail is less than either 720p or 1080i/p nominally offers. Furthermore, it can be shoehorned into either a squarish 4:3 box or spread over a wider 16:9 aspect ratio. (Notice, accordingly, that pixels on a standard DVD are not "square," with width and height being exactly the same. All the pixels I have been talking about up to now are square.)

Lets say a 480i image on a standard DVD is fed into a 1080p HDTV. First it has to go from "i" to "p": interlaced to progressive. This "deinterlacing" of a DVD image is a complex subject in itself, since it is done differently for film-based material than for video-based images. For the former, the reversing of what is called "3:2 pulldown" is required. 3:2 pulldown allows film projected at 24 frames per second to be "scanned" for DVD video at 30 frames (60 fields) per second. Reversing the 3:2 pulldown, done either by the DVD player or the TV, restores the original film frames at 24 fps.

Reverse 3:2 pulldown is also called "inverse telecine." A so-called "progressive" DVD player can do it, or it can be done in the HDTV itself. Reverse 3:2 pulldown turns 480i into 480p, which then has to be upconverted to (say) 1080p. Again, the scaling and format conversion that are required can be done in the DVD player or in the HDTV. The result will be a picture with no more visible detail than on the DVD itself; it can also contain distracting "artifacts" if the reverse 3:2 pulldown is not done perfectly. Still, the image can be quite good ... even though it's not true high-def.


Scaling and format conversion are also done by a high-definition TV for standard-definition TV channels that its onboard tuner receives, or that are input to it from outboard devices such as a cable box or satellite receiver. Standard-def signals are 480i, like DVDs, but they have lower horizontal resolution than a DVD does. Standard-def over-the-air TV (or cable or satellite) signals can be transmitted in digital form or in analog form — it doesn't matter which. After being converted from analog to digital, if need be, they're still converted and scaled to the native screen resolution of the HDTV.

Any HDTV, whether 720p or 1080i/p, can theoretically do complete justice to the detail in standard-def TV images and in 480i images input from regular DVDs, provided that the deinterlacing, scaling, and format conversion the HDTV (or a source device such as a DVD player) does is done well.


As a practical matter, then, when you shop for an HDTV you are generally asked to pick from two native screen resolutions: "1080p" or "720p." The former is more expensive, but it never sheds image detail with any signal source. The latter is more affordable, but it can shed picture detail.

If you read the fine print, some "720p" HDTVs (usually flat panels) actually have "extra" vertical resolution: 768 rows. Some also have "extra" horizontal resolution: more than 1,280 pixels per row. Other "720p" HDTVs (usually rear projectors) have exactly 1,280 pixels per row. Some "720p" flat panels, unfortunately, offer less than the regulation 1,280 pixels per row. From up close, they're less sharp.

Most or all current-model 1080p or "Full HD" HDTVs offer full-fledged 1920 x 1080 native resolution, and most or all accept 1080p inputs from external devices. (Avoid non-current models; they may accept only 720p and 1080i inputs.) Some 1080p HDTVs that lack the "Full HD" designation will reduce the horizontal detail in a 1080i/p input signal.


There are plasma HDTV models that actually offer 1080i native resolution, even though they're advertised as 1080p. Hitachi plasmas come to mind. Fujitsu plasmas also use the same "ALiS" (Alternating Lighting of Surfaces) plasma technology.

ALiS plasma display panels, unlike other types, don't have gaps separating one pixel row from the next. More of the screen surface is able to be lit up, but only every other pixel row of a 1080p video frame containing 1,080 rows can be displayed at any one time. 1/60 second later, the remaining pixel rows are lit up, slightly offset vertically on the screen. In effect, in each 1/60-second frame of 720p input, either the odd-numbered pixel rows or the even-numbered pixel rows are suppressed. (Notice that no pixel rows need to be suppressed for 1080i input. But pixel-row suppression does take place for 720p input after internal scaling to 1080p.)

For the same reason, ALiS plasmas advertised as "720p" (and actually displaying, typically, 768 rows) are actually "720i" (or "768i") displays. In each 1/60-second frame of 720p input, either the odd-numbered pixel rows or the even-numbered pixel rows are suppressed. Again, this does not adversely affect the already-interlaced video in a 1080i input signal.

Many of the larger ALiS plasmas have only 1,366-pixel horizontal resolution. Some smaller models offer only 1,024-pixel horizontal resolution.

On the other hand, ALiS plasmas offer bright screens with less "screen door effect" than other plasmas have: the ability to see the gaps between the pixels when you sit close to the screen. Also, pixel-row suppression means each pixel is lit only half the time, resulting in longer panel life.


Aside from ALiS plasmas, the only HDTVs I know to use a natively interlaced screen format (1080i, for example) are CRT-based. (A CRT? It's a "cathode ray tube": an old fashioned picture tube.)

There are a few direct-view CRT-based HDTVs sold, and until recently there were also rear-projection CRT-based HDTVs still on the market, though I believe they're no longer made. They have in common that a electron beam sweeps across the CRT screen once for each scan line, the equivalent of a pixel row. The lines are scanned in an odd-even, interlaced fashion, no matter what type of format the input signal uses.


To try to reduce all the above to a few sentences:

  • Prefer "1080p" HDTVs if you can afford them.
  • Buy "720p" HDTVs if you're on a budget or for screen sizes less than 37".
  • Be aware that some plasmas suppress pixel rows for input signals with a progressive format.
  • Check the fine print to determine actual vertical and horizontal resolution.

Best of luck to you in your HDTV hunt!

Labels:

10 Comments:

Anonymous davedit said...

This is an excellent, thorough post that has really helped my understanding of the subject. Thank you very much!

December 8, 2007 at 4:24 AM  
Anonymous h2opolo16 said...

Thanks a lot! Great post. Learned a great deal on HDTV up and down converting.Keep it up.

December 28, 2007 at 1:34 AM  
Anonymous Anonymous said...

Thank you for this substantial explanation! I just upgraded to a 52" 1080p Vizio LCD from a 50" 720p Vizio plasma (quit after 2 years), and am disappointed over an apparent loss of resolution with SD tv (1080i signal from cable box) and 720p DVD input. HD tv appears at best the same, if not slightly worse...what could explain this?

July 2, 2008 at 1:31 PM  
Blogger eric said...

Anonymous,

You said: I just upgraded to a 52" 1080p Vizio LCD from a 50" 720p Vizio plasma (quit after 2 years), and am disappointed over an apparent loss of resolution with SD tv (1080i signal from cable box) and 720p DVD input. HD tv appears at best the same, if not slightly worse...what could explain this?

One possibility is that the LCD has motion blur, the result of lags in updating pixels that affect LCD and not plasma TVs. Does the picture seem to get sharper when you freeze the frame?

Or, the settings you are using for the way the new TV is processing the signal internally could be the culprit. Try to find a mode in which internal signal processing is completely off, and see how the picture looks.

It might be a good idea to get one of the DVDs that help you set up your TV for optimal performance, such as Digital Video Essentials or the Avia Guide to Home Theater.

Also, if you are ready to spring for a Blu-Ray Player, you might try to see how the new TV handles true 1080p material. It is not unusual for HDTVs to have much inferior pictures with lesser inputs than the actual screen resolution. So your 1080p set might be compromised picture-wise with a 1080i input, owing to the unavoidable internal format conversion, which might not be done as well as one would hope. The same thing might apply even more with SD input. It's an expensive fix, but there are external video signal processors that do a much better job at format conversions.

If the new TV is still brand new, you may want to consider returning it and buying another brand and model.

Good luck in solving your problem!

July 2, 2008 at 4:57 PM  
Anonymous Raj said...

Great Article and very thorough one. thanks for explaining all this

October 19, 2008 at 11:25 AM  
Blogger Ami said...

Hi,

I greatly enjoyed looking through your blog at "http://whatsonhdtv.blogspot.com/2006/12/1080p-1080i-and-720p.html" and found an informative one for my theme related topics.I have also some entertainment,music,dish TV,satellite TV,music related sites and blogs. So,I think it would be beneficial for both of us if we will join in a community and become link partners to each other which will help your blog/site in getting more Google values.If you are interested then please contact me at-
ami.roberts86@gmail.com

Thanks
Ami

December 26, 2009 at 2:59 AM  
Blogger TrantaLocked said...

This is an extremely helpful page, one of dozens I've read through that finally adequately explained interlaced signals to me and how televisions de-interlace those signals.

September 28, 2011 at 9:45 PM  
Blogger eric said...

Tranta,

I'm glad to be of help. Cheers!

Eric

September 30, 2011 at 12:23 PM  
Blogger Dane said...

Is a pixel on a 37" 1080P monitor smaller than a pixel on a 55" 1080P monitor?

November 12, 2011 at 3:33 PM  
Blogger eric said...

Dane said...

Is a pixel on a 37" 1080P monitor smaller than a pixel on a 55" 1080P monitor?

Dane,

Yes, a pixel on a 37" 1080P monitor is smaller than a pixel on a 55" 1080P monitor.

The width of each pixel will be 37/55 that of a pixel on the larger monitor. Likewise, the height will be 37/55 of that on the larger monitor. So the ratio pf the pixels' areas will be the square of 37/55, or about .45. So the pixel on the 55" monitor will be a bit over twice as large!

Which is what you'd expect, since the surface area of the 55" screen is over twice as large.

However, if you sit (a bit over) twice as close to the 37", it will seem as big as the 55" when seen from the greater distance.

The key here is that both monitors are 1080p, so both will give you the exact same resolution, and the amount of resolution you perceive will depend on both the monitor's size and the seating distance.

Best,
Eric

November 13, 2011 at 10:33 AM  

Post a Comment

<< Home