Why You Should Check out the Latest in HDTV Technology

By: David Johnson

When the first high-definition television (HDTV) sets hit the market in 1998, movie buffs, sports fans and tech aficionados got pretty excited, and for good reason. Ads for the sets hinted at a television paradise with superior resolution and digital surround sound. With HDTV, you could also play movies in their original widescreen format without the letterbox "black bars" that some people find annoying.

But for a lot of people, HDTV hasn't delivered a ready-made source for transcendent experiences in front of the tube. Instead, people have gone shopping for a TV and found themselves surrounded by confusing abbreviations and too many choices. Some have even hooked up their new HDTV sets only to discover that the picture doesn't look good.

Fortunately, a few basic facts easily dispel all of this confusion. In this article, we'll explain the acronyms and resolution levels and give you the facts on the United States transition to all-digital television. We'll also tell you exactly what you need to know if you're thinking about upgrading to HDTV.

For years, watching TV has involved analog signals and cathode ray tube (CRT) sets. The signal is made of continually varying radio waves that the TV translates into a picture and sound.

An analog signal can reach a person's TV over the air, through a cable or via satellite. Digital signals, like the ones from DVD players, are converted to analog when played on traditional TVs.

This system has worked pretty well for a long time, but it has some limitations:

a) Conventional CRT sets display around 480 visible lines of pixels. Broadcasters have been sending signals that work well with this resolution for years, and they can't fit enough resolution to fill a huge television into the analog signal.

b) Analog pictures are interlaced -- a CRT's electron gun paints only half the lines for each pass down the screen. On some TVs, interlacing makes the picture flicker.

c) Converting video to analog format lowers its quality.

United States broadcasting is currently changing to digital television (DTV). A digital signal transmits the information for video and sound as ones and zeros instead of as a wave. For over-the-air broadcasting, DTV will generally use the UHF portion of the radio spectrum with a 6 MHz bandwidth, just like analog TV signals do.

The picture, even when displayed on a small TV, is better quality. A digital signal can support a higher resolution, so the picture will still look good when shown on a larger TV screen. The video can be progressive rather than interlaced -- the screen shows the entire picture for every frame instead of every other line of pixels. TV stations can broadcast several signals using the same bandwidth. This is called multicasting. If broadcasters choose to, they can include interactive content or additional information with the DTV signal. It can support high-definition (HDTV) broadcasts. DTV also has one really big disadvantage: Analog TVs can't decode and display digital signals. When analog broadcasting ends, you'll only be able to watch TV on your trusty old set if you have cable or satellite service transmitting analog signals or if you have a set-top digital converter.

This brings us to the first big misconception about HDTV. Some people believe that the United States is switching to HDTV, that all they'll need for HDTV is a new TV and that they'll automatically have HDTV when analog service ends. Unfortunately, none of this is true.

Top Searches on
Technology
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 
 • 

» More on Technology