ATI and NVIDIA: Quick Look at HDTV Overscan Compensationby Andrew Ku on August 25, 2004 12:00 PM EST
- Posted in
It has been a while since ATI released their HDTV dongles, which provided HDTV output support for most of their Radeon line. In fact, we probably first experimented with their HDTV dongle back in July or August of 2002. Back then, HDTV output support was plagued by the overscan issue.
And for those of you unfamiliar with "overscan", it is simply the part of the picture that is cropped. Depending on whom you ask, others also describe it as the space that bleeds or "scans" beyond the edges of the visible area of the screen. Typical televisions can have a loss of up to 20% of the image due to cropping. This portion of lost image is what is commonly known as overscan. Technically speaking, the information of the "lost picture" is not actually lost, but it is outside the range of the visible area of your TV screen. A similar situation on the computer end is when you view a picture in 100% scaling on a monitor with a lower set resolution than the picture, i.e. a 1600 x 1200 picture in a 1280 x 1024 desktop environment. The difference is that on a computer, you can move the picture around to see portions cut off by the visible area of the monitor.
We should clarify that overscan is a not necessarily a bad thing. It is implemented deliberately on TV sets because of the different video input formats: composite, s-video, etc., all of which the TV needs for which to provide support. If overscan was not implemented as a factor of these different formats, there would likely be underscanning of different degrees on different TV sets. This is due to the different protocols and inherently different signals that the TV needs to handle. (Underscanning would be the opposite of overscanning, where the image is smaller than the area on which it is being displayed.) It would be tantamount to zooming out when you look at a picture; though in the case of TV sets, the space that doesn't reach the edges of the display would be black. The deliberate use of overscanning allows the screen to be completely filled, as opposed to underscanning, which could have varying degrees of underscanned margins.
The reason why we notice overscan more so on a computer is that we already know what the signal is suppose to look like: where the start menu button should be, where the clock should be, where the corners of the screen should be in relation to our desktop shortcuts. A DVD to a DTV or even a regular TV usually encounters some measure of overscan, though we hardly notice it because we aren't use to its native signal. One way that DVD player manufacturers' or TV manufacturers' compensate for this is to provide a zoom out function, where you tell the system essentially to underscan. This is why we go crazy when we notice overscan from a X-Box rather than from DVD signal; you know where the game menu is supposed to look like.
In theory, if a HDTV was designed for only HDTV, there would be no overscan from component computer video output. The main issue with overscan is that DTV are programmed to do more than just DTV signals. They accept many legacy signals: camcorders, s-video, composite, etc All of this means that there must be a cross platform support for all formats, and the only way for that to occur is to either overscan or underscan. Underscanning would be more frustrating to the consumer, since the signal would be smaller than the displayed area with the black bars surrounding the image. Overscan ensures that video signal always fills the screen, though this gets to be increasingly frustrating when you get an X-Box or output video from your computer and the signal is overscanned.
And as Keith Rochford (Chief Engineer of eVGA) explained, when you switch to a DTV, you are now talking about a high resolution display, and backwards engineering a pixel technology to a line scan technology isn't a simple task. This backwards engineering or transfer is what leads to the large 10% to 15% margins of overscan that we have been accustomed to when we output from a computer to a DTV. For those who own something like a plasma display that can do direct computer video output via VGA or DVI, this obviously isn't an issue, since there is no backwards conversion needed. It is essentially like a really big computer monitor, since it keeps the video card's native output.
In the most practical sense, overscan is something you don't want to have or at least want to minimize. Using your HDTV set as a substitute for your monitor can be awesome, but the limitation of having part of the picture cropped gets to be a major deterrent, especially when you want to plan games, surf the web, or watch videos on that nice big screen.
There are more than just ATI and NVIDIA cards on the market, but most of us are still going to be stuck with one or the other. In which case, you are most likely going to get some degree of overscan. Keep in mind that we can't track down every or even most DTV sets and check the degree of overscan, and even if we could, overscan varies between TV sets because of the manufacturer's design, which isn't a bearing on the video card. For these practical reasons, we are going to focus primarily on how ATI and NVIDIA approach HDTV overscan compensation.