We always complain about relatively minor differences in performance between chipsets, motherboards and even CPUs yet very little attention is paid to one of the most important aspects of computing - video output quality.

Over the past few years, as 19" and 21" monitors have become more common users began noticing that the output from their video card wasn't as clear as could be. Issues such as overly blurry text and an inability to read smaller fonts were present, and because they were present when running normal applications in Windows this was incorrectly referred to as poor "2D image quality." We're not exempt from the guilt, as we have performed our own subjective "2D image quality" tests of various graphics cards in the past - but bear in mind that these image quality issues affect all output from your video card, not only 2D windows.

To understand why this occurs you must understand that the connection between your video card and most monitors is still an analog connection. What do we mean when we say "analog?" While it is true that the underlying basis behind all digital circuitry is a collection of analog components, a digital system only understands two discreet values. When you transmit a 1 digitally you'll get a 1 as your output, regardless of voltage fluctuations or any other phenomenon that occur during the transmission so long as the digital components can function properly. Whereas with an analog system, the possibility that a 1 could end up looking like a 0.935 or a 1.062 exists and thus introducing a level of uncertainty where the picture your video processor outputs won't necessarily be the same was what you see on your monitor.

For example, imagine an analog connection between your keyboard and your computer. If the analog to digital converter on your computer's side misinterpreted the signal coming from your keyboard, then typing an 'h' could very well come out as a 'j' on your screen. Similarly, the blurriness that may be present at higher resolutions isn't what's being outputted by your graphics chip. The data to be displayed on your screen leaves the video card's frame buffer (memory) as digital data but before it can leave your video card it must go through a RAMDAC. The RAMDAC (Random Access Memory Digital to Analog Converter) converts the digital data into an analog signal and it used to be the culprit for poor image quality not too long ago. Today's RAMDACs are much higher bandwidth and are of considerably higher quality, thus making quality loss because of the RAMDAC less of an issue than it once was.

After being processed through the RAMDAC the analog signal leaves the video card, through your VGA cable (another source for signal quality loss) and enters your monitor. The signal loss is compounded even further if you have a digital flat panel instead of a conventional analog CRT as the lower quality analog signal is then converted back to a digital form. It is this last stage that makes very little sense because just a few steps ago we were dealing with a completely digital signal leaving your video card's frame buffer; this is where DVI comes in.

Today we'll talk about the Digital Visual Interface (DVI) and how it is shaping up to eliminate these transmission problems when it comes to PC monitors. We'll also be talking about DVI implementations in currently available video cards, as well as how to improve your present-day analog video output if it's not so hot.

What is DVI?
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now