Video Immersion II

One of the most ignored but useful features of the original Radeon was its impressive video de-interlacing.  To understand what de-interlacing is you have to understand how conventional, interlaced televisions works.  In your TV there is something called a CRT or Cathode Ray Tube.  The CRT is home to electron guns that fire electrons at the end of the CRT that you view the images on.  These electrons excite a phosphorous coating on the CRT and cause colors to appear on your screen depending on a variety of factors.  The way these electron guns paint a picture on your screen is by scanning from the left side of your screen to the right, shooting electrons along the way.  A complete line from left to right of electrons shot from the gun(s) is known as a scanline.  NTSC TV has a horizontal resolution of 480 scan lines (480 horizontal lines going from top to bottom).  In order to keep manufacturing costs low, conventional interlaced TVs would only scan every other scanline.  So if your scanlines were numbered 0 – 480, the electron gun(s) would scan 0, 2, 4, 6, 8, and so on.  Then, after the guns reached the bottom of the CRT, they would shut off and fill in the blanks per se.  So scanlines 1, 3, 5, 7 etc… would be filled in.  This all happens so quickly that your eyes are tricked into thinking that the full picture is being presented in front of you when in all actuality only half of it (alternating scanlines) is present at any given time. 

Computer monitors are non-interlaced, meaning that the electron guns in the CRT don't scan every other line, rather they scan every line of resolution.  Newer HDTVs can also accept non-interlaced signals; these are often referred to as progressive scan signals and are called things like 480p, 720p, etc…

Some TVs can take an interlaced signal and convert it to a non-interlaced/progressive scan signal.  These TVs implement what is known as de-interlacing, more commonly referred to as line doubling.  De-interlacing is the process by which odd scanlines present in interlaced video are blended with their even counterparts and displayed at once, thus giving you a non-interlaced video output.  If done properly, de-interlacing can make video look a lot better however it also has some unfortunate side effects. 

A poor de-interlacing algorithm will result in artifacts in the video itself.  ATI combated this with their Adaptive De-Interlacing on the Radeon.  This technology dynamically chose between bob & weave algorithms to produce the best quality de-interlacing out of any 3D graphics cards.  The Radeon 8500 takes it one step further by introducing what ATI calls temporal de-interlacing.  Although we didn't have enough time to thoroughly test it, temporal de-interlacing supposedly offers superior blending between the odd and even lines in order to reduce de-interlacing artifacts. 

Also in order to further tailor to the home theater PC crowd, ATI informed us that the Radeon 8500 will be compatible with a very cheap ($10 - $40) DVI-I to Component output connector.  This is very important for those that have Component (Y Pr Pb) but not VGA inputs on their TVs. 

HydraVision

This is the same dual display technology that was introduced with the Radeon VE.  For more information on how it stacks up to the competition, check out our Dual Display comparison

HyperZ II Counterstrike fans rejoice!
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now