What the R200 contributes

We've talked about all of the features that the All-in-Wonder series has given to this product, but what about the R200 core that allows this card to carry the Radeon 8500 name? 

All of the architectural features of the Radeon 8500 are also featured by the AIW Radeon 8500DV, but there are a few of those features in particular that make this card even more attractive for the home theater enthusiast crowd; those features are summarized by ATI with the term Video Immersion II which we described in our original Radeon 8500 preview:

One of the most ignored but useful features of the original Radeon was its impressive video de-interlacing.  To understand what de-interlacing is you have to understand how conventional, interlaced televisions works.  In your TV there is something called a CRT or Cathode Ray Tube.  The CRT is home to electron guns that fire electrons at the end of the CRT that you view the images on.  These electrons excite a phosphorous coating on the CRT and cause colors to appear on your screen depending on a variety of factors.  The way these electron guns paint a picture on your screen is by scanning from the left side of your screen to the right, shooting electrons along the way.  A complete line from left to right of electrons shot from the gun(s) is known as a scanline.  NTSC TV has a horizontal resolution of 480 scan lines (480 horizontal lines going from top to bottom).  In order to keep manufacturing costs low, conventional interlaced TVs would only scan every other scanline.  So if your scanlines were numbered 0 – 480, the electron gun(s) would scan 0, 2, 4, 6, 8, and so on.  Then, after the guns reached the bottom of the CRT, they would shut off and fill in the blanks per se.  So scanlines 1, 3, 5, 7 etc… would be filled in.  This all happens so quickly that your eyes are tricked into thinking that the full picture is being presented in front of you when in all actuality only half of it (alternating scanlines) is present at any given time. 

Computer monitors are non-interlaced, meaning that the electron guns in the CRT don't scan every other line; rather they scan every line of resolution.  Newer HDTVs can also accept non-interlaced signals; these are often referred to as progressive scan signals and are called things like 480p, 720p, etc…

Some TVs can take an interlaced signal and convert it to a non-interlaced/progressive scan signal.  These TVs implement what is known as de-interlacing, more commonly referred to as line doubling.  De-interlacing is the process by which odd scanlines present in interlaced video are blended with their even counterparts and displayed at once, thus giving you a non-interlaced video output.  If done properly, de-interlacing can make video look a lot better however it also has some unfortunate side effects. 

A poor de-interlacing algorithm will result in artifacts in the video itself.  ATI combated this with their Adaptive De-Interlacing on the Radeon.  This technology dynamically chose between bob & weave algorithms to produce the best quality de-interlacing out of any 3D graphics cards.  The All-in-Wonder Radeon 8500DV takes it one step further by introducing what ATI calls temporal de-interlacing.  Temporal de-interlacing syncs the input video source to the refresh rate of your monitor.  ATI claims that without their temporal filtering algorithms, text would appear to hop across the screen instead of neatly scroll. 

Also in order to further tailor to the home theater PC crowd, ATI informed us that the All-in-Wonder Radeon 8500DV will be compatible with a very cheap ($30 - $50) DVI-I to Component output connector.  This is very important for those that have Component (Y Pr Pb) but not VGA inputs on their TVs.  This connector should be available either in December or early next year directly from ATI's website. 

It's the software that makes the card Your questions are answered
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now