Introduction

ATI is no-doubt enjoying the limelight lately with the release of the X1900, which is no surprise considering their new card puts ATI back in the lead over NVIDIA in the performance category. With some form of the 7800 GTX so long on the top of that list, it's kind of strange seeing an ATI card take its place, especially given the timing of the release and the short lifespan of the X1800 series. Now that ATI has these incredibly powerful cards available to consumers, we're interested to see how future games will put this card to good use.

ATI has been successful with their All-In-Wonder line of multimedia graphics cards, and have been good about making them available on a wide variety of their cards. We recently reviewed the AIW 2006 which was based on the X1300, and ATI have apparently wasted no time in coming out with an All-In-Wonder card based on their newest part, the X1900.

The All-In-Wonder X1900 is the latest thing from ATI and it promises the same quality (if not better) of video and multimedia features we've seen before, on a card with the highest in gaming performance. Until now, the X1800 XL All-In-Wonder was the most powerful version of this card, and we were impressed by its performance. However, the X1900 AIW will obviously be better for gaming, even if it's not quite as powerful as a stock X1900 XT.

Many of the multimedia/video features remain the same as the previous versions, so we will be touching briefly on each feature for those who aren't familiar with what the AIW offers. Because the GPU is clocked lower than a standard X1900, gaming performance will be something for us to look at, and we will be testing the AIW X1900 against a wide range of cards, including the standard X1900 XT and XTX. We've already reviewed a few different All-In-Wonder graphics cards, but for this one, there will be a slight change of focus.

Video quality is something we sometimes take for granted when talking about digital video. Different DVD players can sometimes differ greatly in image quality, and these differences are not all subjective. We will be looking closely at how ATI's latest drivers handle digital video processing with a special HQV benchmark on this card, and we'll also be looking at how it compares to NVIDIA's decoding. There have been some changes by ATI which address some decoding problems in the past, and we're interested to see how well they've done. For now though, lets look at the All-In-Wonder X1900.

The Card and The Features
Comments Locked

43 Comments

View All Comments

  • bldckstark - Friday, February 10, 2006 - link

    The example shown does not change the overall score of the card. The example shown is for the readers reference to the test, and is not what the test scored from. There may be other reasons someone may not give these tests merit, but this is not one of them. You could maybe rank on the author for this, but not the tests.
  • mpeavid - Friday, February 10, 2006 - link

    The example shown does not change the overall score of the card.

    But how do we know that? Take example cadence 2224. According to the text the same item is being compared, yet different frames are clearly shown. If their methadology was more concise, their text is not.

    You have to be clear about this or it misleads your readers. Its like doing a 3D test using 2 different scenes to render. Anandtech uses all the same 3D scenes to render right?

  • rjm55 - Friday, February 10, 2006 - link

    Other sites did AIW 1900 reviews on January 31st. Why so long for AT? Did ATI pass you over on sending a sample?
  • fishbits - Friday, February 10, 2006 - link

    quote:

    If performance continues to increase at the rate that it has been, we aren't sure how game software will be able to keep up.

    By adding more polys, textures, particles, lights, shadows and shaders. You really didn't know this? Call any respected game dev house and ask them if they could possibly come up with a use for more GPU horsepower. The answer will be "Of course genius, we've got code and models we're waiting for capable hardware to run on, it's been that way for years. We'll take every bit of it we can get." Tell Anand I want you to spend this weekend benching EQ2 maxed out and tell us Monday if "we" still "aren't sure."

    Anyhow, sounds like a nice card, but I'd rather have a more dedicated gaming card and a seperate TV tuner solution.
  • Griswold - Monday, February 13, 2006 - link

    Of course they want more power so they dont have to write efficient and optimized code. Especially your EQ2 example comes to mind. There are far too few companies that come up with highly optimized code that will run top notch on current hardware and provide extra eye candy on future generations.
  • Backslider - Friday, February 10, 2006 - link

    The 7800GT used in the test must be stock. The one I purchased came overclocked and performs much better than what the benchmarks are showing.

    ATI is still too pricy at the moment, I looked up and down for an X1800XL that could come with in price range of the 7800GT that I purchased, and I couldn't find one. I wasn't going to pay $60 over when they perform so identical. The prices were approx.

    X1800XL 256 Stock $330
    7800GT 256 OC $270

    ATI get those prices down.
  • tuteja1986 - Friday, February 10, 2006 - link

    I sold my 7800GTX bought a x1900XT and i couldn't be happier :! if G71 fixes some issues like IQ and HDR with AA then i will sell my X1900XT and buy a 7900GTX :) or eles wait for R6XX and G8X.
  • Backslider - Friday, February 10, 2006 - link

    Having owned an X800xl and a 7800GT, I honestly didn't see an IQ difference. The whole HDR with AA thing, well, you must play a lot of Far Cry.

    Good luck with keeping up with the latest and greatest though, it's almost a game with in it self. If you sell at the right times, you can upgrade for very little and still have the newest toys.

    Happy gaming
  • MrKaz - Friday, February 10, 2006 - link

    I have a ati 9700 and geforce 6600gt and ati rendering look better.

    There are some annoying layers/plates on the nvidia rendering that i dont like.

    And just one note: the display is the same on both cards.
  • DeathByDuke - Friday, February 10, 2006 - link

    I'd certainly buy one if it was around $299-349, considering it performs closer to a much more expensive X1800XT

Log in

Don't have an account? Sign up now