Final Words

With very few exceptions, the GeForce 7950 GX2 leads in the single card department. Again with few exceptions, the X1950 XTX leads in the single GPU department. These are the two top performers in the graphics market right now. With the price on the X1950 XTX looking much lower (if ATI is accurate) than the 7950 GX2 right now, whether or not the added performance is worth it will have to be left up to the user, but the 7950 GX2 seems to offer an intriguing middle ground between single card and multi card setups in both performance and cost. At the ultra high end, X1950 CrossFire gets a bigger boost over X1900 CrossFire because the core clock of the CrossFire card is higher in addition to the increased memory bandwidth offered by 2GHz data rate GDDR4. Compared to the 7950 GX2 and 7900 GTX SLI, X1950 CrossFire does very well.

The new X1900 XT 256MB does come in at the bottom of our high end tests, but runs near the top of the heap in our midrange tests. This card will be an excellent value if available for $280, as ATI is suggesting. We know ATI will sell it at stock prices, but we've also heard from at least one vendor indicating they will lead with a higher price. Regardless, the X1900 XT 256MB is a well formed product for its market. We did notice that the overclocked EVGA 7900 GT KO SuperClocked performed nearly the same as the 256MB card for just about the same cost. This puts them on equal footing in our book, and it comes down to personal preference and feature requirements as to which purchase you make. If the X1900 XT 256MB does retail for $280, we can easily recommend it along side overclocked 7900 GT cards at its price point.

On the power front, ATI has reduced the load power significantly on the X1950 XTX from the days of the X1900 XTX, and GDDR4 has officially made its debut. Today's tests really have been all about the memory from size to type and speed. Of course, this is a better method than simply renaming products.

Unfortunately, ATI decided that playing the name game is still a good idea. Maybe from a marketing standpoint it makes sense, but renaming the X1600 Pro to X1300 XT isn't going to make it a better card. And 10Mhz beyond the X1600 XT is barely enough to warrant a different pair of letters following the model number, let alone a whole new series starting with the X1650 Pro. On the bright side, the name game does come with lower prices for the same performance, which is never a bad thing. We should be receiving our X1650 Pro and X1300 XT as this article goes live, so expect a follow up showcasing the latest at the low end in the near future.

We will be revisiting multi-GPU performance with NVIDIA's 7950 GX2 Quad SLI as well. As with most people, we have had some difficulty in getting Quad SLI to behave properly, but hopefully the biggest hurdles are behind us.

Availability is an issue, especially as we had seen quite a few hard launches over the past couple years. It is very difficult for us to make a proper recommendation without real prices to guide us. While ATI is touting some pretty aggressive prices, we just aren't sure people are going to hit the target. While HIS and PowerColor have confirmed that they will at least be in the neighborhood, we are hearing from other sources that prices may be much higher. ATI did try to push this launch back to the 14th of September to wait for availability, so it seems to us that they realize their error, but hopefully they won't repeat the mistake in their next major launch. We really want to hold off making purchasing recommendations until we know what these cards will cost, but ATI's prices would make much of our suggestions turn red.

Before we close, one reminder to people who really want the X1950 XTX: don't buy it. Pick up the X1950 CrossFire instead. For the same price and performance you get a much more versatile solution. If you really need both DVI outputs, the CrossFire dongle supports that as well, so all you're doing is adding a small amount of cable clutter. Basically, there's little point in not getting the CrossFire card -- assuming prices stay equal, of course.

Power to the People
Comments Locked

74 Comments

View All Comments

  • JarredWalton - Wednesday, August 23, 2006 - link

    We used factory overclocked 7900 GT cards that are widely available. These are basically guaranteed overclocks for about $20 more. There are no factory overclocked ATI cards around, but realistically don't expect overclocking to get more than 5% more performance on ATI hardware.

    The X1900 XTX is clocked at 650 MHz, which isn't much higher than the 625 MHz of the XT cards. Given that ATI just released a lower power card but kept the clock speed at 650 MHz, it's pretty clear that there GPUs are close to topped out. The RAM might have a bit more headroom, but memory bandwidth already appears to be less of a concern, as the X1950 isn't tremendously faster than the X1900.
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think its obvious why ATI is selling thier cards for less now, and that reason is alot of 'tech savy' users, are waiting for Direct3D 10 to be released, and want to buy a capable card. This is probably to try an entice some people into buying technology that will be 'obsolete', when Direct3D 10 is released.

    Supposedly Vista will ship with Directx 9L, and Directx 10 (Direct3D 10), but I've also read to the contrary, and that Direct3D 10 wont be released until after Vista ships (sometime). Personally, I couldnt think of a better time to buy hardware, but alot of people think that waiting, and just paying through the nose for a Video card later, is going to save them money. *shrug*
  • Broken - Wednesday, August 23, 2006 - link

    In this review, the test bed was an Intel D975XBX (LGA-775). I thought this was an ATI Crossfire only board and could not run two Nvidia cards in SLI. Are there hacked drivers that allow this, and if so, is there any penalty? Also, I see that this board is dual 8x pci-e and not dual 16x... at high resolutions, could this be a limiting factor, or is that not for another year?

  • DerekWilson - Wednesday, August 23, 2006 - link

    Sorry about the confusion there. We actually used an nForce4 Intel x16 board for the NVIDIA SLI tests. Unfortunately, it is still not possible to run SLI on an Intel motherboard. Our test section has been updated with the appropriate information.

    Thanks for pointing this out.

    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    as we all should know by now, Nvidia's default driver quality setting is lower than ATi's, and makes a significant difference in the framerate when you use the driver settings to match the quality settings. your "The Test" page does not indicate that you changed the driver quality settings to match.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Drivers were run with default quality settings.

    Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

    At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).


    If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

    Thanks,
    Derek Wilson
  • ElFenix - Wednesday, August 23, 2006 - link

    could you run each card with the quality slider turned all the way up, please? i believe that the the default setting for ATi, and the 'High Quality' setting for nvidia. someone correct me if i'm wrong.

    thanks!

    michael
  • yyrkoon - Wednesday, August 23, 2006 - link

    I think as long as all settings from both offerings are as close as possible per benchmark, there is no real gripe.

    Although, some people seem to think it nessisary to run AA as high resolutions (1600x1200 +), but I'm not one of them. Its very hard for me to notice jaggies even at 1440x900, especially when concentrating on the game, instead of standing still, and looking with a magnifying glass for jaggies . . .
  • mostlyprudent - Wednesday, August 23, 2006 - link

    When are we going to see a good number of Core 2 Duo motherboards that support Crossfire? The fact that AT is using an Intel made board rather than a "true enthusiast" board says something about the current state of Core 2 Duo motherboards.
  • DerekWilson - Wednesday, August 23, 2006 - link

    Intel's boards are actually very good. The only reason we haven't been using them in our tests (aside from a lack of SLI support) is that we have not been recommending Intel processors for the past couple years. Core 2 Duo makes Intel CPUs worth having, and you definitely won't go wrong with a good Intel motherboard.

Log in

Don't have an account? Sign up now