The New Improved CrossFire

With the X1800 series of CrossFire cards, we will finally be able to test resolutions above 1600x1200. When it comes to multiGPU solutions, the more flexibility the user gets, the better. It is difficult to justify dropping over a thousand dollars on a setup that has limitations, and the removal of the single-link TMDS receivers definitely makes this version of CrossFire a more viable solution than its first incarnation in the X800 series.



The addition of dual-link TMDS receivers on the master card marks the biggest upgrade we see on the X1800 version of CrossFire. With a new dongle connector, and the improved bandwidth of dual-link DVI, ATI has given their customers what they need to power huge resolutions. We will still be limited in performance on a game by game basis, as alternate frame rendering (AFR) is still the most efficient (and most restrictive) multiGPU mode in which to work. While ATI's Scissor and SuperTiling modes offer some flexibility and the extended SuperAA modes offer an alternate way to add value to games (enhancing quality rather than performance), there are some caveats we will mention in our performance analysis (in particular with Black and White 2).



The basic features of CrossFire haven't changed from the initial design. For a refresher on CrossFire, check out our previous articles on the subject. Aside from the TMDS upgrades, ATI has refreshed their compositing engine with a larger FPGA from Xilinx. This allows ATI to composite the larger images possible with dual-link DVI input. Most of the rest of the CrossFire hardware is either unchanged or only slightly altered. From a board layout stand point, it would certainly make more sense if ATI were to build GPU to GPU communication into their parts as NVIDIA has done with SLI. Incorporating a silicon version of their compositing engine onto their GPUs would save board space and could improve performance even more.

From a high level, the X1800 CrossFire edition can be paired with a regular X1800 XT or an X1800 XL. There isn't a cheaper CrossFire card for the cheaper X1800 XL, and in order to get CrossFire setup and running with the X1800 XL, half of the RAM on the card needs to be disabled. In order to disable the RAM, the system needs to be rebooted, but in other cases CrossFire can be enabled and disabled without rebooting. While adding a CrossFire card to one's existing X1800 XL setup will definitely increase performance, it is ends up delivering even less for your money than when pairing it with an X1800 XT. The price tag is already a bit hefty, and we expect that most people who want this card will be those who need the absolute maximum performance possible. The value of CrossFire as an upgrade won't really be worth it unless the price of the X1800 CrossFire card comes down quite a bit.

ATI: The A is for Availability? The Test
Comments Locked

40 Comments

View All Comments

  • almvtb - Tuesday, December 20, 2005 - link

    Has anyone ever compared SLI and crossfire performance using a dual core compared to just a single core cpu? I mean if there is enough overhead for sli or crossfire a dual core chip could improve performance.
  • kristof007 - Tuesday, December 20, 2005 - link

    I don't know if that dual core thing would work. I mean it might but the two slower CPUs would not help in my opinion. Games are single threaded so the multi CPU wouldn't take off the overhead .. at least that's my knowledge of it.
  • almvtb - Tuesday, December 20, 2005 - link

    See I thought that was a big deal with one of the latest Nvidia driver releases. That it was made multithreaded so that in a situation such as when you have sli or any other kind of driver overhead it would be taken care of by the a second core if one existed. I do not know it was just a thought that i had never seen discussed, so I thought I would ask.
  • bob661 - Tuesday, December 20, 2005 - link

    That was an ATI driver release that had the multithreading stuff, I think.
  • kilkennycat - Tuesday, December 20, 2005 - link

    We shall shortly soon find out whether Crossfire is serious or just a ATi marketing straw-grabbing ploy to get some suckers (er, "enthusiasts") not to buy SLI. If the compositor is fully integrated into EVERY R580 GPU, (thus never requiring a masterboard and implementing the board communications via a passive bridge a la nVidia) then we shall finally know that ATI is serious with Crossfire. It was probably a stupid cheese-pairing management decision not to integrate the Crossfire functionality fully into the R520 GPU, or else Crossfire does not have enthusiastic support from ATI engineering and is purely a ATi marketing ploy anyway. The R580 details will reveal the truth.
  • Spacecomber - Tuesday, December 20, 2005 - link

    What changed since the http://www.anandtech.com/video/showdoc.aspx?i=2466...">Battlefield 2 GPU Performance Analysis article? It seemed like you were able to demonstrate the advantages of SLI in those benchmarks.

    Space
  • bob661 - Tuesday, December 20, 2005 - link

    I think AT has a different benchmark now for BF2.
  • Spacecomber - Thursday, December 22, 2005 - link

    As far as I know the only thing that has changed along the way are the addition of BF2 patches (according to the overclocking the Athlon X2 article, they are up to using the 1.03 patch) and newer nvidia drivers. I believe they are still creating a demo and running it with the timedemo option. With this being such a popular game (BF2), it seems like it would be worthwhile to confirm whether SLI/Crossfire does or does not offer significant improvements for BF2.
  • ViRGE - Wednesday, December 21, 2005 - link

    Ya, DICE seems to screw up demos with new BF2 patches.
  • ElFenix - Tuesday, December 20, 2005 - link

    i wonder if you can change b&w2's name to make the score go up as well. maybe there is poor optimization going on in the catalyst AI?

Log in

Don't have an account? Sign up now