This week, we were very lucky to get our hands on a CrossFire motherboard and a CrossFire master card from Gigabyte.

We have previously covered CrossFire, so please check out that article for more details. It all comes down to ATI's answer to SLI in the form of a master/slave card combination where the master card reconstructs the data produced by both cards and outputs it to the display. Communication between cards is done over PCI Express and a dongle that plugs into the slave card's DVI port. And today, we have the pleasure of talking about performance.

While we did get the chance to take an early look at CrossFire during Computex, we recently learned that what we saw wasn't actually full CrossFire. This time, we have an actual master card in our hands and we'll put ATI's answer to SLI to the test. Of course, due to the very prerelease nature of these products, our tests were not without some bumps and detours.

We had some trouble getting CrossFire set up and running due to a combination of factors. The first monitor that we tested doesn't work on the master card dongle with drivers installed. We weren't exactly sure in which slot the master card needed to be (we hear that it shouldn't make a difference when the final product is released), and we didn't know into which DVI port on the slave card to plug the dongle. After a bout of musical monitors, slots, and ports that finally resulted in a functional setup, we still needed to spend some time actually wrestling the driver into submission.

After getting the driver issues squared away, we got down to testing. Our first disappointment came along when we realized that the CrossFire AA modes were not quite finished. Enabling these modes drops performance much more than what we would expect and looks like the frames that each GPU renders are out of sync with the other card. We can't be totally sure what's going on here, but it's clear that there still needs to be some work done.

One thing that works well right now is SuperTiling. Except for some random display corruption when switching modes, SuperTiling looked alright and ran with good speed.

Note that each GPU will render 32x32 pixel blocks (256 adjacent quads for those keeping track).

The only random quirk that we would expect to find in prerelease hardware, which ended up getting in our way, is the fact that setting standard AA modes in the control center didn't seem to actually enable antialiasing. Games that made use of in game AA adjustments seemed to work well. Not being able to use a good CRT monitor, we had to resort to our 1600x1200 LCD for testing, limiting our max resolution. Below 16x12, many games are not GPU limited under CrossFire. Even at 16x12 with no AA, we see some games that could be pushed much further.

This brings up a point that we will be making more and more as GPUs continue to gain power. Large LCDs are very expensive and CRTs are on their way out. Buying more than one 6800 Ultra, and soon adding a CrossFire master to an X850, doesn't make much sense without being able to run more than 1280x1024. And we would really recommend being able to run at least 1600x1200 for these kinds of setups.

Let's take a closer look at the setup before we get to benchmarks.

The System
Comments Locked


View All Comments

  • Hasse - Wednesday, July 27, 2005 - link

    Sigh.. Read comment to first post..
  • DerekWilson - Thursday, July 28, 2005 - link

    We're all still getting used to the new design :-)

    To actually reply -- the crossfire card can work alone. you don't need another card. I know this because it was running by itself while I was trying various methods to coax it into submission :-) ... Also, even if gigabyte's hardware is close to complete, ATI is in charge of the driver for the mobo and the graphics card, as well as getting a solid video bios out to their customers. These things are among the last in the pipe to be fully optimized.

    I don't think I could give up the F520 even for desk space. I'd rather type on an LCD panel any day, but if I am gonna play or test games, there's just no substitute for a huge CRT.
  • DanaGoyette - Monday, July 25, 2005 - link

    Want to know why 1280x1024 always looks wrong on CRTs? It's because IT IS wrong! 800x600, 1024x768, 1600x1200 -- all are 4:3 aspect ratio,
    1280x1024? That's 5:4! To get 4:3, you need to use 1280x960! That should be benchmarked too, if possible.

    Whoever invented that resolution should be forced to watch a CRT flickering at 30 Hz interlaced for the rest of his life! Or at least create more standard resolutions with the SAME aspect ratio!

    I could rant on this endlessly....
  • DerekWilson - Tuesday, July 26, 2005 - link

    There are a great many lcd panels that use 1280x1024 -- both for desktop systems and on notebooks.

    the resolution people play games on should match the aspect ratio of the display device. At the same time, there is no point in testing 1280x960 and 1280x1024 because their performance would be very similar. We test 1280x1024 because it is the larger of the two resolutions (and by our calculations, the more popular).

    Derek Wilson
  • Hasse - Wednesday, July 27, 2005 - link


    I'ld have to agree with Derek. Most people that play on 1280 use 1280x1024. But it is also true that 960 is the true 4:3... While I don't know why it's so that there's 2 versions of 1280 (probably something related to video or TV), I also know that the performance difference is allmost zero, try for yourself. Running Counterstrike showed no difference on my computer.

  • giz02 - Saturday, July 23, 2005 - link

    Nothing wrong with 16x12 adn sli'd 7800's. If your monitor can't go beyond that, then surely it would welcome the 8 and 16xaa modes that come for free, besides
  • vision33r - Saturday, July 23, 2005 - link

    For the price of $1000+ to do Crossfire, you could've just bought the 7800GTX and you're done.
  • coldpower27 - Saturday, July 23, 2005 - link

    Well Multi GPU technology to me makes sense simply when you can't get the same performance you need out of a single card. I would rather have the 7800 GTX then the Dual 6800 Ultra setup, for reduced power consumption.

    You gotta hand it to Nidia able to market their SLI setup to the 6600 GT, and soon to be 6600 LE line.
  • karlreading - Saturday, July 23, 2005 - link

    I have to say im running a SLI setup.
    i went from a AGP based ( nf3, 754 a64 3200+ hammer ) 6800GT system, to the current SLI setup ( nf4 SLI/s939 3200+ winchester ) dual 6600GT setup.

    to be honest, SLI aint all that. i miss the fact that to use my dual desktop ( i have 2*20inch CRT's ) I have to turn SLI off in the drivers, which involves a reboot, so thats hassel. Yes, the dual GT's beat even a 6800 ULTRA in non AA settings in doom 3 but they get creamed with AA on, even the 6600GT'S i have are feeling cpu strangulation, so you need a decent cpu to feed it. and not all games are SLI ready. To be honest, apart from the nice view in the case of to cards in there ( looks trick ) i would rather have a decent single card set up then a SLI setup.
    crossfire is looking strong though! AIT needed it and now they have it, the playing field is much more level! karlos!
  • AnnoyedGrunt - Saturday, July 23, 2005 - link

    Also, as far as XFire competing with SLI, it seems to offer nothing really different from SLI, and in that sense only competes by being essentially the same thing but for ATI cards. However, NV is clearly concerned about it (as evidenced by their AA improvements, the reduction on the SLI chipset prices, and the improvements in default game support). So, ATI's entry into the dual card arena has really improved the features for all users, which is why this heated competition between the companies is so good for consumers. Hopefully ATI will get their XFire issues resolved quickly, although I agree with the sentiment that the R520 is far more important than XFire (at least in the graphics arena - in the chipset arena ATI probably can't compete without XFire).


Log in

Don't have an account? Sign up now