This week, we were very lucky to get our hands on a CrossFire motherboard and a CrossFire master card from Gigabyte.

We have previously covered CrossFire, so please check out that article for more details. It all comes down to ATI's answer to SLI in the form of a master/slave card combination where the master card reconstructs the data produced by both cards and outputs it to the display. Communication between cards is done over PCI Express and a dongle that plugs into the slave card's DVI port. And today, we have the pleasure of talking about performance.

While we did get the chance to take an early look at CrossFire during Computex, we recently learned that what we saw wasn't actually full CrossFire. This time, we have an actual master card in our hands and we'll put ATI's answer to SLI to the test. Of course, due to the very prerelease nature of these products, our tests were not without some bumps and detours.

We had some trouble getting CrossFire set up and running due to a combination of factors. The first monitor that we tested doesn't work on the master card dongle with drivers installed. We weren't exactly sure in which slot the master card needed to be (we hear that it shouldn't make a difference when the final product is released), and we didn't know into which DVI port on the slave card to plug the dongle. After a bout of musical monitors, slots, and ports that finally resulted in a functional setup, we still needed to spend some time actually wrestling the driver into submission.

After getting the driver issues squared away, we got down to testing. Our first disappointment came along when we realized that the CrossFire AA modes were not quite finished. Enabling these modes drops performance much more than what we would expect and looks like the frames that each GPU renders are out of sync with the other card. We can't be totally sure what's going on here, but it's clear that there still needs to be some work done.

One thing that works well right now is SuperTiling. Except for some random display corruption when switching modes, SuperTiling looked alright and ran with good speed.

Note that each GPU will render 32x32 pixel blocks (256 adjacent quads for those keeping track).

The only random quirk that we would expect to find in prerelease hardware, which ended up getting in our way, is the fact that setting standard AA modes in the control center didn't seem to actually enable antialiasing. Games that made use of in game AA adjustments seemed to work well. Not being able to use a good CRT monitor, we had to resort to our 1600x1200 LCD for testing, limiting our max resolution. Below 16x12, many games are not GPU limited under CrossFire. Even at 16x12 with no AA, we see some games that could be pushed much further.

This brings up a point that we will be making more and more as GPUs continue to gain power. Large LCDs are very expensive and CRTs are on their way out. Buying more than one 6800 Ultra, and soon adding a CrossFire master to an X850, doesn't make much sense without being able to run more than 1280x1024. And we would really recommend being able to run at least 1600x1200 for these kinds of setups.

Let's take a closer look at the setup before we get to benchmarks.

The System


View All Comments

  • CP5670 - Friday, July 22, 2005 - link

    It depends on the person. I can notice a considerable amount of ghosting in a fast paced game like UT2004 even on an 8ms LCD.

    I would never go with an LCD for any kind of gaming for a number of reasons, but anyway I got a top notch CRT a month ago and don't need to worry about LCD limitations for a couple of years at least.
  • Guspaz - Friday, July 22, 2005 - link

    I'm a laptop gamer. I've got an older Mobility Radeon 9700 Pro, so I can't run everything at the native screen size of 1400x1050. As such there are many games that I play at 1024x768, or even 800x600 in the more extreme cases.

    Games look just fine with modern LCD scaling. That is to say, a 1024x768 game looks very good scaled up to 1400x1050. When ATI was designing their scaling algo, they obviously focused on making the pixels look square, rather than just doing a simple bilinear transform.

    The point of scaling on LCDs may be moot to serious gamers though, as modern desktop cards often don't need to run games at below native res. And when they do, I can report they still look good scaled up.

    I can also report that the response rate ("motion blur") is totally overblown. I've got a laptop, which obviously means I've got a pretty high response time (Probably 25ms to 30ms). Motion blur is noticeable, but isn't really distracting except in very bright areas. Desktop LCDs have improved a great deal beyond this with significantly lower response times to the point where it isn't an issue at all. I understand contrast ratios have also improved. All that is really left is colour saturation/accuracy (as a thing that CRTs are better at).

    Most gamers I know with modern PCs have LCDs now. CRTs are dying, slowly but surely, much like DVDs slowly replaced VHS. LCDs already outsell CRTs, and adoption among gamers is possibly even higher than the regular computer buying public.
  • jkostans - Friday, July 22, 2005 - link

    #23 no serious gamer would play in a window, and how can you not see the drawback of shrinking the screen size to get a lower resolution? Why turn a 19" display into a 14" display? I'm sorry but my 21" crt can do any resolution and still use all of the screen space and not interpolate. LCDs can't period and that's a big deal for 99% of gamers. Oh and #21, CRTs will be around until there is a technology that is fit to take its place. LCD panels are not even close to the quality of a good CRT display. Reply
  • blwest - Friday, July 22, 2005 - link

    Can we do this benchmark on a NON ATI chipset? Reply
  • Samadhi - Friday, July 22, 2005 - link

    This article shows the need for a lot more reviews of high resolution LCD devices from a gaming perspective. Reply
  • fungmak - Friday, July 22, 2005 - link

    Anand or Derek - two quick questions

    Whats the configuration of the slave and master card. Is it an XT as master and XT PE as the slave?

    Obviously the speedup for crossfire would be a little bit more if you just used regular XTs instead of XT PEs for the single, if it was just two crossfired XTs.

    Finally, in the Far Cry benchmarks, the single ATi card is labelled as the XT, but in the others, the single ATi card is an XT PE. Is this a typo?

  • Sea Shadow - Friday, July 22, 2005 - link

    Seems like they were CPU limiting most of the benches, only 1600x1200 isnt demanding that much from the 6800 Ultras, x850s, and 7800s in SLI. Reply
  • BuddyHolly - Friday, July 22, 2005 - link

    This is not quite a paper launch, but more like a paper review. Almost useless in helping me decide what my next video purchase will be other than to show that Crossfire does in fact work.
    How about the important stuff? When will it ship? How much? And where is the next generation ATI card and when will it ship?
    I want more competition so I can retire my 9800pro and get a PCIe card, motherboard and processor without having to morgage my house or go without eating for a month...
  • Shodan2k - Friday, July 22, 2005 - link

    To Derek Wilson

    Which ATi driver did you use for the CrossFire setup?
  • yacoub - Friday, July 22, 2005 - link

    So the point of this is that Crossfire with two last-gen GPUs (XT850s) doesn't perform quite as highly as SLI with two current-gen GPUs (7800s)? That makes sense. If/when the next-gen (current-gen) ATI cards are released I would expect them to do as well or better when Crossfire'd compared to SLI'd 7800GTXs. We shall see. :) Reply

Log in

Don't have an account? Sign up now