Performance

ATI outfitted three motherboard manufacturers with fully functional CrossFire demo systems to show off at the show. The systems featured an ATI CrossFire reference board and a pair of graphics cards: a Radeon X850 XT and a CrossFire Radeon X850 XT.

The CrossFire X850 XT had a DVI dongle with two ports; one connected to the monitor, the second connected to a DVI cable, which was fed into the DVI output of the regular X850 XT card.

Even in CrossFire mode, the two graphics cards appear independently in device manager, which may allow for multi-monitor operation while in CrossFire mode:



Enabling CrossFire is done from within the ATI control panel, and unlike NVIDIA's SLI, no reboot is required:



With CrossFire enabled, the new AA modes are available for user selection:



Armed with one of these machines that ATI sent to their partners, we managed to get some benchmark time with CrossFire. Unfortunately, we didn't have much time to test nor did we have a full suite of benchmarks, so all we could run was Doom 3 (it was either Doom 3 or 3dmark 05).

The system that we used for testing featured an Athlon 64 FX-53, 512MB of memory and the two X850 XT graphics cards running under Windows XP Professional.

We ran all Doom 3 tests with 4X AA enabled at the High Quality presets in the unpatched retail version of Doom 3.



Even at this early stage, performance and stability were both impressive. The system that we were running had just been assembled hours earlier and didn't crash at all during our testing. In fact, the system was so new that the motherboard manufacturer who let us test with their hardware hadn't even seen it running - it was their first time as well as ours.

The performance of the solution was equally impressive; at 1024x768, the dual GPU CrossFire setup improved performance by 49%. At 1280x1024 and 1600x1200, the performance went up by 72% and 86% respectively. We had our doubts that ATI would be able to offer performance scaling on par with what we've seen on NVIDIA's SLI, but these initial numbers, despite being run on early hardware/drivers, are quite promising.

The Problematic South Bridge Pricing and Availability
Comments Locked

57 Comments

View All Comments

  • Pollock - Tuesday, May 31, 2005 - link

    Looks interesting to me, at least so far. I'll agree that we still have to see how things turn out.
  • Eug - Tuesday, May 31, 2005 - link

    Crossfire? Meh. About 0.1% of the population will buy dual GPU setups. Crossfire is essentially just a marketecture exercise.

    The really interesting part is the H.264 acceleration, which will have much, much more impact for the general computing world than Crossfire.
  • nitromullet - Tuesday, May 31, 2005 - link

    Interesting... They gave AT stock photos of a Intel based motherboard, but the benchmarking was done on an AMD rig. Anyone know if the chipset(s) support Athlon X2 and/or Pentium D?

    To the person that mentioned that Doom3 is not a good benchmark for ATi: My guess is that Doom3 is probably a good benchmark to use for this purpose. ATi is most assuredly GPU bound in Doom3, so any increase in GPU power will yield a positive result. Whereas in HL2 where ATi has really strong single performance, I would imgaine that the Crossfire rig is CPU lmited, so there is not as drastic of an increase.

    Either way, Crossfire looks to be pretty interesting. Can't wait to see some in depth benches and some screenies of the super AA modes.

    A request to AT: how about some benches in standard and widescreen resolutions. I know that SLI had compatibility issues with widescreen in the past, and it would be nice to know if those are still around and/or if Crossfire also suffers from this.

    Nice article, especially with the limitation of not being able to run a full suite of benchmarks.
  • jiulemoigt - Tuesday, May 31, 2005 - link

    So what about the fact the new mode does not work in OpenGL? I happened to love playing with directX api but knowing that half the engines will not be able to use the new filtering? I like the idea of the new chip they are putting on the board, I'm disguested most of the ATI stuff is marketing not hardware I have to develope for the hardware in peoples machines and the baises that people have toward tech and i'm getting sick of finding a new way of doing something not being able to use it because it only works on nvidia and ati bashes till they sorta get it working then they claim it's the best thing since sliced bread. Lets try and force nvidia and ati marking people to focus on what is there not what there side has.
  • DerekWilson - Tuesday, May 31, 2005 - link

    I suppose we should add a disclaimer to the statement about Super AA working with everything ...

    From what we *hear* from ATI, all games will work with CrossFire. This means that all games will work with at least one of the performance or quality modes. Even if a game doesn't work under AFR, split, or supertiling, it should work with Super AA ...

    But we will have to test compatibility for ourselves.

    Derek Wilson
  • porkster - Tuesday, May 31, 2005 - link

    I can't see why they can't just use one PCIe card with a extra socket. Wheny ou ened to upgrade for more power then you buy just the chip and put the extra GPU chip in the socket to make a dual graphics card.

    SLI is a waste of time in that it's a direction in motherboard layout that isn't going to last, it's a dead end road for the future. So rather than was all the time developing cards that work in tandom, make the card to work with more GPU chip ont he same daughter board.

    .
  • AdamK47 3DS - Tuesday, May 31, 2005 - link

    "all games will be accelerated under any Super AA mode"

    I hope Anandtech isn't pulling my leg here. I'd love to see Halo PC or Splinter Cell using AA. Currently no form of multisampling allows AA in these games. There are probably more games out there that have the same multisampling limitation, but these are the two I know of.
  • matthieuusa - Tuesday, May 31, 2005 - link

    And should I had that investors seems to feel like ATI is taking its place back. NVDIA stocks already dropped 59 cents witht the announcement of ATI crossfire and R520 playback display...
    #37: totally agree, it is going to be a nice fight! If R520 is overall better than the G70, NVDIA will have to worry and counter it as fast as they can (which is going to be great for us).

  • matthieuusa - Tuesday, May 31, 2005 - link

    I agree with #35. It is a little early to know which solution will be considered the best, even if it seems that ATI is bringing sone very interesting features along with their Crossfire. Not everybody care about playing Doom3 at 110 fps. I rather play at 80fps and have all the eye candies, and even more, since they are going to propose heavier FSAA.
    Since SLI and Crossfire will probably come to be close, it is going to see which one of the r520 and g70 will be the fastest with the most interesting features. But if ATI did as good with the R520 as they seem to have done with the Crossfire, they could take back the crown.
    NVIDIA seems in hurry to put the G70 on the shelves, which seems kind of suspect, since there is are no real reasons to do so. They acually do have the most popular cards and the fastest configuration with SLI.
    Did they heard about ATI R520 specs and fear to be at a disadvantage? Do they need two G70 to beat it?
    Wait and See

    #36 2PCI-E 8X -> that is exactly what SLI is right now with NVDIA.
    Compositing Engine chip -> Do you remember the discussion about the PCI-E bridge implemented on GeForce 6 cards? Experience has shown no performance drop. Instead, in ATI's "SLI" solution, it seems even better since it is not part of the die -> less heat... It is not new to them, since there are using it in professionnals products

  • kyaku00x - Tuesday, May 31, 2005 - link

    I think the next chipset king will be the one producing the best next-gen graphics cards. if NV70 is better performing than r520 then I wouldn't think that people are going to care about going ATI in the mobo department, but if r520>nv70, then nvidia may start losing the chipset market.

    this will be real interesting to watch, the first chipset war determined by graphics cards :P

Log in

Don't have an account? Sign up now