ATI has been particularly quiet regarding CrossFire since their launch at Computex back in June. I assumed that we wouldn't see CrossFire boards and cards until August or September at the earliest, and it's really starting to look like that's going to be the case. But I'm not here to talk about CrossFire availability, I'm here to talk about performance.

There have been a number of CrossFire previews published all over the net, including our benchmarks from Computex. But what truly caught my attention was the tests over at HKEPC that showed two X700 cards working in CrossFire mode with nothing more than a simple BIOS update, no master card needed.

Something about that just didn't make much sense; ATI went through all the trouble to stick a compositing chip on these CrossFire master cards, and all you really need is the BIOS from a master card? So we did some digging and Wesley actually discovered the truth behind current CrossFire performance.

ATI distributed a special driver to their partners prior to the Computex launch that was designed to simulate CrossFire performance, by only rendering odd frames (effectively doubling the frame rate and simulating AFR performance). Although we can't confirm that we also ran with this driver back at Computex, chances are we probably did. But more importantly, the reviews you've seen where a pair of slave cards are used aren't actually testing CrossFire, they are simply simulating the performance of CrossFire by rendering half the frames.

We have learned however that the performance of this special driver is actually virtually identical to AFR performance with CrossFire actually working, but it is important to understand that when CrossFire is eventually released - you won't be able to just flash the BIOS on a slave card and have it work as a master card. And obviously, you won't be able to use just any cards in CrossFire mode, you'll have to stick with a X850 or X800 master card.

Although I have yet to see final benchmarks, my guess right now is that ATI needs to actually release CrossFire when they've shipped R520 boards. Had NVIDIA not launched G70 it wouldn't have been as big of an issue, but with a single G70 basically equal in performance to a pair of 6800 Ultras, ATI isn't winning any brownie points by competing with yesterday's GPUs. I think a launch/shipping announcement of R520 and CrossFire either sequentially or simultaneously would put ATI's best foot forward, as CrossFire has lost a bit of its steam by this point.
Comments Locked

35 Comments

View All Comments

  • Richard - Saturday, July 23, 2005 - link

    #14

    Uh, why? I have a mobo thats AGP and it supports every Athlon 64 socket 939 chip out today, INCLUDING dual core (I have one in my system right now overclocked to 2.8 ghz). You are wrong about AGP being outdated. The AGP bus wasn't even the limiting factor.
  • Heron Kusanagi - Friday, July 22, 2005 - link

    Anand, what do you think about Crossfire technology now that a system (albiet pre-release) is tested?

    I really hope ATI's entry level R520 comes in Crossfire flavour at launch. Then those who needs a X850XT upgrade can do that while those hardcore upgraders can get the top end R520 Crossfire. For me, I will love a R520 Crossfire system but hopefully ATI don't release a entry R520 with greatly reduced pipelines (like how X800Pro and X850Pro was killed just because of 4 pipelines)
  • daniel - Wednesday, July 20, 2005 - link

    #32 - 1 word: overhead. something like that is going to take up processing time, which reduces performance. it makes more sense when youre talking about computers because theyre not always going to be the same performance, and anyways its single coded tasks. with, say, a 7800gtx, youre talking about 24 pipelines, 16 ROPS, 8 vertex engines - and suddenly its a lot more complicated to just "execute the next line of code." you need a master which is telling a slave which ones to and not to execute (they both have the same thing in memory) and thats going to take away, and it just doesnt make much sense. SFR is great cause each card knows exactly what part of the frame its gonna process and render and it just does it. also there is some load balancing there, it will change how much goes to each one based on how hard it is. and if for some reason (DOOM3) SFR is not good, (i dont understand why myself) AFR works pretty efficiently, too. nvidia put millions into research for SLI over 3 years. i think they got the best plan ;)
  • EvolutionBunny - Wednesday, July 20, 2005 - link

    Some people can really think weird, both nVidia and ATI try algorithms like even-odd rendering, or splitting the part to be rendered in two to give both cards equal performance.

    There's a even more flexible and possibly faster why do process th scene and you won't have the trouble of having two of the same cards (nv) or the same speed (nv). It's called dynamic load balancing (DLB) - commonly used in computer clustering techniques for high performance computing.

    DLB is easy to understand and has the advantage that the superior processor will do more work without losing it's capable speed. DLB don't require special algorithms to split up the scene, just start from the beginning and which ever card is available gets the next line to process.

    Just a thought.
  • null_pointer_us - Wednesday, July 20, 2005 - link

    Anand, I would also like to see some LCD TV articles. There is supposed to be a new generation of them due out this fall based on new technology that significantly beats the current crop. What is the new technology; which displays will have it, and how can I connect them to my PVR's Radeon 9700 Pro? :-)
  • Anonymous - Wednesday, July 20, 2005 - link

    well #29, 6800u tied on bf2 (1600x1200 4xaa, what a high end type will prob use unless you got 2048, which is unlikely) smashed the 7800gtx on doom3, beat it on everquest 2, tied it on guild wars, tied it on half life 2, got beat by 7800 on Splinter Cell, was slightly faster in UT2004, and crushed the 7800 on wolfenstein. clearly you didnt read the anand review.
  • Josh - Wednesday, July 20, 2005 - link

    #26- clearly you didn't read the anand review then, because the 7800GTX generally beats the 6800U-SLI setup slightly.
  • haf - Tuesday, July 19, 2005 - link

    #27---that is a ridiculous statement that the 800xl does not compete with the 6800gt. Sorry to step on toes, but I buy bang for the buck---and the 800xl gives you 6800gt performance -- the 6800gt does doom 3 better, but the 800xl kills the gt in half life 2.
  • Cliff - Tuesday, July 19, 2005 - link

    Anand,

    A new LCD TV just came out this month, and I would love to see a review from you guys on it. It's the Syntax LT32HVE. It's the 32 inch model with their iDEA technology. Until now, only models up to 26 inches had this, and from what some people are saying, it makes a good bit of difference. It's just over $1K, and seems to be a good value for its size and supposed performance. Keep up the good work!
  • Daniel - Tuesday, July 19, 2005 - link

    so it takes a lot of power, but in most cases 2 6800u well outperforms a 7800gtx... i would hardly call it a waste of money,m sometimes its better to be able to buy one now and one later.
    Also, a 6800gt will pwn an x800xl in anything opengl and is about tied/a little better in d3d games. Also, pcie is definately faster and better then agp. twice as fast in sheer bandwith (even if its not used yet) and the fact that it has the same speed up or down has made turbocache/hypermemory possible, making cheaper cards that can actually do something (as opposed to intel integrated trash.) and whats this about pcie requiring more power then AGP? learn to read, it supplies more power.

Log in

Don't have an account? Sign up now