Multi-GPU Scaling: Two 3850s = One 8800 GTX?

AMD only sent us a pair of Radeon HD 3850s for this review (believe it or not, we had to beg to get a single 3870), so our only CrossFire numbers come from this setup. That being said, the performance is quite respectable:

Believe it or not, but a pair of these $179 Radeon HD 3850s actually gives you the same performance as a single GeForce 8800 GTX.

 Multi-GPU Scaling (2560 x 1600) Radeon HD 3850 CF GeForce 8800 GT SLI
Oblivion 1.7x 1.87x
Unreal Tournament 3 1.48x 1.66x

 

Scaling looks pretty good from the Radeon HD 3850, however it's still not as good as what NVIDIA is able to achieve with the 8800 GT. NVIDIA consistently achieves about 11% better scaling from one to two GPUs than AMD.

The other problem with CrossFire is that it simply doesn't always work, so a pair of 3850s is not necessarily a better option than a single 8800 GT or GTX. Case in point would be the two other games that we wanted to include here: Quake Wars and Call of Duty 4, both gave us lower frame rates with CF enabled than without. AMD's release notes for the Radeon HD 3800 drivers informs us that some applications may show a performance decrease with CF enabled, so we're not too surprised.

While it'd be nice to be able to purchase two cheap cards and get better performance than the best out there, there are simply too many caveats to really embrace the idea.

Out with the Old, in with the Mid-Range Power Consumption
POST A COMMENT

117 Comments

View All Comments

  • bbqchickenrobot - Wednesday, May 07, 2008 - link


    But - now new Catalyst drivers have been released - so an updated benchmark needs to be completed as the drivers provide better support for the hardware and thus, better performance.

    Also, you used a non-AMD MoBo and Chipset... if you went with XFire + AMD 790 chipset + Phenom X3/X4 processor (Spider platform) you would have seen a better performance as well. There are other benchmarks that are/were done with these components (spider) and the results weren't nearly as mediocre. Just a little tip...
    Reply
  • Adamseye - Tuesday, February 12, 2008 - link

    I cant see how every review I have read differs from your charts, the 2900 xt can't be faster then the 3850.I mean I spent a month researching cards and the winner was the 3850 overclocking it to 3870 speeds. To think that AMD spent all that time to make a new 2900xt and name it the 3850-70, is just foolsih. from the benchmarks you provided only an idiot would buy the new gen cards for 60-100 buxks more when the 2900xt is on par. Could you please explain to me how this happened? I feel like ordering a 3850 was a waste of money because the old 2900 is better anyway. Reply
  • aznboi123 - Saturday, February 02, 2008 - link

    Welll dang that bothers me...666...>,< Reply
  • spaa33 - Monday, December 03, 2007 - link

    It looked to me that the biggest complaint on the HD Video Decode article was that the 2600/2900 options did not provide an off switch for the Noise Reduction. Did you notice if this option appeared to be present in the newer drivers of this card (3850)?

    Regards,
    Dan
    Reply
  • emilyek - Tuesday, November 27, 2007 - link

    So AMDTI is still getting stomped by year old hardware?

    That's what I read.
    Reply
  • jpierce55 - Saturday, November 24, 2007 - link

    This is really a good review, some others are very Nvidia biased. I would like to see you do an update with the new drivers in the near future if possible. Reply
  • gochichi - Friday, November 23, 2007 - link

    Anand,

    First Nvidia with its 8800GT... I clearly recall seing those at about $200, now they're $300 or more. At least these may come bundled with a game... they also "hold the crown".

    Now the HD 3870 has gone up to $269.99 (at newegg) and availability is every bit as bad as the 8800GT.

    This review assumes that AMD/ATI was going to deliver in volume, at a fixed price and they haven't delivered either. It would be really nice if you could slap their wrists... as individual consumers we are being tossed about and we don't have the "pull" to do anything other than "take it".

    Shouldn't AMD be accountable to deliver on their promises?
    Reply
  • SmoulikNezbeda - Thursday, November 22, 2007 - link

    Dear Anand,

    I would like to ask you what exactly results in individual games represents. Are those average FPS, or something like (min + max + ave)/3 FPS. On one czech website there were similar results to what was presented here, but they were showing (min + max + ave)/3 FPS, which is a complete nonsense as this would be advantageous for cards which have more volatile results. In case when they were comparing average fps the radeon had the same results as GT card. Also I would like to ask you whether you have used the same demo for both cards or you were playing a game and therefore testing a game in different situations?

    Thanks in advance

    Petr
    Reply
  • SmoulikNezbeda - Thursday, November 22, 2007 - link

    Dear Anand,

    I would like to ask you what exactly results in individual games represents. Are those average FPS, or something like (min + max + ave)/3 FPS. On one czech website there were similar results to what was presented here, but they were showing (min + max + ave)/3 FPS, which is a complete nonsense as this would be advantageous for cards which have more volatile results. In case when they were comparing average fps the radeon had the same results as GT card. Also I would like to ask you whether you have used the same demo for both cards or you were playing a game and therefore testing a game in different situations?

    Thanks in advance

    Petr
    Reply
  • Sectoid - Sunday, November 18, 2007 - link

    If I'm not mistaken the 8800GT is DX10 only right? Is DX10.1 so insignificant as to not count to the favor of the 3800's over the GT's? Don't get me wrong, I'm not trying to defend AMD; I just want to know if it's a good idea to sell my 8800GTS 320mb still at a good price now(I live in Brazil and they're still pricey here) and buy a 3870 or a 8800GT with 512mb. I recently bought a 22" monitor and the GTS is somewhat disappointing at 1600x1050. Nah, it's just that crappy game world in conflict. It runs similar to crysis demo at max! I have to play at medium and the textures are really crappy for a high-end pc 8-month old :(
    Who knows, maybe I'm already CPU or memory bound with a core 2 duo 6400@24xxMhz and dual ocz platinum 2 1gb 800mhz(2gb total)...
    Thanks in advance for any more input on the qualities of DX10.1 :)
    Reply

Log in

Don't have an account? Sign up now