Final Words

It has been far too long since AMD/ATI have been at the top of the performance charts; the crown had been lost on both CPU and GPU fronts, but today's Radeon HD 3870 X2 introduction begins to change that. The Radeon HD 3870 X2 is the most elegant single-card, multi-GPU design we've seen to date and the performance is indeed higher than any competing single-card NVIDIA solution out today.

AMD is also promising the X2 at a fairly attractive price point; at $449 it is more expensive than NVIDIA's GeForce 8800 GTS 512, but it's also consistently faster in the majority of titles we tested. If you're looking for something in between the performance of an 8800 GTS 512 and a 8800 GT 512 SLI setup, the Radeon HD 3870 X2 is perfect.

Even more appealing is the fact that the 3870 X2 will work in all motherboards: CrossFire support is not required. In fact, during our testing it was very easy to forget that we were dealing with a multi-GPU board since we didn't run into any CrossFire scaling or driver issues. We're hoping that this is a sign of things to come, but we can't help but worry about the future of these multi-GPU cards.

The fact that both AMD and NVIDIA are committed to them is promising, and hopefully that means an even better experience when it comes to compatibility and performance with CrossFire and SLI (single-card or not), but we've got no crystal ball - only time will tell how the driver support evolves in the future.

But today, we have a victory for AMD. The past few months have shown a very different graphics division of AMD than we've seen since the first talks of the acquisition. The Radeon HD 2900 XT was a failure and now AMD has arguably the fastest single-card graphics card on the market. The only worry we'd have if we were in AMD's shoes is that the 3870 X2 was made by putting a couple of 3870s onto a single board; if AMD can do it, NVIDIA can as well. And we all know how the 3870 vs. 8800 GT matchup turned out.

What AMD really needs is its next-generation high end GPU, the 3870 X2 will buy the top performance spot for a little while but it's R700 that we really need to see.

Power Consumption
Comments Locked

74 Comments

View All Comments

  • HilbertSpace - Monday, January 28, 2008 - link

    When giving the power consumption numbers, what is included with that? Ie. how many fans, DVD drives, HDs, etc?
  • m0mentary - Monday, January 28, 2008 - link

    I didn't see an actual noise chart in that review, but from what I understood, the 3870GX2 is louder than an 8800 SLI setup? I wonder if anyone will step in with a decent after market cooler solution. Personally I don't enjoy playing with headphones, so GPU fan noise concerns me.
  • cmdrdredd - Monday, January 28, 2008 - link

    then turn up your speakers
  • drebo - Monday, January 28, 2008 - link

    I don't know. It would have been nice to see power consumption for the 8800GT SLI setup as well as noise for all of them.

    I don't know that I buy that power consumption would scale linearly, so it'd be interesting to see the difference between the 3870 X2 and the 8800GT SLI setup.
  • Comdrpopnfresh - Monday, January 28, 2008 - link

    I'm impressed. Looking at the power consumption figures, and the gains compared to a single 3870, this is pretty good. They got some big performance gains without breaking the bank on power. How would one of these cards overclock though?
  • yehuda - Monday, January 28, 2008 - link

    No, I'm not impressed. You guys should check the isolated power consumption of a single-core 3870 card:

    http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/...lay/rade...

    At idle, a single-core card draws just 18.7W (or 23W if you look at it through a 82% efficient power supply). How is it that adding a second core increases idle power draw by 41W?

    It would seem as if PowerPlay is broken.
  • erikejw - Tuesday, January 29, 2008 - link

    ATI smokes Nvidia when it comes to idle power draw.
  • Spoelie - Monday, January 28, 2008 - link

    GDDR4 consumes less power as GDDR3, given that the speed difference is not that great.
  • FITCamaro - Monday, January 28, 2008 - link

    Also you figure the extra hardware on the card itself to link the two GPUs.
  • yehuda - Tuesday, January 29, 2008 - link

    Yes, it could be that. Tech Report said the bridge chip eats 10-12 watts.

Log in

Don't have an account? Sign up now