Last May, AMD introduced its much delayed Radeon HD 2900 XT at $399. In a highly unexpected move, AMD indicated that it would not be introducing any higher end graphics cards. We discussed this in our original 2900 XT review:

"In another unique move, there is no high end part in AMD's R600 lineup. The Radeon HD 2900 XT is the highest end graphics card in the lineup and it's priced at $399. While we appreciate AMD's intent to keep prices in check, the justification is what we have an issue with. According to AMD, it loses money on high end parts which is why we won't see anything more expensive than the 2900 XT this time around. The real story is that AMD would lose money on a high end part if it wasn't competitive, which is why we feel that there's nothing more expensive than the 2900 XT. It's not a huge deal because the number of people buying > $399 graphics cards is limited, but before we've started the review AMD is already giving up ground to NVIDIA, which isn't a good sign."

AMD has since released even more graphics cards, including the competitive Radeon HD 3870 and 3850, but it still lacked a high end offering. The end of 2007 saw a slew of graphics cards released that brought GeForce 8800 GTX performance to the masses at lower price points, but nothing any faster. Considering we have yet to achieve visual perfection in PC games, there's still a need for even faster hardware.

At the end of last year both AMD and NVIDIA hinted at bringing back multi-GPU cards to help round out the high end. The idea is simple: take two fast GPUs, put them together on a single card and sell them as a single faster video card.

These dual GPU designs are even more important today because of the SLI/CrossFire limitations that exist on various chipsets. With few exceptions, you can't run SLI on anything other than a NVIDIA chipset; and unless you're running an AMD or Intel chipset, you can't run CrossFire. These self-contained SLI/CrossFire graphics cards will work on anything however.

AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870s on a single card.


Click to Enlarge

The card is long, measuring 10.5" it's the same length as a GeForce 8800 GTX or Ultra. AMD is particularly proud of its PCB design which is admittedly quite compact despite featuring more than twice the silicon of a single Radeon HD 3870.

On the board we've got two 3870 GPUs, separated by a 48-lane PCIe 1.1 bridge (no 2.0 support here guys). Each GPU has 16 lanes going to it, and then the final 16 lanes head directly to the PCIe connector and out to the motherboard's chipset.


Two RV670 GPUs surround the PCIe bridge chip - Click to Enlarge


Click to Enlarge

Thanks to the point-to-point nature of the PCI Express interface, that's all you need for this elegant design to work.


The 3870's board layout is far simpler (left) than the X2's (right) thanks to less stringent power requirements and lack of a bridge chip + 2nd GPU - Click to Enlarge

Each GPU has its own 512MB frame buffer, but the power delivery on the board has been reworked to deal with supplying two 3870 GPUs.


The Radeon HD 3870 X2 is built on a 12-layer PCB, compared to the 8-layer design used by the standard 3870. The more layers you have on a PCB the easier routing and ground/power isolation becomes, AMD says that this is the reason it is able to run the GPUs on the X2 faster than on the single GPU board. A standard 3870 runs its GPU at 775MHz, while both GPUs on the X2 run at 825MHz.

Memory speed is reduced however; the Radeon HD 3870 X2 uses slower, more available GDDR3 in order to keep board cost under control. While the standard 3870 uses 2.25GHz data rate GDDR4, the X2 runs its GDDR3 at a 1.8GHz data rate.


The 3870 has 8 x 64MB GDDR4 memory devices on one side of the PCB, the X2 has 4 x 32MB GDDR3 per GPU on each side of the PCB for a total of 16 GDDR3 memory devices, or 1GB of memory total - Click to Enlarge

AMD expects the Radeon HD 3870 X2 to be priced at $449, which is actually cheaper than a pair of 3870s - making it sort of a bargain high end product. We reviewed so many sub-$300 cards at the end of last year that we were a bit put off by the $500 pricetag at first; then we remembered how things used to be, and it seems that the 3870 X2 will be the beginning of a return to normalcy in the graphics industry.


One GPU on the Radeon HD 3870


Two GPUs on the Radeon HD 3870 X2 - Click to Enlarge

Single-board CrossFire
Comments Locked

74 Comments

View All Comments

  • HilbertSpace - Monday, January 28, 2008 - link

    When giving the power consumption numbers, what is included with that? Ie. how many fans, DVD drives, HDs, etc?
  • m0mentary - Monday, January 28, 2008 - link

    I didn't see an actual noise chart in that review, but from what I understood, the 3870GX2 is louder than an 8800 SLI setup? I wonder if anyone will step in with a decent after market cooler solution. Personally I don't enjoy playing with headphones, so GPU fan noise concerns me.
  • cmdrdredd - Monday, January 28, 2008 - link

    then turn up your speakers
  • drebo - Monday, January 28, 2008 - link

    I don't know. It would have been nice to see power consumption for the 8800GT SLI setup as well as noise for all of them.

    I don't know that I buy that power consumption would scale linearly, so it'd be interesting to see the difference between the 3870 X2 and the 8800GT SLI setup.
  • Comdrpopnfresh - Monday, January 28, 2008 - link

    I'm impressed. Looking at the power consumption figures, and the gains compared to a single 3870, this is pretty good. They got some big performance gains without breaking the bank on power. How would one of these cards overclock though?
  • yehuda - Monday, January 28, 2008 - link

    No, I'm not impressed. You guys should check the isolated power consumption of a single-core 3870 card:

    http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/...lay/rade...

    At idle, a single-core card draws just 18.7W (or 23W if you look at it through a 82% efficient power supply). How is it that adding a second core increases idle power draw by 41W?

    It would seem as if PowerPlay is broken.
  • erikejw - Tuesday, January 29, 2008 - link

    ATI smokes Nvidia when it comes to idle power draw.
  • Spoelie - Monday, January 28, 2008 - link

    GDDR4 consumes less power as GDDR3, given that the speed difference is not that great.
  • FITCamaro - Monday, January 28, 2008 - link

    Also you figure the extra hardware on the card itself to link the two GPUs.
  • yehuda - Tuesday, January 29, 2008 - link

    Yes, it could be that. Tech Report said the bridge chip eats 10-12 watts.

Log in

Don't have an account? Sign up now