Single-board CrossFire

The Radeon HD 3870 X2 features a single CrossFire connector at the top of the PCB, meaning you'll eventually be able to add a second card to it to enable 3X or 4X CrossFire modes (depending on whether you add another 3870 X2 or just a single 3870).

Unfortunately driver support for the ATI CrossFireX technology isn't quite there yet, although AMD tells us to expect something in the March timeframe. Given that CeBIT is at the beginning of March we're guessing we'll see it at the show.

As we alluded to earlier, the fact that the 3870 X2 features two GPUs on a single board means that it doesn't rely on chipset support to enable its multi-GPU functionality: it'll work in any motherboard that would support a standard 3870.

Driver support is also seamless; you don't have to enable CrossFire or fiddle with any settings, the card just works. AMD's Catalyst drivers attempt to force an Alternate Frame Render (AFR) mode whenever possible, but be warned that if there are issues with the 3870 X2's multi-GPU rendering modes and a game you may only get single-GPU performance until AMD can fix the problem. In our testing we didn't encounter any such issues but as new games and OS revisions come out, as we saw with the GeForce 7950 GX2, there's always the chance.

AMD insists that by releasing a multi-GPU card it will encourage developers to take CrossFire more seriously. It is also committed to releasing future single-card, multi-GPU solutions but we'll just have to wait and see how true that is.

Last Minute Driver Drop: Competitive Crysis Performance

Today's launch was actually supposed to happen last week, on January 23rd. At the last minute we got an email from AMD stating that the embargo on 3870 X2 reviews had been pushed back to the 28th and we'd receive more information soon enough.

The reason for the delay was that over the weekend, before the launch on the 23rd, AMD was able to fix a number of driver issues that significantly impacted performance with the 3870 X2. The laundry list of fixes are as follows:

• Company of Heroes DX10 – AA now working on R680. Up to 70% faster at 2560x1600 4xAA
• Crysis DX10 – Improves up to ~60% on R680 and up to ~9% on RV670 on Island GPU test up to 1920x1200.
• Lost Planet DX10 – 16xAF scores on R680 improved ~20% and more. AF scores were horribly low before and should have been very close to no AF scores
• Oblivion – fixed random texture flashing
• COJ – no longer randomly goes to blackscreen after the DX10 benchmark run
• World in Conflict - 2560x1600x32 0xAA 16xAF quality=high we get 77% increase
• Fixed random WIC random crashing to desktop
• Fixed CF scaling for Colin McRae Dirt, Tiger Woods 08, and Blazing Angels2
• Fixed WIC DX9 having smearable text

With a list like that, we can understand why AMD pushed the NDA back - but most importantly, the Radeon HD 3870 X2 went from not scaling at all in Crysis to actually being competitive.

The Radeon 3800 series has always lagged behind NVIDIA when it came to performance under Crysis, and with the old driver Crysis was a black eye on an otherwise healthy track record for the 3870 X2. The new driver improved performance in Crysis by around 44 - 54% at high quality defaults depending on resolution. The driver update doesn't make Crysis any more playable at very high detail settings, but it makes the X2's launch a lot smoother than it would've been.

According to AMD, the fix in the driver that so positively impacted Crysis performance had to do with the resource management code. Apparently some overhead in the Vista memory manager had to be compensated for, and without the fix AMD was seeing quite poor scaling going to the 3870 X2.

The Test

Test Setup
CPU Intel Core 2 Extreme QX9650 @ 3.00GHz
Motherboard EVGA nForce 780i SLI
Power Measurements done on ASUS P5E3 Deluxe
Video Cards ATI Radeon HD 3870 X2
ATI Radeon HD 3870
NVIDIA GeForce 8800 GTX
NVIDIA GeForce 8800 GTS 512
NVIDIA GeForce 8800 GT (512MB)
Video Drivers ATI: 8-451-2-080123a
NVIDIA: 169.28
Hard Drive Seagate 7200.9 300GB 8MB 7200RPM
RAM 4x1GB Corsair XMS2 DDR2-800 4-4-4-12
Operating System Windows Vista Ultimate 32-bit

 

Index Bioshock
Comments Locked

74 Comments

View All Comments

  • HilbertSpace - Monday, January 28, 2008 - link

    When giving the power consumption numbers, what is included with that? Ie. how many fans, DVD drives, HDs, etc?
  • m0mentary - Monday, January 28, 2008 - link

    I didn't see an actual noise chart in that review, but from what I understood, the 3870GX2 is louder than an 8800 SLI setup? I wonder if anyone will step in with a decent after market cooler solution. Personally I don't enjoy playing with headphones, so GPU fan noise concerns me.
  • cmdrdredd - Monday, January 28, 2008 - link

    then turn up your speakers
  • drebo - Monday, January 28, 2008 - link

    I don't know. It would have been nice to see power consumption for the 8800GT SLI setup as well as noise for all of them.

    I don't know that I buy that power consumption would scale linearly, so it'd be interesting to see the difference between the 3870 X2 and the 8800GT SLI setup.
  • Comdrpopnfresh - Monday, January 28, 2008 - link

    I'm impressed. Looking at the power consumption figures, and the gains compared to a single 3870, this is pretty good. They got some big performance gains without breaking the bank on power. How would one of these cards overclock though?
  • yehuda - Monday, January 28, 2008 - link

    No, I'm not impressed. You guys should check the isolated power consumption of a single-core 3870 card:

    http://www.xbitlabs.com/articles/video/display/rad...">http://www.xbitlabs.com/articles/video/...lay/rade...

    At idle, a single-core card draws just 18.7W (or 23W if you look at it through a 82% efficient power supply). How is it that adding a second core increases idle power draw by 41W?

    It would seem as if PowerPlay is broken.
  • erikejw - Tuesday, January 29, 2008 - link

    ATI smokes Nvidia when it comes to idle power draw.
  • Spoelie - Monday, January 28, 2008 - link

    GDDR4 consumes less power as GDDR3, given that the speed difference is not that great.
  • FITCamaro - Monday, January 28, 2008 - link

    Also you figure the extra hardware on the card itself to link the two GPUs.
  • yehuda - Tuesday, January 29, 2008 - link

    Yes, it could be that. Tech Report said the bridge chip eats 10-12 watts.

Log in

Don't have an account? Sign up now