Final Words

It has been far too long since AMD/ATI have been at the top of the performance charts; the crown had been lost on both CPU and GPU fronts, but today's Radeon HD 3870 X2 introduction begins to change that. The Radeon HD 3870 X2 is the most elegant single-card, multi-GPU design we've seen to date and the performance is indeed higher than any competing single-card NVIDIA solution out today.

AMD is also promising the X2 at a fairly attractive price point; at $449 it is more expensive than NVIDIA's GeForce 8800 GTS 512, but it's also consistently faster in the majority of titles we tested. If you're looking for something in between the performance of an 8800 GTS 512 and a 8800 GT 512 SLI setup, the Radeon HD 3870 X2 is perfect.

Even more appealing is the fact that the 3870 X2 will work in all motherboards: CrossFire support is not required. In fact, during our testing it was very easy to forget that we were dealing with a multi-GPU board since we didn't run into any CrossFire scaling or driver issues. We're hoping that this is a sign of things to come, but we can't help but worry about the future of these multi-GPU cards.

The fact that both AMD and NVIDIA are committed to them is promising, and hopefully that means an even better experience when it comes to compatibility and performance with CrossFire and SLI (single-card or not), but we've got no crystal ball - only time will tell how the driver support evolves in the future.

But today, we have a victory for AMD. The past few months have shown a very different graphics division of AMD than we've seen since the first talks of the acquisition. The Radeon HD 2900 XT was a failure and now AMD has arguably the fastest single-card graphics card on the market. The only worry we'd have if we were in AMD's shoes is that the 3870 X2 was made by putting a couple of 3870s onto a single board; if AMD can do it, NVIDIA can as well. And we all know how the 3870 vs. 8800 GT matchup turned out.

What AMD really needs is its next-generation high end GPU, the 3870 X2 will buy the top performance spot for a little while but it's R700 that we really need to see.

Power Consumption
Comments Locked

74 Comments

View All Comments

  • boe - Monday, January 28, 2008 - link

    I really appreciate this article.

    The things I'd really like to see on the next is adding FEAR benchmarks.

    I'd also appreciate a couple of older cards added for comparison like the 7900 or the x1900.
  • Butterbean - Monday, January 28, 2008 - link

    "And we all know how the 3870 vs. 8800 GT matchup turned out"

    Yeah it was pretty close except for Crysis - where Nvidia got busted not drawing scenes out so as to cheat out a fps gain.
  • Stas - Monday, January 28, 2008 - link

    Conveniently the tests showed how *2* GTs are faster in most cases than X2. Power consumption test only shows *single* GT on the same chart with X2.
  • geogaddi - Wednesday, January 30, 2008 - link


    Conveniently, most of us can multiply by 2.
  • ryedizzel - Monday, January 28, 2008 - link

    in the 2nd paragraph under 'Final Words' you put:

    Even more appealing is the fast that the 3870 X2 will work in all motherboards:

    but i think you meant to say:

    Even more appealing is the fact that the 3870 X2 will work in all motherboards:

    you are welcome to delete this comment when fixed.
  • abhaxus - Monday, January 28, 2008 - link

    I would really, really like to see a crysis benchmark that actually uses the last level of the game rather than the built in gpu bench. My system (q6600@2.86ghz, 2x 8800GTS 320mb @ 620/930) gets around 40fps with 'high' defaults (actually some very high settings turned on per tweakguides) on both of the default crysis benchies, but only got around 10fps on the last map. Even on all medium, the game only got about 15-20 fps on the last level. Performance was even lower with the release version, the patch improved performance by about 10-15%.

    Of course I'd be interested to see how 2 of these cards do in crysis :)
  • Samus - Monday, January 28, 2008 - link

    Even though Farcry is still unplayable at 1900x1200 (<30FPS) but its really close. My 8800GT only manages 18FPS on my PC using the same ISLAND_DEM benchmark AT did, so to scale, the 3870 X2 will do about 27FPS for me. Maybe with some overclocking it can hit 30FPS. $450 to find out is a bit hard to swallow though :\
  • customcoms - Monday, January 28, 2008 - link

    Well, if you're still getting <30 fps on Far Cry, I think you're PC is a bit too outdated to benefit from an upgrade to an HD3870 X2.

    I assume you meant Crysis. This game is honestly poorly coded with graphical glitches in the final scenes. With my 8800GT (and a 2.6ghz Opteron 165, 2gb ram), I pull 50 FPS in the Island_Dem benchmark at 1680x1050 with a mixture of medium-high settings, 15 fps more than the 8800GTS 320mb I replaced it with. However, when you get to some of the final levels, these frames drop to more like 30 or less, and I am forced to drop to medium or a combination of medium-low settings at this point (perhaps my 2.6ghz dual core cpu isn't up to 3ghz C2D snuff).

    Clearly, the game was rushed to market (not unlike Far Cry). Yes, the visuals are stunning, and the storyline is decent, but I much prefer Call of Duty 4, where the visuals are SLIGHTLY less appealing, but the storyline is better, the game is more realistic, the controls are better, and I can crank everything to max without worrying about slowdowns in later levels. Its the only game I have ever immediately started replaying on veteran.

    The point is, no card(s) appears to be able to really play Crysis max everything at 1900x1200 or higher, and in my findings, the built in time demo's do not realistically simulate the games full demands in later levels.
  • swaaye - Wednesday, January 30, 2008 - link

    Yeah, armchair game developer in action!

    In what way is CoD4 realistic, exactly? I suppose it does portray actual real military assets. It doesn't portray them in a realistic way, however.

    Did you notice that Crysis models bullet velocity? It's not just graphical glitz.
  • Griswold - Monday, January 28, 2008 - link

    Ahh, gotta love the arm chair developers who can see the game code just by looking at the DVD.

Log in

Don't have an account? Sign up now