Final Words

I've never felt totally comfortable with single-card multi-GPU solutions. While AMD reached new levels of seamless integration with the Radeon HD 3870 X2, there was always the concern that the performance of your X2 would either be chart topping or merely midrange depending on how good AMD's driver team was that month. The same is true for NVIDIA GPUs, most games we test have working SLI profiles but there's always the concern that one won't. It's not such a big deal for us benchmarking, but it is a big deal if you've just plopped down a few hundred dollars and expect top performance across the board.

Perhaps I'm being too paranoid, but the CrossFire Sideport issue highlighted an important, um, issue for me. I keep getting the impression that multi-GPU is great for marketing but not particularly important when it comes to actually investing R&D dollars into design. With every generation, especially from AMD, I expect to see a much more seamless use of multiple GPUs, but instead we're given the same old solution - we rely on software profiles to ensure that multiple GPUs work well in a system rather than having a hardware solution where two GPUs truly appear, behave and act as one to the software. Maybe it's not in the consumer's best interest for the people making the GPUs to be the same people making the chipsets, it's too easy to try and use multi-GPU setups to sell more chipsets when the focus should really be on making multiple GPUs more attractive across the board, and just...work. But I digress.

The Radeon HD 4870 X2 is good, it continues to be the world's fastest single card solution, provided that you're running a game with CrossFire support. AMD's CF support has been quite good in our testing, scaling well in all but Assassin's Creed. Of course, that one is a doubly bitter pill for AMD when combined with the removal of DX10.1 support in the latest patch (which we did test with here). That has nothing to do with CrossFire support of course, but the lack of scaling and the fact that 4xAA has the potential to be free on AMD hardware but isn't really doesn't stack up well in that test.

In addition to being the fastest single card solution, the 4870 X2 in CrossFire is also the fastest 2 card solution at 2560x1600 in every test we ran but one (once again, Assassin's Creed). It is very important to note that 4-way CrossFire was not the fastest solution at lower than 2560x1600 in as many cases. This is generally because there is more overhead associated with 4-way CrossFire which can become the major bottle neck in performance at lower resolution. It isn't that the 4870 X2 in CrossFire is unplayable at lower resolutions, it's just a waste of money.

We do have yet to test 3-way SLI with the newest generation of NVIDIA hardware, and the 3-way GTX 260 may indeed give 2x 4870 X2 cards a run for their money. We also have no doubt that a 3x GTX 280 solution is going to be the highest performing option available (though we lament the fact that anyone would waste so much money on so much unnecessary (at this point in time) power).

For now, AMD and NVIDIA have really put it all in on this generation of hardware. AMD may not have the fastest single GPU, but they have done a good job of really shaking up NVIDIA's initial strategy and forcing them to adapt their pricing to keep up. Right now, the consumer can't go wrong with a current generation solution for less than $300 in either the GTX 260 or the HD 4870. These cards compete really well with each other and gamers will really have to pay attention to which titles they desire greater performance in before they buy.

The GTX 280 is much more reasonable at $450, but you are still paying a premium for the fastest single GPU solution available. In spite of the fact that the price is 150+% of the GTX 260 and the 4870, you just don't get that return in performance. It is faster than the GTX 260, and most of the time it is faster than the 4870 (though there are times when AMD's $300 part outperforms NVIDIA's $450 part). The bottom line is that if you want performance at a level above the $300 price point in this generation, you're going to get less performance per dollar.

When you start pushing up over $450 and into multi-GPU solutions, you do have to be prepared for even more diminished returns on your investment, and the 4870 X2 is no exception. Though it scales well in most cases and leads the pack in terms of single card performance when it scales, there is no gaurantee that scaling will be there, let alone good, in every game you want to play. AMD is putting a lot into this, and you can expect us to keep pushing them to get performance impovements as near to linear as possible with multi-GPU solutions. But until we have shared framebuffers and real cooperation on rendering frames from a multi-GPU solution we just aren't going to see the kind of robust, consistent results most people will expect when spending over $550+ on graphics hardware.

Power Consumption
Comments Locked

93 Comments

View All Comments

  • Greene - Wednesday, August 13, 2008 - link

    Wow. Lots of this and that in here :-)

    No Hardware Info...
    No Driver Info...

    Did we lose a Page ?

    I'm also curious why Assessess Creed wasn't tested with the different versions ?
    There was such a big stink back in 99/2000 when ati fudged drivers to get better FPS scores, as well as the stink back when Nvidia did the same with 3DMark (what was it 05)?
    And here the "creed" developers drop some sort of support for ATI
    and the authors skip over it, and leave the different versions out of the test.

    Did you guys draft this article 2 weeks ago and forget to revise it ?

    Did you hire fox news editors ?

    I've really trusted and valued Anandtech's articles in the past.

    This just seems sloppy, incomplete and rushed... and i dropped out of college! :-)
  • Arbie - Wednesday, August 13, 2008 - link

    Every bar graph has the cards in a different order. This makes it impossible to scan the graphs and see how a card does overall, across a range of games. And there is no compensating benefit. If I want to know which card is fastest in Crysis, I can clearly see which bar is longer! It DOESN'T HAVE TO BE THE TOP BAR ON THE GRAPH.

    So... you won't do that again.

    Next: everyone should just go out and buy a 4850. It will do all you want for now. Let all these X2 kludges and 65nm dinosaurs pound each other into landfill. Check back again in 6-8 months.

    Arbie
  • hooflung - Wednesday, August 13, 2008 - link

    The numbers were not bad. They speak for themselves. However, the tone of this review was horrible. It is the fastest card in your review and has exactly what people want out of a multi gpu setup. 1 slot, full gig of ram, smashes the competition's closest competitor that cost more, only costs 100 above the best single gpu solution and doesn't require a new motherboard.

    Yet, Nvidia can't do any wrong. ATI decides its sideport isn't needed and disable's it which is a cardinal sin it seems. It still cost 100 dollars LESS than Nvidia's GTX280 when it first came out.

    The mixed signals coming from this review could make a cake if baked.
  • drank12quartsstrohsbeer - Wednesday, August 13, 2008 - link

    This article had the feel like the authors were annoyed that they had to write it. I certainly feel annoyed after reading it...
  • just4U - Wednesday, August 13, 2008 - link

    From my perspective this was a very valid and honest review that zones in on key issues that effect the majority of our gpu buying decisions. Yeah their getting some tough love feedback from it but that's to be expected as well.
  • Keldor314 - Wednesday, August 13, 2008 - link

    750 watts for the X2 in crossfire?! You'd better think of having an electrician come by and upgrade your home's powergrid! Seriously, though, for my house, I can't run a single 8800 gtx at the same time as a space heater without tripping the circut breakers in the garage. True, the heater in question is rated at 1500 watts. The total wattage to trip the circut breaker is thus probably less than 2000 watts, since I've also seen the heater trip it when only accompanied by a lamp (no computer on). Given that the X2 CF will probably, after counting the rest of the computer, send energy usage to over 1000W at load, there's a very real chance that such a computer would periodically cause your power to go out, especially if, god forbid, someone tried to turn on the room's lights.

    Upgrading a power supply is cheap. Rewiring your house to handle the higher wattage is not.
  • CK804 - Sunday, August 17, 2008 - link

    Actually, the power consumption numbers are of the entire system and not just the graphics cards alone. Still, it's amazing how much power these cards draw. My jaw dropped when I saw that the power consumption of a system with these cards under load exceeded 700 watts. When X-bit labs did a roundup of 1000 watt power supplies, the first thing they concluded was that there was no need for power supplies over 6-700 watts for any setup unless some sort of exotic cooling was to be used. I can attest to that statement when I had 4 first gen. 74GB Raptors in RAID 0 coupled with 2 7900GTs in SLI and an AMD X2 4800+ running on a Zalman 460 watt PSU.
  • animaniac2k8 - Wednesday, August 13, 2008 - link

    I 've been a reader of AnandTech's articles for many years and I have owned exlusively Nvidia cards since 2001.

    This is easily one of the worst and most biased articles I 've ever read on AnandTech. Very dissapointed to have wasted my time reading this. I 'll be looking elsewhere for quality reviews from now on.
  • CyberHawk - Wednesday, August 13, 2008 - link

    Same here. Reader since 2001, registered later.

    I always liked articles here. English is my second language and I liked that from time to time I found a new word that made me look into the diary.

    But, this article is a bunch of bull. One more like this and I am out of here. Not that this means the end of anandtech but anyway.
  • helldrell666 - Wednesday, August 13, 2008 - link

    Where's the system setup?
    Why the poster hates AMd that much?
    This is the worst review of the 4870x2 I've checked yet.

    The review at techreport.com is much better.


Log in

Don't have an account? Sign up now