Final Words

I've never felt totally comfortable with single-card multi-GPU solutions. While AMD reached new levels of seamless integration with the Radeon HD 3870 X2, there was always the concern that the performance of your X2 would either be chart topping or merely midrange depending on how good AMD's driver team was that month. The same is true for NVIDIA GPUs, most games we test have working SLI profiles but there's always the concern that one won't. It's not such a big deal for us benchmarking, but it is a big deal if you've just plopped down a few hundred dollars and expect top performance across the board.

Perhaps I'm being too paranoid, but the CrossFire Sideport issue highlighted an important, um, issue for me. I keep getting the impression that multi-GPU is great for marketing but not particularly important when it comes to actually investing R&D dollars into design. With every generation, especially from AMD, I expect to see a much more seamless use of multiple GPUs, but instead we're given the same old solution - we rely on software profiles to ensure that multiple GPUs work well in a system rather than having a hardware solution where two GPUs truly appear, behave and act as one to the software. Maybe it's not in the consumer's best interest for the people making the GPUs to be the same people making the chipsets, it's too easy to try and use multi-GPU setups to sell more chipsets when the focus should really be on making multiple GPUs more attractive across the board, and just...work. But I digress.

The Radeon HD 4870 X2 is good, it continues to be the world's fastest single card solution, provided that you're running a game with CrossFire support. AMD's CF support has been quite good in our testing, scaling well in all but Assassin's Creed. Of course, that one is a doubly bitter pill for AMD when combined with the removal of DX10.1 support in the latest patch (which we did test with here). That has nothing to do with CrossFire support of course, but the lack of scaling and the fact that 4xAA has the potential to be free on AMD hardware but isn't really doesn't stack up well in that test.

In addition to being the fastest single card solution, the 4870 X2 in CrossFire is also the fastest 2 card solution at 2560x1600 in every test we ran but one (once again, Assassin's Creed). It is very important to note that 4-way CrossFire was not the fastest solution at lower than 2560x1600 in as many cases. This is generally because there is more overhead associated with 4-way CrossFire which can become the major bottle neck in performance at lower resolution. It isn't that the 4870 X2 in CrossFire is unplayable at lower resolutions, it's just a waste of money.

We do have yet to test 3-way SLI with the newest generation of NVIDIA hardware, and the 3-way GTX 260 may indeed give 2x 4870 X2 cards a run for their money. We also have no doubt that a 3x GTX 280 solution is going to be the highest performing option available (though we lament the fact that anyone would waste so much money on so much unnecessary (at this point in time) power).

For now, AMD and NVIDIA have really put it all in on this generation of hardware. AMD may not have the fastest single GPU, but they have done a good job of really shaking up NVIDIA's initial strategy and forcing them to adapt their pricing to keep up. Right now, the consumer can't go wrong with a current generation solution for less than $300 in either the GTX 260 or the HD 4870. These cards compete really well with each other and gamers will really have to pay attention to which titles they desire greater performance in before they buy.

The GTX 280 is much more reasonable at $450, but you are still paying a premium for the fastest single GPU solution available. In spite of the fact that the price is 150+% of the GTX 260 and the 4870, you just don't get that return in performance. It is faster than the GTX 260, and most of the time it is faster than the 4870 (though there are times when AMD's $300 part outperforms NVIDIA's $450 part). The bottom line is that if you want performance at a level above the $300 price point in this generation, you're going to get less performance per dollar.

When you start pushing up over $450 and into multi-GPU solutions, you do have to be prepared for even more diminished returns on your investment, and the 4870 X2 is no exception. Though it scales well in most cases and leads the pack in terms of single card performance when it scales, there is no gaurantee that scaling will be there, let alone good, in every game you want to play. AMD is putting a lot into this, and you can expect us to keep pushing them to get performance impovements as near to linear as possible with multi-GPU solutions. But until we have shared framebuffers and real cooperation on rendering frames from a multi-GPU solution we just aren't going to see the kind of robust, consistent results most people will expect when spending over $550+ on graphics hardware.

Power Consumption
Comments Locked

93 Comments

View All Comments

  • pattycake0147 - Tuesday, August 12, 2008 - link

    I've noticed the same bias recently. I've only been a member for a little over a year now and even in the short time the site has gone downhill.
  • sweetsauce - Tuesday, August 12, 2008 - link

    Translation: I like ATI and you don't so im going to bitch. Even though my name is tech guy, i obviously have ovaries. I'm going to go cry now on ATI's behalf.
  • jnmfox - Tuesday, August 12, 2008 - link

    Get over yourself. Pointing out facts isn't taking pot-shots.

    This is just what I was looking for in a review of the X2. The numbers tell the story. In the majority of cases the X2 isn't worth it, and until AMD & NVIDIA get proper hardware implementation of multi-GPU solutions it will most continue to be the case.

    To little performance increase for the large increase is cost.
  • skiboysteve - Tuesday, August 12, 2008 - link

    i completely agree with anand on this article. the lack of innovation from a company supposedly focusing on multi chip solutions is stupid

    although yes, it is really fast.

    and why cant they clock it lower at idle?
  • astrodemoniac - Tuesday, August 12, 2008 - link

    ... reviews I have ever seen here @ Anands. I am extremely disappointed with this so called "Review" ... hell, I have seen PREVIEWS that would put it to shame.

    Oh, and what in the hell did AMD do to you that you're so obviously pissed off at them?... are you annoyed they didn't give you preferential treatment to release the review earlier? man just go back to the unbiased reviews, we're buying graphic cards, not brands.

    It's like the guys writing the reviews are not gamers any more o_0

    /rant
  • Halley - Wednesday, August 13, 2008 - link

    It's no secret that AnandTech is "managed by Intel" as a user put it. Of course every one must have some source of income to support their families and themselves but it's pathetic to show such blatant biasedness.
  • TheDoc9 - Tuesday, August 12, 2008 - link

    Anandtech isn't about gaming anymore, it's about photography and home theater. And the occasional newest intel extreem cpu.

    I think Dailytech and the forums carry Anandtech these days...

  • DigitalFreak - Tuesday, August 12, 2008 - link

    "AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled. "

    In other words, it's broken in hardware and we couldn't get it working, so we "disabled" it.
  • NullSubroutine - Tuesday, August 12, 2008 - link

    You didn't even include test system specs or driver versions.
  • CreasianDevaili - Tuesday, August 12, 2008 - link

    I wanted to know why you didnt retest the 4870CF setup when you obviously had some issues with it before in GRID. I noticed the 280gtx setup was retested which resulted in higher FPS. I feel that after running the game at 2560x1600 on my FW900 and also from other reviews that you had a issue with crossfire not working at that resolution. The single 4870 shouldnt be getting better FPS by that degree at 2560x1600 because it also has 512mb of vram.

    So I just wanted to know why the 280gtx was special enough to retest when this review was about the 4870X2. If it is to show a good comparison then why wasnt the 4870CF, which many have and want to see, not retested as well.

Log in

Don't have an account? Sign up now