After our initial launch article on the Radeon HD 3870x2, we decided to run a quick sanity check and make sure that our positive experience wasn't overly limited by the games we tested. We wanted to make sure that this card really acts like a single card as our first account seemed to indicate.

After a few more days with the card, playing with various features and alternate games that we didn't address in our first article, we've been able to draw a few more conclusions about the hardware. Our initial assessment certainly wasn't without merit. Again, we didn't have any compatibility or driver problems, and CrossFire was completely transparent. Not only are there no physical settings to worry about (like the 7950 GX2), but there were no driver settings to mess with either (unlike the NVIDIA solution). We didn't have to fiddle with anything, and it just worked.

Single cards aren’t just about gaming and physical hardware. We would also expect a single card to behave like a single card in all usage situations. To that end, we wanted to make sure that multi-monitor support was on par with a single card. In the past, multiGPU solutions have required that only one monitor be enabled in conjunction with multiGPU rendering. With no driver setting to toggle, does this mean no multi-monitor support or that one display would blank when CrossFire kicked on?

AMD has addressed these issues very well in this case. Games are fully multiGPU accelerated even when multiple monitors are enabled, thus no separate CrossFire switch and no display blanking.

Just to confirm, we hooked up a second monitor and turned it on while we played gears of war. Our framerates were on par with what we were seeing when only one monitor was enabled. In order to show you guys what it looks like, we ran it in a window to get this screenshot.

It is worth mentioning that running in a window did hurt framerate as compared to running the game full screen. AMD has pushed the fact that their new hardware is capable of fully accelerating windowed 3D based on how it manages clock speed with respect to work load, so we aren't quite sure why we are seeing this behavior. The important thing is that gamers no longer need to disable secondary monitors in order to play their games with a multiGPU solution.

More Performance Testing
Comments Locked

22 Comments

View All Comments

  • SoCalBoomer - Wednesday, February 6, 2008 - link

    It is a valid observation - the 7950 dual GPU on a card solution is old. It's what I was thinking. At $400+, the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it? It's an opinion thing I guess.

    That's not to say that this is not a good card - on the contrary, it's WONDERFUL to see AMD/ATI (both of whom I'm a huge fan) coming out with products that are competitive, smooth, and quality!

    It's also great to see AMD come out with something that has good and smooth driver support - that's something I've always been frustrated about with the older ATI cards. . .loved what they did and loved them after I got them configured the way I wanted - hated the road to get there though.
  • Viditor - Thursday, February 7, 2008 - link

    "the 7950 dual GPU on a card solution is old"

    But a smooth and effecvtive implementation of it like this one, is quite new...

    "the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it?"

    Interesting question...that $150 extra represents a 37.5% price premium. Plus, of course there's the epenis bragging rights of having the fastest card made (though I never really understood that one.

    To give that perspective though, the 8800GT is 32% more expensive than the HD3850...

    I guess if you're a gamer, then the X2 makes perfect sense, but if you use little graphics, then the 3850 makes more sense. In between the 2 is the 8800GT, so it will depend on what you do.
  • ChronoReverse - Wednesday, February 6, 2008 - link

    By that argument, neither does the 8800 Ultra which is MORE expensive...
  • SoCalBoomer - Thursday, February 7, 2008 - link

    Out of curiosity, I looked to see where ANYONE mentioned an 8800 Ultra in this article. . .

    guess what, nobody did. Hmmmm, fancy that.

    Topic of this was comparing a non-available previous-generation nVidia card to a top-of-the-line, just-released ATI.

    THAT is not a valid comparision.
  • Gilgamesj - Wednesday, February 13, 2008 - link

    Neither did anyone mention a 8800GT.

    Point is that it's up to the buyer to decide whether this is worth it. MultiGPU solutions have always been a questionable and subjective thing. The important objective thing taht can be said about this card is that it takes one disadvantage away, there's no need for a crossfire motherboard.

    I think Ati put something interesting on the market.
  • Ryanman - Sunday, February 10, 2008 - link

    It wasn't comparing it in terms of performance, bro. It was comparing it in terms of driver support and form factor. That type of comparison is legit no matter how old a card is and you know it.

    In terms of performace, this was tested with a 8800GT SLI config (500+) and performed just under it (at 4-450), and it outperforms the ultra for obvious reasons. THAT'S the performance comparison. Quit whining and being a fanboy and think before you post.
  • kextyn - Friday, February 8, 2008 - link

    "It is a valid observation - the 7950 dual GPU on a card solution is old. It's what I was thinking. At $400+, the card is a high-end card and really doesn't do $150 worth more than, say, the 8800gt - or does it? It's an opinion thing I guess. "

    "By that argument, neither does the 8800 Ultra which is MORE expensive... "

    "Out of curiosity, I looked to see where ANYONE mentioned an 8800 Ultra in this article. . .
    guess what, nobody did. Hmmmm, fancy that.
    Topic of this was comparing a non-available previous-generation nVidia card to a top-of-the-line, just-released ATI.
    THAT is not a valid comparision. "

    So why did you mention the 8800GT in your other post, hypocrite?

    ChronoReverse's comparison was just as valid as yours.
  • schmunk - Wednesday, February 6, 2008 - link

    I think it is fair. The article's focus is showing AMDs implementation of the dual GPU card, and how seemless it can be done as far a settings, dual sceens, and drivers. I don't think the point is that NVida did it first or faster. I think we all benifit from something like this.
  • schmunk - Wednesday, February 6, 2008 - link

    "AMD has pushed the fact that their new hardware is capable of fully accelerating windowed 3D based on how it manages clock speed with respect to work load, so we aren't quite sure why we are seeing this behavior."

    Could it be in windowed mode the desktop is showing and Aero Glass is hog? :)
  • schmunk - Wednesday, February 6, 2008 - link

    Meant to say Aero. Seriously though doesn't it have some sort of virtual memory footprint for the Aero Interface?

Log in

Don't have an account? Sign up now