After our initial launch article on the Radeon HD 3870x2, we decided to run a quick sanity check and make sure that our positive experience wasn't overly limited by the games we tested. We wanted to make sure that this card really acts like a single card as our first account seemed to indicate.

After a few more days with the card, playing with various features and alternate games that we didn't address in our first article, we've been able to draw a few more conclusions about the hardware. Our initial assessment certainly wasn't without merit. Again, we didn't have any compatibility or driver problems, and CrossFire was completely transparent. Not only are there no physical settings to worry about (like the 7950 GX2), but there were no driver settings to mess with either (unlike the NVIDIA solution). We didn't have to fiddle with anything, and it just worked.

Single cards aren’t just about gaming and physical hardware. We would also expect a single card to behave like a single card in all usage situations. To that end, we wanted to make sure that multi-monitor support was on par with a single card. In the past, multiGPU solutions have required that only one monitor be enabled in conjunction with multiGPU rendering. With no driver setting to toggle, does this mean no multi-monitor support or that one display would blank when CrossFire kicked on?

AMD has addressed these issues very well in this case. Games are fully multiGPU accelerated even when multiple monitors are enabled, thus no separate CrossFire switch and no display blanking.

Just to confirm, we hooked up a second monitor and turned it on while we played gears of war. Our framerates were on par with what we were seeing when only one monitor was enabled. In order to show you guys what it looks like, we ran it in a window to get this screenshot.

It is worth mentioning that running in a window did hurt framerate as compared to running the game full screen. AMD has pushed the fact that their new hardware is capable of fully accelerating windowed 3D based on how it manages clock speed with respect to work load, so we aren't quite sure why we are seeing this behavior. The important thing is that gamers no longer need to disable secondary monitors in order to play their games with a multiGPU solution.

More Performance Testing
POST A COMMENT

22 Comments

View All Comments

  • flyingalan - Monday, February 11, 2008 - link

    As one of a large flightsim community, can I thank the testers for including it as one of the tested games on this graphics card.
    I would plead that it is included as one of the tests in any future CPU/graphics etc reviews, not all of us are war games/shooter games fans and some variety would be nice.
    Reply
  • CDex - Thursday, February 07, 2008 - link

    "its sad that it takes 2 of their cards to beat 1 of nvidias"

    It's sad that the fan~boi mind cannot appreciate cool technology without feeling insecure about the manufacturer of the hardware in their own precious machine. Do you think the corporate entity that you seem to love so much would show you the same loyalty?

    Fan~boi~ism is psychologically no different than the schoolyard clique mentality, which stems from insecurity and immaturity. Grow up and be an independent thinker. It will benefit both your mind and your wallet.
    Reply
  • diablosinc - Thursday, February 07, 2008 - link

    i'll build the chapel, brother...and you preach it! Reply
  • knowom - Thursday, February 07, 2008 - link

    I realize they're different markets, but it's ironic to me AMD gave Intel flack about it's dual die approach for cpus and comes out with a dual chip approach for gpus. I think Nvidia should give them shit about it and then just come out with a dual die approach haha. Reply
  • MrKaz - Thursday, February 07, 2008 - link

    Nope. You are mixing things up.
    AMD did first the dual CPU with 4x4 or quadFX, Intel follows with an also worthless implementation called skulltrail.

    About the dual die, who knows maybe ATI will do it first too since they already have the 512bit memory interface from the R600 design (Nvidia doesn’t have it), just plug in two 256 bit RV670 glued together ;) and there you have a much more elegant and cheaper 3870 X2.

    I think you all whine when AMD doesn’t deliver or deliver. I smell fanatic.
    Reply
  • DigitalFreak - Thursday, February 07, 2008 - link

    Dual die does not equal dual cpu, smacktard. 4x4/Skulltrail has nothing to do with what he was talking about. Reply
  • diablosinc - Thursday, February 07, 2008 - link

    ahh, but it all has to do with all of our manufacturers, when we allow our threads to denigrate into fanboi p**sing matches. but i think i've hit upon a formula that explains it.
    intel = 5 letters, amd = 3. intel wins.
    nvidia = 6 letters, ati = 3. nvidia has twice the performance (unless you want to argue that its amd/ati = 6 letters, one punctuation mark, giving them a slight edge).

    what? then you tell me what everyone is on about!! all i'm saying is that, compared with the logic behind many arguments which spring up when these threads go fanboi...my formula almost makes sense!

    remember to read it all and laugh, folks. ultimately...you're right about what you want. thats what it all comes down to...you are right. whatever your opinion, whatever your rationale...you are right. not these other people who want to tell you what and how to think.

    amd/ati, intel, nvidia = at the buyer's discretion, based on his/her needs.
    okay? okay...
    now can we move along to some real issues here?
    Reply
  • michal1980 - Wednesday, February 06, 2008 - link

    unfair to compare a new solution to a card that by nvidia standards is now nearly 3 generations ago?

    and while I understand the need to say Praise amd for finally getting performance back. its sad that it takes 2 of their cards to beat 1 of nvidias
    Reply
  • Amiga500 - Thursday, February 07, 2008 - link

    "its sad that it takes 2 of their cards to beat 1 of nvidias"



    Its two chips on one card...


    Seriously, how many people are choosing to see it as two cards? The PC sees it as one card, no crossfire drivers within the OS are necessary.
    Reply
  • bigboxes - Wednesday, February 06, 2008 - link

    You sound like a fanboy with your comments. Does it really matter if this is company X? Judge it on it's merits alone. Most of us will buy the card that performs the best for the $$.

    This product is a successful implementation of dual cores in a single card solution. nVidia's attempt at such a card was poorly executed. This review is just comparing the last attempt by nVidia at a dual gpu card. Yes, it took two AMD cores to outperform one of nVidia's latest and greatest, but it's in a single card. Very promising without breaking the bank, especially if one does not have two PCI-Express slots or the skills to set up a SLI/Crossfire in a proper manner.

    It also looks like AMD's new card does this all without much of a hiccup. This is good news for us all. Don't worry for all of you that care which maker is on top. nVidia will come out with their entry in due time. We will have to wait and see if their entry will be as seamless as this AMD setup. Maybe, just maybe, AMD can do something right now and then.
    Reply

Log in

Don't have an account? Sign up now