There are two things that become very clear when looking at our data for the 5970

  1. It’s hands down the fastest single card on the market
  2. It’s so fast that it’s wasted on a single monitor

AMD made a good choice in enabling Crossfire Eyefinity for the 5970, as they have made a card so fast that it basically shoots past everything on the market that isn’t Crysis. All of our action games that aren’t CPU limited do better than 100fps at 2560x1600, and RTSs are doing just under 60fps. The 5970 is without a doubt Overkill (with a capital O) on a single monitor. This will likely change for future games (i.e. STALKER), but on today’s games it’s more power than is necessary to drive even the largest single monitor. The 5970 still offers a good performance boost over the 5870 even with a single monitor, but with the 5870’s outstanding performance, it’s not $200 better.

So that leaves us with Eyefinity. So long as GPUs are outpacing games, AMD needs something to burn up extra performance to give faster cards a purpose, and that’s Eyefinity. Eyefinity is a strain - even 3 smaller monitors can result in more pixels being pushed than a single 2560. Having Crossfire Eyefinity support gives an AMD card the breathing room it needs to offer Eyefinity at playable framerates across a wider spectrum of monitors and games. Given the price of 3 20”+ monitors is going to approach if not exceed the $600 price of the card, the 5970 is the perfect match for Eyefinity gaming at this time.

When AMD originally told us about this card, I was surprised to see that they slapped only a $600 price tag on it. As the fastest of the fast cards, AMD can basically charge up to 2x the price of a 5870 for it, and they didn’t. After seeing the performance data, I understand why. In our benchmarks the 5970 is practically tied with the 5850CF, and a pair of such cards would sell for $600 at this time. I still expect that we’re going to see a performance gap emerge between the cards (particularly if the 5970 is held back by drivers) but right now the $600 price tag is appropriate.

What this does call into question though is what’s better to have: a pair of 5800 series cards, or a 5970. If we assume that the 5970 is equal to a 5850CF in performance and in price, then the differences come down to 3 matters: Heat/noise, power, and Crossfire Eyefinity. The 5970 enjoys lower power usage and it doesn’t need a power supply with 4 PCIe plugs, but the cost is that by compacting this into one card it’s hotter and louder than a 5850CF (which really, is true for all dual-GPU cards). The biggest advantage to the 5970 right now is that it’s the only card to support Crossfire Eyefinity, which means it’s the only card to even consider if you are going to use Eyefinity right now. Ultimately if you can run 2 cards and only will be driving a single monitor, go with the 5850CF, otherwise go with the 5970. And if it’s 2010 and you’re reading this article, check and see if AMD has enabled Crossfire Eyefinity for the 5850CF.

Next, we’re left with the prospects of overclocking the 5970. Only one of our two cards even runs at 5870 speeds (850MHz/1200MHz), and while we're willing to entertain the idea that our 1 cranky card is a fluke, we can't ignore the fact that none of our cards can run a real application at 5870 speeds without throttling. Ultimately our experience with the working card has called into question whether the VRMs on the card are up to the task. Since this is a protection mechanism there’s no risk of damage, but it also means that the card is underperforming. Overclock your 5970 to 5870 speeds if you can bear the extra power/heat/noise, but don’t expect 5870CF results.

Last, that leaves us with the 5870CF, and the 5970CF. Thanks to VRM throttling, there’s still a place in this world for the 5870CF. For a 2-GPU setup, it’s still the best way to go, but keep in mind it comes at a $200 premium and lacks Crossfire Eyefinity support. Meanwhile with the 5970CF, while we didn’t get a chance to test it today, we can safely say that it’s entirely unnecessary for a single-monitor setup. There’s a market out there for $1200 in video cards, but you had better be running 3 30” monitors in Eyefinity mode to make use of it.

Power, Temperature, & Noise


View All Comments

  • Ryan Smith - Wednesday, November 18, 2009 - link

    It's possible, but the 850TX is a very well regarded unit. If it can't run a 5970 overclocked, then I surmise that a lot of buyers are going to run in to the same problem. I don't have another comparable power supply on hand, so this isn't something I can test with my card.

    Anand has a 1K unit, and of course you know how his turned out.

    To be frank, we likely would have never noticed the throttling issue if it wasn't for the client. It's only after realizing that it was underperforming by about 10-20% that I decided to watch the Overdrive pane and saw it bouncing around. These guys could be throttling too, and just not realize it.
  • Silverforce11 - Wednesday, November 18, 2009 - link

    Seems iffy then since most reviews put it at 900 core and 5ghz + on the ram, with only a modest overvolt to 1.16. I would think ATI wouldnt bother putting in 3 high quality VRM and japanese capacitors if they didnt test it thoroughly at the specs they wanted it to OC at.

    My old PSU is the bigger bro of this guy being the 750 ver.">
    And had issues with the 4870x2. Got a better "single rail" PSU and it ran fine n OC well.
  • Silverforce11 - Wednesday, November 18, 2009 - link

    ATI went all out with building these 5970, the components are top notch. The chips are the best of the bunch. I'm surprised they did this, as they are essentially selling you 2x 5870 performance (IF your PSU is good) at $599 when 2x 5870 CF would cost $800. They have no competitor in the top, why do they not price this card higher or why even bother putting in quality parts to almost guarantee 5870 clocks?

    I believe its ATI's last nail on the nV coffin and they hammered it really hard.
  • ET - Wednesday, November 18, 2009 - link

    Too much discussion about adapters for the mini-displayport. The 27" iMac has such an input port and a resolution of 2560 x 1440, and it seems a sin to not test them together. (Not that I'm blaming Anandtech or anything, since I'm sure it's not that easy to get the iMac for testing.) Reply
  • Taft12 - Wednesday, November 18, 2009 - link

    Why would they bother using a computer with attached monitor and instead use the larger, higher-res and CHEAPER Dell 3008WFP? Reply
  • Raqia - Wednesday, November 18, 2009 - link

    Look at all the finger print smudges on the nice card! I've started to notice the hand models that corporations use to hold their products. The hands holding the ipods on the apple site? Flawless, perfect nails and cuticles. Same w/ the fingers grasping the Magny Cours chip. Reply
  • NullSubroutine - Wednesday, November 18, 2009 - link

    Hilbert @ Guru3d got the overclocking working with 900Mhz core speed (though it reached 90c).">

    I was impressed with some of the crossfire benchmarks actually showing improvement. If Eyeinfinity works with 5970 does it work with the card in crossfire?
  • Ryan Smith - Wednesday, November 18, 2009 - link

    Bear in mind that it also took him 1.3v to get there; the AMD tool doesn't go that high. With my card, I strongly suspect the issue is the VRMs, so more voltage wouldn't help.

    And I'm still trying to get an answer to the Eyefinity + 5970CF question. The boys and girls at AMD went home for the night before we realized we didn't have an answer to that.
  • Lennie - Wednesday, November 18, 2009 - link

    I thought everyone knew about Furmark and ATi by now. It used to be like this on 4870 series too.

    It went like this, at first there were few reports of 4870(X2) cards dying when running Furmak. Further investigation showed that it was indeed Furmark causing VRM's to heat up to insane levels and eventually killing them. Word reached ATi from that point on ATi intentionally throttles their card when detecting Furmark to prevent the damage.

    Yeah in fact the amount of heat load Furmak puts on VRMs is unrealistic and no game is able to heat up the VRMs to the level Furmark does. OCCT used the same method (or maybe even integrated Furmark) to test for stability (in their own opinion ofc)

    So beware about Furmark and OCCT if you have HD4K or 5K.

    The term "Hardware Virus" is rightfully applicable to Furmark when it comes to HD4K (and 5K perhaps)
  • strikeback03 - Wednesday, November 18, 2009 - link

    The article stated that they encountered throttling in real games, not Furmark. Reply

Log in

Don't have an account? Sign up now