Conclusion

There are two things that become very clear when looking at our data for the 5970

  1. It’s hands down the fastest single card on the market
  2. It’s so fast that it’s wasted on a single monitor

AMD made a good choice in enabling Crossfire Eyefinity for the 5970, as they have made a card so fast that it basically shoots past everything on the market that isn’t Crysis. All of our action games that aren’t CPU limited do better than 100fps at 2560x1600, and RTSs are doing just under 60fps. The 5970 is without a doubt Overkill (with a capital O) on a single monitor. This will likely change for future games (i.e. STALKER), but on today’s games it’s more power than is necessary to drive even the largest single monitor. The 5970 still offers a good performance boost over the 5870 even with a single monitor, but with the 5870’s outstanding performance, it’s not $200 better.

So that leaves us with Eyefinity. So long as GPUs are outpacing games, AMD needs something to burn up extra performance to give faster cards a purpose, and that’s Eyefinity. Eyefinity is a strain - even 3 smaller monitors can result in more pixels being pushed than a single 2560. Having Crossfire Eyefinity support gives an AMD card the breathing room it needs to offer Eyefinity at playable framerates across a wider spectrum of monitors and games. Given the price of 3 20”+ monitors is going to approach if not exceed the $600 price of the card, the 5970 is the perfect match for Eyefinity gaming at this time.

When AMD originally told us about this card, I was surprised to see that they slapped only a $600 price tag on it. As the fastest of the fast cards, AMD can basically charge up to 2x the price of a 5870 for it, and they didn’t. After seeing the performance data, I understand why. In our benchmarks the 5970 is practically tied with the 5850CF, and a pair of such cards would sell for $600 at this time. I still expect that we’re going to see a performance gap emerge between the cards (particularly if the 5970 is held back by drivers) but right now the $600 price tag is appropriate.

What this does call into question though is what’s better to have: a pair of 5800 series cards, or a 5970. If we assume that the 5970 is equal to a 5850CF in performance and in price, then the differences come down to 3 matters: Heat/noise, power, and Crossfire Eyefinity. The 5970 enjoys lower power usage and it doesn’t need a power supply with 4 PCIe plugs, but the cost is that by compacting this into one card it’s hotter and louder than a 5850CF (which really, is true for all dual-GPU cards). The biggest advantage to the 5970 right now is that it’s the only card to support Crossfire Eyefinity, which means it’s the only card to even consider if you are going to use Eyefinity right now. Ultimately if you can run 2 cards and only will be driving a single monitor, go with the 5850CF, otherwise go with the 5970. And if it’s 2010 and you’re reading this article, check and see if AMD has enabled Crossfire Eyefinity for the 5850CF.

Next, we’re left with the prospects of overclocking the 5970. Only one of our two cards even runs at 5870 speeds (850MHz/1200MHz), and while we're willing to entertain the idea that our 1 cranky card is a fluke, we can't ignore the fact that none of our cards can run a real application at 5870 speeds without throttling. Ultimately our experience with the working card has called into question whether the VRMs on the card are up to the task. Since this is a protection mechanism there’s no risk of damage, but it also means that the card is underperforming. Overclock your 5970 to 5870 speeds if you can bear the extra power/heat/noise, but don’t expect 5870CF results.

Last, that leaves us with the 5870CF, and the 5970CF. Thanks to VRM throttling, there’s still a place in this world for the 5870CF. For a 2-GPU setup, it’s still the best way to go, but keep in mind it comes at a $200 premium and lacks Crossfire Eyefinity support. Meanwhile with the 5970CF, while we didn’t get a chance to test it today, we can safely say that it’s entirely unnecessary for a single-monitor setup. There’s a market out there for $1200 in video cards, but you had better be running 3 30” monitors in Eyefinity mode to make use of it.

Power, Temperature, & Noise
Comments Locked

114 Comments

View All Comments

  • tcube - Monday, March 1, 2010 - link

    Well my thought is that if amd would release this card on SOI/HK-MG in 32/28nm MCM config at 4 GHz it would leave nvidia wondering what it did to deserve it. And I wonder ... why the heck not? These cards would be business-grade-able and with a decent silicon they could ask for enormous prices. Plus they could probably stick the entire thing in dual config on one single card (possibly within the 300 W pcie v2 limit)... that would be a 40-50 Tflops card and with the new GDDR5(5ghz +) it should qualify as a damn monster. I would also expect a ~2-3k$ per such beast but I think its worth it. 50Tf/card, 6cards/server... 3Pflop/cabinet... hrm... well...it wouldn't be fair to compair it to general purpose supercomputers... buuut you could deffinatelly ray trace render avatar directly into HD4x 3d in realtime and probably make it look even better in the process...
  • srikar115 - Sunday, December 6, 2009 - link

    i agree with this reveiw ,here a complete summary i found is also intresting
    http://pcgamersera.com/2009/12/ati-radeon-5970-rev...">http://pcgamersera.com/2009/12/ati-radeon-5970-rev...
  • srikar115 - Tuesday, April 20, 2010 - link

    http://pcgamersera.com/ati-radeon-5970-review-suma...
  • xpclient - Friday, November 27, 2009 - link

    What no test to check video performance/DXVA? DirectX 11/WDDM 1.1 introduced DXVA HD (Accelerated HD/Blu-Ray playback).
  • cmdrdredd - Sunday, November 22, 2009 - link

    Clearly nobody buying this card is going to put Crysis on "Gamer Quality" They'll put it on the max it can go. Why is AT still the only tech site in the whole world who is using "Gamer quality" with a card that has enough power to run a small town?
  • AnnonymousCoward - Sunday, November 22, 2009 - link

    Why does the 5970 get <= 5850 CF performance, when it has 3200 Stream Processors vs 2880?
  • araczynski - Saturday, November 21, 2009 - link

    i look forward to buying this, in a few years.
  • JonnyDough - Friday, November 20, 2009 - link

    why they didn't just call it the 5880.
  • Paladin1211 - Friday, November 20, 2009 - link

    Ryan,

    I'm a professional L4D player, and I know the Source engine gives out very high frame rate on today cards. The test become silly because there is no different at all from 60 to 300+ fps. So, it all comes down to min fps.

    I suggest that you record a demo in map 5 Dead Air, with 4 Survivors defend with their back onto the limit line of the position of the first crashed plane. The main player for the record will be vomitted on by boomer, another throws pipe bomb near him, another throws molotov near him also. Full force of zombies (only 30), 2 hunters, 1 smoker, 1 tank attacking. (When a player become a tank, the boss he's controlling become a bot, and still attacking the survivors).

    This is the heaviest practical scene in L4D, and it just makes sense for the benchmark. You dont really need 8 players to arrange the scene, I think using cheats is much easier.

    I know it will take time to re-benchmark all of those cards for the new scene, but I think it wont be too much. Even if you cant do this, please reply me.

    Thank you :)
  • SunSamurai - Friday, November 20, 2009 - link

    You're not professional FPS player if you think there is no difference between 60 and 300fps.

Log in

Don't have an account? Sign up now