The catch however is that what we don’t have is a level of clear domination when it comes to single-card solutions. AMD was shooting to beat the GTX 295 with the 5870, but in our benchmarks that’s not happening. The 295 and the 5870 are close, perhaps close enough that NVIDIA will need to reconsider their position, but it’s not enough to outright dethrone the GTX 295. NVIDIA still has the faster single-card solution, although the $100 price premium is well in excess of the <10% performance premium.

-From Our Radeon 5870 Review, On The GTX 295 vs. The 5870

Let’s get straight to the point, shall we? Today AMD is launching the 5970, their dual-GPU card that finishes building out AMD’s technical domination of the high-end market. With it AMD delivers the absolute victory over NVIDIA’s GTX 295 that the Radeon 5870 couldn’t quite achieve and at the same time sets the new high water mark for single-card performance.

This also marks the last AMD product introduction of the year. The rest of the Evergreen series, composing the sub-$100 low-end parts, will be launching next year.

  AMD Radeon HD 5970 AMD Radeon HD 5870 AMD Radeon HD 5850
Stream Processors 2x1600 1600 1440
Texture Units 2x80 80 72
ROPs 2x32 32 32
Core Clock 725MHz 850MHz 725MHz
Memory Clock 1GHz (4GHz data rate) GDDR5 1.2GHz (4.8GHz data rate) GDDR5 1GHz (4GHz data rate) GDDR5
Memory Bus Width 2x256-bit 256-bit 256-bit
Frame Buffer 2x1GB 1GB 1GB
Transistor Count 2x2.15B 2.15B 2.15B
TDP 294W 188W 151W
Manufacturing Process TSMC 40nm TSMC 40nm TSMC 40nm
Price Point $599 $400 $300

The 5970 serves as the nowadays obligatory dual-GPU part. It is 2 Cypress dice mounted on a single, dual-slot video card. AMD clocks it at 725MHz core and 1GHz (4GHz effective) for the GDDR5 memory. The card comes equipped with 2GB of GDDR5, which is split between the two GPUs, giving it an effective memory capacity of 1GB. The card will be selling for $600, at least so long as vendors and retailers hold the line on MSRP.

In practice this makes the card something between a 5850 in Crossfire mode and a 5870 in Crossfire mode. The clocks are the same as the 5850, but here all 20 SIMD units are enabled. This is a 15% clockspeed difference between the 5970 and 5870CF, so officially the 5870CF will continue to be the faster setup. However as we’ll see in a bit, looking at the stock 5970 can be a bit deceiving.

This also brings up the matter of the name of the card. We asked AMD what happened to the X2 tag, and the answer is that they didn’t want to use it since the card was configured neither like a 5850 nor a 5870 – it was closer to a mythical 5860. So rather than call it an odd (or worse yet, wrong) name, AMD just gave it a new model number entirely. We suspect AMD wanted to be rid of the X2 name – their processors go up to X4 after all – but there you go as far as an official reason is concerned. It looks like special multi-GPU tags are now gone in both the NVIDIA and AMD camps.

Moving on, for power, the 5970 uses an 8pin and a 6pin power connector (although the 6pin sits on top of a spot silk-screened for anther 8pin). The TDP is 294W, bringing it in just under the 300W ATX limit. Idle power is 42W, thanks to AMD’s aggressive power optimizations present in the entire 5000 series.

As some of you may have noticed, in spite of the fact that this card is at least a pair of 5850s, it consumes less than the 320W (2x160W) such a setup would. In order to meet the 300W limit, AMD went and binned Cypress chips specifically for the 5970, in order to find chips that could operate at 725MHz at only 1.05v (the 5850 runs at 1.088v). Given the power creep coming from the 4800 series, binning for the best chips is the only way AMD could get a 300W card out.

AMD’s official guidance for this card is that the minimum requirements are a 650W power supply, and they recommend a 750W power supply. The recommended power supply will become more important later on when we talk about overclocking.

Finally, AMD is also launching Crossfire Eyefinity support with the 5970, and thus far only the 5970. Currently Eyefinity doesn’t work with Crossfire mode on any of AMDs cards due to driver limitations. The drivers that the 5970 will be shipping with enable Crossfire Eyefinity support on the 5970 for 22 games – currently AMD is using whitelisting and is enabling games on a case-by-case basis. Crossfire Eyefinity will make its way in to the mainstream Catalyst drivers and be enabled for other cards early next year.

Meet The 5970
POST A COMMENT

114 Comments

View All Comments

  • SJD - Wednesday, November 18, 2009 - link

    Thanks Anand,

    That kind of explains it, but I'm still confused about the whole thing. If your third monitor supported mini-DP then you wouldn't need an active adapter, right? Why is this when mini-DP and regular DP are the 'same' appart from the actual plug size. I thought the whole timing issue was only relevant when wanting a third 'DVI' (/HDMI) output from the card.

    Simon
    Reply
  • CrystalBay - Wednesday, November 18, 2009 - link

    WTH is really up at TWSC ? Reply
  • Jacerie - Wednesday, November 18, 2009 - link

    All the single game tests are great and all, but once I would love to see AT run a series of video card tests where multiple instances of games like EVE Online are running. While single instance tests are great for the FPS crowd, all us crazy high-end MMO players need some love too. Reply
  • Makaveli - Wednesday, November 18, 2009 - link

    Jacerie the problem with benching MMO's and why you don't see more of them is all the other factors that come into play. You have to now deal with server latency, you also have no control of how many players are usually in the server at any given time when running benchmarks. There is just to many variables that would not make the benchmarks repeatable and valid for comparison purposes! Reply
  • mesiah - Thursday, November 19, 2009 - link

    I think more what he is interested in is how well the card can render multiple instances of the game running at once. This could easily be done with a private server or even a demo written with the game engine. It would not be real world data, but it would give an idea of performance scaling when multiple instances of a game are running. Myself being an occasional "Dual boxer" I wouldn't mind seeing the data myself. Reply
  • Jacerie - Thursday, November 19, 2009 - link

    That's exactly what I was trying to get at. It's not uncommon for me to be running at lease two instances of EVE with an entire assortment of other apps in the background. My current 3870X2 does the job just fine, but with 7 out and DX11 around the corner I'd like to know how much money I'm going to need to stash away to keep the same level of usability I have now with the newer cards. Reply
  • Zool - Wednesday, November 18, 2009 - link

    The so fast is only becouse 95% of the games are dx9 xbox ports. Still crysis is the most demanding game out there quite a time (it need to be added that it has a very lazy engine). In Age of Conan the diference in dx9 and dx10 is more than half(with plenty of those efects on screen even1/3) the fps drop. Those advanced shader efects that they are showing in demos are actualy much more demanding on the gpu than the dx9 shaders. Its just the thing they dont mention it. It will be same with dx11. A full dx11 game with all those fancy shaders will be on the level of crysis. Reply
  • crazzyeddie - Wednesday, November 18, 2009 - link

    ... after their first 40nm test chips came back as being less impressive than **there** 55nm and 65nm test chips were. Reply
  • silverblue - Wednesday, November 18, 2009 - link

    Hehe, I saw that one too. Reply
  • frozentundra123456 - Wednesday, November 18, 2009 - link

    Unfortunately, since playing MW2, my question is: are there enough games that are sufficiently superior on the PC to justify the inital expense and power usage of this card? Maybe thats where eyefinity for AMD and PhysX for nVidia come in: they at least differentiate the PC experience from the console.
    I hate to say it, but to me there just do not seem to be enough games optimized for the PC to justify the price and power usage of this card, that is unless one has money to burn.
    Reply

Log in

Don't have an account? Sign up now