Radeon HD 5970 Eyefinity on 3 x 30" Displays: Playable at 7680 x 1600

TWiT's Colleen Kelly pestered me on twitter to run tests on a 3 x 30" Eyefinity setup. The problem with such a setup is twofold:

1) I was worried even a 5970 wouldn't be powerful enough to drive all three displays at their full resolution (a total of 7680 x 1600) in a modern game, and

2) The Radeon HD 5970's third video output is mini-DP only.

The second issue is bigger than you think, there are currently no 30" displays that accept a mini DP input, only regular DP. And to convert a miniDP to DP/DL-DVI, you need an active adapter, a bit more expensive than a standard converter cable. Apple makes such a cable and sells it for $99. The local store had one in stock, so I hopped in the batmobile and got one. Update: AMD tells us that in order to use all three outputs, regardless of resolution, you need to use an active adapter for the miniDP output because the card runs out of timing sources.

Below we have a passive mini-DP to single-link DVI adapter. This is only capable of driving a maximum of 1920 x 1200:

This cable works fine on the Radeon HD 5970, but I couldn't have one of my displays running at a non-native resolution.

Next is the $99 mini DP to dual-link DVI adapter. This box can drive a panel at full 2560 x 1600:

User reviews are pretty bad for this adapter, but thankfully I was using it with a 30" Apple Cinema Display. My experience, unlike those who attempt to use it with non-Apple monitors, was flawless. In fact, I had more issues with one of my Dell panels than this Apple. It turns out that one of my other DVI cables wasn't rated for dual-link operation and I got a bunch of vertical green lines whenever I tried to run the panel at 2560 x 1600. Check your cables if you're setting up such a beast, I accidentally grabbed one of my DVI cables for a 24" monitor and caused the problem.

Windows detected all of the monitors, I used the Eyefinity configuration tool to arrange them properly, grabbed a few 7680 x 1600 wallpapers and was at a super wide Windows desktop.


Click to Enlarge

The usual Eyefinity complaints apply here. My start menu was around 3 feet to the left of my head and the whole setup required nearly 6 feet of desk space.


Click to Enlarge

In game menus and cutscenes are also mostly borked. They are fixed resolution/aspect ratio and end up getting stretched across all three 30" panels. Case in point is Call of Duty Modern Warfare 2:


Click to Enlarge

While most games will run at the 7680 x 1600 resolution enumerated by ATI's driver, they don't know how to deal with the 48:10 aspect ratio of the setup (3 x 16:10 displays) and apply the appropriate field of vision adjustments. The majority of games will simply try to stretch the 16:10 content to the wider aspect ratio, resulting in a lot of short and fat characters on screen (or stretched characters in your periphery). Below is what MW2 looks like by default:


Click to Enlarge

All of the characters look like they have legs that start at their knees. Thankfully there's a little tool out there that lets you automatically correct aspect ratio errors in some games. It's called Widescreen Fixer and you can get it here.

Select your game, desired aspect ratio and just leave it running in the background while you play. Hitting semicolon will enable/disable the aspect ratio correction and result in a totally playable, much less vomit inducing gaming experience. Below we have MW2 with correct aspect ratio/FOV:


Click to Enlarge

Take note of how much more you can see as well as normal the characters now look. Unfortunately Widescreen Fixer only supports 11 games as of today, and four of them are Call of Duty titles:

Battlefield 2
Battlefield 2142
BioShock
Call of Duty 2
Call of Duty 4: Modern Warfare
Call of Duty: World at War
Call of Duty: Modern Warfare 2
Darkest of Days Demo
SEGA Rally Revo
Unreal Tournament
Wolfenstein

There's absolutely no reason for ATI not to have done this on its own. There's a donate link on the Widescreen Fixer website, the right thing for ATI to do would be to pay this developer for his work. He's doing the job ATI's Eyefinity engineers should have done from day one. Kudos to him, shame on ATI.

Performance with 3 x 30" displays really varies depending on the game. I ran through a portion of the MW2 single player campaign and saw an average frame rate of 30.9 fps with a minimum of 10 fps and a maximum of 50 fps. It was mostly playable on the Radeon HD 5970 without AA enabled, but not buttery smooth. Turning on 4X AA made Modern Warfare 2 crash in case you were wondering. The fact that a single card is even remotely capable of delivering a good experience at 7680 x 1600 is impressive.

I also noticed a strange issue where I couldn't get my video to sync upon any soft reboots. I'd need to shut down the entire system and turn it back on for me to see anything on the screen after a restart.

With corrected aspect ratios/FOV, gaming is ridiculous on such a wide setup. You really end up using your peripheral vision for what it was intended. The experience, even in an FPS, is much more immersive. Although I do stand by my original take on Eyefinity, the most engulfing gameplay is when you find yourself running through an open field and not in games that deal with more close quarters action.

Just for fun i decided to hook my entire 3 x 30" Eyefinity setup to a power meter and see how much power three 30" displays and a Core i7 system with a single Radeon HD 5970 would consume.

Under load while playing Modern Warfare 2 the entire system only used 517W. Which brings us to the next issue with Eyefinity on the 5970: most games will only use one of the GPUs. Enabling Eyefinity with Crossfire requires a ridiculous amount of data to be sent between the GPUs thanks to the ultra high resolutions supported. Doing this isn't quite that easy given some design tradeoffs made with Cypress (more on this in an upcoming article). Currently, only a limited number of titles will support the 5970 running in dual-GPU mode with Eyefinity. Dual-card owners (e.g. 5870 CF) are out of luck, the current drivers do not allow for CF and Eyefinity to work together. This will eventually get fixed, but it's going to take some time. With a second GPU running you can expect total system power consumption, including displays, to easily break 600W for the setup I used here.

The difficulties most games have with such a wide setup prevent 3 x 30" Eyefinity (or even any triple-monitor configuration) from being a real recommendation. If you have three monitors, sure, why not, but I don't believe it's anywhere near practical. Not until ATI gets a lot of the software compatibility taken care of.

STALKER: Call of Pripyat – A Peak at More DX11 The Test
Comments Locked

114 Comments

View All Comments

  • Ryan Smith - Wednesday, November 18, 2009 - link

    It's possible, but the 850TX is a very well regarded unit. If it can't run a 5970 overclocked, then I surmise that a lot of buyers are going to run in to the same problem. I don't have another comparable power supply on hand, so this isn't something I can test with my card.

    Anand has a 1K unit, and of course you know how his turned out.

    To be frank, we likely would have never noticed the throttling issue if it wasn't for the Distributed.net client. It's only after realizing that it was underperforming by about 10-20% that I decided to watch the Overdrive pane and saw it bouncing around. These guys could be throttling too, and just not realize it.
  • Silverforce11 - Wednesday, November 18, 2009 - link

    Seems iffy then since most reviews put it at 900 core and 5ghz + on the ram, with only a modest overvolt to 1.16. I would think ATI wouldnt bother putting in 3 high quality VRM and japanese capacitors if they didnt test it thoroughly at the specs they wanted it to OC at.

    My old PSU is the bigger bro of this guy being the 750 ver.
    http://anandtech.com/casecoolingpsus/showdoc.aspx?...">http://anandtech.com/casecoolingpsus/showdoc.aspx?...
    And had issues with the 4870x2. Got a better "single rail" PSU and it ran fine n OC well.
  • Silverforce11 - Wednesday, November 18, 2009 - link

    ATI went all out with building these 5970, the components are top notch. The chips are the best of the bunch. I'm surprised they did this, as they are essentially selling you 2x 5870 performance (IF your PSU is good) at $599 when 2x 5870 CF would cost $800. They have no competitor in the top, why do they not price this card higher or why even bother putting in quality parts to almost guarantee 5870 clocks?

    I believe its ATI's last nail on the nV coffin and they hammered it really hard.
  • ET - Wednesday, November 18, 2009 - link

    Too much discussion about adapters for the mini-displayport. The 27" iMac has such an input port and a resolution of 2560 x 1440, and it seems a sin to not test them together. (Not that I'm blaming Anandtech or anything, since I'm sure it's not that easy to get the iMac for testing.)
  • Taft12 - Wednesday, November 18, 2009 - link

    Why would they bother using a computer with attached monitor and instead use the larger, higher-res and CHEAPER Dell 3008WFP?
  • Raqia - Wednesday, November 18, 2009 - link

    Look at all the finger print smudges on the nice card! I've started to notice the hand models that corporations use to hold their products. The hands holding the ipods on the apple site? Flawless, perfect nails and cuticles. Same w/ the fingers grasping the Magny Cours chip.
  • NullSubroutine - Wednesday, November 18, 2009 - link

    Hilbert @ Guru3d got the overclocking working with 900Mhz core speed (though it reached 90c).

    http://www.guru3d.com/article/radeon-hd-5970-revie...">http://www.guru3d.com/article/radeon-hd-5970-revie...

    I was impressed with some of the crossfire benchmarks actually showing improvement. If Eyeinfinity works with 5970 does it work with the card in crossfire?
  • Ryan Smith - Wednesday, November 18, 2009 - link

    Bear in mind that it also took him 1.3v to get there; the AMD tool doesn't go that high. With my card, I strongly suspect the issue is the VRMs, so more voltage wouldn't help.

    And I'm still trying to get an answer to the Eyefinity + 5970CF question. The boys and girls at AMD went home for the night before we realized we didn't have an answer to that.
  • Lennie - Wednesday, November 18, 2009 - link

    I thought everyone knew about Furmark and ATi by now. It used to be like this on 4870 series too.

    It went like this, at first there were few reports of 4870(X2) cards dying when running Furmak. Further investigation showed that it was indeed Furmark causing VRM's to heat up to insane levels and eventually killing them. Word reached ATi from that point on ATi intentionally throttles their card when detecting Furmark to prevent the damage.

    Yeah in fact the amount of heat load Furmak puts on VRMs is unrealistic and no game is able to heat up the VRMs to the level Furmark does. OCCT used the same method (or maybe even integrated Furmark) to test for stability (in their own opinion ofc)

    So beware about Furmark and OCCT if you have HD4K or 5K.

    The term "Hardware Virus" is rightfully applicable to Furmark when it comes to HD4K (and 5K perhaps)
  • strikeback03 - Wednesday, November 18, 2009 - link

    The article stated that they encountered throttling in real games, not Furmark.

Log in

Don't have an account? Sign up now