Radeon HD 5970 Eyefinity on 3 x 30" Displays: Playable at 7680 x 1600

TWiT's Colleen Kelly pestered me on twitter to run tests on a 3 x 30" Eyefinity setup. The problem with such a setup is twofold:

1) I was worried even a 5970 wouldn't be powerful enough to drive all three displays at their full resolution (a total of 7680 x 1600) in a modern game, and

2) The Radeon HD 5970's third video output is mini-DP only.

The second issue is bigger than you think, there are currently no 30" displays that accept a mini DP input, only regular DP. And to convert a miniDP to DP/DL-DVI, you need an active adapter, a bit more expensive than a standard converter cable. Apple makes such a cable and sells it for $99. The local store had one in stock, so I hopped in the batmobile and got one. Update: AMD tells us that in order to use all three outputs, regardless of resolution, you need to use an active adapter for the miniDP output because the card runs out of timing sources.

Below we have a passive mini-DP to single-link DVI adapter. This is only capable of driving a maximum of 1920 x 1200:

This cable works fine on the Radeon HD 5970, but I couldn't have one of my displays running at a non-native resolution.

Next is the $99 mini DP to dual-link DVI adapter. This box can drive a panel at full 2560 x 1600:

User reviews are pretty bad for this adapter, but thankfully I was using it with a 30" Apple Cinema Display. My experience, unlike those who attempt to use it with non-Apple monitors, was flawless. In fact, I had more issues with one of my Dell panels than this Apple. It turns out that one of my other DVI cables wasn't rated for dual-link operation and I got a bunch of vertical green lines whenever I tried to run the panel at 2560 x 1600. Check your cables if you're setting up such a beast, I accidentally grabbed one of my DVI cables for a 24" monitor and caused the problem.

Windows detected all of the monitors, I used the Eyefinity configuration tool to arrange them properly, grabbed a few 7680 x 1600 wallpapers and was at a super wide Windows desktop.


Click to Enlarge

The usual Eyefinity complaints apply here. My start menu was around 3 feet to the left of my head and the whole setup required nearly 6 feet of desk space.


Click to Enlarge

In game menus and cutscenes are also mostly borked. They are fixed resolution/aspect ratio and end up getting stretched across all three 30" panels. Case in point is Call of Duty Modern Warfare 2:


Click to Enlarge

While most games will run at the 7680 x 1600 resolution enumerated by ATI's driver, they don't know how to deal with the 48:10 aspect ratio of the setup (3 x 16:10 displays) and apply the appropriate field of vision adjustments. The majority of games will simply try to stretch the 16:10 content to the wider aspect ratio, resulting in a lot of short and fat characters on screen (or stretched characters in your periphery). Below is what MW2 looks like by default:


Click to Enlarge

All of the characters look like they have legs that start at their knees. Thankfully there's a little tool out there that lets you automatically correct aspect ratio errors in some games. It's called Widescreen Fixer and you can get it here.

Select your game, desired aspect ratio and just leave it running in the background while you play. Hitting semicolon will enable/disable the aspect ratio correction and result in a totally playable, much less vomit inducing gaming experience. Below we have MW2 with correct aspect ratio/FOV:


Click to Enlarge

Take note of how much more you can see as well as normal the characters now look. Unfortunately Widescreen Fixer only supports 11 games as of today, and four of them are Call of Duty titles:

Battlefield 2
Battlefield 2142
BioShock
Call of Duty 2
Call of Duty 4: Modern Warfare
Call of Duty: World at War
Call of Duty: Modern Warfare 2
Darkest of Days Demo
SEGA Rally Revo
Unreal Tournament
Wolfenstein

There's absolutely no reason for ATI not to have done this on its own. There's a donate link on the Widescreen Fixer website, the right thing for ATI to do would be to pay this developer for his work. He's doing the job ATI's Eyefinity engineers should have done from day one. Kudos to him, shame on ATI.

Performance with 3 x 30" displays really varies depending on the game. I ran through a portion of the MW2 single player campaign and saw an average frame rate of 30.9 fps with a minimum of 10 fps and a maximum of 50 fps. It was mostly playable on the Radeon HD 5970 without AA enabled, but not buttery smooth. Turning on 4X AA made Modern Warfare 2 crash in case you were wondering. The fact that a single card is even remotely capable of delivering a good experience at 7680 x 1600 is impressive.

I also noticed a strange issue where I couldn't get my video to sync upon any soft reboots. I'd need to shut down the entire system and turn it back on for me to see anything on the screen after a restart.

With corrected aspect ratios/FOV, gaming is ridiculous on such a wide setup. You really end up using your peripheral vision for what it was intended. The experience, even in an FPS, is much more immersive. Although I do stand by my original take on Eyefinity, the most engulfing gameplay is when you find yourself running through an open field and not in games that deal with more close quarters action.

Just for fun i decided to hook my entire 3 x 30" Eyefinity setup to a power meter and see how much power three 30" displays and a Core i7 system with a single Radeon HD 5970 would consume.

Under load while playing Modern Warfare 2 the entire system only used 517W. Which brings us to the next issue with Eyefinity on the 5970: most games will only use one of the GPUs. Enabling Eyefinity with Crossfire requires a ridiculous amount of data to be sent between the GPUs thanks to the ultra high resolutions supported. Doing this isn't quite that easy given some design tradeoffs made with Cypress (more on this in an upcoming article). Currently, only a limited number of titles will support the 5970 running in dual-GPU mode with Eyefinity. Dual-card owners (e.g. 5870 CF) are out of luck, the current drivers do not allow for CF and Eyefinity to work together. This will eventually get fixed, but it's going to take some time. With a second GPU running you can expect total system power consumption, including displays, to easily break 600W for the setup I used here.

The difficulties most games have with such a wide setup prevent 3 x 30" Eyefinity (or even any triple-monitor configuration) from being a real recommendation. If you have three monitors, sure, why not, but I don't believe it's anywhere near practical. Not until ATI gets a lot of the software compatibility taken care of.

STALKER: Call of Pripyat – A Peak at More DX11 The Test
Comments Locked

114 Comments

View All Comments

  • palladium - Wednesday, November 18, 2009 - link

    Since AMD is binning their chips to get the 5970 within spec, I suppose it wouldn't make sense to make a 5950 SKU since a 5850 is simply a re-harvested 5870 (which failed the initial binning process), and 2x5850 would be out of the ATX spec anyway.

    Anyway, a great card for those who can afford it, and have the proper case and PSU to handle it.
  • Paladin1211 - Wednesday, November 18, 2009 - link

    With 512 SP, 6.67% more than a GTX 295, I dont see Fermi has any chance of beating the 5970. nVidia will need a dual Fermi to dethrone the 5970, and thats not happening until Q3 or Q4 2010.

    nVidia has targeted a wrong, niche market rather than gamers. Sooner or later monitors without bezel will come out, and Eyefinity makes much more sense. Its really funny that the R770s aka HD 4870s are in 1 out of 5 fastest supercomputers and not Tesla.

    They have taken a long, deep sleep after the 8800GTX and now they're paying for it.
  • cmdrdredd - Wednesday, November 18, 2009 - link

    Unfortunately, PC gaming is almost dead. Look at Call of Duty's release. Look at Dragon Age which is also available on consoles. Sure the PC version might look a bit better, but when you spend as much on a video card as someone does on an entire system that can download movies, demos, act as a media box, and play Blu-Rays...you get the point.
  • Lurker0 - Wednesday, November 18, 2009 - link

    Unfortunately, PC gaming has been declared "nearly dead" for decades. It hasn't died, and as much as console fanboys will rage on hearing this, it isn't going to either.

    PC gaming is a niche industry, it always has been and always will be. Yes, console games do tend to be more profitable, which means that most games will be developed for consoles first and then ported to the PC. Doesn't mean there will never be games developed for the PC first(or even exclusivly), or that there's no profit to be had in PC games.

    Yes, it can be cheaper to get a console than a mid-level gaming PC, just like it can be cheaper to just buy some econobox off the lot than to buy or build your own hot rod. Sure, one market is much larger and more profitable than the other, but there's still money to be made off of PC games and gearheads alike, and so long as that's true neither will be going away.
  • DominionSeraph - Thursday, November 19, 2009 - link

    PC gaming is no longer an isolated economy, though. That changes things. With most games being written with consoles in mind, there isn't the broad-based software push for hardware advance that there was at the dawn of 3d acceleration.
    I could give you dozens of great reasons to have upgraded from a NV4 to a NV15 back in the day, but the upgrade from a G80 to 5970 today? ~$800 when you factor in the PSU, and for what? Where's the must-have game that needs it? TNT to Geforce 2 was two years -- it's now been 3 since the release of the 8800, and there's been no equivalent to a Half Life, Quake 2, Deus Ex, Homeworld, Warcraft III, or WoW.
  • GourdFreeMan - Thursday, November 19, 2009 - link

    Unfortunately, this is precisely the problem. When looking at AAA (large budget) games, six years ago PC game sales were predominantly PC exclusives, with some well known console ports (e.g. Halo, Morrowind). Twelve years ago PC game sales were almost entirely exclusives. Today the console ports are approaching the majority of high profile PC titles.

    Being multiplatform isn't necessarily a detriment for a console game. After all, having a larger budget allows more money to be spent on art and polishing the code to get the best performance on console hardware. In most cases, however, the PC version of a multiplatform title is almost always an afterthought. Little to no effort is spent redesigning the user interface and rebalancing game play because of the different controls. Shaders are almost never rewritten to take advantage of effects that could only be accomplished with the power of the graphic cards in modern PCs when porting. At most we seem to get better textures at somewhat higher resolutions.

    The biggest problem with multiplatform development, however, is that multiplatform games are almost always aimed at the lowest common denominator in terms of both technology and content. All this does is produce the same game over and over again -- the clichéd rail shooter in a narrow environment with a macho/reluctant superhuman protagonist thrown against hordes of respawning mooks.

    Based on the quarterly reports of sales revenue from the major publishers (EA, Activision and Ubisoft), PC games sales are comparable to PS3 game sales. The PS3, however, has several more exclusives because Sony owns several games studios and forces them to release exclusives. AMD and nVIDIA do not, much to PC gaming's detriment.
  • mschira - Wednesday, November 18, 2009 - link

    Hehe 5970CF to power three screens, now that sounds like a killer setup.
    Besides that one's burning 600+ watts for the graphic. What's the CPU supposed to live on? The BIOS-Battery?
    M.
  • monomer - Wednesday, November 18, 2009 - link

    Wouldn't it be possible to run six screens using a 5970 CF setup, or are there other limitations I'm unaware of?
  • Fleeb - Wednesday, November 18, 2009 - link

    600W for the whole setup. :S
  • maximusursus - Wednesday, November 18, 2009 - link

    It really seems weird...:( I've seen some reviews that had way better overclocking than the standard 5870 clocks and their tests seem to be ok without any "throttling" problems.

    For example:

    Techspot: 900/1250
    HotHardware: 860/1220
    Tom's Hardware: 900/1300
    HardOCP: 900/1350 (!)
    Guru3D: 900/1200

    HardwareZone however had a similar problem with you guys, could it really be the PSU?

Log in

Don't have an account? Sign up now