Radeon HD 5970 Eyefinity on 3 x 30" Displays: Playable at 7680 x 1600

TWiT's Colleen Kelly pestered me on twitter to run tests on a 3 x 30" Eyefinity setup. The problem with such a setup is twofold:

1) I was worried even a 5970 wouldn't be powerful enough to drive all three displays at their full resolution (a total of 7680 x 1600) in a modern game, and

2) The Radeon HD 5970's third video output is mini-DP only.

The second issue is bigger than you think, there are currently no 30" displays that accept a mini DP input, only regular DP. And to convert a miniDP to DP/DL-DVI, you need an active adapter, a bit more expensive than a standard converter cable. Apple makes such a cable and sells it for $99. The local store had one in stock, so I hopped in the batmobile and got one. Update: AMD tells us that in order to use all three outputs, regardless of resolution, you need to use an active adapter for the miniDP output because the card runs out of timing sources.

Below we have a passive mini-DP to single-link DVI adapter. This is only capable of driving a maximum of 1920 x 1200:

This cable works fine on the Radeon HD 5970, but I couldn't have one of my displays running at a non-native resolution.

Next is the $99 mini DP to dual-link DVI adapter. This box can drive a panel at full 2560 x 1600:

User reviews are pretty bad for this adapter, but thankfully I was using it with a 30" Apple Cinema Display. My experience, unlike those who attempt to use it with non-Apple monitors, was flawless. In fact, I had more issues with one of my Dell panels than this Apple. It turns out that one of my other DVI cables wasn't rated for dual-link operation and I got a bunch of vertical green lines whenever I tried to run the panel at 2560 x 1600. Check your cables if you're setting up such a beast, I accidentally grabbed one of my DVI cables for a 24" monitor and caused the problem.

Windows detected all of the monitors, I used the Eyefinity configuration tool to arrange them properly, grabbed a few 7680 x 1600 wallpapers and was at a super wide Windows desktop.


Click to Enlarge

The usual Eyefinity complaints apply here. My start menu was around 3 feet to the left of my head and the whole setup required nearly 6 feet of desk space.


Click to Enlarge

In game menus and cutscenes are also mostly borked. They are fixed resolution/aspect ratio and end up getting stretched across all three 30" panels. Case in point is Call of Duty Modern Warfare 2:


Click to Enlarge

While most games will run at the 7680 x 1600 resolution enumerated by ATI's driver, they don't know how to deal with the 48:10 aspect ratio of the setup (3 x 16:10 displays) and apply the appropriate field of vision adjustments. The majority of games will simply try to stretch the 16:10 content to the wider aspect ratio, resulting in a lot of short and fat characters on screen (or stretched characters in your periphery). Below is what MW2 looks like by default:


Click to Enlarge

All of the characters look like they have legs that start at their knees. Thankfully there's a little tool out there that lets you automatically correct aspect ratio errors in some games. It's called Widescreen Fixer and you can get it here.

Select your game, desired aspect ratio and just leave it running in the background while you play. Hitting semicolon will enable/disable the aspect ratio correction and result in a totally playable, much less vomit inducing gaming experience. Below we have MW2 with correct aspect ratio/FOV:


Click to Enlarge

Take note of how much more you can see as well as normal the characters now look. Unfortunately Widescreen Fixer only supports 11 games as of today, and four of them are Call of Duty titles:

Battlefield 2
Battlefield 2142
BioShock
Call of Duty 2
Call of Duty 4: Modern Warfare
Call of Duty: World at War
Call of Duty: Modern Warfare 2
Darkest of Days Demo
SEGA Rally Revo
Unreal Tournament
Wolfenstein

There's absolutely no reason for ATI not to have done this on its own. There's a donate link on the Widescreen Fixer website, the right thing for ATI to do would be to pay this developer for his work. He's doing the job ATI's Eyefinity engineers should have done from day one. Kudos to him, shame on ATI.

Performance with 3 x 30" displays really varies depending on the game. I ran through a portion of the MW2 single player campaign and saw an average frame rate of 30.9 fps with a minimum of 10 fps and a maximum of 50 fps. It was mostly playable on the Radeon HD 5970 without AA enabled, but not buttery smooth. Turning on 4X AA made Modern Warfare 2 crash in case you were wondering. The fact that a single card is even remotely capable of delivering a good experience at 7680 x 1600 is impressive.

I also noticed a strange issue where I couldn't get my video to sync upon any soft reboots. I'd need to shut down the entire system and turn it back on for me to see anything on the screen after a restart.

With corrected aspect ratios/FOV, gaming is ridiculous on such a wide setup. You really end up using your peripheral vision for what it was intended. The experience, even in an FPS, is much more immersive. Although I do stand by my original take on Eyefinity, the most engulfing gameplay is when you find yourself running through an open field and not in games that deal with more close quarters action.

Just for fun i decided to hook my entire 3 x 30" Eyefinity setup to a power meter and see how much power three 30" displays and a Core i7 system with a single Radeon HD 5970 would consume.

Under load while playing Modern Warfare 2 the entire system only used 517W. Which brings us to the next issue with Eyefinity on the 5970: most games will only use one of the GPUs. Enabling Eyefinity with Crossfire requires a ridiculous amount of data to be sent between the GPUs thanks to the ultra high resolutions supported. Doing this isn't quite that easy given some design tradeoffs made with Cypress (more on this in an upcoming article). Currently, only a limited number of titles will support the 5970 running in dual-GPU mode with Eyefinity. Dual-card owners (e.g. 5870 CF) are out of luck, the current drivers do not allow for CF and Eyefinity to work together. This will eventually get fixed, but it's going to take some time. With a second GPU running you can expect total system power consumption, including displays, to easily break 600W for the setup I used here.

The difficulties most games have with such a wide setup prevent 3 x 30" Eyefinity (or even any triple-monitor configuration) from being a real recommendation. If you have three monitors, sure, why not, but I don't believe it's anywhere near practical. Not until ATI gets a lot of the software compatibility taken care of.

STALKER: Call of Pripyat – A Peak at More DX11 The Test
Comments Locked

114 Comments

View All Comments

  • GourdFreeMan - Friday, November 20, 2009 - link

    Having not bought MW2, I can say conversely that the lack of differentiation between console and PC features hurts game sales. According to news reports, in the UK PC sales of MW2 account for less than 3% of all sales. This is neither representative of the PC share of the gaming market (which should be ~25% of all "next-gen" sales based on quarterly reports of revenue from publishers), nor the size of the install base of modern graphics cards capable of running MW2 at a decent frame rate (which should be close to the size of the entire console market based on JPR figures). Admittedly the UK has a proportionately larger console share than the US or Germany, but I can't image MW2 sales of the PC version are much better globally.

    I am sure executives will be eager to blame piracy for the lack of PC sales, but their target market knows better...
  • cmdrdredd - Wednesday, November 18, 2009 - link

    [quote]Unfortunately, since playing MW2, my question is: are there enough games that are sufficiently superior on the PC to justify the inital expense and power usage of this card? Maybe thats where eyefinity for AMD and PhysX for nVidia come in: they at least differentiate the PC experience from the console.
    I hate to say it, but to me there just do not seem to be enough games optimized for the PC to justify the price and power usage of this card, that is unless one has money to burn.[/quote]

    Yes this is exactly my thoughts. They can tout DX11, fancy schmancy eyefinity, physx, everything except free lunch and it doesn't change the fact that the lineup for PC gaming is bland at best. It sucks, I love gaming on PC but it's pretty much a dead end at this time. No thanks to every 12 year old who curses at you on XBox Live.
  • The0ne - Wednesday, November 18, 2009 - link

    My main reason to want this card would be to drive my 30" LCDs. I have two Dell's already and will get another one early next year. I don't actually play games much but I like having the desktop space for my work.

    -VM's at higher resolution
    -more open windows without switching too much
    -watch movie(s) while working
    -bigger font size but maintaining the aspect ratio of programs :)

    Currently have my main on one 30" and to my 73" TV. TV is only 1080P so space is a bit limited. Plus working on the TV sucks big time :/
  • shaolin95 - Wednesday, November 18, 2009 - link

    I am glad ATI is able to keep competing as that helps keep prices at a "decent" level.
    Still, for all of you so amazed by eyefinity, do yourselves a favor and try 3D vision with a big screen DLP then you will laugh at what you thought was cool and "3D" before.
    You can have 100 monitors but it is still just a flat world....time to join REAL 3D gaming guys!
  • Carnildo - Wednesday, November 18, 2009 - link

    Back in college, I was the administrator for a CAVE system. It's a cube ten feet on a side, with displays on all surfaces. Combine that with head tracking, hand tracking, shutter glasses, and surround sound, and you've got a fully immersive 3D environment.

    It's designed for 3D visualization of large datasets, but people have ported a number of 3D shooters to the platform. You haven't lived until you've seen a life-sized opponent come around the corner and start blasting away at you.
  • 7Enigma - Wednesday, November 18, 2009 - link

    But Ryan, I feel you might need to edit a couple of your comparison comments between the 295 and this new card. Based on the comments in a several previous articles quite a few readers do not look at (or understand) the charts and instead rely on the commentary below the charts. Here's some examples:

    "Meanwhile the GTX 295 sees the first of many falls here. It falls behind the 5970 by 30%-40%. The 5870 gave it a run for its money, so this is no surprise."

    This one for Stalker is clear and concise. I'd recommend you repeat this format for the rest of the games.

    "As for the GTX 295, the lead is only 20%. This is one of the better scenarios for the GTX 295."

    This comment was for Battleforge and IMO is confusing. To someone not reading the chart it could be viewed as saying the 295 has a 20% advantage. Again I'd stick with your Stalker comment.

    "HAWX hasn’t yet reached a CPU ceiling, but it still gets incredibly high numbers. Overclocking the card gets 14% more, and the GTX 295 performance advantage is 26%."

    Again, this could be seen as the 295 being 26% faster.

    "Meanwhile overclocking the 5970 is good for another 9%, and the GTX 295 gap is 37%."

    This one is less confusing as it doesn't mention an advantage but should just mention 37% slower.


    Finally I think you made a typo in the conclusion where you said this:

    "Overclock your 5970 to 5870 speeds if you can bear the extra power/heat/noise, but don’t expect 5970CF results."


    I think you meant 5870CF results...


    Overall, though, the article is really interesting as we've finally hit a performance bottleneck that is not so easily overcome (due to power draw and ATX specifications). I'm very pleased, however, that you mention first in the comments that this truly is a card meant for multi-monitor setups only, and even then, may be bottlenecked by design. The 5870 single card setup is almost overkill for a single display, and even then most people are not gaming on >24" monitors.

    I've said it for the past 2 generations of cards but we've pretty much maxed out the need for faster cards (for GAMING purposes). Unless we start getting some super-hi res goggles that are reasonably priced, there just isn't much further to go due to display limitations. I mean honestly are those slightly fuzzy shadows worth the crazy perforamnce hit on a FPS? I honestly am having a VERY difficult time seeing a difference in the first set of pictures of the soldier's helmet. The pictures are taken slightly off angle from each other and even then I don't see what the arrow is pointing at. And if I can't see a significant difference in a STILL shot, how the heck am I to see a difference in-game!?

    OK enough rant, thanks for the review. :)
  • Anand Lal Shimpi - Wednesday, November 18, 2009 - link

    Thanks for the edits, I've made some corrections for Ryan that will hopefully make the statements more clear.

    I agree that the need for a faster GPU on the desktop is definitely minimized today. However I do believe in the "if you build it, they will come" philosophy. At some point, the amount of power you can get in a single GPU will be great enough that someone has to take advantage of it. Although we may need more of a paradigm shift to really bring about that sort of change. I wonder if Larrabee's programming model is all we'll need or if there's more necessary...

    Take care,
    Anand
  • 7Enigma - Wednesday, November 18, 2009 - link

    Thank you for the edits and the reply Anand.

    One of the main things I'd like to see GPU drivers implement is an artificial framerate cap option. These >100fps results in several of the tests at insane resolutions are not only pointless, but add unneccesary heat and stress to the system. Drop back down to normal resolutions that >90% of people have and it becomes even more wasteful to render 150fps.

    I always enable V-sync in my games for my LCD (75Hz), but I don't know if this is actually throttling the gpu to not render greater than 75fps. My hunch is in the background it's rendering to its max but only showing on the screen the Hz limitation.
  • Zool - Wednesday, November 18, 2009 - link

    I tryed out full screen furmark with vsync on and off (in 640*480) and the diference was 7 degre celsius. I have a custom cooler on the 4850 and a 20cm side fan on the case so thats quite lot.
  • 7Enigma - Thursday, November 19, 2009 - link

    Thanks for the reply Zool, I was hoping that was the case. So it seems like if I ensure vsync is on I'm at least limiting the gpu to only displaying the refresh rate of the LCD. Awesome!

Log in

Don't have an account? Sign up now