Radeon HD 5970 Eyefinity on 3 x 30" Displays: Playable at 7680 x 1600

TWiT's Colleen Kelly pestered me on twitter to run tests on a 3 x 30" Eyefinity setup. The problem with such a setup is twofold:

1) I was worried even a 5970 wouldn't be powerful enough to drive all three displays at their full resolution (a total of 7680 x 1600) in a modern game, and

2) The Radeon HD 5970's third video output is mini-DP only.

The second issue is bigger than you think, there are currently no 30" displays that accept a mini DP input, only regular DP. And to convert a miniDP to DP/DL-DVI, you need an active adapter, a bit more expensive than a standard converter cable. Apple makes such a cable and sells it for $99. The local store had one in stock, so I hopped in the batmobile and got one. Update: AMD tells us that in order to use all three outputs, regardless of resolution, you need to use an active adapter for the miniDP output because the card runs out of timing sources.

Below we have a passive mini-DP to single-link DVI adapter. This is only capable of driving a maximum of 1920 x 1200:

This cable works fine on the Radeon HD 5970, but I couldn't have one of my displays running at a non-native resolution.

Next is the $99 mini DP to dual-link DVI adapter. This box can drive a panel at full 2560 x 1600:

User reviews are pretty bad for this adapter, but thankfully I was using it with a 30" Apple Cinema Display. My experience, unlike those who attempt to use it with non-Apple monitors, was flawless. In fact, I had more issues with one of my Dell panels than this Apple. It turns out that one of my other DVI cables wasn't rated for dual-link operation and I got a bunch of vertical green lines whenever I tried to run the panel at 2560 x 1600. Check your cables if you're setting up such a beast, I accidentally grabbed one of my DVI cables for a 24" monitor and caused the problem.

Windows detected all of the monitors, I used the Eyefinity configuration tool to arrange them properly, grabbed a few 7680 x 1600 wallpapers and was at a super wide Windows desktop.


Click to Enlarge

The usual Eyefinity complaints apply here. My start menu was around 3 feet to the left of my head and the whole setup required nearly 6 feet of desk space.


Click to Enlarge

In game menus and cutscenes are also mostly borked. They are fixed resolution/aspect ratio and end up getting stretched across all three 30" panels. Case in point is Call of Duty Modern Warfare 2:


Click to Enlarge

While most games will run at the 7680 x 1600 resolution enumerated by ATI's driver, they don't know how to deal with the 48:10 aspect ratio of the setup (3 x 16:10 displays) and apply the appropriate field of vision adjustments. The majority of games will simply try to stretch the 16:10 content to the wider aspect ratio, resulting in a lot of short and fat characters on screen (or stretched characters in your periphery). Below is what MW2 looks like by default:


Click to Enlarge

All of the characters look like they have legs that start at their knees. Thankfully there's a little tool out there that lets you automatically correct aspect ratio errors in some games. It's called Widescreen Fixer and you can get it here.

Select your game, desired aspect ratio and just leave it running in the background while you play. Hitting semicolon will enable/disable the aspect ratio correction and result in a totally playable, much less vomit inducing gaming experience. Below we have MW2 with correct aspect ratio/FOV:


Click to Enlarge

Take note of how much more you can see as well as normal the characters now look. Unfortunately Widescreen Fixer only supports 11 games as of today, and four of them are Call of Duty titles:

Battlefield 2
Battlefield 2142
BioShock
Call of Duty 2
Call of Duty 4: Modern Warfare
Call of Duty: World at War
Call of Duty: Modern Warfare 2
Darkest of Days Demo
SEGA Rally Revo
Unreal Tournament
Wolfenstein

There's absolutely no reason for ATI not to have done this on its own. There's a donate link on the Widescreen Fixer website, the right thing for ATI to do would be to pay this developer for his work. He's doing the job ATI's Eyefinity engineers should have done from day one. Kudos to him, shame on ATI.

Performance with 3 x 30" displays really varies depending on the game. I ran through a portion of the MW2 single player campaign and saw an average frame rate of 30.9 fps with a minimum of 10 fps and a maximum of 50 fps. It was mostly playable on the Radeon HD 5970 without AA enabled, but not buttery smooth. Turning on 4X AA made Modern Warfare 2 crash in case you were wondering. The fact that a single card is even remotely capable of delivering a good experience at 7680 x 1600 is impressive.

I also noticed a strange issue where I couldn't get my video to sync upon any soft reboots. I'd need to shut down the entire system and turn it back on for me to see anything on the screen after a restart.

With corrected aspect ratios/FOV, gaming is ridiculous on such a wide setup. You really end up using your peripheral vision for what it was intended. The experience, even in an FPS, is much more immersive. Although I do stand by my original take on Eyefinity, the most engulfing gameplay is when you find yourself running through an open field and not in games that deal with more close quarters action.

Just for fun i decided to hook my entire 3 x 30" Eyefinity setup to a power meter and see how much power three 30" displays and a Core i7 system with a single Radeon HD 5970 would consume.

Under load while playing Modern Warfare 2 the entire system only used 517W. Which brings us to the next issue with Eyefinity on the 5970: most games will only use one of the GPUs. Enabling Eyefinity with Crossfire requires a ridiculous amount of data to be sent between the GPUs thanks to the ultra high resolutions supported. Doing this isn't quite that easy given some design tradeoffs made with Cypress (more on this in an upcoming article). Currently, only a limited number of titles will support the 5970 running in dual-GPU mode with Eyefinity. Dual-card owners (e.g. 5870 CF) are out of luck, the current drivers do not allow for CF and Eyefinity to work together. This will eventually get fixed, but it's going to take some time. With a second GPU running you can expect total system power consumption, including displays, to easily break 600W for the setup I used here.

The difficulties most games have with such a wide setup prevent 3 x 30" Eyefinity (or even any triple-monitor configuration) from being a real recommendation. If you have three monitors, sure, why not, but I don't believe it's anywhere near practical. Not until ATI gets a lot of the software compatibility taken care of.

STALKER: Call of Pripyat – A Peak at More DX11 The Test
Comments Locked

114 Comments

View All Comments

  • Paladin1211 - Saturday, November 21, 2009 - link

    To be precise, anything above the monitor refresh rate is not going to be recognizable. Mine maxed out at 60Hz 1920x1200. Correct me if I'm wrong.

    Thanks :)
  • noquarter - Saturday, November 21, 2009 - link

    If you read AnandTech's 'Triple Buffering: Why We Love It' article, there is a very slight advantage at more than 60fps even though the display is only running at ~60Hz. If the GPU finishes rendering a frame immediately after the display refresh then that frame will be 16ms stale by the time the display shows it as it won't have the next one ready in time. If someone started coming around the corner while that frame is stale it'd be 32ms (stale frame then fresh frame) before the first indicator showed up. This is simplified as with v-sync off you'll just get torn frames but the idea is still there.

    To me, it's not a big deal, but if you're looking at a person with quick reaction speed of 180ms, 32ms of waiting for frames to catch up could be significant I guess. If you increase the fps past 60 you're more likely to have a fresh frame rendered right before each display refresh.
  • T2k - Friday, November 20, 2009 - link

    Seriously: is he no more...? :D
  • XeroG1 - Thursday, November 19, 2009 - link

    OK, so seriously, did you really take a $600 video card and benchmark Crysis Warhead without turning it all the way up? The chart says "Gamer Quality + Enthusiast Shaders". I'm wondering if that's really how you guys benchmarked it, or if the chart is just off. But if not, the claim "Crysis hasn’t quite fallen yet, but it’s very close" seems a little odd, given that you still don't have all the settings turned all the way up.

    Incidentally, I'm running a GeForce 9800 GTX (not plus) and a Core2Duo E8550, and I play Warhead at all settings enthusiast, no AA, at 1600x900. At those settings, it's playable for me. People constantly complain about performance on that title, but really if you just turn down the resolution, it scales pretty well and still looks better than anything else on the market IMHO.
  • XeroG1 - Thursday, November 19, 2009 - link

    Er, oops - that was supposed to say "E8500", not "E8550", since there is no 8550.
  • mapesdhs - Thursday, November 19, 2009 - link


    Carnildo writes:
    > ... I was the administrator for a CAVE system. ...

    Ditto! :D


    > ... ported a number of 3D shooters to the platform. You haven't
    > lived until you've seen a life-sized opponent come around the
    > corner and start blasting away at you.

    Indeed, Quake2 is amazing in a CAVE, especially with both the player
    and the gun separately motion tracking - crouch behind a wall and be
    able to stick your arm up to fire over the wall - awesome! But more
    than anything as you say, it's the 3D effect which makes the experience.

    As for surround-vision in general... Eyefinity? Ha! THIS is what
    you want:

    http://www.sgidepot.co.uk/misc/lockheed_cave.jpg">http://www.sgidepot.co.uk/misc/lockheed_cave.jpg

    270 degree wraparound, 6-channel CAVE (Lockheed flight sim).

    I have an SGI VHS demo of it somewhere, must dig it out sometime.


    Oh, YouTube has some movies of people playing Quake2 in CAVE
    systems. The only movie I have of me in the CAVE I ran was
    a piece taken of my using COVISE visualisation software:

    http://www.sgidepot.co.uk/misc/iancovise.avi">http://www.sgidepot.co.uk/misc/iancovise.avi

    Naturally, filming a CAVE in this way merely shows a double-image.


    Re people commenting on GPU power now exceeding the demands for
    a single display...

    What I've long wanted to see in games is proper modelling of
    volumetric effects such as water, snow, ice, fire, mud, rain, etc.
    Couldn't all this excess GPU power be channeled into ways of better
    representing such things? It would be so cool to be able to have
    genuinely new effects in games such as naturally flowing lava, or
    an avalanche, or a flood, tidal wave, storm, landslide, etc. By this
    I mean it being done so that how the substance behaves is governed
    by the environment in a natural way (physics), not hard coded. So far,
    anything like this is just simulated - objects involved are not
    physically modelled and don't interact in any real way. Rain is
    a good example - it never accumulates, flows, etc. Snow has weight,
    flowing water can make things move, knock you over, etc.

    One other thing occurs to me: perhaps we're approaching a point
    where a single CPU is just not enough to handle what is now possible
    at the top-end of gaming. To move them beyond just having ever higher
    resolutions, maybe one CPU with more & more cores isn't going to
    work that well. Could there ever be a market for high-end PC
    gaming with 2-socket mbds? I do not mean XEON mbds as used for
    servers though. Just thoughts...

    Ian.

  • gorgid - Thursday, November 19, 2009 - link

    WITH THEIR CARDS ASUS PROVIDES THE SOFTWARE WHERE YOU CAN ADJUST CORE AND MEMORY VOLTAGES. YOU CAN ADJUST CORE VOLTAGE UP TO 1.4V

    READ THAT:
    http://www.xtremesystems.org/forums/showthread.php...">http://www.xtremesystems.org/forums/sho...cd1d6d10...

    I ORDERED ONE FROM HERE:

    http://www.provantage.com/asus-eah5970g2dis2gd5a~7...">http://www.provantage.com/asus-eah5970g2dis2gd5a~7...


  • K1rkl4nd - Wednesday, November 18, 2009 - link

    Am I the only one waiting for TI to come out with a 3x3 grid of 1080p DLPs? You'd think if they can wedge ~2.2 million mini-mirrors on a chip, they should be able to scale that up to a native 5760x3240. Then they could buddy up with Dell and sell it as an Alienware premium package of display + computer capable of using it.
  • skrewler2 - Wednesday, November 18, 2009 - link

    When can we see benchmarks of 2x 5970 in CF?
  • Mr Perfect - Wednesday, November 18, 2009 - link

    "This means that it’s not just a bit quieter to sound meters, but it really comes across that way to human ears too"

    Have you considered using the dBA filter rather then just raw dB? dBA is weighted to measure the tones that the human ear is most sensitive to, so noise-oriented sites like SPCR use dBA instead.

Log in

Don't have an account? Sign up now