Power Consumption

AMD did list a slight increase in power consumption for the 5870 Eyefinity 6 cards. In real world usage it amounts to a 6 - 7W increase in power consumption at idle and under load. Hardly anything to be too concerned about.

It's worth mentioning that these power numbers were obtained in a benchmark that showed no real advantage to the extra 1GB of frame buffer. It is possible that under a more memory intensive workload (say for example, driving 6 displays) the 5870 E6 would draw much more power than a hypothetical 6-display 1GB 5870.

Power Consumption Comparison
Total System Power
Radeon HD 5870 1GB
Radeon HD 5870 E6 2GB
Idle 179.1W 186.0W
Load (Crysis Warhead) 290.0W 296.0W

If you are power conscious however, then an Eyefinity 6 setup may not be right for you. Our six 22" Dell displays consumed 114W of power by themselves while playing Crysis. That's the power consumption of an entire Core i5 system under load, just for your displays!

Final Words

I spoke with Carrell Killebrew a few days ago (no, not for the RV970 story) and our conversation drifted over to future applications for GPUs. When Carrell first introduced me to Eyefinity he told me that this was the first step towards enabling a Holodeck-like environment in about 6 years. 

Carrell envisions a world where when you want to watch a football game with your friends, or just hang out and play video games you'll do so in a virtual room inside of your home. You'll have a display occupying the majority if not all of your vision. Being displayed will be fully rendered, lifelike models of your friends, which you can interact with in real time. After all, sending model data requires far less bandwidth than exchanging high resolution encoded video between a dozen people in a room.

Sound will have to be calculated on a per person basis. Existing surround sound setups work well for a single user, but not for multiple people spread out all over a virtual room. The GPU will not only have the task of rendering the characters in the room, but also calculating the phase and position accurate sound for everyone.

Today we play games like Rockband or Guitar Hero facing a screen. In Carrell's world, 6 years from now we'll be facing a crowd of fans and it'll look, feel and sound like we're on stage performing. There's a lot that has to be solved between now and then, but in Carrell's eyes this is the beginning. And like most beginnings, this one has its rough patches. 

The good news is that a single Radeon HD 5870 Eyefinity 6 Edition card can drive a total of six displays. That's something that we couldn't have imagined from a consumer card even just a couple of years ago. If you've ever found yourself wanting 6 monitors for a particular application, workload or even game - this is your solution. 

As a general gaming card however, there are definite issues. In existing titles, with 3 or fewer screens, we just didn't see a tremendous performance advantage to the 5870 E6. The larger frame buffer did help raise minimum frame rates, but not enough to positively impact the average frame rates in our tests. Even in triple display setups we didn't see any reason to get the E6 card.

If you are looking to make the jump to six displays however, the issues then stop being with the card itself and are more about what you want to do with the setup. Having two 3x1 groups makes sense. It's a bit pricey, but it makes sense if you like mixing work and pleasure on your desktop. The single 3x2 group is the problematic configuration. For games you play in the third person, it's great. For first person shooters however, playing on an Eyefinity 6 setup puts you at a disadvantage due to crosshair problem. What AMD really needs to do here is enable a 5x1 configuration for folks serious about FPSes.

The bigger problem is simply the state of game support for Eyefinity. The majority of titles, even new ones coming out today, often ship with gross incompatibilities with Eyefinity setups. AMD is hard at work to make this better, but it means that you can't plop down $1500 for six monitors and two stands, drop another $900 on a pair of video cards and have it work perfectly in everything you'd ever want to play. It's a tough pill to swallow.

If you want to have an immersive gaming experience and if you've got the wall space you're better off buying a 720p projector, building a screen (or painting one on the wall) and calling it a day. On the other hand, of you just need more desktop resolution then a 30" monitor is probably in your future. If you must combine the two needs and have them serviced by a single setup, that's where Eyefinity 6 can offer some value.

Six Display Performance
Comments Locked

78 Comments

View All Comments

  • frenchfrog - Wednesday, March 31, 2010 - link

    It would be so nice:

    -3 monitors for left-center-rigth views
    -1 monitor for rear view
    -2 monitors for guages/GPS/map/flight controls
  • vol7ron - Wednesday, March 31, 2010 - link

    I'm not sure why a "wall" was created. Your problem with FOV is the fact that you have too much of a 2D setup, rather than an easier-to-view 3D.

    Suggestion: 3 stands.

    Center the middle pair to your seat.
    Adjust the right and left pair so they're at a 15-25 degree slant, as if you were forming a hexadecagon (16 sided polygon @ 22.5 degrees)

    vol7ron
  • cubeli - Wednesday, March 31, 2010 - link

    I cannot print your reviews anymore.. Any help would be greatly appreciated!
  • WarlordSmoke - Wednesday, March 31, 2010 - link

    I still don't understand the point of this card by itself, as a single card(no CF).

    It's too expensive and too gaming oriented to be used in the workplace where, as someone else already mentioned, there have been cheaper and more effective solutions for multi-display setups for years.
    It's too weak to drive the 6 displays it's designed to for gaming. Crysis(I know it's not a great example of an optimized engine but give me a break here) which is a 3 year old game isn't playable at < 25fps and I can't imagine the next generation of games which are around the corner to be more forgiving.

    My point is, why build a card to drive 6 displays when you could have 2 cards that can drive 3 displays each and be more effective for gaming. I know this isn't currently possible, but that's my point, it should be, it's the next logical step.

    Instead of having 2 cards in crossfire, where only one card has display output and the other just tags along as extra horsepower, why not use the cards in parallel, split the scene in two and use two framebuffers(one card with upper 3 screens and the other card with the lower 3 screens) and practically make crossfire redundant(or just use it for synchronizing the rendering).

    This should be more efficient on so many levels. First, the obvious, half the screens => half the area to render => better performance. Second, if the scene is split in two each card could load different textures so less memory should be wasted than in crossfire mode where all cards need to load the same textures.
    I'm probably not taking too seriously the synchronization issues that could appear between them, but they should be less obvious when they are between distinct rows of displays, especially if they have bezels.

    Anyway this idea with 2 cards with 3 screens each would have been beneficial to both ATI(sales of more cards) and to the gamers: Buy a card and three screens now, and maybe later if you can afford it buy another card and another three screens. Not to mention the fact that ATI has several distinct models of cards that support 3 displays. So they could have made possible 6 display setups even for lower budgets.

    To keep a long story short(er), I believe ATI should have worked to make this possible in their driver and just scrap this niche 6 display card idea from the start.
  • Bigginz - Wednesday, March 31, 2010 - link

    I have an idea for the monitor manufacturers (Samsung). Just bolt a magnifying glass to the front of the monitor that is the same width and height (bezel included). I vaguely remember some products similar to this for the Nintendo Gameboy & DS.

    Dell came out with their Crystal LCD monitor at CES 2008. Just replace the tempered glass with a magnifying glass and your bezel problem is fixed.
    http://hothardware.com/News/Dell_Crystal_LCD_Monit...
  • Calin - Thursday, April 1, 2010 - link

    Magnifying glass for such a large surface would be thick and heavy (and probably prone to cracking), and "thin" variations have image artefacts (I've seen a magnifying "glass" usable as a bookmark, and the image was good, but it definitely had issues
  • imaheadcase - Wednesday, March 31, 2010 - link

    As much R&D the invested in this, It seems better to use it towards making own monitors that don't have bezels. The extra black link is a major downside to this card.

    ATI monitors + video setup would be ideal. After all, when you are going to drop $1500 + video card setup, what is a little more in price for a streamlined monitors.
  • yacoub - Wednesday, March 31, 2010 - link

    "the combined thickness of two bezels was annoying when actually using the system"

    Absolutely!
  • CarrellK - Thursday, April 1, 2010 - link

    There are a fair number of comments to the effect of "Why did ATI/AMD build the Six? They could have spent their money better elsewhere..." To those who made those posts, I respectfully suggest that your thoughts are too near-term, that you look a bit further into the future.

    The answers are:

    (1) To showcase the technology. We wanted to make the point that the world is changing. Three displays wasn't enough to make that point, four was obvious but still not enough. Six was non-obvious and definitely made the point that the world is changing.

    (2) To stimulate thinking about the future of gaming, all applications, how interfaces *will* change, how operating systems *will* change, and how computing itself is about to change and change dramatically. Think Holodeck folks. Seriously.

    (3) We wanted a learning vehicle for ourselves as well as everyone else.

    (4) And probably the biggest reason of all: BECAUSE WE THOUGHT IT WOULD BE FUN. Not just for ourselves, but for those souls who want to play around and experiment at the edges of the possible. You never know what you don't know, and finding that out is a lot of fun.

    Almost every day I tell myself and anyone who'll listen: If you didn't have fun at work today, maybe it is time to do something else. Go have some fun folks.
  • Anand Lal Shimpi - Thursday, April 1, 2010 - link

    Thanks for posting Carrell :) I agree with the having fun part, if that's a motivation then by all means go for it!

    Take care,
    Anand

Log in

Don't have an account? Sign up now