Triple Display Performance

In its initial press presentations on the Radeon HD 5870 Eyefinity 6 Edition, AMD argued that the larger frame buffer would come in handy with triple-display configurations. To test this theory I ran through our suite with three 1080p displays running at a combined resolution of 5760 x 1080. 

For most games I had to disable AA at this resolution. It's simply too much for a single card to drive. Even at high detail settings, I saw no real advantage to the 2GB 5870 E6. 

 

Crysis Warhead - 3 Displays

HAWX - 3 Displays

Left 4 Dead - 3 Displays

Battlefield Bad Company 2 - Chase Bench - 3 Displays

Battlefield Bad Company 2 - Waterfall Bench - 3 Displays

S.T.A.L.K.E.R.: Call of Pripyat - 3 Displays

DiRT 2 - 3 Displays

2GB vs. 1GB: Does it Matter? Six Display Performance
Comments Locked

78 Comments

View All Comments

  • frenchfrog - Wednesday, March 31, 2010 - link

    It would be so nice:

    -3 monitors for left-center-rigth views
    -1 monitor for rear view
    -2 monitors for guages/GPS/map/flight controls
  • vol7ron - Wednesday, March 31, 2010 - link

    I'm not sure why a "wall" was created. Your problem with FOV is the fact that you have too much of a 2D setup, rather than an easier-to-view 3D.

    Suggestion: 3 stands.

    Center the middle pair to your seat.
    Adjust the right and left pair so they're at a 15-25 degree slant, as if you were forming a hexadecagon (16 sided polygon @ 22.5 degrees)

    vol7ron
  • cubeli - Wednesday, March 31, 2010 - link

    I cannot print your reviews anymore.. Any help would be greatly appreciated!
  • WarlordSmoke - Wednesday, March 31, 2010 - link

    I still don't understand the point of this card by itself, as a single card(no CF).

    It's too expensive and too gaming oriented to be used in the workplace where, as someone else already mentioned, there have been cheaper and more effective solutions for multi-display setups for years.
    It's too weak to drive the 6 displays it's designed to for gaming. Crysis(I know it's not a great example of an optimized engine but give me a break here) which is a 3 year old game isn't playable at < 25fps and I can't imagine the next generation of games which are around the corner to be more forgiving.

    My point is, why build a card to drive 6 displays when you could have 2 cards that can drive 3 displays each and be more effective for gaming. I know this isn't currently possible, but that's my point, it should be, it's the next logical step.

    Instead of having 2 cards in crossfire, where only one card has display output and the other just tags along as extra horsepower, why not use the cards in parallel, split the scene in two and use two framebuffers(one card with upper 3 screens and the other card with the lower 3 screens) and practically make crossfire redundant(or just use it for synchronizing the rendering).

    This should be more efficient on so many levels. First, the obvious, half the screens => half the area to render => better performance. Second, if the scene is split in two each card could load different textures so less memory should be wasted than in crossfire mode where all cards need to load the same textures.
    I'm probably not taking too seriously the synchronization issues that could appear between them, but they should be less obvious when they are between distinct rows of displays, especially if they have bezels.

    Anyway this idea with 2 cards with 3 screens each would have been beneficial to both ATI(sales of more cards) and to the gamers: Buy a card and three screens now, and maybe later if you can afford it buy another card and another three screens. Not to mention the fact that ATI has several distinct models of cards that support 3 displays. So they could have made possible 6 display setups even for lower budgets.

    To keep a long story short(er), I believe ATI should have worked to make this possible in their driver and just scrap this niche 6 display card idea from the start.
  • Bigginz - Wednesday, March 31, 2010 - link

    I have an idea for the monitor manufacturers (Samsung). Just bolt a magnifying glass to the front of the monitor that is the same width and height (bezel included). I vaguely remember some products similar to this for the Nintendo Gameboy & DS.

    Dell came out with their Crystal LCD monitor at CES 2008. Just replace the tempered glass with a magnifying glass and your bezel problem is fixed.
    http://hothardware.com/News/Dell_Crystal_LCD_Monit...
  • Calin - Thursday, April 1, 2010 - link

    Magnifying glass for such a large surface would be thick and heavy (and probably prone to cracking), and "thin" variations have image artefacts (I've seen a magnifying "glass" usable as a bookmark, and the image was good, but it definitely had issues
  • imaheadcase - Wednesday, March 31, 2010 - link

    As much R&D the invested in this, It seems better to use it towards making own monitors that don't have bezels. The extra black link is a major downside to this card.

    ATI monitors + video setup would be ideal. After all, when you are going to drop $1500 + video card setup, what is a little more in price for a streamlined monitors.
  • yacoub - Wednesday, March 31, 2010 - link

    "the combined thickness of two bezels was annoying when actually using the system"

    Absolutely!
  • CarrellK - Thursday, April 1, 2010 - link

    There are a fair number of comments to the effect of "Why did ATI/AMD build the Six? They could have spent their money better elsewhere..." To those who made those posts, I respectfully suggest that your thoughts are too near-term, that you look a bit further into the future.

    The answers are:

    (1) To showcase the technology. We wanted to make the point that the world is changing. Three displays wasn't enough to make that point, four was obvious but still not enough. Six was non-obvious and definitely made the point that the world is changing.

    (2) To stimulate thinking about the future of gaming, all applications, how interfaces *will* change, how operating systems *will* change, and how computing itself is about to change and change dramatically. Think Holodeck folks. Seriously.

    (3) We wanted a learning vehicle for ourselves as well as everyone else.

    (4) And probably the biggest reason of all: BECAUSE WE THOUGHT IT WOULD BE FUN. Not just for ourselves, but for those souls who want to play around and experiment at the edges of the possible. You never know what you don't know, and finding that out is a lot of fun.

    Almost every day I tell myself and anyone who'll listen: If you didn't have fun at work today, maybe it is time to do something else. Go have some fun folks.
  • Anand Lal Shimpi - Thursday, April 1, 2010 - link

    Thanks for posting Carrell :) I agree with the having fun part, if that's a motivation then by all means go for it!

    Take care,
    Anand

Log in

Don't have an account? Sign up now