Wanna see what 24.5 million pixels looks like?

That's six Dell 30" displays, each with an individual resolution of 2560 x 1600. The game is World of Warcraft and the man crouched in front of the setup is Carrell Killebrew, his name may sound familiar.

Driving all of this is AMD's next-generation GPU, which will be announced later this month. I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. This is the successor to the RV770. We can't talk specs but at today's AMD press conference two details are public: 2.15 billion transistors and over 2.5 TFLOPs of performance. As expected, but nice to know regardless.

The technology being demonstrated here is called Eyefinity and it actually all started in notebooks.

Not Multi-Monitor, but Single Large Surface

DisplayPort is gaining popularity. It's a very simple interface and you can expect to see mini-DisplayPort on notebooks and desktops alike in the very near future. Apple was the first to embrace it but others will follow.

The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today.

Eventually someone looked at all of the outputs and realized that without too much effort you could drive six displays off of a single card - you just needed more display engines on the chip. AMD's DX11 GPU family does just that.

At the bare minimum, the lowest end AMD DX11 GPU can support up to 3 displays. At the high end? A single GPU will be able to drive up to 6 displays.


AMD's software makes the displays appear as one. This will work in Vista, Windows 7 as well as Linux.

The software layer makes it all seamless. The displays appear independent until you turn on SLS mode (Single Large Surface). When on, they'll appear to Windows and its applications as one large, high resolution display. There's no multimonitor mess to deal with, it just works. This is the way to do multi-monitor, both for work and games.


Note the desktop resolution of the 3x2 display setup

 

I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.


Left 4 Dead in a 3 monitor configuration, 7680 x 1600


If a game pulls its resolution list from Windows, it'll work perfectly with Eyefinity.

With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display. If you want more vertical real estate, switch over to a 3x2 setup and then you're at 5040 x 2100. That's more resolution for less than most high end 30" panels.

Any configuration is supported, you can even group displays together. So you could turn a set of six displays into a group of 4 and a group of 2.

It all just seems to work, which is arguably the most impressive part of it all. AMD has partnered up with at least one display manufacturer to sell displays with thinner bezels and without distracting LEDs on the front:


A render of what the Samsung Eyefinity optimized displays will look like

We can expect brackets and support from more monitor makers in the future. Building a wall of displays isn't exactly easy.

The First Generation Holodeck by 2016
Comments Locked

137 Comments

View All Comments

  • formulav8 - Friday, September 11, 2009 - link

    I'm not sure why you think AMD's cpu's stink? All benches i've seen is they can run any game out there and push more than 30fps in all tested games not limited by the video card and even push the frames to where the video card ends up the bottleneck. No?

    Even compared to i5, the PII 9650 held its own quite well in alot of area's.

    For the past few years Intel has defintely had the trashiest iGPU's and probably will at least until the forseeable future. And I wouldn't count on Larrabee to change that all that much by the time it comes out. You can have the strongest cpu in the world but if you have gpu trash like Intels you can't game anyway at good resolutions and speeds to make good use of the fastest cpu in the world.

    Just my humble opinion :)


    Jason

  • TA152H - Friday, September 11, 2009 - link

    I think people sometimes instinctively balance things out, and forget they are even doing it.

    Keep in mind that Phenom II processors are the same size as the Nehalems, and you're forced to compared the low end of a brain-damaged line with the highest end AMD CPU, and still generally come out the loser. That's not a good CPU design, if you think about it.

    I don't agree with your remarks about great processors not mattering if you don't have the GPU to go with it. Stuff like that just makes you sound irrational and too pro-AMD. Not everyone needs powerful 3D graphics, but more to the point, you can get an ATI card for your Intel motherboard. So, sorry, they don't need to make a good video card for you to benefit from it.
  • Kary - Thursday, September 10, 2009 - link

    That might have looked really cool using projectors since that would get rid of your borders...REALLY EXPENSIVE, but cool :)

    Maybe on the side of a building
  • anandreader - Thursday, September 10, 2009 - link

    The Apple store on Michigan in Chicago is using something very similar to what's being shown here. If I recall correctly, they had 8 panels arranged 2 across by 4 down. They were running an animation showing all the iPhone apps that were available. I noticed the display from the other side of Michigan and was impressed enough to cross the street to see how they did it.

    The device was probably purposely made by the LCD manufacturerer as the seams were about as wide as a single thin bezel on Samsung monitors.
  • Holly - Thursday, September 10, 2009 - link

    hmm, running that is pretty nice, especially with the framerate given, but god save us from electricity bills... screens with good colour reproduction of this size are about to take 80Watt at least each... giving half kilowatt on screens only.

    Anyway I am eager to get detailed review soon... And hopefully nVidia counterpart as well... $1000+ graphic cards are just nice to see in benchmarks so fingers crossed for fast competition...
  • teldar - Thursday, September 10, 2009 - link

    They are actually going to be $450 cards for the initial 5870's with 2GB of GDDR5 1300MHz ram on board.
    So they are probably reasonable for what they offer.
  • imaheadcase - Thursday, September 10, 2009 - link

    I've yet to see someone use a monitor setup like that for gaming. It looks terrible with the lines between monitors. Video card manufacturers have been trying that for ages now, Matrox tried and failed at it already. Let it die. AMD.

    Mission control stuff, large desktop for apps sure be nice. But to game on? yuk.

    Case in point, look at the WoW screen shot..screen cut right in half were character is located. lol

    Now if some monitor company decides to make a monitor that lets you remove bezel around edges to a smooth setup, then we can talk.
  • Havor - Saturday, September 12, 2009 - link

    Apparently you never played on a 3 monitor setup. (yeah a 6 monitor setup would suck)

    I can tell from personal experiences that it awesome.
    I have tree 22" wide screens Matrox TripleHead2Go setup, so have no problem whit bezels in the middle of my screen.

    Yeah of course the bezels bug me, but just as mouths as the roof beams do when i am driving in my car,
    And yeah a 6 monitor setup just doesn't work in most games(except games like C&C), would be the same as putting a band of duct tape on eye height around the car.

    If you wane see for your self go over to gaming site from Matrox to see some screen dumps from for example WoW
    Ore see how it looks on YouTube

    NFS is also awesome on tree monitors

    And whit the low prizes of monitors anyone can now have a 3x 22" setup
  • Havor - Saturday, September 12, 2009 - link

    Darn shitty quote and link system on Anandtech

    Matrox game site: http://www.matrox.com/graphics/surroundgaming/en/h...">http://www.matrox.com/graphics/surroundgaming/en/h...
    WoW site: http://www.matrox.com/graphics/surroundgaming/en/g...">http://www.matrox.com/graphics/surroundgaming/en/g...
    YouTube movies: http://www.youtube.com/results?search_query=triple...">http://www.youtube.com/results?search_q...riplehea...
  • Zingam - Friday, September 11, 2009 - link

    What actually counts is that the graphics card is so powerful that it can support the high resolution!!! And this is just the first generation D11 card. I doubt that we will see anytime soon DX12 so I guess the third generation DX11 cards will rock the earth and break the walls!

Log in

Don't have an account? Sign up now