Wanna see what 24.5 million pixels looks like?

That's six Dell 30" displays, each with an individual resolution of 2560 x 1600. The game is World of Warcraft and the man crouched in front of the setup is Carrell Killebrew, his name may sound familiar.

Driving all of this is AMD's next-generation GPU, which will be announced later this month. I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. This is the successor to the RV770. We can't talk specs but at today's AMD press conference two details are public: 2.15 billion transistors and over 2.5 TFLOPs of performance. As expected, but nice to know regardless.

The technology being demonstrated here is called Eyefinity and it actually all started in notebooks.

Not Multi-Monitor, but Single Large Surface

DisplayPort is gaining popularity. It's a very simple interface and you can expect to see mini-DisplayPort on notebooks and desktops alike in the very near future. Apple was the first to embrace it but others will follow.

The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today.

Eventually someone looked at all of the outputs and realized that without too much effort you could drive six displays off of a single card - you just needed more display engines on the chip. AMD's DX11 GPU family does just that.

At the bare minimum, the lowest end AMD DX11 GPU can support up to 3 displays. At the high end? A single GPU will be able to drive up to 6 displays.


AMD's software makes the displays appear as one. This will work in Vista, Windows 7 as well as Linux.

The software layer makes it all seamless. The displays appear independent until you turn on SLS mode (Single Large Surface). When on, they'll appear to Windows and its applications as one large, high resolution display. There's no multimonitor mess to deal with, it just works. This is the way to do multi-monitor, both for work and games.


Note the desktop resolution of the 3x2 display setup

 

I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.


Left 4 Dead in a 3 monitor configuration, 7680 x 1600


If a game pulls its resolution list from Windows, it'll work perfectly with Eyefinity.

With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display. If you want more vertical real estate, switch over to a 3x2 setup and then you're at 5040 x 2100. That's more resolution for less than most high end 30" panels.

Any configuration is supported, you can even group displays together. So you could turn a set of six displays into a group of 4 and a group of 2.

It all just seems to work, which is arguably the most impressive part of it all. AMD has partnered up with at least one display manufacturer to sell displays with thinner bezels and without distracting LEDs on the front:


A render of what the Samsung Eyefinity optimized displays will look like

We can expect brackets and support from more monitor makers in the future. Building a wall of displays isn't exactly easy.

The First Generation Holodeck by 2016
Comments Locked

137 Comments

View All Comments

  • MadMan007 - Friday, September 11, 2009 - link

    Yes it's nuts and you can thank ATi for finally upping the texture units and ROPs. I think they've been the same in number, although they've gotten faster, since the x1k series!
  • Golgatha - Thursday, September 10, 2009 - link

    No, I would agree that this resolution with those kinds of framerates is just nuts.
  • skrewler2 - Thursday, September 10, 2009 - link

    Yeah, seriously, the performance is unreal. I'm wondering if the settings were really maxed out..
  • AznBoi36 - Thursday, September 10, 2009 - link

    Probably were. But look, it's a lvl 1 toon wandering the world. I wanna see the performance in a 25 man raid or simply wandering around Dalaran. Oh and the shadows better be maxed too!
  • theslug - Thursday, September 10, 2009 - link

    I could see it being practical if the bezel of each monitor wasn't visible. They would need the actual LCD panels attached to one another instead.
  • HelToupee - Thursday, September 10, 2009 - link

    LCD's require control electronics around all 4 sides, making the bezel a necessity. It could easily be 1/4 the width of current monitors. I messed around with stitching the images from 3 rear-mounted projectors together. The image was seamless, but the price would be astronomical. That, and you have to have a VERY good screen to project on to, or all your wonderful resolution gets muddied.
  • USRFobiwan - Friday, September 11, 2009 - link

    How about the Samsung 460UTn with just 4mm bezels...
  • mczak - Friday, September 11, 2009 - link

    Or the Nec X461UN, which looks very similar (btw you don't need the 460UTn, the 460UT would do as there's no use for the built-in PC in this scenario)
    Those are really expensive (>5000 USD), are huge and low-res (1366x768). That's really for big video walls, not suitable for some monster gaming setup. But really, it shouldn't be much of a problem manufacturing 24 inch or so tfts with similar slim bezels. There just hasn't been a market for this up to now...
  • snakeoil - Thursday, September 10, 2009 - link

    wow this is spectacular.
    intel is in big trouble because intel graphics are pretty much garbage
    while amd's graphics are real gems.
  • TA152H - Thursday, September 10, 2009 - link

    You make a good point, but the other side of the coin is also true - Intel processors are very strong, and AMD processors suck by comparison.

    It's a pity ATI stopped making chipsets for Intel motherboards. They'd make money, Intel would still sell processors, and the only real loser would be NVIDIA. It's surprising how many chipsets they sell. I don't know many people who would buy NVIDIA chipsets, like most people, but it seems they sell them well with HP and Dell, where no one asks or knows the difference. ATI should really make chipsets for the Atom too. That would be a great combination.

Log in

Don't have an account? Sign up now