After a bit of ballyhoo and a bit more of a delay, NVIDIA is finally ready to launch their competitor to AMD’s triple-monitor Eyefinity technology: 3D Vision Surround.

As a quick refresher, in September of 2009 AMD launched their multi-monitor Eyefinity technology alongside the Radeon HD 5000 series. With Eyefinity AMD could present a Single Large Surface to games and applications, allowing them to draw to 3 monitors as if they were a single monitor. It allowed for computing and gaming at a very wide field of view approaching the limits of human vision.

Not to be left out, NVIDIA decided to counter back with their own take on the technology: 3D Vision Surround. We first learned about 3D Vision Surround at CES 2010, where NVIDIA was officially announced the technology and was offering both public and private demonstrations of the technology. At the time they had it running on both GTX 200 series cards and what would become the GTX 400 series. 3D Vision Surround was to be NVIDIA’s competition to AMD’s Eyefinity technology and then-some: not only would NVIDIA match AMD’s Eyefinity triple-monitor capabilities in the 2D space, but they would extend the concept by merging it with their 3D Vision technology for 3D Vision Surround.

NVIDIA has previously told us that they’ve been sitting on the concept for some time with no apparent market for it, with the success of Eyefinity and Matrox’s TripleHead2Go finally motivating NVIDIA to move forward with the technology. The result of this delayed plan is an interesting technology that in many ways is NVIDIA’s version of Eyefinity, and in other ways is entirely different. In a nutshell: it’s not just 3D Eyefinity.

Today’s Launch

This morning NVIDIA is launching their 258.69 beta driver, the first public driver to offer 3D Vision Surround functionality. NVIDIA did not sample this driver to the general press ahead of this launch so we will not be providing a review for today’s launch. We will have our own review in the coming weeks, as we’re still working on acquiring a complete set of 120Hz LCD monitors to properly test both NVIDIA Surround (2D) and 3D Vision Surround (3D).

In lieu of that we have already been provided a technical briefing for the technology which for the time being enables us to answer some of the biggest questions we had about the technology prior to today’s launch.

 

3D Vision Feature Support
  2-Way SLI 3-Way SLI NVIDIA Surround 3D Vision Surround
GTX 400 Series Yes Yes Yes Yes
GTX 200 Series Yes No Yes Yes

First and foremost, as we’ve discussed in previous articles, NVIDIA is technically launching two different technologies today. The first is NVIDIA Surround, the name NVIDIA is giving to their Eyefinity-alike 2D multi-monitor technology. The second is 3D Vision Surround, which is the infusion of 3D Vision in to NVIDIA Surround. Admittedly the naming could use some work (“NVIDIA Surround” does not roll off the tongue quite like “Eyefinity”) but it’s fairly straightforward in conveying which one is for 3D. For the sake simplicity in this article, we’ll be referring to the overall technology as NVIDIA Single Large Surface (NVSLS) when discussing matters that apply to both NVIDIA Surround and 3D Vision Surround.

Although NVIDIA may have been sitting on NVSLS for quite some time, the fact of the matter is that by the time they decided to launch it, they were already too far along in the design process of GF100 to do anything about it on the hardware level. Whereas AMD could make hardware changes to facilitate Eyefinity – primarily by enabling more display outputs – NVIDIA could not. This has some drawbacks and some benefits.


GTX 480: Only 2 outputs can be used at once, requiring SLI for NVSLS

In terms of drawbacks, the lack of dedicated hardware means that virtually none of NVIDIA’s cards have enough display outputs for NVSLS. With the exception of a single model of the GeForce GTX 295 that has an HDMI output on the daughter card, 2+ cards operating in SLI are required to take advantage of NVSLS. This is due to the fact that the second card’s display outputs are needed to drive the 3rd monitor. This gives NVSLS a higher setup cost than Eyefinity, which can be done for up to 6 monitors on a single card. Along those lines is NVIDIA’s other current limitation: they can only do 3 monitors right now while AMD can do 6.

However there are also benefits of NVIDIA’s software implementation. While AMD relied on hardware and limited Eyefinity to the Radeon HD 5000 series as a result, a pure software solution allows for the technology to be backported to older cards. Along with the GTX 400 series, the last-generation GTX 200 series will also be gaining NVSLS capabilities today – this is for both NVIDIA Surround and 3D Vision Surround. There are a couple more limitations at the moment (3-way SLI is not supported on the GTX 200 series) but the fundamental technology is there. Furthermore in this brute-force manner NVIDIA also tidily bypasses any reliance on DisplayPort, so unlike Eyefinity NVSLS will work without an active DP-to-DVI adapter.

The biggest remaining question right now will be whether a pure-software approach differs from AMD’s hardware + software approach in terms of performance and game compatibility. NVIDIA’s own internal benchmarks have a SLI GTX 480 setup beating a CF 5870 2GB setup, but the GTX 480 is already faster than the Radeon HD 5870 so this wouldn’t be wholly surprising. As for compatibility we do know that NVIDIA is still fighting with the issue much like AMD has been, as NVIDIA is suggesting the use of the 3rd party Widescreen Fixer to fix the aspect ratio of several games.

The Next Step: 3D
Comments Locked

61 Comments

View All Comments

  • killerclick - Tuesday, June 29, 2010 - link

    Too bad they won't ever line up.
  • newparad1gm - Tuesday, June 29, 2010 - link

    Yes they will, that's exactly what bezel management is for.
  • james.jwb - Tuesday, June 29, 2010 - link

    he meant the borders of the windows on the car and the bezels of the monitors.

    He has a point, bezels do need to -- and will -- become smaller in the future. Until then there will definitely be a chunk of people willing to hold off.
  • DanNeely - Tuesday, June 29, 2010 - link

    Some of the control circuitry for an LCD has to be on the side. Backlights don't have to be, but until Joe Moron is convinced that bezels are more of a problem than thickness they're going to continue to end up there on anything except high end displays (edge back lighting is less even than the back lights behind the panels)
  • Ninjahedge - Tuesday, June 29, 2010 - link

    I think 3 is fine. 2 is where the problem has been. When using 2 monitors, most games are designed to have your FOV centered on teh screen, not off to one side or another (even on RTS games, you can "center view".

    Trying, I forget that Magic RTS, "Magic V" maybe? Trying that game was irritating with the seperation of screens right there in front of you.

    What needs to be done is simple. 3 screens works, but getting 3 screens the same size is difficult, especially with the HDTV craze (very short wide screens). 3 square screens, or two square and a wide screen would work well.

    The next is that some games need to WORK with it. You need to have the game developers take advantage of the screen. What might work even with the Duals that many have now is a rebdered screen and a menu screen.

    HELL, they were doing that 15 years ago with AutoCAD! What, they do not think you want your hud on one monitor and the FOV on the other?

    :p

    Three screens on one card is the next step. Not many people have $1000+ to get a single 2550x1600 for their desktop....
  • CptTripps - Tuesday, June 29, 2010 - link

    Time to get some new material, you posted the same "exact" thing at Toms. Have you ever played on a three monitor setup?
  • mindbomb - Tuesday, June 29, 2010 - link

    It seems completely inferior to and much less practical than gaming on a big screen lcd tv.
    Just because nvidia and amd are pushing it doesn't mean its a good idea all of a sudden.
  • killerclick - Tuesday, June 29, 2010 - link

    Yeah, the first time on 3 huge screens in Trocadero in London in '93. It sucked back then, too just as it did every time I tried it since.
  • ghitz - Tuesday, June 29, 2010 - link

    agreed.
  • frozentundra123456 - Tuesday, June 29, 2010 - link

    Seems like cool technology. Now all you need is a dedicated power plant for the power and loads of money to buy the equipment.
    Seriously, either this or eyefinity seems like a lot of money to spend on playing a game. But I guess if you can afford it more power to you. Personally, this is way out of my league, unless the prices come down a lot, especially considering most games coming out are sequels or console ports.

Log in

Don't have an account? Sign up now