3D Vision Surround: NVIDIA’s Eyefinity

During our meeting with NVIDIA, they were also showing off 3D Vision Surround, which was announced at the start of CES at their press conference. 3D Vision Surround is not inherently a GF100 technology, but since it’s being timed for release along-side GF100 cards, we’re going to take a moment to discuss it.

If you’ve seen Matrox’s TripleHead2Go or AMD’s Eyefinity in action, then you know what 3D Vision Surround is. It’s NVIDIA’s implementation of the single large surface concept so that games (and anything else for that matter) can span multiple monitors. With it, gamers can get a more immersive view by being able to surround themselves with monitors so that the game world is projected from more than just a single point in front of them.

NVIDIA tells us that they’ve been sitting on this technology for quite some time but never saw a market for it. With the release of TripleHead2Go and Eyefinity it became apparent to them that this was no longer the case, and they unboxed the technology. Whether this is true or a sudden reaction to Eyefinity is immaterial at the moment, as it’s coming regardless.

This triple-display technology will have two names. When it’s used on its own, NVIDIA is calling it NVIDIA Surround. When it’s used in conjunction with 3D Vision, it’s called 3D Vision Surround. Obviously NVIDIA would like you to use it with 3D Vision to get the full effect (and to require a more powerful GPU) but 3D Vision is by no means required to use it. It is however the key differentiator from AMD, at least until AMD’s own 3D efforts get off the ground.

Regardless of to what degree this is a sudden reaction from NVIDIA over Eyefinity, ultimately this is something that was added late in to the design process. Unlike AMD who designed the Evergreen family around it from the start, NVIDA did not, and as a result they did not give a GF100 the ability to drive more than 2 displays at once. The shipping GF100 cards will have the traditional 2 monitor limit, meaning that gamers will need 2 GF100 cards in SLI to drive 3+ monitors, with the second card needed to provide the 3rd and 4th display outputs. We expect that the next NVIDIA design will include the ability to drive 3+ monitors from a single GPU, as for the moment this limitation precludes any ability to do Surround for cheap.


GTX 280 with 2 display outputs: GF100 won't be any different

As for some good news, as we stated earlier this is not a technology inherent to the GF100. NVIDIA can do it entirely in software and as a result will be backporting this technology to the GT200 (GTX 200 series). The drivers that get released for the GF100 will allow GTX 200 cards to do Surround in the same manner: with 2 cards, you can run a single large surface across 3+ displays. We’ve seen this in action and it works, as NVIDIA was demoing a pair of GTX 285s running in NVIDIA Surround mode in their CES booth.

The big question of course is going to be what this does for performance on both the GF100 and GT200, along with compatibility. That’s something that we’re going to have to wait on the actual hardware for.

Applications of GF100’s Compute Hardware Final Words
Comments Locked

115 Comments

View All Comments

  • chizow - Monday, January 18, 2010 - link

    Looks like Nvidia G80'd the graphics market again by completely redesigning major parts of their rendering pipeline. Clearly not just a doubling of GT200, some of the changes are really geared toward the next-gen of DX11 and PhysX driven games.

    One thing I didn't see mentioned anywhere was HD sound capabilities similar to AMD's 5 series offerings. I'm guessing they didn't mention it, which makes me think its not going to be addressed.
  • mm2587 - Monday, January 18, 2010 - link

    for nvidia to "g80" the market again they would need parts far faster then anything amd had to offer and to maintain that lead for several months. The story is in fact reversed. AMD has the significantly faster cards and has had them for months now. gf100 still isn't here and the fact that nvidia isn't signing the praises of its performance up and down the streets is a sign that they're acceptable at best. (acceptable meaning faster then a 5870, a chip that's significantly smaller and cheaper to make)
  • chizow - Monday, January 18, 2010 - link

    Nah, they just have to win the generation, which they will when Fermi launches. And when I mean "generation", I mean the 12-16 month cycles dictated by process node and microarchitecture. It was similar with G80, R580 had the crown for a few months until G80 obliterated it. Even more recently with the 4870X2 and GTX 295. AMD was first to market by a good 4 months but Nvidia still won the generation with GTX 295.
  • FaaR - Monday, January 18, 2010 - link

    Win schmin.

    The 295 ran extremely hot, was much MUCH more expensive to manufacture, and the performance advantage in games was negligible for the most part. No game is so demanding the 4870 X2 can't run it well.

    The geforce 285 is at least twice as expensive as a radeon 4890, its closest competitor, so how you can say Nvidia "won" this round is beyond me.

    But I suppose with fanboy glasses on you can see whatever you want to see. ;)
  • beck2448 - Monday, January 18, 2010 - link

    Its amazing to watch ATI fanboys revise history.

    The 295 smoked the competition and ran cooler and quieter. Fermi will inflict another beatdown soon enough.
  • chizow - Monday, January 18, 2010 - link

    Funny the 295 ran no hotter (and often cooler) with a lower TDP than the 4870X2 from virtually every review that tested temps and was faster as well. Also the GTX 285 didn't compete with the 4890, the 275 did in both price and performance.

    Its obvious Nvidia won the round as these points are historical facts based on mounds of evidence, I suppose with fanboy glasses on you can see whatever you want to see. ;)
  • Paladin1211 - Monday, January 18, 2010 - link

    Hey kid, sometimes less is more. You dont need to post that much just to say "nVidia wins, and will win again". This round AMD has won with 2mil cards drying up the graphics market. You cant change this, neither could nVidia.

    Just come out and buy a Fermi, which is 15-20% faster than a HD 5870, for $500-$600. You only have to wait 3 months, and save some bucks until then. I have a HD 5850 here and I'm waiting for Tegra 2 based smartphone, not Fermi.

  • Calin - Tuesday, January 19, 2010 - link

    Both Tegra 2 and Fermi are extraordinary products - if what NVidia says about them is true. Unfortunately, it doesn't seem like any of them is a perfect fit for the gaming desktop.
  • Calin - Monday, January 18, 2010 - link

    You don't win a generation with a very-high-end card - you win a generation with a mainstream card (as this is where most of the profits are). Also, low-end cards are very high-volume, but the profit from each unit is very small.
    You might win the bragging rights with the $600, top-of-the-line, two-in-one cards, but they don't really have a market share.
  • chizow - Monday, January 18, 2010 - link

    But that's not how Nvidia's business model works for the very reasons you stated. They know their low-end cards are very high-volume and low margin/profit and will sell regardless.

    They also know people buying in these price brackets don't know about or don't care about features like DX11 and as the 5670 review showed, such features are most likely a waste on such low-end parts to begin with (a 9800GT beats it pretty much across the board).

    The GPU market is broken up into 3 parts, High-end, performance and mainstream. GF100 will cover High-end and the top tier in performance with GT200 filling in the rest to compete with the lower-end 5850. Eventually the technology introduced in GF100 will diffuse down to lower-end parts in that mainstream segment, but until then, Nvidia will deliver the cutting edge tech to those who are most interested in it and willing to pay the premium for it. High-end and performance minded individuals.

Log in

Don't have an account? Sign up now