NVIDIA Launches 3D Vision Surround

by Ryan Smith on 6/29/2010 9:00 AM EST


Back to Article

  • Wayne321 - Tuesday, June 29, 2010 - link

    Great, more competition = more innovation. I'm still waiting for quality 120Hz LCDs though, for a non-3D upgrade. Reply
  • Etern205 - Tuesday, June 29, 2010 - link

    Nvidia needs 2 video cards to get 3 screens running while ATi can do the same thing with one (minus the 3D). Reply
  • Death666Angel - Tuesday, June 29, 2010 - link

    They can do 3D, in various formats, for example:
    Stupid nVidia bias.
  • Heatlesssun - Tuesday, June 29, 2010 - link

    While technically true there are a LOT of caveats to this. First is the resolution. I'm running 5760x1200 and that's really a LOT of pixels to push through only on card, I'm running 3 480s and they muderize a single 5870 at this resolution. So while you need two cards you WANT two cards at these resolutions unless you are willing to give up a LOT of eye candy. Reply
  • B3an - Sunday, July 04, 2010 - link

    I'd also like to point out that the 1GB 5870 does not have enough RAM either for gaming on multiple displays at this kinda res.
    It can cope with older games of course, but on my single 2560x1600 display i can run out of RAM with my 5870's in some games made within the last 3 years, but it usually requires some level of AA. When you hit the RAM limit though you go from perfectly smooth into single digit FPS.

    If you're playing at higher res i believe two 480's would be overall best. I know ATI have 2GB 5870's but they are not as fast.
  • wiak - Monday, July 05, 2010 - link

    AMD can do 6 screens on 2x standard eyefinity cards, or 12 screens on 2x eyefinity6 cards :P

    didnt AMD show a linux based desktop running a flight sim that was having 24 screens? :D

    that must have been 4x eyefinity6 cards

    jup here is it
  • Earballs - Tuesday, June 29, 2010 - link

    Same boat here. Give me high resolution 120Hz IPS LCD please. I can't upgrade my display in good conscience until then. Reply
  • james.jwb - Tuesday, June 29, 2010 - link

    exactly the same feeling here, IPS 120hz, hurry! Reply
  • PPalmgren - Tuesday, June 29, 2010 - link

    Aren't IPS panels already problemating regarding input lag? I want black accuracy as much as you do, but its not worth the atrocious input lag problems I've experienced on my panels. I have a good TN 24'' on the left now for gaming, on the right I have my 24'' expensive IPS that I'll never play a game on again.

    Considering the main function of this type of system is gaming (what else could you use 3d surround for?) I'd say fixing the input lag issues would be precedent unless you're playing a game that requires low reaction time and accuracy.
  • PPalmgren - Tuesday, June 29, 2010 - link

    Oops...disregard that post. I'm an idiot, its a PVA panel I have. Reply
  • B3an - Sunday, July 04, 2010 - link

    It's not the type of panel tech that has input lag anyway, it's more to do with the electronics used to convert the signal.

    Personally i'd never use a TN panel for anything. Utter sh*te.
  • Leo V - Tuesday, June 29, 2010 - link

    Amen! Even scrolling at 60Hz gives me a headache. Reply
  • I am as mad as hell - Wednesday, June 30, 2010 - link

    IPS and fast refresh rate needed for 3D shutter tech don't mix!

    IPS panels can't refresh fast enough without causing major ghosting issues in 3D

    Sorry to shatter your dreams for now.
  • electroju - Thursday, July 01, 2010 - link

    ""Same boat here. Give me high resolution 120Hz IPS LCD please. I can't upgrade my display in good conscience until then. ""

    Like others have said. That will not happen because of how IPS panels works. Another panel type is ASV from Sharp. It is said it is capable of fast response times and wide viewing angles or a replacement for IPS. I have not check out any ASV type of LCD panel yet. The ASV panel is used mainly for TV, but it should be able to use for computers too. I guess the market has not yet turn on the light bulb when there is a better technology out there. Though Sony sign a contract with Sharp, so Sony could be using ASV panels for their new TV.

    Call me old fashion or just smart, but I prefer to use CRT. I dislike LCD for just about anything.
  • Aezay - Friday, July 02, 2010 - link

    I'm hoping that OLED will fix all of these problems, by having better stats than both TN and IPS/PVA. Reply
  • cactusdog - Wednesday, June 30, 2010 - link

    YAY!! Lets spend a couple of thousand dollars on graphics cards and monitors so we can play the latest games at 30 FPS!!!! Reply
  • Rick83 - Tuesday, June 29, 2010 - link

    I've been missing playing Lock-On in dual screen mode (yes, it broke a few things in the tutorials), as I could back inthe WinXP days...
    Luckily AMD went ahead and brought multiscreening back to Vista, and forced nVidia to follow suit...
    Only took 4 years to raise this feature from the dead -.-
  • Rick83 - Tuesday, June 29, 2010 - link

    Actually, having installed the driver (x64 on server 2008) I cannot seem to find the relevant settings. I only have a single GTX260 though - still with two monitors attached it should offer me some kind of surround option? Reply
  • Rick83 - Tuesday, June 29, 2010 - link

    So it seems this caters to the SLI crowd only - rendering across two screens (in full-screen mode) is still a big nono.

    For shame nVidia, for shame....
  • Rick83 - Wednesday, June 30, 2010 - link

    And I got the confirmation from nvidia...

  • cknobman - Tuesday, June 29, 2010 - link

    if you have to spend double the cash to do the same thing (even less) than your competitor can do. Reply
  • beginner99 - Tuesday, June 29, 2010 - link

    Not an issue IMHO. Only enthusiasts use this anyway and the tend to have multi-gpu anyway. Not to mention that playing at these resolutions with high settings probably needs 2 gpus anyway. Reply
  • DanNeely - Tuesday, June 29, 2010 - link

    The only 3 common monitor setup that's really viable on a single GPU is 3x1280x1024, which is about the same number of pixels as a 30" 2560x1600 display, going higher without turning off most of the eye candy isn't really practical on newer games without dual GPUs so this is more of a problem for marketing than the real world. Reply
  • Rick83 - Tuesday, June 29, 2010 - link

    There is a software solution called SoftTH which will do triple-heading completely in software, without the need for any specific driver. Works only for dx8 and 9 as far as I know. It will even work with cheap cards as secondary outputs, because all rendering is performed on the main card. Reply
  • Lord 666 - Tuesday, June 29, 2010 - link

    Similar processing as a 480 using the Fermi core, but it will have different outputs? Reply
  • DanNeely - Tuesday, June 29, 2010 - link

    I wouldn't hold my breath. The chip in the 460 is aimed at the middle range of the GPU market and was probably almost as far along as the chip in the 465/470/480. We'll almost certainly have to wait until the 5xx series chips to get a 3rd concurrent output. Reply
  • eddyg17 - Tuesday, June 29, 2010 - link

    wow, just wow.
    My boss and I were just talking about how cool it would be to have surround 3d gaming; enter nvidia. Cant wait to hear the reviews. Gonna have to start saving money.
  • xrolando - Tuesday, June 29, 2010 - link

    Do the glasses restrict your field of view enough to prevent you from seeing 100% the two extra monitors? I just remember my movie theater experiences not being very pleasant with the glasses on. Reply
  • killerclick - Tuesday, June 29, 2010 - link

    Wait, what are those lines between the screens? Oh yeah, it's the black plastic the monitors are made of. Three monitor gaming = FAIL! Reply
  • infodan - Tuesday, June 29, 2010 - link

    yeah it's not realistic at all, its not like you'd find anything like that in real life situations that games are commonly modelled on like driving for example. Reply
  • killerclick - Tuesday, June 29, 2010 - link

    Too bad they won't ever line up. Reply
  • newparad1gm - Tuesday, June 29, 2010 - link

    Yes they will, that's exactly what bezel management is for. Reply
  • james.jwb - Tuesday, June 29, 2010 - link

    he meant the borders of the windows on the car and the bezels of the monitors.

    He has a point, bezels do need to -- and will -- become smaller in the future. Until then there will definitely be a chunk of people willing to hold off.
  • DanNeely - Tuesday, June 29, 2010 - link

    Some of the control circuitry for an LCD has to be on the side. Backlights don't have to be, but until Joe Moron is convinced that bezels are more of a problem than thickness they're going to continue to end up there on anything except high end displays (edge back lighting is less even than the back lights behind the panels) Reply
  • Ninjahedge - Tuesday, June 29, 2010 - link

    I think 3 is fine. 2 is where the problem has been. When using 2 monitors, most games are designed to have your FOV centered on teh screen, not off to one side or another (even on RTS games, you can "center view".

    Trying, I forget that Magic RTS, "Magic V" maybe? Trying that game was irritating with the seperation of screens right there in front of you.

    What needs to be done is simple. 3 screens works, but getting 3 screens the same size is difficult, especially with the HDTV craze (very short wide screens). 3 square screens, or two square and a wide screen would work well.

    The next is that some games need to WORK with it. You need to have the game developers take advantage of the screen. What might work even with the Duals that many have now is a rebdered screen and a menu screen.

    HELL, they were doing that 15 years ago with AutoCAD! What, they do not think you want your hud on one monitor and the FOV on the other?


    Three screens on one card is the next step. Not many people have $1000+ to get a single 2550x1600 for their desktop....
  • CptTripps - Tuesday, June 29, 2010 - link

    Time to get some new material, you posted the same "exact" thing at Toms. Have you ever played on a three monitor setup? Reply
  • mindbomb - Tuesday, June 29, 2010 - link

    It seems completely inferior to and much less practical than gaming on a big screen lcd tv.
    Just because nvidia and amd are pushing it doesn't mean its a good idea all of a sudden.
  • killerclick - Tuesday, June 29, 2010 - link

    Yeah, the first time on 3 huge screens in Trocadero in London in '93. It sucked back then, too just as it did every time I tried it since. Reply
  • ghitz - Tuesday, June 29, 2010 - link

    agreed. Reply
  • frozentundra123456 - Tuesday, June 29, 2010 - link

    Seems like cool technology. Now all you need is a dedicated power plant for the power and loads of money to buy the equipment.
    Seriously, either this or eyefinity seems like a lot of money to spend on playing a game. But I guess if you can afford it more power to you. Personally, this is way out of my league, unless the prices come down a lot, especially considering most games coming out are sequels or console ports.
  • hackztor - Tuesday, June 29, 2010 - link

    Over the weekend I was at DesertBash lan in Arizona and they had a setup for 3d vision surround and it was pretty cool. Reply
  • Sp12 - Tuesday, June 29, 2010 - link

    Honestly, I think requiring two cards is a significant increase in the cost of entry barrier, and frankly gives ATI a significant advantage.

    I am not an enthusiast buyer. I buy for value, yet I have an eyefinity setup, and my cost of entry was 350$.

    Granted, I already had two IPS monitors for photoshop and needed a new videocard anyway, but I was able to buy one ATI 5850 for 250$ and am able to drive demanding games at 5040*1050 with at least 2xAA and 60 frames.

    I think that's a pretty good value. If I had to upgrade to a dual GPU-setup to do so, I would've ended up buying a new PSU, mobo, and a second video card.
  • seonjie - Friday, October 01, 2010 - link

    demanding game in 5040x1080 with single 5850 is a big no no.with 5040x1080 u only can play old or low detail game. Reply
  • rgladiator - Tuesday, June 29, 2010 - link

    >>In the meantime stay tuned for our full review of NVIDIA’s 3D Vision Surround later this month.

    So that's tomorrow then? :)
  • RaistlinZ - Tuesday, June 29, 2010 - link

    Why is it that we can develop 3D vision technology but still can't make a freakin' LCD with a thin bezel? Reply
  • Ninjahedge - Tuesday, June 29, 2010 - link

    The problem may not be in the ability to do so, but in protection for the components.

    I wonder how they would be able to protect the edge of the screen from shipping damage (limit loss #s) with an ultra-thin edge....

    The only other thing I can think of as a problem would be getting the wiring around the corner to be on the same plane as the LCD screen... Maybe they need to focus more on thin edge than thin front-to-back?

    Would that be feasable?

    BTW, WT(H) is up with 3D everything? No doubt it is a neat novelty, but until you can actually reach out and TOUCH something in a game, it may be a wasted effort. Most games do not allow that kind of close contact,. and those that do can be hard to control. (Remember Die By The Sword?)..

    Oddly enough, racing games may be the best use for this (both 3D AND 3 Screens)......

    Ah, I remember the game I was having problems with... Heroes of Might and Magic....
  • dailo23 - Tuesday, June 29, 2010 - link

    instead of making ultra-thin edge, i think the solution to the bezel is to have removable bezel on lcd screen. so for normal usage, you still have the bezel for protection.. but if you want to add more monitor, just remove one of the side (or both), and maybe have some mechanism to lock up with another monitor Reply
  • TGressus - Tuesday, June 29, 2010 - link

    Great idea! Reply
  • james.jwb - Tuesday, June 29, 2010 - link

    I like :) Reply
  • Death666Angel - Tuesday, June 29, 2010 - link

    Yeah, how could they possibly manage to ship something as fragile as LCD monitors without edges.... Did you ever have something like glass shipped?

    And this solution still sucks, we need either:
  • nubie - Wednesday, June 30, 2010 - link

    Huh, I bet I could make my own light engine and place panels on it.

    1. Find a 40" LCD TV with a cracked LCD, harvest backlight and frame.

    2. Mount 3 LCDs from computer monitors (with bad backlights or inverters) in front of single large backlight.

    3. ??

    4. Profit.

    Sounds great, especially if you can find the parts broken for little money.
  • miahallen - Tuesday, June 29, 2010 - link

    OK, you say that triple SLI doesn't work with GT200 series cards....but will the technology work with Quad-SLI (2x 295 GTX)? Reply
  • TinksMeOff - Tuesday, June 29, 2010 - link

    I ordered two Galaxy GTX 465 as they blow hot air outside the PC Case and two (more) ASUS LCD 25.5" 2MS VW266H (1920X1200) which should arrive on July 1st. Not to sure what to do with my GTX 285. But right now I am psyched. This set me back $1,100 shopping at all the right places and I am broke but not broken. The two GTX465 were $500 shipped and will beat a single GTX480 in all benchmarks. Two of these cards will nearly have the same power requirements as one GTX480 and the heat from two Galaxy GTX465 is less than one GTX480. For those wanting less power, heat and a faster GPU setup, the GTX465 is sitting pretty if you want nVidia surround sound.

    I remember the Matrox G200 when working at CompUSA back in the day and we were all jazzed about the three monitor support. Neverwinter Nights and Diablo were beauts to behold for expanding your viewing scope. Those cards were $500 if I recall. Then the monitors cost of course. I never bought into it back then because Matrox was so far behind nVidia and ATI in terms of raw power and they didn't seem to want to compete in the raw power arena (and they didn't compete in the end). Now we have AMD and nVidia both offering powerful cards that can do three monitors. Just Lovely! This is the wave of the future. nVidia offering 3D Vision ups the ante.
  • Setsunayaki - Thursday, July 01, 2010 - link

    I remember when OCP did the review on 3x2 monitors on the ATI cards....and for shooters that did not work since the targetting reticle is always centered and the bezels get in the way. Nvidia of course has the same problem....

    I rather own one LARGE 40 - 50 inch LCD monitor under a 16 : 9 Aspect Ratio (so i can also watch DVDs and Blu-Ray disks at correct aspect Ratio) and even play games under the same Aspect Ratio, simply for synchronization purposes with most media out there...

    ...than have multiple monitors and video cards eating up kw of power just to find that I can't even maintain enough framerate to perfectly render this new technology at max settings due to the heavy graphical requirements for the next generation of games being released....

    Years and Years later....we still don't have one video card that can run Crysis on max settings and break 60 FPS...though we have the first video card combination SLI that can actually do it....which means next generation of games + this new Nvidia technology...Don't make me laugh when you have to wait YEARS to get a worthy 3D gaming experience due to the lack of Framerates, but if you are willing to go barebones in your graphics, im sure you can have some experience...but thats not what video cards are made for...to go barebones :(
  • TinksMeOff - Thursday, July 01, 2010 - link

    Hardware Canucks put out some quick numbers for a GTX480 SLI. I am more interested in 2D Surround than I am 3D Surround. The numbers shown aren't depressing me one bit especially for beta drivers.

  • TinksMeOff - Friday, July 02, 2010 - link

    FYI Update, after upgrading my system last night with two Galaxy GTX465 Cards and two more ASUS LCD 25.5" 2MS VW266H (1920X1200), I got these benchmark in the built in Farcry2 Benchmark test. I particularly like the Min Benchmark findings

    Single Monitor SLI benchmarks
    Average - 107.21
    Max - 158.34
    Min - 82.44

    Three Monitor SLI - 2D Surround Benchmarks
    Average - 62.12
    Max - 81.12
    Min - 49.92

    System spec:

    CASE ANTEC 900 (ver 1)
    MB EVGA E760-A1 X58 Classified
    CPU INTEL|CORE I7 975 3.33G OC'd @ 30x133 4Ghz -
    TWO Galaxy GTX 465 SLI 1024MB
    Noctua NH-U12P SE1366 120mm
    2 WD Caviar HD WD6401AALS 640GB RAID 0
    1 WD Caviar HD WD6401AALS 640GB data drive
    SB XFi PCI
    THREE ASUS LCD 25.5" 2MS VW266H (1920X1200)
    Windows 7 64bit Premium
  • tnygwek - Friday, July 02, 2010 - link

    "But with 3D Vision, horizontal linear polarization comes in to play: because both the monitor and the glasses are polarized for glare reduction and image blocking respectively, they have to be properly aligned. Anyone who has tilted their head when viewing 3D through a linear system has seen what happens if the screen and glasses are not aligned: the polarization blocks the entire image. As a result 3D Vision Surround is not currently usable in portrait mode when used in conjunction with an LCD monitor – only projectors are supported."

    I though that Nvidia 3DVision only used shutter glass technique for image blocking and no polarization at all.
    Do they really have additional linear polarization for glare reduction? I can not see anything related to that in any document from Nvidia or article.
    I do not have any 3DVision kit so I can not do the tilt test.
  • hcforde50 - Friday, July 02, 2010 - link

    I have not read through all of the post but a major problem I see with 3D as it is is that if one monitor goes down you are in deep trouble. They seem to change monitor models every 6 months or so. If one went out after a year or so you may have to pay a lot to find an EXACT replacement of buy three new monitors.

    Nvidia is going to have to deal with this somehow or face some dissapointed customers when a monitor breaks down.
  • TinksMeOff - Friday, July 02, 2010 - link

    Good Point. They need to have similar specs, not the exact same brand. Even yet, a Manufacture Brand that supplies an FL Inverter Board (an inexpensive small part that gives powers to the screen) with an easy replacement slot on the backside panel may solve a lot of FUD over the issue.

    On the flip side, you should still have two monitors available until the new/replacement monitor arrives. You may also have one great excuse to the wifey (or yourself) that you need to upgrade all three, LOL.

    Length of warranties or extended warranties will play a good factor I would think.

  • Fermion Alpha - Sunday, July 11, 2010 - link

    I try using the forum to post a question but it won't let me so I figure I could use this threat since is 3d related. Basically I am very confused in how to get a 3d setup working. I Have a Radeon 5850 and I read somewhere that AMD is now offering third party 3D solutions. But I can't find a review anywhere and an list of company I could get glasses from. The other thing, Will AMD 3D "vision" work on my Samsung 120Hz 22in monitor ? or is this monitor only good for Nvidia's 3D ? Finally my last question, I keep hearing you need a 120Hz monitor to play 3D on my computer. does that mean my game has to play at 120fps ? For instance Battlefield Bad Company 2 plays at 80 FPS on my computer and sometimes deeps to 40 frames. does this mean the 3d is going to look screwed up on my computer? One more final question, How much extra graphics power does 3D take. I keep reading it takes 2 times the power since the card has to render 2 screens. Does that mean my Battlefield Bad Company 2 will play between 40 and 20 fps if I use 3D? I read your website everyday and this place is grate to finding answers to technology questions. Thank you for reading. Reply
  • Rukur - Sunday, September 19, 2010 - link

    This tech leads me to think of multiple projectors and hence no bezel issues. Why hasn't anyone done it yet ? Reply

Log in

Don't have an account? Sign up now