Final Words

All in all, this is like a very polished version of what we've had since the turn of the century. No flicker, less headaches (though there may still be some issue with people who have motion sickness -- I just don't have a large enough sample size to say definitively), and broad game support with less of a performance hit than other solutions. NVIDIA has a very good active shutter stereoscopic solution with GeForce 3D Vision. But the problem is that its value is still very dependent on the application(s) the end user wants it for. It works absolutely perfectly for viewing stereo images and 3D movies (which might be more of a factor when those start coming to bluray) and applications built with stereo support. But for games, though it works with 350 titles, it's just a little hit or miss.

We really hate to say that because we love seeing anyone push through the chicken and egg problem. NVIDIA getting this technology out there and getting developers excited about it and publishers excited about a new market will ultimately really make this a reality. But until most devs program in ways that are friendly to NVIDIA's version of stereo rendering, the gaming experience will be either good or not so good and there's just no way of knowing how much each individual title's problems will bother you until you try it. And at $200 that's a bit of a plunge for the risk. Especially if you don't have a 120Hz display device (which will cost several hundred more).

If you absolutely love a few of the games that works great with it, then it will be worth it. The problem is that NVIDIA's rating make is so that you can't rely on "excellent" as being excellent. Most of the people who played with Left 4 Dead loved it, but one person was really bothered by the floating names being 2D sprites at screen depth. Which is annoying, but the rest of it looked good enough for me not to care (and I'm pretty picky). If NVIDIA wants to play fast and loose with it's ratings, thats fine, but we don't have time to tests all their games and confirm their rating or come up with our own. They really should at least have another class of rating called "perfect" where there are absolutely no issues and all settings work great and we get exactly what we expect.

Shutter glasses have been around for a long time. Perhaps now the time is right for them to start pushing into the mainstream. But NVIDIA isn't doing the technology any favors if they put something out there and let it fail. This technology needs to be developed and needs to be pervasive because it is just that cool. But until it works perfectly in a multitude of games or until 3D movies start hitting PCs near you, we have the potential for a set back. If GeForce 3D Vision is successful, however, that will open the door for us to really move forward with stereoscopic effects.

What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily. Relying on NVIDIA to discern the proper information and then handle rendering images for both eyes off of one scene is great as a stop gap, just like CUDA was a good interim solution before we had OpenCL. We need the API to be able to handle knowing if there is stereo hardware present and making it easy to generate images for both eyes while duplicating as little work as possible. Giving developers simple tools to make stereo effects cooler and more real or to embed hits about convergence and separation would be great as well.

And hopefully GeForce 3D Vision is a real step toward that future that can become viable right now. I could see some World of Warcraft devotees being really excited about it. Those out there like me who love 3D technology in every form will be excited by it. People who want to create there own stereo images or videos (there are lenses available for this and techniques you can improvise to make it work) will like it, but people waiting for 3D movies will need some content available at home first. But the guys who we would love to see drive the adoption of the technology might not be as into it. The hardcore gamers out there looking to upgrade will probably be better served at this point by going with a high end graphics card and a 30" display rather than a 120Hz monitor and shutter glasses.

The NVIDIA Experience, Look and Feel
Comments Locked

54 Comments

View All Comments

  • Matt Campbell - Thursday, January 8, 2009 - link

    One of my roommates in college had a VR helmet he used to play Descent, and was interning at a company designing (then) state-of-the-art updates to it. It was pretty wild to try, and hysterical to watch the person in the chair dodging and moving as things flew at them. It was really dodgy on support though, and gave most of us a headache after about 10 minutes. Now it's over 10 years later, and it doesn't sound like much has changed.
  • crimson117 - Thursday, January 8, 2009 - link

    VR helmets were more about making your real head's position guide your avatar's head's position than about providing stereoscopic 3D.
  • Holly - Thursday, January 8, 2009 - link

    They did the both. It had tiny screen for each eye...

    .. reminds me lovely days of System Shock :'(
  • Dfere - Thursday, January 8, 2009 - link

    So. Mediocre equipment with mediocre drivers. Gee, why would anyone want us to buy it?

    Am I the only one getting a feeling this is a start of something designed to suck up more GPU power and/or sell SLI as a mainstream requirement? After all, resolutions and FPS increases can't alone fuel the growth Nvidia and ATI would like.
  • PrinceGaz - Thursday, January 8, 2009 - link

    I think you are being hopelessly negative about why nVidia would be doing this.

    What advantage do they gain by a move towards stereoscopic 3D glasses? Okay, increased 3D rendering power is needed as each frame has to be rendered twice to maintain the same framerate, but GPU power is increasing so quickly that is almost a non-issue, so SLI is irrelevant... NOT.

    The main problem with stereoscopic rendering is each consecutive frame has to be rendered from a different perspective, and only every second frame is directly related to the one before it. That seems to be so nicely connected to what SLI AFR mode provides that it is too good to be true. One card does the left-eye in SLI AFR, the other the right-eye, and with suitably designed drivers, you get all the normal effects which rely on access to the previous frame (motion-blur etc) but in a "3D graphics system" sell twice as many cards as one card is doing each eye. They're not daft-- stereoscopic display is going to make dual GPU cards not just a luxury for the high-end gamer, but a necessity for normal gamers who want a satisfactory 3D experience.
  • Gannon - Thursday, January 8, 2009 - link

    ... for nvidia to co-operate with monitor manufacturers and implement 3D in the display itself instead of these half-baked attempts at depth. Nobody really wants to wear special glasses so they can have 3D depth perception on their computer.

    The only way you are going to standardize something like this (because people are lazy and ignorant, lets face it) is to do it at the point where everybody gets it so it is standardized - everyone needs a monitor with their computer, so it would make sense to work towards displays that either:

    1) Are natively 3D or
    2) Built the costly stereoscopy into the monitor itself, thereby reducing costs through economies of scale.

    I really think current shutter based stereoscopic 3D is a hold-over until we start to get real 3D displays. If I was nvidia I'd want to do it on the monitor end, not as an after-market part targetted towards gamers at a $200 price point.
  • nubie - Friday, January 9, 2009 - link

    Try Passive glasses, weight is next to nothing, no moving parts, no batteries.

    Just polarization that works off of native LCD tech:


    http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...

    nVidia dropped support for this, boo/hiss.
  • rcr - Thursday, January 8, 2009 - link

    Is there the possibility to just use an SLI-system to get rid of these problems about the visual quality. So would it be possible to let every Graphiccard do the calculations for every eye and so you could the same Quality as on one card?
  • wh3resmycar - Thursday, January 8, 2009 - link


    what do you guys think? how about ViiBii?
  • JonnyDough - Thursday, January 8, 2009 - link

    No, actually the picture says "AT" just in case anyone couldn't see it. :-)

Log in

Don't have an account? Sign up now