Final Words

All in all, this is like a very polished version of what we've had since the turn of the century. No flicker, less headaches (though there may still be some issue with people who have motion sickness -- I just don't have a large enough sample size to say definitively), and broad game support with less of a performance hit than other solutions. NVIDIA has a very good active shutter stereoscopic solution with GeForce 3D Vision. But the problem is that its value is still very dependent on the application(s) the end user wants it for. It works absolutely perfectly for viewing stereo images and 3D movies (which might be more of a factor when those start coming to bluray) and applications built with stereo support. But for games, though it works with 350 titles, it's just a little hit or miss.

We really hate to say that because we love seeing anyone push through the chicken and egg problem. NVIDIA getting this technology out there and getting developers excited about it and publishers excited about a new market will ultimately really make this a reality. But until most devs program in ways that are friendly to NVIDIA's version of stereo rendering, the gaming experience will be either good or not so good and there's just no way of knowing how much each individual title's problems will bother you until you try it. And at $200 that's a bit of a plunge for the risk. Especially if you don't have a 120Hz display device (which will cost several hundred more).

If you absolutely love a few of the games that works great with it, then it will be worth it. The problem is that NVIDIA's rating make is so that you can't rely on "excellent" as being excellent. Most of the people who played with Left 4 Dead loved it, but one person was really bothered by the floating names being 2D sprites at screen depth. Which is annoying, but the rest of it looked good enough for me not to care (and I'm pretty picky). If NVIDIA wants to play fast and loose with it's ratings, thats fine, but we don't have time to tests all their games and confirm their rating or come up with our own. They really should at least have another class of rating called "perfect" where there are absolutely no issues and all settings work great and we get exactly what we expect.

Shutter glasses have been around for a long time. Perhaps now the time is right for them to start pushing into the mainstream. But NVIDIA isn't doing the technology any favors if they put something out there and let it fail. This technology needs to be developed and needs to be pervasive because it is just that cool. But until it works perfectly in a multitude of games or until 3D movies start hitting PCs near you, we have the potential for a set back. If GeForce 3D Vision is successful, however, that will open the door for us to really move forward with stereoscopic effects.

What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily. Relying on NVIDIA to discern the proper information and then handle rendering images for both eyes off of one scene is great as a stop gap, just like CUDA was a good interim solution before we had OpenCL. We need the API to be able to handle knowing if there is stereo hardware present and making it easy to generate images for both eyes while duplicating as little work as possible. Giving developers simple tools to make stereo effects cooler and more real or to embed hits about convergence and separation would be great as well.

And hopefully GeForce 3D Vision is a real step toward that future that can become viable right now. I could see some World of Warcraft devotees being really excited about it. Those out there like me who love 3D technology in every form will be excited by it. People who want to create there own stereo images or videos (there are lenses available for this and techniques you can improvise to make it work) will like it, but people waiting for 3D movies will need some content available at home first. But the guys who we would love to see drive the adoption of the technology might not be as into it. The hardcore gamers out there looking to upgrade will probably be better served at this point by going with a high end graphics card and a 30" display rather than a 120Hz monitor and shutter glasses.

The NVIDIA Experience, Look and Feel
Comments Locked

54 Comments

View All Comments

  • marc1000 - Thursday, January 8, 2009 - link

    I had one of these... and I had it with the 3d glasses!!! it was a 8bit console, with bad-looking games, the 3d glass was conected to the console via a cable, and the pace of changing the eyes was so slow you could se it if you pay enough attention. but it worked. and worked with any simple TV out there. however it was only FUN, no good images in reality... it's nice to see this technology come back to life!
  • JonnyDough - Thursday, January 8, 2009 - link

    60hz should be the MINIMUM. Not the STANDARD. Even @ 60hz you tend to get a good bit of eye strain. I don't know how the monitor/tv industries get away with the mere 60hz. I for one STILL get headaches. Doesn't anyone else?
  • ViRGE - Thursday, January 8, 2009 - link

    On an LCD? No. Which is why all this talk of strain is silly; the strain was a product of the flickering in a CRT, there's no reason anyone should be straining on a LCD.
  • PrinceGaz - Thursday, January 8, 2009 - link

    120hz LCD panel is probably enough to say where your testing went wrong and your problems with ghosting and other issues began.

    You must use a display with a native almost instant response, and no LCD panel to date can provide that (regardless of how much overdrive is given to nasty poor-quality but fast-response TN panels). You should have went old-school and used a high-quality CRT at 120hz refresh-rate, like many pro-gamers still use, or if available an OLED display as they would also be able to cope properly with 120hz refresh. Hell, I've got an old 15" CRT sitting on my desk which is capable of 640x480 @ 120hz which would almost certainly have done a better job of testing your 3D goggles than whichever LCD panel you used.

    Ghosting would almost certainly have been a non-issue with a CRT running at 120hz, and having the left and right-eye images not having some of the other eye image also still visible (because of LCD response-time) would almost certainly have made it look a lot better.
  • DerekWilson - Friday, January 9, 2009 - link

    Not that kind of ghosting ... it didn't have to do with the panel -- everything looked fine on that end. I'm using the samsung 5ms 120Hz 1680x1050 monitor. the image looked smooth.

    after talking with nvidia, it seems the ghosting issues were likely from convergence being off (where the look at points for each eye camera are set) causing problems. convergence can be adjusted with hot keys as well, but i didn't play with this.

    eye strain didn't appear to be from flicker either -- it's more about the exercise of focusing on things that aren't there ... tweaking the depth (separation) and your distance from the monitor can make a big difference here. a CRT would not have made a difference. i do have a sony gdm f520, but its just not at the lab ...
  • ssiu - Thursday, January 8, 2009 - link

    Yes you can use the NVIDIA glasses with analog CRT monitors with 100Hz-120Hz refresh rate.
  • ssiu - Thursday, January 8, 2009 - link

    Anyone interested in this should also check out and compare it with the competitor solution from iZ3D http://www.iz3d.com/">http://www.iz3d.com/ The 2 solutions each have their pros and cons, but iZ3D is significantly cheaper (MSRP $400 versus $600 ($200 glasses + $400 120Hz monitor)). iZ3D works with both ATI and NVIDIA video cards, and ATI users get an extra $50 rebate.
  • simtex - Thursday, January 8, 2009 - link

    This looks very promising, if nvidia really want to push this rather old technology forward again I'm sure they can do so.

    OpenGL have had built in support for the buffers you need to create stereoscopic images for years, in fact since version 1.1 if im not mistaken, so that is really no excuse for developers not using it.

    And the suggestion that nvidia should just make a 3d monitor, what technology do are you refering to here, because as far as I know there is no technology capable of creating 3d images on a tradiional flat 2d monitor.
  • crimson117 - Thursday, January 8, 2009 - link

    I can only find one, and it's bundled with these glasses :)

    http://www.tigerdirect.com/applications/searchtool...">http://www.tigerdirect.com/applications...904&...
  • ssiu - Thursday, January 8, 2009 - link

    The other announced 120Hz monitor is Viewsonic VX2265wm.

    http://www.viewsonic.com/company/news/vs_press_rel...">http://www.viewsonic.com/company/news/vs_press_rel...

    http://www.viewsonic.com/products/desktop-monitors...">http://www.viewsonic.com/products/deskt.../lcd/x-s...

Log in

Don't have an account? Sign up now