Not Just Another Pair of Glasses: GeForce 3D Vision at Work

While the GeForce 3D Vision solution does include some pretty nice glasses, that's not where it ends. We'll start there though. The glasses are small and light weight with polarized LCD lenses. The glasses contain a battery, LCD controller and IR receiver. They charge over USB and can be plugged in to the system while in operation or not. We haven't done a full battery life test, NVIDIA claims battery life to be about 40+ hours of operation on a single charge. We can say that we haven't needed to recharge in all our testing, which was a good handful of hours over a few days.

We were a little surprised that NVIDIA went with IR at first, but it makes a lot of sense from a battery life perspective. Though the glasses need line of sight with the transmitter, you do need line of sight to the monitor to see it anyway, so it's not a huge deal. There can be issues with having a bunch of people in the same room using the glasses or if there are other IR devices transmitting around the room. We didn't have enough equipment to really push it till it broke, but it did stand up to throwing a Wii and a universal remote at it.

The transmitter connects to the PC via USB or it can connect to a stereoscopic monitor with a standard stereo connector. The transmitter also has a wheel on it for adjusting the "depth" of the image (this actually adjusts the separation of the left and right images). This is fairly convenient and can hook into multiple devices and be adjusted for the comfort of the user fairly quickly.

But that's not really the special sauce part. The real meat of the solution is in their driver, not the hardware. Aside from the fact that you need a 120Hz display that is.

As we mentioned, either an application needs to be developed for stereoscopic rendering, or it needs some external "help" from software. In 3rd party cases, this is a wrapper, but in NVIDIA's case they built it right in to the driver. The advantage NVIDIA has is that they can go really low level if they need to. Of course, this is also their downfall to some degree, but I'll get to that later.

When rendering 3D games, a virtual "camera" is floated around the world at what is considered the viewers eye position. This camera has a "look at" point that tells us where the it's pointed. At the most brute force level (which really isn't done), we could take the camera for a particular game state and render two frames instead of one. With the first, we could move the camera a little left and the look at point a little right. For the next frame we could move the camera a little right and the look at position a little left. The point where the line of sight of the cameras cross is actually at screen depth. The game thinks it's just rendered one frame, but it's actually happened twice. The problem here is that we've got to rely on a high consistent frame rate for the game which just isn't going to happen with modern titles. But it helps illustrate the effect of what is going on: these two different camera views have to get to the screen some how.

Since the scene shouldn't change between rendering for each eye, we have the advantage that the geometry is consistent. All we need to do is render everything from two different positions. This means things can be sped up a good bit. NVIDIA tries to do as little as possible more than once. But they are also very uninterested in sharing exactly how they handle rendering two images. Once the rendered images have been placed in the frame buffer, the left and right eye images bounce back and forth in alternating frames. This makes it so that slow frame rate from the game doesn't affect the stereoscopic effect. There is no flickering and none of the instant headache feeling that older solutions were prone to.

One of the down sides with whatever optimizations NVIDIA is doing is that some effects that don't have readily accessible depth information are not rendered correctly. This means many post processing effects like motion blur, corona and bloom effects, some advanced shadowing implementations and certain lighting shaders, and some smoke fire water and other effects. Rendering the full scene twice could help alleviate some of these issues, but others just can't be fixed without a little extra help. And this is where NVIDIA comes through.

One of the biggest problems is knowing what settings to run at in a game so that the stereoscopic effects look as correct as possible. Because of NVIDIA's extensive game profiling for SLI, they are able to add additional profile information for stereo effects. A little information window pops up to tell you exactly what settings you need to worry about when you run a game. This window is enabled by default until you turn it off for a particular game. NVIDIA has also rated a bunch of games in terms of the experience with stereo effects which can help let people know what to expect from a particular game.

Beyond this, another common problems is rendering crosshairs or other sighting aides as a 2D sprite at the center of the screen rather than at the depth of the thing behind it. Many games render the crosshairs at object depth, but many also render it to screen depth. Luckily, many games give you a way to disable the in-game crosshairs and NVIDIA has provided their own set of stereoscopic crosshairs that render correctly. This is very helpful, as a 2D object at screen depth in the middle of the screen looks the same as if you were looking about 3 feet ahead of you while holding your finger up about 3 inches in front of your nose.

Leveraging their extensive work with developers, NVIDIA is also hoping to get new games to better support stereoscopic effects. While some of the anomalies are a result of NVIDIA's method rather than the developer, encouraging and assisting developers in implementing their desired effects in a stereo friendly way will help pave the way for the future. They can even get developers to include information that allows them to render some effects out of the screen. And this isn't cheesy theatre out of screen: this is in your face feels like you can touch it out of screen. Currently World of Warcraft Wrath of the Litch King has some out of screen effects that pretty sweet, but that really just leaves us wanting more.

More 3D than 3D: Stereoscopic Defined The NVIDIA Experience, Look and Feel
Comments Locked

54 Comments

View All Comments

  • jkostans - Friday, January 9, 2009 - link

    So how is this different from my ELSA 3d shutter glasses from 1999? The glasses I paid $50 for back then are just as good as this $200 setup in 2009? Great job re-inventing the wheel and charging more for it nVIDIA.

    There is a reson shutter glasses didn't catch on. Ghosting being the worst problem, along with compatibility, loss of brightness/color accuracy, performance hits, the need for high refresh rate, etc etc etc.

    If you are thinking of buying these, don't. You will use them for a few weeks, then just toss them in a drawer due to lack of game support and super annoying ghosting.
  • nubie - Friday, January 9, 2009 - link

    It is different because these are likely ~$400 - $500 quality glasses.

    Check out my setup with high resolution, no ghosting, high compatibility, minimal performance hit:

    http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...

    http://picasaweb.google.com/nubie07/3DMonitor">http://picasaweb.google.com/nubie07/3DMonitor

    Running on iZ3D of course, no need for nVidia at all, buy any card you like, and keep running XP until Microsoft releases another OS worth spending money for.
  • jkostans - Friday, January 9, 2009 - link

    No ghosting?

    http://picasaweb.google.com/nubie07/3DMonitor#5060...">http://picasaweb.google.com/nubie07/3DMonitor#5060...

    I can see it there and thats not even a high contrast situation.

    Shutter glasses are shutter glasses, they all suck regardless of price.
  • nubie - Saturday, January 10, 2009 - link

    OK have a closed mind, technology never advances.

    PS, that picture was taken through a linear polarized lens, and I am holding the camera and the glasses, so they may not have been lined up.

    Also the contrast is automatically set by the camera, in person there isn't any ghosting.
  • Shadowdancer2009 - Friday, January 9, 2009 - link

    Can they PLEASE kill this tech soon?
    It was 100% crap the first time, and it won't get better no matter how awesome the drivers are.

    The glasses eat 50% of the brightness when "open" and doesn't kill 100% when "closed"

    They never did, and your review says the same thing.

    This was crap ten years ago, and it's crap now.

    Give us dual screen highres VR goggles instead.
  • nubie - Friday, January 9, 2009 - link

    Maybe you don't understand the technology, these are ~$400 - $500 glasses, wireless with about a week of li-ion battery power.

    Don't compare them to the $10 ones you can get anywhere, at least try them for yourself.

    There are much better reasons to bash nVidia, like dropping support for 90% of the displays they used to support, and making support Vista only.
  • gehav - Friday, January 9, 2009 - link

    I'm perfectly satisfied with the current refresh rate of LCD-panels (60Hz). However what you forgot is the following: if the 3D glasses open and shut 60 times per second (for a 120Hz Panel) the old flicker of CRTs is effectively back. Therefore raising the refresh rate of the monitor to 240Hz would reduce the per eye flicker to an acceptable 120Hz. Not the monitor itself is the culprit here but the 3D glasses reintroduce flickering like in the old days of CRTs (and they are directly dependent on the refresh rate of the monitor).

    Georg
  • gehav - Friday, January 9, 2009 - link

    btw: 200Hz displays are already on the way, it seems:
    http://www.engadget.com/2008/09/02/sony-samsung-bo...">http://www.engadget.com/2008/09/02/sony...-both-cl...
  • gehav - Friday, January 9, 2009 - link

    Just a thought I had while reading the article:

    Wouldn't a ray traced image work far better for stereoscopic viewing? From what I understand the rasterizing technique used by today's graphics cards uses all kinds of tricks and effects to create the perception of a "real 3D world". That's why the drivers have to be customized for every game.

    Ray tracing uses a far simpler algorithm to get good results. Every light ray is calculated separately and every game that uses ray tracing should therefore - in principle - easily be customizable for stereoscopic viewing.

    I'm thinking of the announced Intel Larrabee which will maybe offer ray tracing acceleration for games and could therefore be much better suited for stereoscopic viewing.

    Not sure if I'm right with these thoughts but it would be interesting to see if games that are already available in a ray tracing version (like Quake 4) could be easily adapted to support stereoscopic viewing and what the result would look like.

    Apart from that I also think we would need faster LCD-panels (240Hz) to get non-flickering pictures for each eye.

    Georg
  • nubie - Friday, January 9, 2009 - link

    Check out some of the other initiatives, notably iZ3D, who have offered a free driver for all AMD products and XP support (double check the nVidia support for XP, non-existent much?)

    nVidia's idea is too little, too expensive, too late. I have built my own dual-polarized passive rig that works great with $3 glasses, unfortunately nVidia has dropped all support (the last supported card is from the 7 series, so "gaming" isn't really an option.)

    Thankfully iZ3D has stepped up to provide drivers, but thanks to nVidia's lack of support I have lost so much money on unsupported 8 series hardware that I haven't looked at a game in a couple years.

    nVidia has killed my will to game. Dropping support of 3D is not the way to sell 3D (do some research, nvidia has dropped XP, supports only vista, and not even any of the cool displays you can cobble together yourself for less than the $200 this stupid package costs.)

    My proof of concept, before nvidia pulled the plug:
    http://picasaweb.google.com/nubie07/3DMonitor#">http://picasaweb.google.com/nubie07/3DMonitor#

    My gaming rig, before nvidia dropped support for ~3 years:
    http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...

    nVidia needs to do better than this, and they should know better.

Log in

Don't have an account? Sign up now