Not Just Another Pair of Glasses: GeForce 3D Vision at Work

While the GeForce 3D Vision solution does include some pretty nice glasses, that's not where it ends. We'll start there though. The glasses are small and light weight with polarized LCD lenses. The glasses contain a battery, LCD controller and IR receiver. They charge over USB and can be plugged in to the system while in operation or not. We haven't done a full battery life test, NVIDIA claims battery life to be about 40+ hours of operation on a single charge. We can say that we haven't needed to recharge in all our testing, which was a good handful of hours over a few days.

We were a little surprised that NVIDIA went with IR at first, but it makes a lot of sense from a battery life perspective. Though the glasses need line of sight with the transmitter, you do need line of sight to the monitor to see it anyway, so it's not a huge deal. There can be issues with having a bunch of people in the same room using the glasses or if there are other IR devices transmitting around the room. We didn't have enough equipment to really push it till it broke, but it did stand up to throwing a Wii and a universal remote at it.

The transmitter connects to the PC via USB or it can connect to a stereoscopic monitor with a standard stereo connector. The transmitter also has a wheel on it for adjusting the "depth" of the image (this actually adjusts the separation of the left and right images). This is fairly convenient and can hook into multiple devices and be adjusted for the comfort of the user fairly quickly.

But that's not really the special sauce part. The real meat of the solution is in their driver, not the hardware. Aside from the fact that you need a 120Hz display that is.

As we mentioned, either an application needs to be developed for stereoscopic rendering, or it needs some external "help" from software. In 3rd party cases, this is a wrapper, but in NVIDIA's case they built it right in to the driver. The advantage NVIDIA has is that they can go really low level if they need to. Of course, this is also their downfall to some degree, but I'll get to that later.

When rendering 3D games, a virtual "camera" is floated around the world at what is considered the viewers eye position. This camera has a "look at" point that tells us where the it's pointed. At the most brute force level (which really isn't done), we could take the camera for a particular game state and render two frames instead of one. With the first, we could move the camera a little left and the look at point a little right. For the next frame we could move the camera a little right and the look at position a little left. The point where the line of sight of the cameras cross is actually at screen depth. The game thinks it's just rendered one frame, but it's actually happened twice. The problem here is that we've got to rely on a high consistent frame rate for the game which just isn't going to happen with modern titles. But it helps illustrate the effect of what is going on: these two different camera views have to get to the screen some how.

Since the scene shouldn't change between rendering for each eye, we have the advantage that the geometry is consistent. All we need to do is render everything from two different positions. This means things can be sped up a good bit. NVIDIA tries to do as little as possible more than once. But they are also very uninterested in sharing exactly how they handle rendering two images. Once the rendered images have been placed in the frame buffer, the left and right eye images bounce back and forth in alternating frames. This makes it so that slow frame rate from the game doesn't affect the stereoscopic effect. There is no flickering and none of the instant headache feeling that older solutions were prone to.

One of the down sides with whatever optimizations NVIDIA is doing is that some effects that don't have readily accessible depth information are not rendered correctly. This means many post processing effects like motion blur, corona and bloom effects, some advanced shadowing implementations and certain lighting shaders, and some smoke fire water and other effects. Rendering the full scene twice could help alleviate some of these issues, but others just can't be fixed without a little extra help. And this is where NVIDIA comes through.

One of the biggest problems is knowing what settings to run at in a game so that the stereoscopic effects look as correct as possible. Because of NVIDIA's extensive game profiling for SLI, they are able to add additional profile information for stereo effects. A little information window pops up to tell you exactly what settings you need to worry about when you run a game. This window is enabled by default until you turn it off for a particular game. NVIDIA has also rated a bunch of games in terms of the experience with stereo effects which can help let people know what to expect from a particular game.

Beyond this, another common problems is rendering crosshairs or other sighting aides as a 2D sprite at the center of the screen rather than at the depth of the thing behind it. Many games render the crosshairs at object depth, but many also render it to screen depth. Luckily, many games give you a way to disable the in-game crosshairs and NVIDIA has provided their own set of stereoscopic crosshairs that render correctly. This is very helpful, as a 2D object at screen depth in the middle of the screen looks the same as if you were looking about 3 feet ahead of you while holding your finger up about 3 inches in front of your nose.

Leveraging their extensive work with developers, NVIDIA is also hoping to get new games to better support stereoscopic effects. While some of the anomalies are a result of NVIDIA's method rather than the developer, encouraging and assisting developers in implementing their desired effects in a stereo friendly way will help pave the way for the future. They can even get developers to include information that allows them to render some effects out of the screen. And this isn't cheesy theatre out of screen: this is in your face feels like you can touch it out of screen. Currently World of Warcraft Wrath of the Litch King has some out of screen effects that pretty sweet, but that really just leaves us wanting more.

More 3D than 3D: Stereoscopic Defined The NVIDIA Experience, Look and Feel
Comments Locked

54 Comments

View All Comments

  • fishbits - Thursday, January 8, 2009 - link

    "What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily."
    Would be nice, but whatever works and is actually implemented.

    Nvidia could come up with a "3d glasses thumbs-up" seal of approval for games that get it right, and it could be displayed on the packaging. This would furter encourage developers to get on board. Heck, NV could have traveling demo rigs that sit in a Gamestop/Best Buy for a week, playing games that have superior compliance. Good for sales of the game(s), good for sales of the glasses.

    I've done the old shutter glasses, was a neat novelty, but wears thin as Derek says. Sounds like these are currently only a bit better with current titles in most cases. *IF* they get this right and all major titles released support the system well, I'd buy the glasses right away. The new monitor too. But they have to get it right first.

    This might work for the next generation of consoles too, albeit if hooked up to a high-refresh monitor possibly. Great selling point, another reason to get this right and off the ground. Of course Sony/Nintendo/MS might just make their own solution, but whatever gets the job done. If only one had this feature implemented well, it could be a big tie-breaker in winning sales to their camp.
  • Judgepup - Thursday, January 8, 2009 - link

    Been waiting for the next revolution in gaming and after all the bugs have been worked out, this looks like it could be a contender. I'm typically an early adopter, but I'm fairly happy with a physical reality LCD at this point. Will wait in the wings on this one, but I applaud the Mighty nVidia for taking this tech to the next level.
  • Holly - Thursday, January 8, 2009 - link

    Although I am great supporter of 3Ding of virtual worlds, there are really huge drawbacks in this technology nVidia presented.

    First of all, the reason why LCDs did not need to keep as high refresh rate as CRTs was the fact that LCD screen intensity doesn't go from wall to wall - 100% intensity to 0% intensity before another refresh (the way of CRT). This intensity fluctuation is what hurts our eyes. LCDs keep their intensity much more stable (some say their intensity is totaly stable, though I have seen some text describing there is some minor intensity downfall with LCDs as well, can't find it though). Back back on topic... we either went 100Hz+ or LCD to save our eyes.

    Even if we ignore software related problems there is still problem... The flickering is back. Even if the screen picture is intensity stable these shutter glasses make the intensity go between 0-100% and we are back to days of old 14" screens and good way to get white staff sooner or later. Even if we have 120Hz LCDs, every eye has to go with 60Hz pretty much same as old CRTs. This just won't work. For longer use (gaming etc.) you really need 85Hz+ of flickering not to damage your eyes.

    Another point I am curious about is how the GeForce 3D Vision counts in screen latency. It's not that long AT presented review of few screens with some minor whine about S-PVA latency coming way up to like 40ms. Thing is this latency could very easily cause that the frame that was supposed for left eye gets received by right eye and vice versa. I can imagine nausea superfast TM out of that (kind of effect when you drink too much and those stupid eyes just can't both focus on one thing).

    I believe this stereoscopy has a future, but I don't believe it would be with shutter glasses or other way to switch 'seeing' eye and 'blinded' eye.
  • PrinceGaz - Thursday, January 8, 2009 - link

    The answer is simple, move from LCD technology to something faster, like OLED or SED (whatever happened to SED?).

    Both of those technologies are quite capable of providing a true 200hz refresh that truly changes the display every time (not just changes the colour a bit towards something else). A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye).

    I do think 120hz split between two eyes would quite quickly give me a serious headache as when I used a CRT monitor in the past and had to look at the 60hz refresh before installing the graphics-card driver, it was seriously annoying.
  • Rindis - Friday, January 9, 2009 - link

    "A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye)."

    Almost is the key word here. I'm okay with 75Hz CRTs unless I'm staring at a blank white screen (Word), and by 85 I'm perfectly fine.

    However, my roommate trained as a classical animator (which means hours of flipping through almost identical drawings) and could perceive CRT refresh rates up to about 115Hz. (She needed expensive monitors and graphics cards....) Which would demand a 230+Hz rate for this technology.
  • paydirt - Friday, January 9, 2009 - link

    It's strange. I definitely notice when a CRT has a 60 Hz refresh rate. I have been gaming with a 29" LCD with 60 Hz refresh rate for about 4 years now and don't notice the refresh rate.
  • DerekWilson - Friday, January 9, 2009 - link

    That's because the CRT blanks while the LCD stays on. With an LCD panel, every refresh the color changes from what it was to what it is. With a CRT, by the time the electron gun comes around every 60Hz, the phosphorus has dimmed even if it hasn't gone completely dark. 60Hz on a CRT "flashes" while 60Hz on an LCD only indicates how many times per second the color of each pixel is updated.
  • SlyNine - Thursday, January 8, 2009 - link

    Yes but LCD's have ghosting, unless you can purge that image completely the right eye would see a ghost of the left eye. If you ever looked at the stills from testing LCD ghosting, you will see that even the best LCD's ghosts last for 2 frames.

    The best TV I can think of to use this with is the 7000$ Laser TV from Mitsubishi.

    Why can they not use dual videocards for this, Have one frame buffer be the left eye and the other be the right, then even if the car has yet to finish rendering the other image just flip to the last fully rendered frame.
  • Holly - Thursday, January 8, 2009 - link

    I think the result would be quite bad. You could easily end up in situation where one eye card runs 50 FPS while other eye card would be on 10FPS (even with the same models... the different placement of camera might invalidate big part of octree causing the FPS difference. Not sure how the brain would handle such a difference between two frames, but I think not well...
  • SlyNine - Thursday, January 8, 2009 - link

    You know what, I skimmed every thing you wrote, and rereading it I realize the error I made.

    My bad.

Log in

Don't have an account? Sign up now