The NVIDIA Experience, Look and Feel

Oh this is such a mixed bag. Much of this is going to be personal preference, so this feedback is mine combined with that of family and friends who came to check it out. I like numbers, but this really is more of an experience type of situation, and I'll do my best with it.

When it works it works really well and looks simply amazing. It's simple to adjust to a degree that is comfortable and doesn't cause huge amounts of eye strain. Because you do have to focus on objects at different depths, your eyes are working harder than when playing a normal game, and it forces you to do more looking at things rather than using your peripheral vision and just reacting like I often do when gaming. When it's done right (especially with out of screen effects) it fundamentally changes the experience in a very positive way.

But ...

In many games we tested there were some serious drawbacks. Even games that NVIDIA rated the experience as "excellent" we felt were subpar at best. Fallout 3 had some ghosting effects that we couldn't fix, and it just didn't feel right for example. Games with an excellent rating most of the time still require reducing some settings to a lower level like FarCry 2 where the lower quality shadows really take away from the experience. If anyone is going out of their way to buy a 120Hz LCD panel, a high end NVIDIA graphics card and a $200 bundle of active shutter glasses, they are not going to be happy when told to reduce any quality settings. But thats just how it is right now.

Other games, like Crysis Warhead, that received a rating of "good" were nothing if not unplayable with stereoscopic effects. Even turning shadows, shaders, postprocessing, and motion blur and using NVIDIA's stereo crosshairs didn't help when there was any fire, smoke, explosion, or water anywhere around. When those effects pop up (which is all the time) everything goes to hell and you can't focus on anything. It just destroys the experience and you get reduced image quality. A great package.

NVIDIA has said that they are still working with the profiles and with developers to help improve the experience. They have been and are trying to get developers to add stereo friendly effects to their games through patches, but that's just not in the budget for some studios. But NVIDIA needs to be more realistic with their rating systems. At this point, we would recommend taking a look at any game not rated excellent and just writing it off as something that won't offer a good experience. Then take the games rated excellent and assume you'll either have to disable some effects or live with some minor annoyance in a good many of them. For ratings to be taken seriously they need to be accurate and right now they are just not telling the right story.

RTS like Age of Empires or games with a 3/4 view tend to look the best to me. There is a fixed depth and you don't need to do lot of refocusing, but the 3D really grabs you. It actually looks a bit like one of my daughter's pop-up books, but infinitely cooler.

First person shooters are sort of hit and miss, as one of the best looking games was Left 4 Dead, but large outdoor environments like in Fallout 3 can degrade the experience because of the huge difference in actual depth contrasted by the lack of stereoscopic depth at extreme distances: you can only go so deep "into" or "out of" the monitor, and big worlds just aren't accommodated.

Simulation games can look pretty good and Race Driver GRID worked well. It would be nice to keep shadows and motion blur, but the tradeoff isn't bad here. The depth actually helped with judging when to start a turn and just how close other drivers really were.

The two effects that stand out the best right now are the out of screen effects in World of Warcraft and the volumetric smoke and lighting in Left 4 Dead. In L4D, fire the pistol real fast and you can see the smoke pouring out of the barrel curl around as if it were really floating there. Properly done stereoscopic volumetric effects and out of screen effects add an incredible level of realism that can't be overstated. Combining those and removing all problems while allowing maximum image quality would really be incredible. Unfortunately there isn't anything we tested that gave us this satisfaction.

We do also need to note that, while no one got an instant headache, everyone who tested our setup felt a little bit of eye strain and slight pressure between the eyes after as little as 15 minutes of play. One of our testers reported nausea following the gaming session, though she happens to suffer from motion sickness so this may have played a part in it. Of course, that's also very relevant information as no one wants to take dramamine before gaming.

Not Just Another Pair of Glasses: GeForce 3D Vision at Work Final Words
Comments Locked

54 Comments

View All Comments

  • Matt Campbell - Thursday, January 8, 2009 - link

    One of my roommates in college had a VR helmet he used to play Descent, and was interning at a company designing (then) state-of-the-art updates to it. It was pretty wild to try, and hysterical to watch the person in the chair dodging and moving as things flew at them. It was really dodgy on support though, and gave most of us a headache after about 10 minutes. Now it's over 10 years later, and it doesn't sound like much has changed.
  • crimson117 - Thursday, January 8, 2009 - link

    VR helmets were more about making your real head's position guide your avatar's head's position than about providing stereoscopic 3D.
  • Holly - Thursday, January 8, 2009 - link

    They did the both. It had tiny screen for each eye...

    .. reminds me lovely days of System Shock :'(
  • Dfere - Thursday, January 8, 2009 - link

    So. Mediocre equipment with mediocre drivers. Gee, why would anyone want us to buy it?

    Am I the only one getting a feeling this is a start of something designed to suck up more GPU power and/or sell SLI as a mainstream requirement? After all, resolutions and FPS increases can't alone fuel the growth Nvidia and ATI would like.
  • PrinceGaz - Thursday, January 8, 2009 - link

    I think you are being hopelessly negative about why nVidia would be doing this.

    What advantage do they gain by a move towards stereoscopic 3D glasses? Okay, increased 3D rendering power is needed as each frame has to be rendered twice to maintain the same framerate, but GPU power is increasing so quickly that is almost a non-issue, so SLI is irrelevant... NOT.

    The main problem with stereoscopic rendering is each consecutive frame has to be rendered from a different perspective, and only every second frame is directly related to the one before it. That seems to be so nicely connected to what SLI AFR mode provides that it is too good to be true. One card does the left-eye in SLI AFR, the other the right-eye, and with suitably designed drivers, you get all the normal effects which rely on access to the previous frame (motion-blur etc) but in a "3D graphics system" sell twice as many cards as one card is doing each eye. They're not daft-- stereoscopic display is going to make dual GPU cards not just a luxury for the high-end gamer, but a necessity for normal gamers who want a satisfactory 3D experience.
  • Gannon - Thursday, January 8, 2009 - link

    ... for nvidia to co-operate with monitor manufacturers and implement 3D in the display itself instead of these half-baked attempts at depth. Nobody really wants to wear special glasses so they can have 3D depth perception on their computer.

    The only way you are going to standardize something like this (because people are lazy and ignorant, lets face it) is to do it at the point where everybody gets it so it is standardized - everyone needs a monitor with their computer, so it would make sense to work towards displays that either:

    1) Are natively 3D or
    2) Built the costly stereoscopy into the monitor itself, thereby reducing costs through economies of scale.

    I really think current shutter based stereoscopic 3D is a hold-over until we start to get real 3D displays. If I was nvidia I'd want to do it on the monitor end, not as an after-market part targetted towards gamers at a $200 price point.
  • nubie - Friday, January 9, 2009 - link

    Try Passive glasses, weight is next to nothing, no moving parts, no batteries.

    Just polarization that works off of native LCD tech:


    http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...

    nVidia dropped support for this, boo/hiss.
  • rcr - Thursday, January 8, 2009 - link

    Is there the possibility to just use an SLI-system to get rid of these problems about the visual quality. So would it be possible to let every Graphiccard do the calculations for every eye and so you could the same Quality as on one card?
  • wh3resmycar - Thursday, January 8, 2009 - link


    what do you guys think? how about ViiBii?
  • JonnyDough - Thursday, January 8, 2009 - link

    No, actually the picture says "AT" just in case anyone couldn't see it. :-)

Log in

Don't have an account? Sign up now