The NVIDIA Experience, Look and Feel

Oh this is such a mixed bag. Much of this is going to be personal preference, so this feedback is mine combined with that of family and friends who came to check it out. I like numbers, but this really is more of an experience type of situation, and I'll do my best with it.

When it works it works really well and looks simply amazing. It's simple to adjust to a degree that is comfortable and doesn't cause huge amounts of eye strain. Because you do have to focus on objects at different depths, your eyes are working harder than when playing a normal game, and it forces you to do more looking at things rather than using your peripheral vision and just reacting like I often do when gaming. When it's done right (especially with out of screen effects) it fundamentally changes the experience in a very positive way.

But ...

In many games we tested there were some serious drawbacks. Even games that NVIDIA rated the experience as "excellent" we felt were subpar at best. Fallout 3 had some ghosting effects that we couldn't fix, and it just didn't feel right for example. Games with an excellent rating most of the time still require reducing some settings to a lower level like FarCry 2 where the lower quality shadows really take away from the experience. If anyone is going out of their way to buy a 120Hz LCD panel, a high end NVIDIA graphics card and a $200 bundle of active shutter glasses, they are not going to be happy when told to reduce any quality settings. But thats just how it is right now.

Other games, like Crysis Warhead, that received a rating of "good" were nothing if not unplayable with stereoscopic effects. Even turning shadows, shaders, postprocessing, and motion blur and using NVIDIA's stereo crosshairs didn't help when there was any fire, smoke, explosion, or water anywhere around. When those effects pop up (which is all the time) everything goes to hell and you can't focus on anything. It just destroys the experience and you get reduced image quality. A great package.

NVIDIA has said that they are still working with the profiles and with developers to help improve the experience. They have been and are trying to get developers to add stereo friendly effects to their games through patches, but that's just not in the budget for some studios. But NVIDIA needs to be more realistic with their rating systems. At this point, we would recommend taking a look at any game not rated excellent and just writing it off as something that won't offer a good experience. Then take the games rated excellent and assume you'll either have to disable some effects or live with some minor annoyance in a good many of them. For ratings to be taken seriously they need to be accurate and right now they are just not telling the right story.

RTS like Age of Empires or games with a 3/4 view tend to look the best to me. There is a fixed depth and you don't need to do lot of refocusing, but the 3D really grabs you. It actually looks a bit like one of my daughter's pop-up books, but infinitely cooler.

First person shooters are sort of hit and miss, as one of the best looking games was Left 4 Dead, but large outdoor environments like in Fallout 3 can degrade the experience because of the huge difference in actual depth contrasted by the lack of stereoscopic depth at extreme distances: you can only go so deep "into" or "out of" the monitor, and big worlds just aren't accommodated.

Simulation games can look pretty good and Race Driver GRID worked well. It would be nice to keep shadows and motion blur, but the tradeoff isn't bad here. The depth actually helped with judging when to start a turn and just how close other drivers really were.

The two effects that stand out the best right now are the out of screen effects in World of Warcraft and the volumetric smoke and lighting in Left 4 Dead. In L4D, fire the pistol real fast and you can see the smoke pouring out of the barrel curl around as if it were really floating there. Properly done stereoscopic volumetric effects and out of screen effects add an incredible level of realism that can't be overstated. Combining those and removing all problems while allowing maximum image quality would really be incredible. Unfortunately there isn't anything we tested that gave us this satisfaction.

We do also need to note that, while no one got an instant headache, everyone who tested our setup felt a little bit of eye strain and slight pressure between the eyes after as little as 15 minutes of play. One of our testers reported nausea following the gaming session, though she happens to suffer from motion sickness so this may have played a part in it. Of course, that's also very relevant information as no one wants to take dramamine before gaming.

Not Just Another Pair of Glasses: GeForce 3D Vision at Work Final Words
Comments Locked

54 Comments

View All Comments

  • fishbits - Thursday, January 8, 2009 - link

    "What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily."
    Would be nice, but whatever works and is actually implemented.

    Nvidia could come up with a "3d glasses thumbs-up" seal of approval for games that get it right, and it could be displayed on the packaging. This would furter encourage developers to get on board. Heck, NV could have traveling demo rigs that sit in a Gamestop/Best Buy for a week, playing games that have superior compliance. Good for sales of the game(s), good for sales of the glasses.

    I've done the old shutter glasses, was a neat novelty, but wears thin as Derek says. Sounds like these are currently only a bit better with current titles in most cases. *IF* they get this right and all major titles released support the system well, I'd buy the glasses right away. The new monitor too. But they have to get it right first.

    This might work for the next generation of consoles too, albeit if hooked up to a high-refresh monitor possibly. Great selling point, another reason to get this right and off the ground. Of course Sony/Nintendo/MS might just make their own solution, but whatever gets the job done. If only one had this feature implemented well, it could be a big tie-breaker in winning sales to their camp.
  • Judgepup - Thursday, January 8, 2009 - link

    Been waiting for the next revolution in gaming and after all the bugs have been worked out, this looks like it could be a contender. I'm typically an early adopter, but I'm fairly happy with a physical reality LCD at this point. Will wait in the wings on this one, but I applaud the Mighty nVidia for taking this tech to the next level.
  • Holly - Thursday, January 8, 2009 - link

    Although I am great supporter of 3Ding of virtual worlds, there are really huge drawbacks in this technology nVidia presented.

    First of all, the reason why LCDs did not need to keep as high refresh rate as CRTs was the fact that LCD screen intensity doesn't go from wall to wall - 100% intensity to 0% intensity before another refresh (the way of CRT). This intensity fluctuation is what hurts our eyes. LCDs keep their intensity much more stable (some say their intensity is totaly stable, though I have seen some text describing there is some minor intensity downfall with LCDs as well, can't find it though). Back back on topic... we either went 100Hz+ or LCD to save our eyes.

    Even if we ignore software related problems there is still problem... The flickering is back. Even if the screen picture is intensity stable these shutter glasses make the intensity go between 0-100% and we are back to days of old 14" screens and good way to get white staff sooner or later. Even if we have 120Hz LCDs, every eye has to go with 60Hz pretty much same as old CRTs. This just won't work. For longer use (gaming etc.) you really need 85Hz+ of flickering not to damage your eyes.

    Another point I am curious about is how the GeForce 3D Vision counts in screen latency. It's not that long AT presented review of few screens with some minor whine about S-PVA latency coming way up to like 40ms. Thing is this latency could very easily cause that the frame that was supposed for left eye gets received by right eye and vice versa. I can imagine nausea superfast TM out of that (kind of effect when you drink too much and those stupid eyes just can't both focus on one thing).

    I believe this stereoscopy has a future, but I don't believe it would be with shutter glasses or other way to switch 'seeing' eye and 'blinded' eye.
  • PrinceGaz - Thursday, January 8, 2009 - link

    The answer is simple, move from LCD technology to something faster, like OLED or SED (whatever happened to SED?).

    Both of those technologies are quite capable of providing a true 200hz refresh that truly changes the display every time (not just changes the colour a bit towards something else). A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye).

    I do think 120hz split between two eyes would quite quickly give me a serious headache as when I used a CRT monitor in the past and had to look at the 60hz refresh before installing the graphics-card driver, it was seriously annoying.
  • Rindis - Friday, January 9, 2009 - link

    "A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye)."

    Almost is the key word here. I'm okay with 75Hz CRTs unless I'm staring at a blank white screen (Word), and by 85 I'm perfectly fine.

    However, my roommate trained as a classical animator (which means hours of flipping through almost identical drawings) and could perceive CRT refresh rates up to about 115Hz. (She needed expensive monitors and graphics cards....) Which would demand a 230+Hz rate for this technology.
  • paydirt - Friday, January 9, 2009 - link

    It's strange. I definitely notice when a CRT has a 60 Hz refresh rate. I have been gaming with a 29" LCD with 60 Hz refresh rate for about 4 years now and don't notice the refresh rate.
  • DerekWilson - Friday, January 9, 2009 - link

    That's because the CRT blanks while the LCD stays on. With an LCD panel, every refresh the color changes from what it was to what it is. With a CRT, by the time the electron gun comes around every 60Hz, the phosphorus has dimmed even if it hasn't gone completely dark. 60Hz on a CRT "flashes" while 60Hz on an LCD only indicates how many times per second the color of each pixel is updated.
  • SlyNine - Thursday, January 8, 2009 - link

    Yes but LCD's have ghosting, unless you can purge that image completely the right eye would see a ghost of the left eye. If you ever looked at the stills from testing LCD ghosting, you will see that even the best LCD's ghosts last for 2 frames.

    The best TV I can think of to use this with is the 7000$ Laser TV from Mitsubishi.

    Why can they not use dual videocards for this, Have one frame buffer be the left eye and the other be the right, then even if the car has yet to finish rendering the other image just flip to the last fully rendered frame.
  • Holly - Thursday, January 8, 2009 - link

    I think the result would be quite bad. You could easily end up in situation where one eye card runs 50 FPS while other eye card would be on 10FPS (even with the same models... the different placement of camera might invalidate big part of octree causing the FPS difference. Not sure how the brain would handle such a difference between two frames, but I think not well...
  • SlyNine - Thursday, January 8, 2009 - link

    You know what, I skimmed every thing you wrote, and rereading it I realize the error I made.

    My bad.

Log in

Don't have an account? Sign up now