More 3D than 3D: Stereoscopic Defined

Let's start with reality: we live in a world where things occupy a finite volume of space at any given moment in time... Alright, maybe that's not a good way to explain this. Let me try again. Stuff we see in real life has some width, some height and some depth. Our life in our 3D world and our two eyes give us the ability to quickly and easily judge position and dimensions of objects. 3D video games try to approximate this by drawing a two image that has many of the same "depth cues" we use to judge position and shape in reality.

Looking at a picture of something, a 2D image can help us perceive some of the depth that we would have seen if we had stood at the same location as the camera: stuff that's further away appears relatively smaller than the foreground. Shadows and lighting help give us a feel for dimensions as they fall on objects. If we were to talk about video, we would see parallax in effect making it look like objects closer to the viewer move faster than objects further away. Our experience tells us that we can expect certain constants in our reality and we pick up on those and use them to judge things that look similar to reality. Video games exploit all these things to help tell our brains that there is depth in that monitor. Or maybe we're looking at a video of something that was reality. Either way, there is something major (aside from actual depth) missing.

Though we can judge 3 dimensions to a certain extent based on depth cues, having two eyes see objects from two slightly different positions is what really tells our brain that something has depth. The combination of these two slightly different images in our brain delivers tons of information on depth. Trying to play catch with one eye is tough. Just ask your neighborhood pirate.

Seeing two different images with your two different eyes, or rather presenting two different images of the same thing from slightly different positions, is what stereoscopic 3D is. It's right there in the word ... ya know ... stereo ... and scopic. Alright, moving on.

If you've ever tried looking at those "magic eye" pictures, you know what impact just stereoscopic info can have. For those who don't know, a magic eye image is a seemingly random looking pattern that when viewed with your eyes looking "through" the image reveals a hidden 3D picture. Though there is absolutely no other depth information in the picture, no lighting or shadows, no perspective projection, nothing but basic shapes that each eye picks up when you focus through the image, the 3D effect is pronounced and looks "deeper" than any 3D game out there.


This is not a sailboat.

Combining stereoscopic information with all the other depth information makes for a dramatic effect when done properly. Correct rendering and presentation of left and right eye images with proper 3D projection, lighting all that simply looks real enough to touch. Viewing a game properly rendered for stereoscopic effects can range from feeling like looking at a shoe box diorama or a popup book to looking through a window into the next room.

Hollywood tried stereoscopic 3D with anaglyphs (those red and blue images you need the red and blue glasses for), but it didn't really take off except as a sort of lame gimmick. Back in the late 90s and early this century, we saw the computer industry test the waters with active shutter glasses that worked quite a bit better. Rather than displaying a single images with both eye views superimposed requiring filtering, shutter glasses cover one eye while the entire screen displays an image rendered for the other eye. That eye is covered while the first is uncovered to see it's own full resolution full color image. When done right this produces amazing effects.

There are a couple catches though. This process needs to happen super fast and super accurately. Anyone who spent (or spends) hours staring at sub-60Hz CRTs knows that slow flicker can cause problems from eye strain to migraines. So we need at least 60Hz for each eye for a passable experience. We also need to make absolutely certain that one eye doesn't see any of the image intended for the other eye. Thus, when building active shutter glasses, a lot of work needs to go into making both lenses able to turn on and off very fast and very accurately, and we need a display that can deliver 120 frames per second in order to achieve 60 for each eye.

Early shutter glasses and applications could work too slowly delivering the effect with a side of eye strain, and getting really good results required a CRT that could handle 120Hz and glasses that could match pace. It also required an application built for stereoscopic viewing or a sort of wrapper driver that could make the application render two alternating images every frame. Requiring the rendering of an extra image per "frame" required realtime 3D software to be very fast as well. These and other technical limitations helped to keep stereoscopic 3D on the desktop from taking off.

There is still a market today for active shutter glasses and stereoscopic viewing, though there has been sort of a lull between the production of CRTs and the availability of 120Hz LCD panels. And while LCDs that can accept and display a 120Hz signal are just starting to hit the market, it's still a little early for a resurgence of the technology. But for those early adopters out there, NVIDIA hopes to be the option of choice. So what's the big deal about NVIDIA's solution? Let's check it out.

Index Not Just Another Pair of Glasses: GeForce 3D Vision at Work
Comments Locked

54 Comments

View All Comments

  • fishbits - Thursday, January 8, 2009 - link

    "What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily."
    Would be nice, but whatever works and is actually implemented.

    Nvidia could come up with a "3d glasses thumbs-up" seal of approval for games that get it right, and it could be displayed on the packaging. This would furter encourage developers to get on board. Heck, NV could have traveling demo rigs that sit in a Gamestop/Best Buy for a week, playing games that have superior compliance. Good for sales of the game(s), good for sales of the glasses.

    I've done the old shutter glasses, was a neat novelty, but wears thin as Derek says. Sounds like these are currently only a bit better with current titles in most cases. *IF* they get this right and all major titles released support the system well, I'd buy the glasses right away. The new monitor too. But they have to get it right first.

    This might work for the next generation of consoles too, albeit if hooked up to a high-refresh monitor possibly. Great selling point, another reason to get this right and off the ground. Of course Sony/Nintendo/MS might just make their own solution, but whatever gets the job done. If only one had this feature implemented well, it could be a big tie-breaker in winning sales to their camp.
  • Judgepup - Thursday, January 8, 2009 - link

    Been waiting for the next revolution in gaming and after all the bugs have been worked out, this looks like it could be a contender. I'm typically an early adopter, but I'm fairly happy with a physical reality LCD at this point. Will wait in the wings on this one, but I applaud the Mighty nVidia for taking this tech to the next level.
  • Holly - Thursday, January 8, 2009 - link

    Although I am great supporter of 3Ding of virtual worlds, there are really huge drawbacks in this technology nVidia presented.

    First of all, the reason why LCDs did not need to keep as high refresh rate as CRTs was the fact that LCD screen intensity doesn't go from wall to wall - 100% intensity to 0% intensity before another refresh (the way of CRT). This intensity fluctuation is what hurts our eyes. LCDs keep their intensity much more stable (some say their intensity is totaly stable, though I have seen some text describing there is some minor intensity downfall with LCDs as well, can't find it though). Back back on topic... we either went 100Hz+ or LCD to save our eyes.

    Even if we ignore software related problems there is still problem... The flickering is back. Even if the screen picture is intensity stable these shutter glasses make the intensity go between 0-100% and we are back to days of old 14" screens and good way to get white staff sooner or later. Even if we have 120Hz LCDs, every eye has to go with 60Hz pretty much same as old CRTs. This just won't work. For longer use (gaming etc.) you really need 85Hz+ of flickering not to damage your eyes.

    Another point I am curious about is how the GeForce 3D Vision counts in screen latency. It's not that long AT presented review of few screens with some minor whine about S-PVA latency coming way up to like 40ms. Thing is this latency could very easily cause that the frame that was supposed for left eye gets received by right eye and vice versa. I can imagine nausea superfast TM out of that (kind of effect when you drink too much and those stupid eyes just can't both focus on one thing).

    I believe this stereoscopy has a future, but I don't believe it would be with shutter glasses or other way to switch 'seeing' eye and 'blinded' eye.
  • PrinceGaz - Thursday, January 8, 2009 - link

    The answer is simple, move from LCD technology to something faster, like OLED or SED (whatever happened to SED?).

    Both of those technologies are quite capable of providing a true 200hz refresh that truly changes the display every time (not just changes the colour a bit towards something else). A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye).

    I do think 120hz split between two eyes would quite quickly give me a serious headache as when I used a CRT monitor in the past and had to look at the 60hz refresh before installing the graphics-card driver, it was seriously annoying.
  • Rindis - Friday, January 9, 2009 - link

    "A 200hz display refresh (and therefore 100hz per eye refresh) should be fast enough for almost anyone, and most people won't have a problem with 170hz display (85hz flickering per eye)."

    Almost is the key word here. I'm okay with 75Hz CRTs unless I'm staring at a blank white screen (Word), and by 85 I'm perfectly fine.

    However, my roommate trained as a classical animator (which means hours of flipping through almost identical drawings) and could perceive CRT refresh rates up to about 115Hz. (She needed expensive monitors and graphics cards....) Which would demand a 230+Hz rate for this technology.
  • paydirt - Friday, January 9, 2009 - link

    It's strange. I definitely notice when a CRT has a 60 Hz refresh rate. I have been gaming with a 29" LCD with 60 Hz refresh rate for about 4 years now and don't notice the refresh rate.
  • DerekWilson - Friday, January 9, 2009 - link

    That's because the CRT blanks while the LCD stays on. With an LCD panel, every refresh the color changes from what it was to what it is. With a CRT, by the time the electron gun comes around every 60Hz, the phosphorus has dimmed even if it hasn't gone completely dark. 60Hz on a CRT "flashes" while 60Hz on an LCD only indicates how many times per second the color of each pixel is updated.
  • SlyNine - Thursday, January 8, 2009 - link

    Yes but LCD's have ghosting, unless you can purge that image completely the right eye would see a ghost of the left eye. If you ever looked at the stills from testing LCD ghosting, you will see that even the best LCD's ghosts last for 2 frames.

    The best TV I can think of to use this with is the 7000$ Laser TV from Mitsubishi.

    Why can they not use dual videocards for this, Have one frame buffer be the left eye and the other be the right, then even if the car has yet to finish rendering the other image just flip to the last fully rendered frame.
  • Holly - Thursday, January 8, 2009 - link

    I think the result would be quite bad. You could easily end up in situation where one eye card runs 50 FPS while other eye card would be on 10FPS (even with the same models... the different placement of camera might invalidate big part of octree causing the FPS difference. Not sure how the brain would handle such a difference between two frames, but I think not well...
  • SlyNine - Thursday, January 8, 2009 - link

    You know what, I skimmed every thing you wrote, and rereading it I realize the error I made.

    My bad.

Log in

Don't have an account? Sign up now