Legacy Titles, 'Good' Games: Work with 3D, but hardly Flawless

The most common problem in older titles was that crosshairs rendered at screen depth. Two titles where this really stuck out were Battlefield Bad Company 2, and to a lesser extent games running Valve’s Source engine.


This crosshair in Day of Defeat: Source renders at screen depth sometimes, deep 3D other times

To be fair, NVIDIA offers the ability to enable a 3D Laser Sight for certain games, which replaces the distracting normal cursor in some games.

In general, it’s distracting when things are being rendered at screen depth. A good example is how in games like TF2 or DOD:S, kill notifications in the top right, chat text, weapon selection, and players names all render in 2D at screen depth. You can get used to it, but it looks out of place. There’s also the occasional bit of content that just isn’t 3D as well.

When you fire up a title, NVIDIA gives you an overlay in the bottom right with information about the game’s compatibility, as well as what settings should be enabled or disabled for optimal 3D quality. The interesting bit about 3D is that you can really get a feel for when game engines are doing hackety things like rendering HDR at screen depth instead of in 3D - all these little flaws show in 3D mode on older titles.

The other problem is simple - hold a weapon up to a wall, and you’ll get the perception that your gun is going into the wall, which is actually closer to you. This is that age old clipping problem rearing its ugly head, now in 3D. It’s impossibly difficult to describe, but  weapons will appear to dive into materials that they can’t. In the past, with 2D, this wasn’t a problem, but the result just looks off in 3D.


See how the MG42 goes into the wood? It's even weirder when it appears to have depth.


The reality is that Excellent and Good rated titles work very well, but just haven’t been designed for 3D. The result is that while the game content is 3D and looks beautiful, things like menus, crosshairs, and information all on the edges makes the experience a bit jarring.

NVIDIA 3D Vision Ready: Play These in 3D

But what about NVIDIA’s 3D Vision Ready titles? New games designed for 3D? I decided to try a little experiment. I decided I would play Metro 2033 through, beginning to end, entirely in 3D. I would then repeat the same thing in 2D and see what I thought.

I want a 120Hz 3D monitor and kit of my own now.

The difference in the 3D experience here compared to ‘good’ or even ‘excellent’ titles is mind-blowing. What makes Metro 2033 a particularly good example is how everything is 3D. The initial menu screen is 3D, everything is 3D. There isn’t a jarring difference between content that obviously was never intended to be viewed in 3D and the rest of the game - in Metro 2033 it just works.


Metro 2033 in 3D: Everything is 3D in this menu

Things like dust are entirely volumetric, not just 2D speckles. There’s depth and detail on weapons, objects, and textures. The game is just completely immersive in a different way. It’s difficult to explain just how much more engaging this game feels in 3D compared to 2D. Suffice it to say things like the very final level where you’re running on floating platforms or in a maze away from the dark ones, or up in Ostankino tower, are amazingly different and trippy. Honestly, Metro 2033 in 3D is close to if not entirely NVIDIA 3D Vision’s killer app. 3D vision does exact a considerable price on framerate though. With Metro 2033 I settled on 1680x1050 on High with the DX10 codepath to get playable framerates. Pushing the GTX 470 much further reduced FPS too much.

The only caveat which remains is the same as found in other 3D systems - you do lose some brightness, and you’ve got to wear glasses, which is annoying if you already wear glasses to correct your vision.


Losing brightness through the shutter glasses

As I’ll show in a second, the VG236 is indeed a very bright display, but you really need every last nit to keep things lit up when in 3D mode. Just by nature of using shutter glasses, everything gets dimmer. I wouldn’t say it was a problem when playing 3D games, but I did increase gamma in Metro 2033 just a bit because it’s such a dark game most of the time.

For me, the bottom line is this. Virtually every game is going to benefit from a 120Hz panel because you won’t get visible tearing until your framerate is over 120 fps. In older games where even on maximum everything you’re well over 100, it’s nice to actually see some of those frames. To that extent, games do appear smoother to me visually. For 3D content, 3D Vision Ready titles are a whole different level of immersion compared to older titles that - while they do work - have distracting 2D elements. With time, hopefully more games will be developed with 3D in mind and lose the distracting bits that otherwise diminish the experience.
 

NVIDIA 3D Vision - Part 1 Analysis: Color Quality
Comments Locked

121 Comments

View All Comments

  • B3an - Sunday, August 8, 2010 - link

    Is 120hz possible on a 2560x1600 monitor? As the res is the highest a DVI dual-link cable can handle, and i'm not sure if the latest Display Port or HDMI specs have enough bandwidth for 120hz at this res? Anyone know?
  • mac2j - Sunday, August 8, 2010 - link

    Is 120HZ possible on a 2560x1600 ?

    Yes. And personally I agree that would be my dream also ... although we're probably talking ~$2000. The mostly likely place to look would be Dell's next revision of the 2008WFP.

    The other consideration is you'd need a serious graphics card to drive 3D at that resolution with that framerate .... really with the current offerings you're probably looking at needing the top 1 or 2 models in SLI for good performance.

    I have a rudimentary understanding of where monitors excel in relationship to TVs in this area but can anyone tell me what kind of performance/picture you could expect using one of the new 240Hz 3D TVs as a monitor?
  • mac2j - Sunday, August 8, 2010 - link

    Ugh type I meant 3008WFP ... need edit button...
  • DarkUltra - Sunday, August 8, 2010 - link

    No you need DisplayPort to get 2560x1600 at 120hz. Dual link DVI maxes out at about 1300p 120hz. If you have such a high resolution and lack the 3d perfromance, why not run games at half, say 1280x800. Fonts in Windows look real nice in high dpi on my crt (1530p 134dpi)
  • mac2j - Monday, August 9, 2010 - link

    OK here's the breakdown as far as I can tell:

    Regular DVI & HDMI <1.3 max out at 1920x1200x60Hz.

    Dual-link DVI maxes out at 1920x1200x120Hz

    HDMI 1.3 & DisplayPort 1.0 max out at 1680x1050x240hz

    1920x1200x140hz or 2560x1600x120hz would require DisplayPort 1.2 or HDMI B (which may become 1.5)

    Nothing I've heard of can handle 2560x1600x240hz as far as I know (would require =24 Gbit/s capacity)
  • mac2j - Monday, August 9, 2010 - link

    Its worth mentioning that as far as I know the first commercial cards to support DisplayPort 1.2 will be the ATI 6000 series late this year but I could be wrong.
  • B3an - Monday, August 9, 2010 - link

    I've got two 5870's and they run pretty much everything at 2560x1600 no problem, even with AA + AF... not really anything these days that really stresses cards like games used to, too much console port crap. Also had a single GTX480 and that could get way over 60+FPS at this res with 98% of games.

    So after looking into it... mac2j is right, Display Port 1.2 should definitely be able to do 2560x1600 @ 120hz.

    Just hope the 3008 replacement can do 120hz, but i highly doubt it will, these monitors are not really for gamers, even though it would benefit other things too...
  • ralgha2001 - Monday, July 11, 2011 - link

    I know this may be a dumb question.. but could I use a Samsung 40" Full HD TV (S-PVA supposedly 4 ms response) @ 120 hz for gaming and have 1920x1080 @ 120 hz (it has HDMI and RGB inputs). I don't quite care about 3D but I would like to know if I could do gaming on this and skip buying a new monitor for now..

    I'm actually building a new rig from ground zero. I'm thinkin in nvidia's GTX 580 and a mobo for the intel 1155 socket (maybe along the 2500K to save some bucks from 2600K). Since detailing the other components might not be needed or care about I'm stopping here.

    But I'm not sure if i should go for the AW2310 or another current monitor since I might still be able to go with my HDTV and save a bunch. Doesn't seem like an actual option since nobody seems to mention it and still wonder in with monitors @60 hz.

    Thanks!
  • TareX - Sunday, August 8, 2010 - link

    Any shutter glasses system is not the future.

    The future is autostereoscopic lenticular lens 3D screens.
  • bill4 - Sunday, August 8, 2010 - link

    I think you guys play fast and loose with the input lag tests. It's great that you do them though, dont get me wrong.

    For one thing as far as I know, turning almost any processing on only increases lag. For that reason I'm rather doubtful turning overdrive on reduced lag. I mean think about it, if the display has to process the image in any way, you're adding lag.

    Next you mention some monitor that you claim has no scaler and no lag. Well again as far as I know, ANY LCD display inherently has lag. So again I'm rather dubious.

    In total, it reads like you wanted this monitor to have low lag, because you liked it so much, so you sort of brushed aside evidence otherwise.

    I dont understand how in the same article you run a test apparently showing it to have 14 ms lag, then later claim it has 3.9 ms by comparing it with some third monitor. It just doesnt make sense, and is confusing at the least. Which test do you consider definitive? And if this third LCD has no lag, why didn't you test it versus a CRT? Simply having no scaler is not proof it has no lag.

    I mention this because in the HDTV lag thread at AVS forums, it's a generally accepted tenant that 120hz displays have more lag than 60 hz ones. That's why I would expect this 120hz display to have relatively more lag, such as your first test seemed to hint at.

Log in

Don't have an account? Sign up now