Legacy Titles, 'Good' Games: Work with 3D, but hardly Flawless

The most common problem in older titles was that crosshairs rendered at screen depth. Two titles where this really stuck out were Battlefield Bad Company 2, and to a lesser extent games running Valve’s Source engine.


This crosshair in Day of Defeat: Source renders at screen depth sometimes, deep 3D other times

To be fair, NVIDIA offers the ability to enable a 3D Laser Sight for certain games, which replaces the distracting normal cursor in some games.

In general, it’s distracting when things are being rendered at screen depth. A good example is how in games like TF2 or DOD:S, kill notifications in the top right, chat text, weapon selection, and players names all render in 2D at screen depth. You can get used to it, but it looks out of place. There’s also the occasional bit of content that just isn’t 3D as well.

When you fire up a title, NVIDIA gives you an overlay in the bottom right with information about the game’s compatibility, as well as what settings should be enabled or disabled for optimal 3D quality. The interesting bit about 3D is that you can really get a feel for when game engines are doing hackety things like rendering HDR at screen depth instead of in 3D - all these little flaws show in 3D mode on older titles.

The other problem is simple - hold a weapon up to a wall, and you’ll get the perception that your gun is going into the wall, which is actually closer to you. This is that age old clipping problem rearing its ugly head, now in 3D. It’s impossibly difficult to describe, but  weapons will appear to dive into materials that they can’t. In the past, with 2D, this wasn’t a problem, but the result just looks off in 3D.


See how the MG42 goes into the wood? It's even weirder when it appears to have depth.


The reality is that Excellent and Good rated titles work very well, but just haven’t been designed for 3D. The result is that while the game content is 3D and looks beautiful, things like menus, crosshairs, and information all on the edges makes the experience a bit jarring.

NVIDIA 3D Vision Ready: Play These in 3D

But what about NVIDIA’s 3D Vision Ready titles? New games designed for 3D? I decided to try a little experiment. I decided I would play Metro 2033 through, beginning to end, entirely in 3D. I would then repeat the same thing in 2D and see what I thought.

I want a 120Hz 3D monitor and kit of my own now.

The difference in the 3D experience here compared to ‘good’ or even ‘excellent’ titles is mind-blowing. What makes Metro 2033 a particularly good example is how everything is 3D. The initial menu screen is 3D, everything is 3D. There isn’t a jarring difference between content that obviously was never intended to be viewed in 3D and the rest of the game - in Metro 2033 it just works.


Metro 2033 in 3D: Everything is 3D in this menu

Things like dust are entirely volumetric, not just 2D speckles. There’s depth and detail on weapons, objects, and textures. The game is just completely immersive in a different way. It’s difficult to explain just how much more engaging this game feels in 3D compared to 2D. Suffice it to say things like the very final level where you’re running on floating platforms or in a maze away from the dark ones, or up in Ostankino tower, are amazingly different and trippy. Honestly, Metro 2033 in 3D is close to if not entirely NVIDIA 3D Vision’s killer app. 3D vision does exact a considerable price on framerate though. With Metro 2033 I settled on 1680x1050 on High with the DX10 codepath to get playable framerates. Pushing the GTX 470 much further reduced FPS too much.

The only caveat which remains is the same as found in other 3D systems - you do lose some brightness, and you’ve got to wear glasses, which is annoying if you already wear glasses to correct your vision.


Losing brightness through the shutter glasses

As I’ll show in a second, the VG236 is indeed a very bright display, but you really need every last nit to keep things lit up when in 3D mode. Just by nature of using shutter glasses, everything gets dimmer. I wouldn’t say it was a problem when playing 3D games, but I did increase gamma in Metro 2033 just a bit because it’s such a dark game most of the time.

For me, the bottom line is this. Virtually every game is going to benefit from a 120Hz panel because you won’t get visible tearing until your framerate is over 120 fps. In older games where even on maximum everything you’re well over 100, it’s nice to actually see some of those frames. To that extent, games do appear smoother to me visually. For 3D content, 3D Vision Ready titles are a whole different level of immersion compared to older titles that - while they do work - have distracting 2D elements. With time, hopefully more games will be developed with 3D in mind and lose the distracting bits that otherwise diminish the experience.
 

NVIDIA 3D Vision - Part 1 Analysis: Color Quality
Comments Locked

121 Comments

View All Comments

  • DarkUltra - Thursday, August 19, 2010 - link

    PS I use http://www.lagom.nl/lcd-test/black.php and http://www.lagom.nl/lcd-test/white.php and a few new 23" 1080p lcd we have at work are really bad at these tests. My dads benq g2400w is good, so TN can perform and I hope the LG W2363D is similar.
  • AllenP - Wednesday, September 8, 2010 - link

    Hey, I had a question about this statement:

    "It just so happens that it’s pulling one frame behind, which on average worked out to a lag of 1.9 ms."

    Sorry, I don't quite understand where those numbers are coming from... what do you mean by "one frame"? One frame is 8.3ms at 120 Hz or 16.7ms at 60Hz... To be honest, I guess it really doesn't matter: the graphics card ends up being the one that decides how much latency exists between outputs, especially when it's working with two signals at different refresh rates.

    I would recommend using a more simple method of testing that always tries to get the exact same frame, refresh rate, and resolution to both monitors after it leaves the graphics card though one DVI-I port... This will eliminate all the confusion about how much latency the graphics card has when reading from the frame buffer to two different DVI ports. You'll need a really good CRT for that kind of test (one that can support like 1080P+ at 120Hz+), but I'm sure they exist. This way you can just split the DVI-I to a DVI and VGA using a passive component like this: http://sewelldirect.com/gefen-dvi-dvi-and-vga.asp -- Then the only element introducing discrepancies would be the DAC inside your graphics card that is used on the DVI-I (which I would assume is happens /after/ the frame is read from the frame buffer with most graphics cards that support this type of simultaneous output).
  • v12v12 - Tuesday, September 14, 2010 - link

    You know when I 1st read the review of this, I thought to myself; hrmmm could I too be "wrong" about this whole 3D-imagery non-nonse... Am I missing out on something really good, bc I'm bias towards CRTs? Thankfully the answer is NO! I'm not missing anything lol!

    I love my habit of always reading through the comments section; you REALLY can segregate the meat from the fat when you go through pages of comments. Comments with BRUTAL honesty and usually spot on; what you don't find in Klug's fanboy-hyped review. Seriously I started questioning my judgment(s) about 3D, 120hz, and (omg) TN-panels, LMFAO! TN—really (?!), for all this money and supposed advancement?!
    __Thanks to the numerous comments about these and many more "overlooked" features/technological implements, I've slammed the gavel on this ridiculous review; GUILTY! This monitor is nothing more than old technology, souped-up with some racing stripe, coffee-can exhaust like "advances," repackaged for the sheepish plebs. Yep I said Plebs; the sheep that will do EXACTLY what the OP has suggested; go out and get this monitor right now! ORLY? Then you read the comments and see all the major flaws of this ricer "technology..." I've been saying this for YEARS when I saw LCDs starting to take a market foot-hold; high-tech MARKETING is selling low-end technology as STANDARD FOR US ALL! WTF?
    Haha, I love having a marketing background; makes seeing all these smoke and mirror, Vegas light and dazzle shows so much easier... But nope, plenty of sheep out there that will be saving up or going out right now to support this con-artist marketing of low-tech "advancement(s)," which hurts the real technological enthusiast or just simply someone that knows the TRUTH about how most folks are being duped on the daily, which sets market precedent for EVERYONE to get with, or pay much more for what we all should demand!

    TN? Glossy? 1/2 arse 3D? No game port? Lack of real lag testing? 16:9???! PRICE? Yeah MARKETING folks...

    I'll PASS!
  • Zoeff - Friday, September 17, 2010 - link

    Brian, what would be your recommended settings for this monitor when using it for both playing games and some photo editing?

    Thanks!
  • Deusfaux - Monday, October 4, 2010 - link

    a 16:10 monitor won't allow you to better see/hunt down enemies in a videogame, as you imply, with it's increased height.

    Almost all widescreen compatible games are Horizontal+, and the vertical FOV is constant. You're actually seeing less with a narrower 16:10 display than a wider 16:9 one.
  • NiteTortoise - Wednesday, October 6, 2010 - link

    Hey Brian Klug -

    I think you have the wrong model # for the display without the glasses. You list it as VG246HE, but its actually VG236HE per Asus's website: http://usa.asus.com/product.aspx?P_ID=RiEoeerrSbel...

    I spent a couple hours trying to find the display without the glasses, and I'm sure others have been in the same situation!

    Thanks!
  • snuuggles - Monday, November 22, 2010 - link

    I've been gaming on an older 32" 720p lcd tv for about 3 years. In a pinch I can use it as a monitor for work or browsing email, but of course the resolution is a bit low for that stuff...

    I'd really like to replace it with a higher resolution monitor, but I don't want to go too much smaller, and I'm *very* sensitive to input lag/low fps/ghosting/motion issues.

    Basically for gaming, the things that matter (to me) are:

    - input lag
    - size of screen
    - native resolution that is a good comprimise between sharp/useful-for-work and not-a-frame-rate-killer (hellooooo 1080p, booooo 2560x1600)
    - minimal ghosting/trailing/whatever motion artifacts

    not important:
    - viewing angle (it's just me!)
    - color reproduction (I don't edit photos)
    - energy consumption (unless it starts costing dollars per day, PC gaming is just going to cost real money, and my electric bill is just not an issue compared to all the other costs, even if it *doubles*)

    Seems like a large-format (27"+), 1080p TN+Film 120hz monitor for 500-600 would be something I should expect to be able to buy. Why is there no such thing? For *any* price?!?!

    Can anyone say *for sure* that any of the new 3d TVs actually accept 120hz input. In other words, I know at least some (most? al?) of them use a funny "frame packing" method to get the two frames to the tv, basically using the same 60hz input with a funny resolution that the tv then just splits and displays one after the other. Are there *any* 3d tvs that actually accept a true 120hz 1080p input that I could use as a large monitor? Anything in the next year?

    Should I just say screw it and get the zr30w and game at 1280x800 and work at full rez? There's no way I'm dropping 600 every year just to have a video card that can play the latest games at full rez.

    Seems really really lame that the only two 27" 120hz monitors I can find listed anywhere on google have no release date. What is going on, should I just wait, or is there some reason that the things I care about just don't seem to be something people want to deliver a product for?
  • snuuggles - Monday, December 6, 2010 - link

    Anyone (Brian) looking at this thread anymore? I know I'm basically posing on a dead article, but there's no 'display' section of the forums, and I haven't seen the answer to my questions anywhere (here, TFT reviews, avs forums)!

    My questions consolidated for convenience:

    - When are larger-format (27"+) 120hz input monitors coming out. Is there some reason they aren't out already?

    - assuming the above is going to take a while (6+ months), are there any TVs out there now that will take a 120hz input and display it at 120hz.

    - Assuming there are no tvs currently available that can do that, can anyone say with confidence that 2011 will bring some (given that they'll all have HDMI 1.4 and that supposedly would support 1080p@120hz input). Or will they continue to insist on the ridiculous "frame packing" bs, and not even allow 120hz input.

    It just seems strange that I can't get what I want which is a large (27-42"), low lag, 120hz input display at 1080p.

    Thanks again! Sorry if I'm posting in an inappropriate place.
  • Onslaught2k3 - Saturday, December 4, 2010 - link

    Every single display won't have everything you need. People bashing 120hz calling it a gimmick are just silly. Gamers spend hundreds finding the best mouse around that'll move at 5000+ DPI. After buying this montior, going over 2500 DPI on a laser gaming-grade mouse is unnecessary because the 120hz display effectively doubles the DPI @ 60Hz. You can use a cheaper mouse with this monitor and save money on what you would otherwise be spending on a freaking peripheral! That on its own is pretty good. I have a Samsung T260 paired with this monitor (I've paid MORE for the T260 back in may of 2008 than I did for the VG236HE - the one without the nvidia 3d kit since I use a AMD graphics card) and the difference is plain stunning. Since I focus what's on screen and not on my reflection I find that the glossy screen resembles a CRT with regards to colour reproduction. But as people have said here I know for a fact that CRTs are far better in almost every regard to an LCD picture and colour-wise. My next monitor purchase will most likely be a 120hz IPS panel that is around 27". This probably won't happen for another 3-5 years.
  • MLSCrow - Monday, December 6, 2010 - link

    The title of the Article is, "...120Hz is the future", but I think what you really should say is, "...240Hz is the future".

    Reason(s) being;

    In the introduction of your article, you wrote, "I spent the first half hour seriously just dragging windows back and forth across the desktop - from a 120Hz display to a 60Hz, stunned at how smooth and different 120Hz was. Yeah, it’s that different."

    With that said, I agree, 120Hz is amazing in comparison to 60Hz. I've been noticing the difference ever since I tried noticing the difference, back in the CRT days, however, once you activate 3D mode, you break that value in half for each eye. So, as you said, the 120Hz, becomes 60Hz per eye in 3D mode. However, in order to have that same smoothness that you saw 120Hz prior to 3D mode, during 3D mode, you would need a 240Hz display and considering that 240Hz 3D capable displays (120Hz per eye in 3D mode) are currently available, I'm sure you'd agree that it really is 240Hz that is the future.

    Cheers.

    -Fan since genesis.

Log in

Don't have an account? Sign up now