BenQ says that the maximum light output of the XL2720T is 300 nits, but the most I could record is 252 nits while not totally crushing whites. It might be possible to set the contrast to 100 and record a higher number, but with that setting the top white shades, from around 230 until 255, all blend into a single shade. No one would ever use a display this way, so it’s a bit impractical for me to measure them this way. The lowest white level I could get was 93 nits, which helps to explain the contrast ratio issue on the sRGB test. If we are targeting 80 nits, and the lowest level the display can natively do is 93 nits, then the video card LUT has to be lowered to bring that brightness down. Anytime you lower the peak of the LUT, you then start losing contrast range, which is why we try to get the monitor set as close as possible before calibration, or use DDC to have it automatically do it correctly.

White Level -  XR Pro, Xrite i1D2 and XR i1DPro

The black level was fine on the XL2720T, though not exceptional as TN isn’t known for being great at blocking out light. It isn’t a bad number, but black level is really all about contrast ratio and on its own doesn’t mean much unless we also know the white value.

Black Level - XR Pro, Xrite i1D2 and XR i1DPro

The contrast ratio on the BenQ comes out at right around 820:1 at maximum or minimum light level. The lower number we saw on the sRGB calibration was due to targeting a light output level below what the monitor can do natively. If you aren’t trying to go below the 95 nits light level that I saw on the white testing, you will get a contrast ratio right around 820:1 from the XL2720T.

Contrast Ratio -  XR Pro, Xrite i1D2 and XR i1DPro

The BenQ XL2720T also does well on power usage, consuming very little at maximum or minimum backlight levels. Other than the Acer touchscreen model, the BenQ is the most efficient monitor for which I have calculated the Candelas per Watt number, which takes into account screen size, power use, and light output to normalize the data. I have a feeling the BenQ benefits by having a lower resolution 27” panel than most of the 27” monitors I’ve tested, but with the small number of data points I have so far that is all conjecture at the moment. Whatever the cause of it, the power use of the BenQ is very low.

LCD Power Draw (Kill-A-Watt)

Candelas per Watt

Display Uniformity Gaming Use Comments and Lag Tests
Comments Locked

79 Comments

View All Comments

  • mdrejhon - Tuesday, June 18, 2013 - link

    The XL2720T has better color quality than the VG248QE.
    Someone owns both monitors, and reported this.
    The VG278H is actually pretty competitive to the XL2720T, despite its age.

    What makes them really worth it, is the LightBoost.
  • Death666Angel - Tuesday, June 18, 2013 - link

    Wow, this review badly needs a table of the specs on the first page.
  • brandonicus - Tuesday, June 18, 2013 - link

    I hate to be "that guy" but I found it really annoying you assumed we knew what the resolution was... unless I'm blind the only place it was mentioned was in the "Posted in" header and the seventh and eighth page. I feel like something that important should be mentioned upfront.
  • blackoctagon - Tuesday, June 18, 2013 - link

    Thanks for the review, Chris, but WHY exactly did you choose to measure input lag using the Leo Bodnar test? Apart from the fact that it cannot measure the screen's performance at 120Hz (the refresh rate at which this screen is designed to be played), the test itself seems to not have undergone the same verification as, say, PRAD.de's use of an oscilloscope has...for a review that starts out with a discussion about input lag, and even mentions that you were "still in search of" the ideal test, I expected to hear your reasoning for choosing this methodology over others.
  • cheinonen - Tuesday, June 18, 2013 - link

    I actually talked to TFT Central about this, as they have an oscilloscope method as well (which is beyond my means, unfortunately). They've tested multiple ways and feel the Leo Bodnar winds up as the most accurate version out there right now as well, other than a scope method. SMTT was working relatively well, but it has a license, and he stopped selling them. Our license expired, so I can't use it anymore.

    Searching for a totally accurate, and affordable, lag measurement device continues. I'll look into the Audrino solution that was mentioned here and see how that looks.
  • blackoctagon - Wednesday, June 19, 2013 - link

    Thank you for the reply. Looking forward to seeing where this search leads you
  • mdrejhon - Wednesday, June 19, 2013 - link

    I'm the inventor of the Arduino Input Lag Tester, which runs via a USB cable connected to the computer.

    It features:
    - Sub-millisecond accuracy
    - Works at all computer resolutions and refresh rates.
    - USB cable latency compensation (subtracts calculated USB cable latency)
    - Costs only $40 to build.

    It's currently undergoing beta testing, with custom software I have created.
    Contact me at mark[at]blurbusters.com for more information about the Arudino Input Lag Tester.
  • blackoctagon - Thursday, June 20, 2013 - link

    Interesting. But is it 'Audrino,' 'Arduino' or 'Arudino' test? :)
    I see all three (mis-?)spellings on this page
  • mdrejhon - Thursday, June 20, 2013 - link

    Apologies. It's a hard word sometimes.
    The correct spelling is Arduino, which refers to an easy-to-program hobbyist microcontroller:
    http://www.arduino.cc/

    It's a home made input lag meter involving (1) Almost any Arduino with a USB port, (2) a photodiode, (3) a resistor, and (4) some wires. It's an open source input lag circuit I've developed that is very easy to build (easier than building a computer -- no soldering iron required!). I'll be publishing some software that makes everything run as an accurate input lag tester (including USB cable latency/jitter compensation), since the assembly is connected to a PC displaying flashing squares.
  • Pastuch - Tuesday, June 18, 2013 - link

    Honestly, this review is a huge let down. When I started reading this website 10 years ago the articles were always informed and well researched. This review is sorely lacking in that regard. The only reason people are still buying 120hz displays is for Lightboost capable 2d gaming. The CS, BF and Quake communities LOVE the CRT like motion response of Lightboost and this is one of the better models to have that capability. http://www.blurbusters.com/ has all the relevant info, Mark is an invaluable resource and I implore you to contact him for more info.

    You complain loudly about IPS color quality in a gaming review but you admit yourself that gaming isn't a hobby you’re interested in. Your conclusion argues that the money could be better spent on an IPS 2560 display. Do you know how many video cards it takes to run Planetside 2 at 2560 at 80FPS+? You need two Geforce 780s! Can I borrow $1200?

    I used to own a 2560x1440 IPS for desktop work but I couldn’t play CS on it due to slow pixel response and horrible input lag. Once you try lightboost there is no going back. The motion clarity at 120fps + on a LB capable display genuinely changes the gameplay experience. I don't own a LB display yet but I've tried it at a lan party. I was blown away and I was hoping that Anand would provide a comprehensive review of the Benq 2720T. With the latest Nvidia drivers and LB enabled, gamers are reporting almost 1000 contrast ratio on the 2720 which is better than any other LB monitor. Lightboost is a genuine boon to the gaming market, there are Sony FW900 owners that say the motion clarity of LB is BETTER than the FW900. Do you have any idea how amazing that is? People have been waiting 10 years for a monitor that can replace the FW900 for twitch-gaming.

    If you want to read solid monitor reviews go to http://www.tftcentral.co.uk/

Log in

Don't have an account? Sign up now