Physical Design and Features

Starting with appearances, the design of the PG27UQ is, to put colloquially, quite thick. This would typically be an area where different companies can distinguish between similar products, in this case meaning Asus and Acer, but the dimensions are a consequence of the AUO LCD/backlight unit, the G-Sync HDR module, and HSF assembly.

And yes, a heatsink and fan. Topping the bulkness off is the aforementioned active cooling with the fan, located behind the stand/VESA mount point. The fan's behavior is not really documented, but it runs during standby and sometimes when the monitor is powered off - the latter behavior had me slightly confused on first use, as the fan spun up once I plugged in the monitor. Thankfully, the noise levels are low enough that it should only be a concern for fanless/silent configurations.


PCPerspective's photo of the FPGA module and fan

A teardown by PCPer revealed an Altera Arria 10 GX 480 FPGA with 3GB DDR4-2400 RAM, a substantial upgrade from the Altera Arria V GX FPGA with 768MB DDR3L of the original G-Sync module. NVIDIA stated that, like previous iterations, the module does not replace the TCON and does not support VESA DSC. The latter has been suggested as a solution to the bandwidth limitations of combined high res/high refresh/HDR, and we know that DisplayPort 1.4 includes the DSC standard. Implementing DSC for G-Sync may or may not add latency, but NVIDIA probably explored that option before going with the current implementation of chroma subsampling past 98Hz.

More interestingly, NVIDIA also mentioned that the G-Sync HDR module uses eDP to interface with the LCD panel, as opposed to the first generation’s LVDS, which is an aged standard nowadays. In general, eDP provides higher bandwidth, requiring fewer PCB traces and signal wires overall, and so consumes less power. Except in this case, the overall power usage and/or heat generation requires a blower fan.

It’s been reported that the 27-inch panel will come in a non-HDR variant without the FALD backlight, but the price reduction is harder to guess, since the G-Sync HDR module and quantum dot film would likely still be used. The panel will presumably have an eDP interface, which wouldn’t be compatible with the LVDS-only capability of the first generation G-Sync modules. At the least, there likely wouldn’t be a need for active cooling anymore.

So in contrast with the modern trend of smaller screen borders, the PG27UQ bezels are noticeable at around 15mm at the sides and around 20mm on the top and bottom. The three-point stand is large and the unit as a whole is on the heavier side, just a little over 20 pounds. That stand actually allows for underside LEDs, which can project a logo on the desk below, and the monitor comes with customizable blank plastic covers for this purpose. This falls under the "LIGHT IN MOTION" OSD, and a separate "Aura RGB" option governs LEDs for the ROG logo at the back of the stand. Alternatively, Aura Sync can be enabled to control the "Aura RBG" lighting.

Similarly, the ROG logo can be projected rearwards by the "ROG Light Signal," the last bit in the monitor's bling kit. The power LED also does turn red, but this is to indicate that the monitor is in G-Sync model; it is white during standard operation and amber during standby.

Also at the top of the monitor is an ambient light sensor, which is used with auto-adjusting SDR brightness ('Auto SDR Brightness') and blackness (Auto Black Level) settings in the OSD.

Connectivity is as minimal as it gets without being a big issue: 1 x DisplayPort 1.4, 1 HDMI 2.0 port, an audio jack, and a 2-port USB 3.0 hub. By standards of a premium monitor, it’s certainly not ideal; even if the panel specifications and features are the main attraction over connectivity, the $2000 price point hardly suggests minimal connections. The configuration is identical with Acer's X27 so I'm not sure if there was much Asus could do, unless the reasoning was primarily about margins (if so, then it might indicate that development/panel/module expenses are higher).

The stand and mount combine to offer a good range of adjustment options.

In terms of the on-screen display (OSD), the PG27UQ comes with several SDR picture mode presets called 'GameVisual' as it uses GameVisual Video Intelligence. The modes are as follows:

  • Racing (default): intended for input lag reduction
  • Scenery: intended for more constrast gradations. Also sets monitor to 100% brightness and locks gamma and Dark Boost (auto gamma curve adjustment)
  • Cinema: intended for saturated and cool colors. Also sets monitor to 'Cool' color temperature and locks gamma and Dark Boost
  • RTS/RPG: intended to enhance constrast sharpness and color saturation. Also sets gamma to 2.4 and Dark Boost to Level 1
  • FPS: intended for higher constrast. Also sets Level 3 Dark Boost
  • sRGB: intended for viewing photos and graphics on PCs. Also locks color temperature, brightness, contrast, and gamma

Switching to HDR mode disables GameVisual, and locks gamma, dark boost, variable backlight, and Auto SDR Brightness.

Meanwhile, a separate 'GamePlus' button brings up options for gaming-oriented OSD overlays: crosshair, timer, FPS counter, and screen alignment markers for multi-monitor setup.

The (Asus) G-Sync HDR Experience: Premium Panel for Premium Price Brightness and Contrast
Comments Locked

91 Comments

View All Comments

  • lilkwarrior - Monday, October 8, 2018 - link

    OLED isn't covered by VESA HDR standards; it's far superior picture quality & contrast.

    QLED cannot compete with OLED at all in such things. I would very much get a Dolby Vision OLED monitor than a LED monitor with a HDR 1000 rating.
  • Lolimaster - Tuesday, October 2, 2018 - link

    You can't even call HDR with a pathetic low contrast IPS.
  • resiroth - Monday, October 8, 2018 - link

    Peak luminance levels are overblown because they’re easily quantifiable. In reality, if you’ve ever seen a recent LG TV which can hit about 900 nits peak that is too much. https://www.rtings.com/tv/reviews/lg/c8

    It’s actually almost painful.

    That said I agree oled is the way to go. I wasn’t impressed by any LCD (FALD or not) personally. It doesn’t matter how bright the display gets if it can’t highlight stars on a night sky etc. without significant blooming.

    Even 1000 bits is too much for me. The idea of 4000 is absurd. Yes, sunlight is way brighter, but we don’t frequently change scenes from night time to day like television shows do. It’s extremely jarring. Unless you like the feeling of being woken up repeatedly in the middle of the night by a flood light. It’s a hard pass.
  • Hxx - Saturday, October 6, 2018 - link

    the only competition is Acer which costs the same. If you want Gsync you have to pony up otherwise yeah there are much cheaper alternatives.
  • Hixbot - Tuesday, October 2, 2018 - link

    Careful with this one, the "whistles" in the article title is referring to the built-in fan whine. Seriously, look at the newegg reviews.
  • JoeyJoJo123 - Tuesday, October 2, 2018 - link

    "because I know"

    I wouldn't be so sure. Not for Gsync, at least. AU Optronics is the only panel producer for monitor sized displays that even gives a flip about pushing lots of high refresh rate options on the market. A 2560x1440 144hz monitor 3 years ago still costs just as much today (if not more, due to upcoming China-to-US import tariffs, starting with 10% on October 1st 2018, and another 15% (total 25%) in January 1st 2019.

    High refresh rate GSync isn't set to come down anytime soon, not as long as Nvidia has a stranglehold on GPU market and not as long as AU Optronics is the only panel manufacturer that cares about high refresh rate PC monitor displays.
  • lilkwarrior - Monday, October 8, 2018 - link

    Japan Display plans to change that in 2019. IIRC Asus is planning to use their displays for a portable Professional OLED monitor.

    I would not be surprised they or LG created OLED gaming monitors from Japan Display that's a win-win for gamers, Japan Display, & monitor manufacturers in 2020.

    Alternatively they surprise us with MLED monitors that Japan Display also invested in + Samsung & LG.

    That's way better to me than any Nano-IPS/QLED monitor. They simply cannot compete.
  • Impulses - Tuesday, October 2, 2018 - link

    I would GLADLY pay the premium over the $600-1,000 alternatives IF I thought I was really going to take advantage of what the display offers in the next 2 or even 4 years... But that's the issue. I'm trying to move away from SLI/CF (2x R9 290 atm, about to purchase some sort of 2080), not force myself back into it.

    You're gonna need SLI RTX 2080s (Ti or not) to really eke out frame rates fast enough for the refresh rate to matter at 4K, chances are it'll be the same with the next gen of cards unless AMD pulls a rabbit out of a hat and quickly gets a lot more competitive. That's 2-3 years easy where SLI would be a requirement.

    HDR support seems to be just as much of a mess... I'll probably just end up with a 32" 4K display (because I'm yearning for something larger than my single 16:10 24" and that approaches the 3x 24" setup I've used at times)... But if I wanted to try a fast refresh rate display I'd just plop down a 27" 1440p 165Hz next to it.

    Nate's conclusion is exactly the mental calculus I've been doing, those two displays are still less money than one of these and probably more useful in the long run as secondary displays or hand me down options... As awesome as these G-Sync HDR displays may be, the vendor lock in around G-Sync and active cooling makes em seem like poor investments.

    Good displays should last 5+ years easy IMO, I'm not sure these would still be the best solution in 3 years.
  • Icehawk - Wednesday, October 3, 2018 - link

    Grab yourself an inexpensive 32" 4k display, decent ones are ~$400 these days. I have an LG and it's great all around (I'm a gamer btw), it's not quite high end but it's not a low end display either - it compares pretty favorably to my Dell 27" 2k monitor. I just couldn't see bothering with HDR or any of that other $$$ BS at this point, plus I'm not particularly bothered by screen tearing and I don't demand 100+ FPS from games. Not sure why people are all in a tizzy about super high FPS, as long as the game runs smoothly I am happy.
  • WasHopingForAnHonestReview - Saturday, October 6, 2018 - link

    You dont belong here, plebian.

Log in

Don't have an account? Sign up now