Physical Design and Features

Starting with appearances, the design of the PG27UQ is, to put colloquially, quite thick. This would typically be an area where different companies can distinguish between similar products, in this case meaning Asus and Acer, but the dimensions are a consequence of the AUO LCD/backlight unit, the G-Sync HDR module, and HSF assembly.

And yes, a heatsink and fan. Topping the bulkness off is the aforementioned active cooling with the fan, located behind the stand/VESA mount point. The fan's behavior is not really documented, but it runs during standby and sometimes when the monitor is powered off - the latter behavior had me slightly confused on first use, as the fan spun up once I plugged in the monitor. Thankfully, the noise levels are low enough that it should only be a concern for fanless/silent configurations.


PCPerspective's photo of the FPGA module and fan

A teardown by PCPer revealed an Altera Arria 10 GX 480 FPGA with 3GB DDR4-2400 RAM, a substantial upgrade from the Altera Arria V GX FPGA with 768MB DDR3L of the original G-Sync module. NVIDIA stated that, like previous iterations, the module does not replace the TCON and does not support VESA DSC. The latter has been suggested as a solution to the bandwidth limitations of combined high res/high refresh/HDR, and we know that DisplayPort 1.4 includes the DSC standard. Implementing DSC for G-Sync may or may not add latency, but NVIDIA probably explored that option before going with the current implementation of chroma subsampling past 98Hz.

More interestingly, NVIDIA also mentioned that the G-Sync HDR module uses eDP to interface with the LCD panel, as opposed to the first generation’s LVDS, which is an aged standard nowadays. In general, eDP provides higher bandwidth, requiring fewer PCB traces and signal wires overall, and so consumes less power. Except in this case, the overall power usage and/or heat generation requires a blower fan.

It’s been reported that the 27-inch panel will come in a non-HDR variant without the FALD backlight, but the price reduction is harder to guess, since the G-Sync HDR module and quantum dot film would likely still be used. The panel will presumably have an eDP interface, which wouldn’t be compatible with the LVDS-only capability of the first generation G-Sync modules. At the least, there likely wouldn’t be a need for active cooling anymore.

So in contrast with the modern trend of smaller screen borders, the PG27UQ bezels are noticeable at around 15mm at the sides and around 20mm on the top and bottom. The three-point stand is large and the unit as a whole is on the heavier side, just a little over 20 pounds. That stand actually allows for underside LEDs, which can project a logo on the desk below, and the monitor comes with customizable blank plastic covers for this purpose. This falls under the "LIGHT IN MOTION" OSD, and a separate "Aura RGB" option governs LEDs for the ROG logo at the back of the stand. Alternatively, Aura Sync can be enabled to control the "Aura RBG" lighting.

Similarly, the ROG logo can be projected rearwards by the "ROG Light Signal," the last bit in the monitor's bling kit. The power LED also does turn red, but this is to indicate that the monitor is in G-Sync model; it is white during standard operation and amber during standby.

Also at the top of the monitor is an ambient light sensor, which is used with auto-adjusting SDR brightness ('Auto SDR Brightness') and blackness (Auto Black Level) settings in the OSD.

Connectivity is as minimal as it gets without being a big issue: 1 x DisplayPort 1.4, 1 HDMI 2.0 port, an audio jack, and a 2-port USB 3.0 hub. By standards of a premium monitor, it’s certainly not ideal; even if the panel specifications and features are the main attraction over connectivity, the $2000 price point hardly suggests minimal connections. The configuration is identical with Acer's X27 so I'm not sure if there was much Asus could do, unless the reasoning was primarily about margins (if so, then it might indicate that development/panel/module expenses are higher).

The stand and mount combine to offer a good range of adjustment options.

In terms of the on-screen display (OSD), the PG27UQ comes with several SDR picture mode presets called 'GameVisual' as it uses GameVisual Video Intelligence. The modes are as follows:

  • Racing (default): intended for input lag reduction
  • Scenery: intended for more constrast gradations. Also sets monitor to 100% brightness and locks gamma and Dark Boost (auto gamma curve adjustment)
  • Cinema: intended for saturated and cool colors. Also sets monitor to 'Cool' color temperature and locks gamma and Dark Boost
  • RTS/RPG: intended to enhance constrast sharpness and color saturation. Also sets gamma to 2.4 and Dark Boost to Level 1
  • FPS: intended for higher constrast. Also sets Level 3 Dark Boost
  • sRGB: intended for viewing photos and graphics on PCs. Also locks color temperature, brightness, contrast, and gamma

Switching to HDR mode disables GameVisual, and locks gamma, dark boost, variable backlight, and Auto SDR Brightness.

Meanwhile, a separate 'GamePlus' button brings up options for gaming-oriented OSD overlays: crosshair, timer, FPS counter, and screen alignment markers for multi-monitor setup.

The (Asus) G-Sync HDR Experience: Premium Panel for Premium Price Brightness and Contrast
Comments Locked

91 Comments

View All Comments

  • crimsonson - Tuesday, October 2, 2018 - link

    Someone can correct me, but AFAIK there are no native 10 bit RGB support in games. 10-bit panel would at least improve its HDR capabilities.
  • FreckledTrout - Tuesday, October 2, 2018 - link

    The games that say they are HDR should be using 10-bit color as well.
  • a5cent - Wednesday, October 3, 2018 - link

    Any game that supports HDR uses 10 bpp natively. In fact, many games use 10 bpp internally even if they don't support HDR officially.

    That's why a HDR monitor must support the HDR10 video signal (that's the only way to get the 10 bpp frame from the GPU to the monitor).

    OTOH, a 10 bit panel for gaming typically won't provide a perceptible improvement. In practice, 8bit+FRC is just as good. IMHO it's only for editing HDR still imagery where real 10bit panels provide benefits.
  • GreenReaper - Thursday, October 4, 2018 - link

    I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this, where the bandwidth is insufficient to have full resolution *and* colour depth *and* refresh rate at once?

    You run the risk of banding or flicker, but frankly that's similar for display FRC, and I imagine if the screen was aware of what was happening it might be able to smooth it out. It'd essentially improve the refresh rate of the at the expense of some precise accuracy. Which some gamers might well be willing to take. Of course that's all moot if the card can't even play the game at the target refresh rate.
  • GreenReaper - Thursday, October 4, 2018 - link

    By client, of course, I mean card - it would send an 8-bit signal within the HDR colour gamut and the result would be a frequency-interpolated output hopefully similar to that possible now - but by restricting at the graphics card end you use less bandwidth, and hopefully it doesn't take too much power.
  • a5cent - Thursday, October 4, 2018 - link

    "I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this"

    It's an interesting idea, but I don't think it can work.

    The core problem is that the monitor then has no way of knowing if in such an FRC'ed image, a bright pixel next to a darker pixel correctly describes the desired content, or if it's just an FRC artifact.

    Two neighboring pixels of varying luminance affect everything from how to control the individual LEDs in a FALD backlight, to when and how strongly to overdrive pixels to reduce motion blur. You can't do these things in the same way (or at all) if the luminance delta is merely an FRC artifact.

    As a result, the GPU would have to control everything that is currently handled by the monitor's controller + firmware, because only it has access to the original 10 bpp image. That would be counter productive, because then you'd also have to transport all the signaling information (for the monitor's backlighting and pixels) from the GPU to the monitor, which would require far more bandwidth than the 2 bpp you set out to save 😕

    What you're thinking about is essentially a compression scheme to save bandwidth. Even if it did work, employing FRC in this way is lossy and nets you, at best, a 20% bandwidth reduction.

    However, the DP1.4(a) standard already defines a compression scheme. DSC is lossless and nets you about 30%.That would be the way to do what you're thinking of.

    Particularly 4k DP1.4 gaming monitors are in dire need of this. That nVidia and Acer/Asus would implement chroma subsampling 4:2:2 (which is also a lossy compression scheme) rather than DSC is shameful. 😳

    I wonder if nVidia's newest $500+ g-sync module is even capable of DSC. I suspect it is not.
  • Zoolook - Friday, October 5, 2018 - link

    DSC is not lossless, it's "visually lossless", which means that most of the time you shouldn't percieve a difference compared to an uncompressed stream.
    I'll reserve my judgement until I see some implementations.
  • Impulses - Tuesday, October 2, 2018 - link

    That Asus PA32UC wouldn't get you G-Sync or refresh rates over 60Hz and it's still $975 tho... It sucks that the display market is so fractured and people who use their PCs for gaming as well as content creation can't get anything approaching perfect or even ideal at times.

    There's a few 4K 32" displays with G-Sync or Freesync but they don't go past 60-95Hz AFAIK and then you don't get HDR, it's all a compromise, and has been for years due to competing adaptive sync standards, lagging connection standards, a lagging GPU market, etc etc.
  • TristanSDX - Tuesday, October 2, 2018 - link

    Soon there will be new PG27UC, with mini led backlight (10000 diodes vs 384) and with DSC
  • DanNeely - Tuesday, October 2, 2018 - link

    Eventually, but not soon. AUO is the only panel company working on 4k/high refresh/HDR; and they don't have anything with more dimming zones on their public road map (which is nominally about a year out for their production, add a few months to it for monitors makers to package them and get them to retail up once they start volume production of panels).

Log in

Don't have an account? Sign up now