Physical Design and Features

Starting with appearances, the design of the PG27UQ is, to put colloquially, quite thick. This would typically be an area where different companies can distinguish between similar products, in this case meaning Asus and Acer, but the dimensions are a consequence of the AUO LCD/backlight unit, the G-Sync HDR module, and HSF assembly.

And yes, a heatsink and fan. Topping the bulkness off is the aforementioned active cooling with the fan, located behind the stand/VESA mount point. The fan's behavior is not really documented, but it runs during standby and sometimes when the monitor is powered off - the latter behavior had me slightly confused on first use, as the fan spun up once I plugged in the monitor. Thankfully, the noise levels are low enough that it should only be a concern for fanless/silent configurations.


PCPerspective's photo of the FPGA module and fan

A teardown by PCPer revealed an Altera Arria 10 GX 480 FPGA with 3GB DDR4-2400 RAM, a substantial upgrade from the Altera Arria V GX FPGA with 768MB DDR3L of the original G-Sync module. NVIDIA stated that, like previous iterations, the module does not replace the TCON and does not support VESA DSC. The latter has been suggested as a solution to the bandwidth limitations of combined high res/high refresh/HDR, and we know that DisplayPort 1.4 includes the DSC standard. Implementing DSC for G-Sync may or may not add latency, but NVIDIA probably explored that option before going with the current implementation of chroma subsampling past 98Hz.

More interestingly, NVIDIA also mentioned that the G-Sync HDR module uses eDP to interface with the LCD panel, as opposed to the first generation’s LVDS, which is an aged standard nowadays. In general, eDP provides higher bandwidth, requiring fewer PCB traces and signal wires overall, and so consumes less power. Except in this case, the overall power usage and/or heat generation requires a blower fan.

It’s been reported that the 27-inch panel will come in a non-HDR variant without the FALD backlight, but the price reduction is harder to guess, since the G-Sync HDR module and quantum dot film would likely still be used. The panel will presumably have an eDP interface, which wouldn’t be compatible with the LVDS-only capability of the first generation G-Sync modules. At the least, there likely wouldn’t be a need for active cooling anymore.

So in contrast with the modern trend of smaller screen borders, the PG27UQ bezels are noticeable at around 15mm at the sides and around 20mm on the top and bottom. The three-point stand is large and the unit as a whole is on the heavier side, just a little over 20 pounds. That stand actually allows for underside LEDs, which can project a logo on the desk below, and the monitor comes with customizable blank plastic covers for this purpose. This falls under the "LIGHT IN MOTION" OSD, and a separate "Aura RGB" option governs LEDs for the ROG logo at the back of the stand. Alternatively, Aura Sync can be enabled to control the "Aura RBG" lighting.

Similarly, the ROG logo can be projected rearwards by the "ROG Light Signal," the last bit in the monitor's bling kit. The power LED also does turn red, but this is to indicate that the monitor is in G-Sync model; it is white during standard operation and amber during standby.

Also at the top of the monitor is an ambient light sensor, which is used with auto-adjusting SDR brightness ('Auto SDR Brightness') and blackness (Auto Black Level) settings in the OSD.

Connectivity is as minimal as it gets without being a big issue: 1 x DisplayPort 1.4, 1 HDMI 2.0 port, an audio jack, and a 2-port USB 3.0 hub. By standards of a premium monitor, it’s certainly not ideal; even if the panel specifications and features are the main attraction over connectivity, the $2000 price point hardly suggests minimal connections. The configuration is identical with Acer's X27 so I'm not sure if there was much Asus could do, unless the reasoning was primarily about margins (if so, then it might indicate that development/panel/module expenses are higher).

The stand and mount combine to offer a good range of adjustment options.

In terms of the on-screen display (OSD), the PG27UQ comes with several SDR picture mode presets called 'GameVisual' as it uses GameVisual Video Intelligence. The modes are as follows:

  • Racing (default): intended for input lag reduction
  • Scenery: intended for more constrast gradations. Also sets monitor to 100% brightness and locks gamma and Dark Boost (auto gamma curve adjustment)
  • Cinema: intended for saturated and cool colors. Also sets monitor to 'Cool' color temperature and locks gamma and Dark Boost
  • RTS/RPG: intended to enhance constrast sharpness and color saturation. Also sets gamma to 2.4 and Dark Boost to Level 1
  • FPS: intended for higher constrast. Also sets Level 3 Dark Boost
  • sRGB: intended for viewing photos and graphics on PCs. Also locks color temperature, brightness, contrast, and gamma

Switching to HDR mode disables GameVisual, and locks gamma, dark boost, variable backlight, and Auto SDR Brightness.

Meanwhile, a separate 'GamePlus' button brings up options for gaming-oriented OSD overlays: crosshair, timer, FPS counter, and screen alignment markers for multi-monitor setup.

The (Asus) G-Sync HDR Experience: Premium Panel for Premium Price Brightness and Contrast
Comments Locked

91 Comments

View All Comments

  • Ryan Smith - Wednesday, October 3, 2018 - link

    Aye. The FALD array puts out plenty of heat, but it's distributed, so it can be dissipated over a large area. The FPGA for controlling G-Sync HDR is generates much less heat, but it's concentrated. So passive cooling would seem to be non-viable here.
  • a5cent - Wednesday, October 3, 2018 - link

    Yeah, nVidia's DP1.4 VRR solution is baffelingly poor/non-competitive, not just due to the requirement for active cooling.

    nVidia's DP1.4 g-sync module is speculated to contribute a lot to the monitor's price (FPGA alone is estimated to be ~ $500). If true, I just don't see how g-sync isn't on a path towards extinction. That simply isn't a price premium over FreeSync that the consumer market will accept.

    If g-sync isn't at least somewhat widespread and (via customer lock in) helping nVidia sell more g-sync enabled GPUs, then g-sync also isn't serving any role for nVidia. They might as well drop it and go with VESA's VRR standard.

    So, although I'm actually thinking of shelling out $2000 for a monitor, I don't want to invest in technology it seems has priced itself out of the market and is bound to become irrelevant.

    Maybe you could shed some light on where nVidia is going with their latest g-sync solution? At least for now it doesn't seem viable.
  • Impulses - Wednesday, October 3, 2018 - link

    How would anyone outside of NV know where they're going with this tho? I imagine it does help sell more hardware to one extent or another (be it GPUs, FPGAs to display makers, or a combination of profits thru the side deals) AND they'll stay the course as long as AMD isn't competitive at the high end...

    Just the sad reality. I just bought a G-Sync display but it wasn't one of these or even $1K, and it's still a nice display regardless of whether it has G-Sync or not. I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones.
  • a5cent - Wednesday, October 3, 2018 - link

    "How would anyone outside of NV know where they're going with this tho?"

    Anandtech could talk with their contacts at nVidia, discuss the situation with monitor OEMs, or take any one of a dozen other approaches. Anandtech does a lot of good market research and analysis. There is no reason they can't do that here too. If Anandtech confronted nVidia with the concern of DP1.4 g-sync being priced into irrelevancy, they would surely get some response.

    "I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones."

    You're mistakenly assuming the DP1.2 g-sync is in any way comparable to DP1.4 g-sync. It's not.

    First, nobody sells plenty of g-sync monitors. The $200 price premium over FreeSync has made g-sync monitors (comparatively) low volume niche products. For DP1.4 that premium goes up to over $500. There is no way that will fly in a market where the entire product typically sells for less than $500. This is made worse by the fact that ONLY DP1.4 supports HDR. That means even a measly DisplayHDR 400 monitor, which will soon retail for around $400, will cost at least $900 if you want it with g-sync.

    Almost nobody, for whom price is even a little bit of an issue, will pay that.

    While DP1.2 g-sync monitors were niche products, DP1.4 g-sync monitors will be irrelevant products (in terms of market penetration). Acer's and Asus' $2000 monitors aren't and will not sell in significant numbers. Nothing using nVidia's DP1.4 g-sync module will.

    To be clear, this isn't a rant about price. It's a rant about strategy. The whole point of g-sync is customer lock-in. Nobody, not even nVidia, earns anything selling g-sync hardware. For nVidia, the potential of g-sync is only realized when a person with a g-sync monitor upgrades to a new nVidia card who would otherwise have bought an AMD card. If DP1.4 g-sync isn't adopted in at least somewhat meaningful numbers, g-sync loses its purpose. That is when I'd expect nVidia to either trash g-sync and start supporting FreeSync, OR build a better g-sync module without the insanely expensive FPGA.

    Neither of those two scenarios motivates me to buy a $2000 g-sync monitor today. That's the problem.
  • a5cent - Wednesday, October 3, 2018 - link

    To clarify the above...

    If I'm spending $2000 on a g-sync monitor today, I'd like some reassurance that g-sync will still be relevant and supported three years from now.

    For the reasons mentioned, from where I stand, g-sync looks like "dead technology walking". With DP1.4 it's priced itself out of the market. I'm sure many would appreciate some background on where nVidia is going with this...
  • lilkwarrior - Monday, October 8, 2018 - link

    Nvidia's solution is objectively better besides not being open. Similarly NVLINK is better than any other multi-GPU hardware wise.

    With HDMI 2.1, Nvidia will likely support it unless it's simply underwhelming.

    Once standards catch up, Nvidia hasn't been afraid to deprecate their own previous effort somewhat besides continuing to support it for wide-spread support / loyalty or a balanced approach (i.e. NVLINK for Geforce cards but delegate memory pooling to DX12 & Vulkan)
  • Impulses - Tuesday, October 2, 2018 - link

    If NVidia started supporting standard adaptive sync at the same time that would be great... Pipe dream I know. Things like G-Sync vs Freesync, fans inside displays, and dubious HDR support don't inspire much confidence in these new displays. I'd gladly drop the two grand if I *knew* this was the way forward and would easily last me 5+ years, but I dunno if that would really pan out.
  • DanNeely - Tuesday, October 2, 2018 - link

    Thank you for including the explanation on why DSC hasn't shown up in any products to date.
  • Heavenly71 - Tuesday, October 2, 2018 - link

    I'm pretty disappointed that a gaming monitor with this price still has only 8 bits of native color resolution (plus FRC, I know).

    Compare this to the ASUS PA32UC which – while not mainly targetted at gamers – has 10 bits, no fan noise, is 5 inches bigger (32" total) and many more inputs (including USB-C DP). For about the same price.
  • milkod2001 - Tuesday, October 2, 2018 - link

    Wonder if they make native 10bit monitors. Would you be able to output 10bit colours from gaming GPU or only professional GPU?

Log in

Don't have an account? Sign up now