Delayed past its original late 2017 timeframe, let alone the April and May estimates, NVIDIA’s G-Sync HDR technology finally arrived over the last couple months courtesy of Asus’ ROG Swift PG27UQ and Acer’s Predator X27. First shown at Computex 2017 as prototypes, the 27-inch displays bring what are arguably the most desired and visible aspects of modern gaming monitors: ultra high resolution (4K), high refresh rates (144Hz), and variable refresh rate technology (G-Sync), all in a reasonably-sized quality panel (27-inch IPS-type). In addition to that, of course, are the various HDR-related capabilities with brightness and color gamut.

Individually, these features are just some of the many modern display technologies, but where resolution and refresh rate (and also input latency) are core to PC gaming, those elements typically work as tradeoffs, with 1440p/144Hz being a notable middle ground. So by the basic 4K/144Hz standard, we have not yet had a true ultra-premium gaming monitor. But today, we look at one such beast with the Asus ROG Swift PG27UQ.

ASUS ROG Swift PG27UQ G-SYNC HDR Monitor Specifications
  ROG Swift PG27UQ
Panel 27" IPS (AHVA)
Resolution 3840 × 2160
Refresh Rate OC Mode 144Hz (HDR, 4:2:2) 144Hz (SDR, 4:2:2)
Standard 120Hz (HDR, 4:2:2)
98Hz (HDR, 4:4:4)
120Hz (SDR, 4:4:4)
Over HDMI 60Hz
Variable Refresh Rate NVIDIA G-Sync HDR module
(actively cooled)
Response Time 4 ms (GTG)
Brightness Typical 300 - 600 cd/m²
Peak 1000 cd/m² (HDR)
Contrast Typical 1000:1
Peak 50000:1 (HDR)
Backlighting FALD, 384 zones
Quantum Dot Yes
HDR Standard HDR10 Support
Viewing Angles 178°/178° horizontal/vertical
Pixel Density 163 pixels per inch
0.155mm pixel pitch
Color Depth 1.07 billion
(8-bit with FRC)
Color Gamut sRGB: 100%
Adobe RGB: 99%
 DCI-P3: 97%
Inputs 1 × DisplayPort 1.4
1 × HDMI 2.0
Audio 3.5-mm audio jack
USB Hub 2-port USB 3.0
Stand Adjustments Tilt: +20°~-5°
Swivel: +160°~+160°
Pivot: +90°~-90°
Height Adjustment: 0~120 mm
Dimensions (with stand) 634 x 437-557 x 268 mm
VESA Mount 100 × 100
Power Consumption Idle: 0.5 W
Peak: 180 W (HDR)
Price $1999

As an ultra-premium gaming monitor of that caliber, the PG27UQ also has an ultra-premium price of $1999. For reasons we’ll soon discuss, the pricing very much represents the panel’s HDR backlighting unit, quantum dot film, and G-Sync HDR module. The full-array local dimming (FALD) backlighting system delivers the brightness and contrast needed for HDR, while the quantum dot film enhances the representable colors to a wider gamut, another HDR element. The new generation G-Sync HDR module deals with the variable refresh implementation, but with HDR, high refresh rate, and high resolution combined, bandwidth constraints require chroma subsampling beyond 98Hz.

In terms of base specifications, the PG27UQ is identical to Acer’s Predator X27 as it uses the same AU Optronics panel, and both monitors are essentially flagships for the G-Sync HDR platform, which includes the curved ultrawide 35-inch models and 4K 65-inch Big Format Gaming Displays (BFGD). Otherwise, there isn’t anything new here that we haven’t already known about in the long run-up.

NVIDIA G-SYNC HDR Monitor Lineup
  Acer
Predator X27
ASUS
ROG Swift PG27UQ
Acer
Predator X35
ASUS
ROG Swift PG35VQ
Acer
Predator BFGD
ASUS
ROG Swift PG65
HP
OMEN X 65 BFGD
Panel 27" IPS-type (AHVA) 35" VA
1800R curve
65" VA?
Resolution 3840 × 2160 3440 × 1440 (21:9) 3840 × 2160
Pixel Density 163 PPI 103 PPI 68 PPI
Max Refresh Rates 144Hz
60Hz (HDMI)
200Hz
60Hz (HDMI)
120Hz
60Hz (HDMI)
Backlighting FALD (384 zones) FALD (512 zones) FALD
Quantum Dot Yes
HDR Standard HDR10 Support
Color Gamut sRGB
DCI-P3
Inputs 2 × DisplayPort 1.4
1 × HDMI 2.0
DisplayPort 1.4
HDMI 2.0
DisplayPort 1.4
HDMI 2.0
Ethernet
Price $1999 TBA TBA
Availability Present 2H 2018?

Furthermore, Asus’ ROG Swift PG27UQ also had a rather insightful channel for updates on their ROG forums, so there's some insight into the panel-related firmware troubles they've been having.

How We Got Here: Modern Gaming Monitors and G-Sync HDR

One of the more interesting aspects about the PG27UQ is about its headlining features. The 3840 x 2160 ‘4K’ resolution and 144Hz refresh rate are very much in the mix, and so is the monitor being not just G-Sync but G-Sync HDR. Then there is the HDR aspect, with the IPS-type panel that has localized backlighting and a quantum dot film. G-Sync HDR means both a premium tier of HDR monitor, as well as the new generation of G-Sync that works with high dynamic range gaming.

Altogether, the explanation isn’t very succinct for gamers, especially compared to a non-HDR gaming monitor, and it has all to do with the vast amount of moving parts involved in consumer monitor features, something more thoroughly covered by Brett. For some context, recent display trends include

  • Higher resolutions (e.g. 1440p, 4K, 8K)
  • Higher refresh rates (e.g. 120Hz, 165Hz, 240Hz)
  • Variable refresh rate (VRR) (e.g. G-Sync, FreeSync)
  • Panel size, pixel density, curved and/or ultrawide formats
  • Better panel technology (e.g. VA, IPS-type, OLED)
  • Color bit depth
  • Color compression (e.g. chroma subsampling)
  • Other high dynamic range (HDR) relevant functions for better brightness/contrast ratios and color space coverage, such as local dimming/backlighting and quantum dot films

These features obviously overlap, and much of their recent developments are not so much ‘new’ as they are now ‘reasonably affordable’ to the broader public. For a professional class price, monitors for professional visualization have offered many of the same specifications. And most elements are ultimately limited by PC game support, even uncapped refresh rates and 4K+ resolutions. This is, of course, not including connection standards, design (i.e. bezels and thinness), or gaming monitor features (e.g. ULMB). All these bits, and more, are served up to consumers in a bevy of numbers and brands.

Why does all of this matter? All of these points are points of discussion with the Asus ROG Swift PG27UQ, and especially to G-Sync HDR at the heart of this display. Gaming monitors are moving beyond resolution and refresh rate in their feature sets, especially as games start to support HDR technologies (i.e. HDR10, Dolby Vision, FreeSync 2 tone-mapping). To implement those overlapping features, much more has to do with the panel rather than the VRR hardware/specification, which has become the de facto identifier of a modern gaming monitor. The goal is no longer summarized by ‘faster frames filled with more pixels’ and becomes more difficult to communicate, let alone market, to consumers. And this has much to do with where G-Sync (and VRR) started and what it is now aspiring to be.

From G-Sync Variable Refresh To G-Sync HDR Gaming Experience
POST A COMMENT

90 Comments

View All Comments

  • crimsonson - Tuesday, October 02, 2018 - link

    Someone can correct me, but AFAIK there are no native 10 bit RGB support in games. 10-bit panel would at least improve its HDR capabilities. Reply
  • FreckledTrout - Tuesday, October 02, 2018 - link

    The games that say they are HDR should be using 10-bit color as well. Reply
  • a5cent - Wednesday, October 03, 2018 - link

    Any game that supports HDR uses 10 bpp natively. In fact, many games use 10 bpp internally even if they don't support HDR officially.

    That's why a HDR monitor must support the HDR10 video signal (that's the only way to get the 10 bpp frame from the GPU to the monitor).

    OTOH, a 10 bit panel for gaming typically won't provide a perceptible improvement. In practice, 8bit+FRC is just as good. IMHO it's only for editing HDR still imagery where real 10bit panels provide benefits.
    Reply
  • GreenReaper - Thursday, October 04, 2018 - link

    I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this, where the bandwidth is insufficient to have full resolution *and* colour depth *and* refresh rate at once?

    You run the risk of banding or flicker, but frankly that's similar for display FRC, and I imagine if the screen was aware of what was happening it might be able to smooth it out. It'd essentially improve the refresh rate of the at the expense of some precise accuracy. Which some gamers might well be willing to take. Of course that's all moot if the card can't even play the game at the target refresh rate.
    Reply
  • GreenReaper - Thursday, October 04, 2018 - link

    By client, of course, I mean card - it would send an 8-bit signal within the HDR colour gamut and the result would be a frequency-interpolated output hopefully similar to that possible now - but by restricting at the graphics card end you use less bandwidth, and hopefully it doesn't take too much power. Reply
  • a5cent - Thursday, October 04, 2018 - link

    "I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this"

    It's an interesting idea, but I don't think it can work.

    The core problem is that the monitor then has no way of knowing if in such an FRC'ed image, a bright pixel next to a darker pixel correctly describes the desired content, or if it's just an FRC artifact.

    Two neighboring pixels of varying luminance affect everything from how to control the individual LEDs in a FALD backlight, to when and how strongly to overdrive pixels to reduce motion blur. You can't do these things in the same way (or at all) if the luminance delta is merely an FRC artifact.

    As a result, the GPU would have to control everything that is currently handled by the monitor's controller + firmware, because only it has access to the original 10 bpp image. That would be counter productive, because then you'd also have to transport all the signaling information (for the monitor's backlighting and pixels) from the GPU to the monitor, which would require far more bandwidth than the 2 bpp you set out to save 😕

    What you're thinking about is essentially a compression scheme to save bandwidth. Even if it did work, employing FRC in this way is lossy and nets you, at best, a 20% bandwidth reduction.

    However, the DP1.4(a) standard already defines a compression scheme. DSC is lossless and nets you about 30%.That would be the way to do what you're thinking of.

    Particularly 4k DP1.4 gaming monitors are in dire need of this. That nVidia and Acer/Asus would implement chroma subsampling 4:2:2 (which is also a lossy compression scheme) rather than DSC is shameful. 😳

    I wonder if nVidia's newest $500+ g-sync module is even capable of DSC. I suspect it is not.
    Reply
  • Zoolook - Friday, October 05, 2018 - link

    DSC is not lossless, it's "visually lossless", which means that most of the time you shouldn't percieve a difference compared to an uncompressed stream.
    I'll reserve my judgement until I see some implementations.
    Reply
  • Impulses - Tuesday, October 02, 2018 - link

    That Asus PA32UC wouldn't get you G-Sync or refresh rates over 60Hz and it's still $975 tho... It sucks that the display market is so fractured and people who use their PCs for gaming as well as content creation can't get anything approaching perfect or even ideal at times.

    There's a few 4K 32" displays with G-Sync or Freesync but they don't go past 60-95Hz AFAIK and then you don't get HDR, it's all a compromise, and has been for years due to competing adaptive sync standards, lagging connection standards, a lagging GPU market, etc etc.
    Reply
  • TristanSDX - Tuesday, October 02, 2018 - link

    Soon there will be new PG27UC, with mini led backlight (10000 diodes vs 384) and with DSC Reply
  • DanNeely - Tuesday, October 02, 2018 - link

    Eventually, but not soon. AUO is the only panel company working on 4k/high refresh/HDR; and they don't have anything with more dimming zones on their public road map (which is nominally about a year out for their production, add a few months to it for monitors makers to package them and get them to retail up once they start volume production of panels). Reply

Log in

Don't have an account? Sign up now