Closing Thoughts

Bringing this review to a close, the ROG Swift PG27UQ has some subtleties as it is just as much a ‘G-Sync HDR’ monitor as it is an ROG Swift 4Kp144 HDR monitor. In terms of panel quality and color reproduction, the PG27UQ is excellent by our tests. As a whole, the monitor comes with some slight design compromises: design bulkiness, active cooling, limited connectivity. However, those aspects aren’t severe enough to be dealbreakers except in very specific scenarios, such as for silent PC setups. Given the pricing and capabilities, the PG27UQ is destined to be paired with the highest end graphics cards; for a 4K 144Hz target, multi-GPU with SLI is the only – and pricy – solution for more intensive games.

And on that note, therein lies the main nuance with the PG27UQ. The $2000 price point is firmly in the ultra-high-end based on the specific combination of functionalities that the display offers: 4K, 144Hz, G-Sync, and DisplayHDR 1000-level HDR.

For ‘price is no object’ types, this is hardly a concern if the ROG Swift PG27UQ can hit all those well – and it does. But if price is at least somewhat of a consideration – and for the vast majority, it still is – then not using all those features simultaneously means not utilizing the full value of the monitor, and at $2000 this is already including an existing premium. The use-cases where all those features would be used simultaneously, that is, HDR games, are somewhat limited due to the nature of HDR support in PC games, as well as the horsepower of graphics cards currently on the market.

The graphics hardware landscape brings us to the other idea behind getting a monitor of this caliber: futureproofing. At this time, even the GeForce RTX 2080 Ti is not capable of reaching much beyond 80fps or so, and with NVIDIA stepping back from SLI, especially with 2+ way configurations, multi-GPU options are somewhat unpredictable and require more user-configuration. This could be particularly problematic depending on the nature of the HDR with 4:2:2 chroma subsampling performance hit for Pascal cards. Though this could go both ways, as some gamers expect minimal user configuration for products at the upper end of ‘premium’.

On the face of it, this is the type of monitor that demands ‘next-generation graphics’, and fortunately we have the benefit of NVIDIA’s announcement – and now launch – of Turing and GeForce RTX GPUs. In looking to that next generation, G-Sync HDR monitors are put in an awkward position. We still don’t know the extent of performance on Turing hybrid rendering with real-time ray tracing effects, but that feature is clearly the primary focus, if the branding ‘GeForce RTX’ wasn’t already clear enough. For traditional rendering in games (i.e. ‘out-of-the-box’ performance in most games), for 4K performance we saw the RTX 2080 Ti as 32% faster than the GTX 1080 Ti, reference-to-reference, and the RTX 2080 as around 8% faster than the GTX 1080 Ti and 35% faster than the GTX 1080. In the mix is the premium pricing of the GeForce RTX 2080 Ti, 2080, and 2070, of which only the 2080 Ti and 2080 support SLI.

Although this is really a topic to revisit after RTX support rolls out in games, Turing and its successors matter if only because this is a forward-looking monitor with G-Sync (and thus using VRR) means using NVIDIA cards. And for modern ultra-high-end gaming monitors, VRR is simply mandatory. Given that Turing’s focus is on new feature sets rather than purely on raw traditional performance over Pascal, then it somewhat plays against the idea of ‘144Hz 4K HDR with variable refresh’ as the near-future 'ultimate gaming experience', presumably in favor of real-time raytracing and the like. So enthusiasts might be faced with a quandary where enabling real-time raytracing effects means forgoing 4K resolution and/or ultra-high refresh rates, and even when for traditional non-raytraced performance, the framerate is still lacking. Again, details won’t become clear until we see the intensity of hybrid rendered game workloads, but this is absolutely something to keep in mind because not only are ultra-high-end gaming monitors and ultra-high-end graphics cards are tied at the hip, but also that the former tends to have longer upgrade/replacement cycles than the latter.

With futureproofing and to a lesser extent early adoption, consumers are paying the premium for features that they will fully utilize at some point, and that the device in question will still be viable until then. But if there is hard divergence from that vision of the future, then some of those features might not be fully utilized for quite some time. For the PG27UQ, it’s clear that the panel quality and HDR capability will keep it viable for quite some time, but right now the rest of the situation is unclear.

Returning to the here-and-now, there are a few general caveats for a prospective buyer. Utilizing HDMI will work with HDR input sources (limited to 60Hz max), but the G-Sync functionality is unused with current generation HDR consoles, which support FreeSync. The monitor is not intended for double-duty as a professional visualization monitor, and for business/productivity purposes the HDR/VRR is not generally useful, and the 4:2:2 chroma subsampling modes may be an issue for clear text reproduction.

On the brightness side, the HDR white and black levels, and the contrast ratios are excellent; with Windows 10 HDR mode these features can be utilized outside of HDR content. The ROG Swift PG27UQ is well-calibrated out-of-the-box, which can’t be understated as most people don’t calibrate monitors. The FALD operates with good uniformity, and color reproduction matches well under both HDR and SDR gamuts.

As for the $2000 price point, and the monitor itself, it all comes down to gaming with all the bells and whistles of PC display technology: 4K, 144Hz, G-Sync, HDR10 with FALD and peak 1000 nits brightness. Market-wise, there isn’t a true option that is a step below this, as right now, the PG27UQ and Acer variant are the only choices if gamers are looking for either high refresh rates on a 4K G-Sync monitor, or a G-Sync monitor that supports HDR.

So seeking either combination leaves consumers to have to step up to the G-Sync HDR products. Nonetheless, Acer did recently announce a DisplayHDR 400 variant without quantum dots or FALD, set at $1299 and due to launch in Q4. However, without QD, FALD, or DisplayHDR 1000/600 capabilities, HDR functionality is on the minimal side, and it’s telling that the monitor is specced as a G-Sync monitor rather than G-Sync HDR. As far as we know, there isn’t an upcoming intermediate panel in the vein of a 1440p 144Hz G-Sync HDR product, which would be less able to justify a premium margin.

But because the monitor is focused on HDR gaming, the situation with OS and video game support needs to be noted, though again we should reiterate that this is outside Asus’ control. There is a limited selection of games with HDR support, which doesn’t always equate to HDR10, and of those games not all are developed or designed to utilize HDR’s breadth of brightness and color. Windows 10 support for HDR displays have improved considerably, but is still a work-in-progress. All of this is to say that HDR gaming is baked into the $2000, and purchasing it for primarily high refresh rate 4K gaming effectively increases the premium that the consumer would be paying.

So essentially, gamers will not get the best value of the PG27UQ unless they:

  • Regularly play or will play games that support HDR, ideally ones that use HDR very well
  • Have or will have the graphics horsepower to go beyond 4K 60fps in those games, and are willing to deal with SLI if necessary to achieve that
  • Are willing to deal with maturing HDR support in video games, software, and Windows 10

Again, if price is no object, then these points don't matter from a value perspective. And if consumers fit the criteria, then the PG27UQ deserves serious consideration, because presently there is no other class of monitor that can provide the gaming experience that G-Sync HDR monitors like the ROG Swift PG27UQ can. Asus's monitor packs in every bell and whistle imaginable on a PC gaming monitor, and the end result is that, outside of Acer's twin monitor, the PG27UQ is unparalleled with respect to its feature set, and is among the best monitors out there in terms of image quality.

But if price is still a factor – as playing on the bleeding edge of technology so often is – consumers will have to keep in mind that they might be paying a premium for features they may not regularly use, will use much later in the future than anticipated, or will cost more than expected to use (i.e. costs of dual RTX 2080 Ti's). In the case of GeForce RTX cards, you might end up in a waiting situation for titles to release with HDR and/or RTX support, whereupon the card would still not push the PG27UQ's capabilties to the max.

On that note, the situation relies a lot on media consumption habits, not only in terms of HDR games or HDR video content but also in terms of console usage, preference for indie over AA/AAA games, and preference over older versus newer titles. If $2000 is an affordable amount, that budget could encompass two quality displays combined that may better suit individual use-case scenarios, for example, Asus' $600 to $700 PG279Q (1440p 165Hz G-Sync IPS 27-in) monitor paired with a $1300 4K HDR 27-in professional monitor with peak 1000 nit luminance. Or instead of a professional HDR monitor, an entry or mid-level 4K HDR TV in the $550 to $1000 range.

Wrapping things up, if it sounds like this is equal parts a conclusion of G-Sync HDR as much as it is of the ROG Swift PG27UQ, it is because it is. G-Sync HDR currently exists as the Asus and Acer 27-in models, and those G-Sync HDR capabilities are what is driving the price; NVIDIA’s G-Sync HDR is not just context or a single feature, it is intrinsically intertwined with the PG27UQ.

Though this is not to say the ROG Swift PG27UQ is flawed. It’s not the perfect package, but the panel provides combined qualities that no other gaming monitor, excluding the Acer variant, can offer. As a consumer-focused site we can never ignore the importance of price in evaluating a product, but just putting that aside for the briefest of moments, it really is an awesome monitor that is well beyond what any other monitor can deliver today. It just costs more than what most gamers will ever consider paying for a monitor, and the nuances of the monitor, G-Sync HDR, and HDR gaming means that $2000 might be more than expected for how you use it.

Ultimately the PG27UQ is the first of many G-Sync HDR monitors. And as the technology matures, hopefully we'll see these monitors further improve and for the price to drop. However for the near future, the schedule slip of the 27-inch G-Sync HDR models doesn’t bode well for the indeterminate timeframe of the 35-in ultrawides and 65-in BFGDs. So if you want the best right now – and what's very likely to be the best 27-inch monitor for at least the next year or two to come – this is it.

Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • imaheadcase - Tuesday, October 2, 2018 - link

    3840x1600 is the dell i mean.
  • Impulses - Tuesday, October 2, 2018 - link

    The Acer Predator 32" has a similar panel as that BenQ and adds G-Sync tho still at a max 60Hz, not as well calibrated out of the box (and with a worse stand and controls) but it has dropped in price a couple times to the same as the BenQ... I've been cross shopping them for a while because 2 grand for a display whose features I may or may not be able to leverage in the next 3 years seems dubious.

    I wanted to go 32" too because the 27" 1440p doesn't seem like enough of a jump from my 24" 1920x1200 (being 16:10 it's nearly as tall as the 16:9 27"erd), and I had three of those which we occasionally used in Eyefinity mode (making a ~40" display). I've looked at 40-43" displays but they're all lacking compared to the smaller stuff (newer ones are all VA too, mostly Phillips and one Dell).

    I use my PC for photo editing as much as PC gaming but I'm not a pro so a decent IPS screen that I can calibrate reasonably well would satisfy my photo needs.
  • Fallen Kell - Tuesday, October 2, 2018 - link

    It is "almost" perfect. It is missing one of the most important things, HDMI 2.1, which has the bandwidth to actually feed the panel with what it is capable of doing (i.e. 4k HDR 4:4:4 120Hz). But we don't have that because this monitor was actually designed 3 years ago and only now finally coming to market, 6 months after HDMI 2.1 was released.
  • lilkwarrior - Monday, October 8, 2018 - link

    HDMI 2.1 certification is still not done; it would not have been able to call itself a HDMI 2.1 till probably late this year or next year.
  • imaheadcase - Tuesday, October 2, 2018 - link

    The 35 inch one has been canceled fyi. Asus rep told me when inquired about it just a week ago, unless in a week something has changed. Reason being panel is not perfect yet to mass produce.

    That said, its not a big loss, even if disappointing. Because HDR is silly tech so you can skip this generation
  • EAlbaek - Tuesday, October 2, 2018 - link

    I bought one of these, just as they came out. Amazing display performance, but the in-built fan to cool the G-Sync HDR-module killed it for me.

    It's one of those noisy 40mm fans, which were otherwise banned from PC setups over a decade ago. It made more noise than the entirety of the rest of my 1080 Ti-SLI system combined. Like a wasp was loose in my room all the time. Completely unbearable to listen to.

    I tried to return the monitor as RMA, as I thought that couldn't be right. But it could, said the retailer. At which point I chose to simply return the unit.

    In my case, these things will have to wait, till nVidia makes a new G-Sync HDR module, which doesn't require active cooling. Plain and simple. I'm sort of guessing that'll fall in line with the availability of micro-LED displays. Which will hopefully also be much cheaper, than the ridiculously expensive FALD-panels in these monitors.
  • imaheadcase - Tuesday, October 2, 2018 - link

    Can't you just replace the fan yourself? I read around the time of release someone simply removed fan and put own silent version on it.
  • EAlbaek - Tuesday, October 2, 2018 - link

    No idea - I shouldn't have to void the warranty on my $2000 monitor, to replace a 40mm fan.
  • madwolfa - Tuesday, October 2, 2018 - link

    Is that G-Sync HDR that requires active cooling or FALD array?
  • EAlbaek - Tuesday, October 2, 2018 - link

    It's the G-Sync HDR chip, apparantly.

Log in

Don't have an account? Sign up now