Closing Thoughts

Bringing this review to a close, the ROG Swift PG27UQ has some subtleties as it is just as much a ‘G-Sync HDR’ monitor as it is an ROG Swift 4Kp144 HDR monitor. In terms of panel quality and color reproduction, the PG27UQ is excellent by our tests. As a whole, the monitor comes with some slight design compromises: design bulkiness, active cooling, limited connectivity. However, those aspects aren’t severe enough to be dealbreakers except in very specific scenarios, such as for silent PC setups. Given the pricing and capabilities, the PG27UQ is destined to be paired with the highest end graphics cards; for a 4K 144Hz target, multi-GPU with SLI is the only – and pricy – solution for more intensive games.

And on that note, therein lies the main nuance with the PG27UQ. The $2000 price point is firmly in the ultra-high-end based on the specific combination of functionalities that the display offers: 4K, 144Hz, G-Sync, and DisplayHDR 1000-level HDR.

For ‘price is no object’ types, this is hardly a concern if the ROG Swift PG27UQ can hit all those well – and it does. But if price is at least somewhat of a consideration – and for the vast majority, it still is – then not using all those features simultaneously means not utilizing the full value of the monitor, and at $2000 this is already including an existing premium. The use-cases where all those features would be used simultaneously, that is, HDR games, are somewhat limited due to the nature of HDR support in PC games, as well as the horsepower of graphics cards currently on the market.

The graphics hardware landscape brings us to the other idea behind getting a monitor of this caliber: futureproofing. At this time, even the GeForce RTX 2080 Ti is not capable of reaching much beyond 80fps or so, and with NVIDIA stepping back from SLI, especially with 2+ way configurations, multi-GPU options are somewhat unpredictable and require more user-configuration. This could be particularly problematic depending on the nature of the HDR with 4:2:2 chroma subsampling performance hit for Pascal cards. Though this could go both ways, as some gamers expect minimal user configuration for products at the upper end of ‘premium’.

On the face of it, this is the type of monitor that demands ‘next-generation graphics’, and fortunately we have the benefit of NVIDIA’s announcement – and now launch – of Turing and GeForce RTX GPUs. In looking to that next generation, G-Sync HDR monitors are put in an awkward position. We still don’t know the extent of performance on Turing hybrid rendering with real-time ray tracing effects, but that feature is clearly the primary focus, if the branding ‘GeForce RTX’ wasn’t already clear enough. For traditional rendering in games (i.e. ‘out-of-the-box’ performance in most games), for 4K performance we saw the RTX 2080 Ti as 32% faster than the GTX 1080 Ti, reference-to-reference, and the RTX 2080 as around 8% faster than the GTX 1080 Ti and 35% faster than the GTX 1080. In the mix is the premium pricing of the GeForce RTX 2080 Ti, 2080, and 2070, of which only the 2080 Ti and 2080 support SLI.

Although this is really a topic to revisit after RTX support rolls out in games, Turing and its successors matter if only because this is a forward-looking monitor with G-Sync (and thus using VRR) means using NVIDIA cards. And for modern ultra-high-end gaming monitors, VRR is simply mandatory. Given that Turing’s focus is on new feature sets rather than purely on raw traditional performance over Pascal, then it somewhat plays against the idea of ‘144Hz 4K HDR with variable refresh’ as the near-future 'ultimate gaming experience', presumably in favor of real-time raytracing and the like. So enthusiasts might be faced with a quandary where enabling real-time raytracing effects means forgoing 4K resolution and/or ultra-high refresh rates, and even when for traditional non-raytraced performance, the framerate is still lacking. Again, details won’t become clear until we see the intensity of hybrid rendered game workloads, but this is absolutely something to keep in mind because not only are ultra-high-end gaming monitors and ultra-high-end graphics cards are tied at the hip, but also that the former tends to have longer upgrade/replacement cycles than the latter.

With futureproofing and to a lesser extent early adoption, consumers are paying the premium for features that they will fully utilize at some point, and that the device in question will still be viable until then. But if there is hard divergence from that vision of the future, then some of those features might not be fully utilized for quite some time. For the PG27UQ, it’s clear that the panel quality and HDR capability will keep it viable for quite some time, but right now the rest of the situation is unclear.

Returning to the here-and-now, there are a few general caveats for a prospective buyer. Utilizing HDMI will work with HDR input sources (limited to 60Hz max), but the G-Sync functionality is unused with current generation HDR consoles, which support FreeSync. The monitor is not intended for double-duty as a professional visualization monitor, and for business/productivity purposes the HDR/VRR is not generally useful, and the 4:2:2 chroma subsampling modes may be an issue for clear text reproduction.

On the brightness side, the HDR white and black levels, and the contrast ratios are excellent; with Windows 10 HDR mode these features can be utilized outside of HDR content. The ROG Swift PG27UQ is well-calibrated out-of-the-box, which can’t be understated as most people don’t calibrate monitors. The FALD operates with good uniformity, and color reproduction matches well under both HDR and SDR gamuts.

As for the $2000 price point, and the monitor itself, it all comes down to gaming with all the bells and whistles of PC display technology: 4K, 144Hz, G-Sync, HDR10 with FALD and peak 1000 nits brightness. Market-wise, there isn’t a true option that is a step below this, as right now, the PG27UQ and Acer variant are the only choices if gamers are looking for either high refresh rates on a 4K G-Sync monitor, or a G-Sync monitor that supports HDR.

So seeking either combination leaves consumers to have to step up to the G-Sync HDR products. Nonetheless, Acer did recently announce a DisplayHDR 400 variant without quantum dots or FALD, set at $1299 and due to launch in Q4. However, without QD, FALD, or DisplayHDR 1000/600 capabilities, HDR functionality is on the minimal side, and it’s telling that the monitor is specced as a G-Sync monitor rather than G-Sync HDR. As far as we know, there isn’t an upcoming intermediate panel in the vein of a 1440p 144Hz G-Sync HDR product, which would be less able to justify a premium margin.

But because the monitor is focused on HDR gaming, the situation with OS and video game support needs to be noted, though again we should reiterate that this is outside Asus’ control. There is a limited selection of games with HDR support, which doesn’t always equate to HDR10, and of those games not all are developed or designed to utilize HDR’s breadth of brightness and color. Windows 10 support for HDR displays have improved considerably, but is still a work-in-progress. All of this is to say that HDR gaming is baked into the $2000, and purchasing it for primarily high refresh rate 4K gaming effectively increases the premium that the consumer would be paying.

So essentially, gamers will not get the best value of the PG27UQ unless they:

  • Regularly play or will play games that support HDR, ideally ones that use HDR very well
  • Have or will have the graphics horsepower to go beyond 4K 60fps in those games, and are willing to deal with SLI if necessary to achieve that
  • Are willing to deal with maturing HDR support in video games, software, and Windows 10

Again, if price is no object, then these points don't matter from a value perspective. And if consumers fit the criteria, then the PG27UQ deserves serious consideration, because presently there is no other class of monitor that can provide the gaming experience that G-Sync HDR monitors like the ROG Swift PG27UQ can. Asus's monitor packs in every bell and whistle imaginable on a PC gaming monitor, and the end result is that, outside of Acer's twin monitor, the PG27UQ is unparalleled with respect to its feature set, and is among the best monitors out there in terms of image quality.

But if price is still a factor – as playing on the bleeding edge of technology so often is – consumers will have to keep in mind that they might be paying a premium for features they may not regularly use, will use much later in the future than anticipated, or will cost more than expected to use (i.e. costs of dual RTX 2080 Ti's). In the case of GeForce RTX cards, you might end up in a waiting situation for titles to release with HDR and/or RTX support, whereupon the card would still not push the PG27UQ's capabilties to the max.

On that note, the situation relies a lot on media consumption habits, not only in terms of HDR games or HDR video content but also in terms of console usage, preference for indie over AA/AAA games, and preference over older versus newer titles. If $2000 is an affordable amount, that budget could encompass two quality displays combined that may better suit individual use-case scenarios, for example, Asus' $600 to $700 PG279Q (1440p 165Hz G-Sync IPS 27-in) monitor paired with a $1300 4K HDR 27-in professional monitor with peak 1000 nit luminance. Or instead of a professional HDR monitor, an entry or mid-level 4K HDR TV in the $550 to $1000 range.

Wrapping things up, if it sounds like this is equal parts a conclusion of G-Sync HDR as much as it is of the ROG Swift PG27UQ, it is because it is. G-Sync HDR currently exists as the Asus and Acer 27-in models, and those G-Sync HDR capabilities are what is driving the price; NVIDIA’s G-Sync HDR is not just context or a single feature, it is intrinsically intertwined with the PG27UQ.

Though this is not to say the ROG Swift PG27UQ is flawed. It’s not the perfect package, but the panel provides combined qualities that no other gaming monitor, excluding the Acer variant, can offer. As a consumer-focused site we can never ignore the importance of price in evaluating a product, but just putting that aside for the briefest of moments, it really is an awesome monitor that is well beyond what any other monitor can deliver today. It just costs more than what most gamers will ever consider paying for a monitor, and the nuances of the monitor, G-Sync HDR, and HDR gaming means that $2000 might be more than expected for how you use it.

Ultimately the PG27UQ is the first of many G-Sync HDR monitors. And as the technology matures, hopefully we'll see these monitors further improve and for the price to drop. However for the near future, the schedule slip of the 27-inch G-Sync HDR models doesn’t bode well for the indeterminate timeframe of the 35-in ultrawides and 65-in BFGDs. So if you want the best right now – and what's very likely to be the best 27-inch monitor for at least the next year or two to come – this is it.

Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • lilkwarrior - Monday, October 8, 2018 - link

    OLED isn't covered by VESA HDR standards; it's far superior picture quality & contrast.

    QLED cannot compete with OLED at all in such things. I would very much get a Dolby Vision OLED monitor than a LED monitor with a HDR 1000 rating.
  • Lolimaster - Tuesday, October 2, 2018 - link

    You can't even call HDR with a pathetic low contrast IPS.
  • resiroth - Monday, October 8, 2018 - link

    Peak luminance levels are overblown because they’re easily quantifiable. In reality, if you’ve ever seen a recent LG TV which can hit about 900 nits peak that is too much. https://www.rtings.com/tv/reviews/lg/c8

    It’s actually almost painful.

    That said I agree oled is the way to go. I wasn’t impressed by any LCD (FALD or not) personally. It doesn’t matter how bright the display gets if it can’t highlight stars on a night sky etc. without significant blooming.

    Even 1000 bits is too much for me. The idea of 4000 is absurd. Yes, sunlight is way brighter, but we don’t frequently change scenes from night time to day like television shows do. It’s extremely jarring. Unless you like the feeling of being woken up repeatedly in the middle of the night by a flood light. It’s a hard pass.
  • Hxx - Saturday, October 6, 2018 - link

    the only competition is Acer which costs the same. If you want Gsync you have to pony up otherwise yeah there are much cheaper alternatives.
  • Hixbot - Tuesday, October 2, 2018 - link

    Careful with this one, the "whistles" in the article title is referring to the built-in fan whine. Seriously, look at the newegg reviews.
  • JoeyJoJo123 - Tuesday, October 2, 2018 - link

    "because I know"

    I wouldn't be so sure. Not for Gsync, at least. AU Optronics is the only panel producer for monitor sized displays that even gives a flip about pushing lots of high refresh rate options on the market. A 2560x1440 144hz monitor 3 years ago still costs just as much today (if not more, due to upcoming China-to-US import tariffs, starting with 10% on October 1st 2018, and another 15% (total 25%) in January 1st 2019.

    High refresh rate GSync isn't set to come down anytime soon, not as long as Nvidia has a stranglehold on GPU market and not as long as AU Optronics is the only panel manufacturer that cares about high refresh rate PC monitor displays.
  • lilkwarrior - Monday, October 8, 2018 - link

    Japan Display plans to change that in 2019. IIRC Asus is planning to use their displays for a portable Professional OLED monitor.

    I would not be surprised they or LG created OLED gaming monitors from Japan Display that's a win-win for gamers, Japan Display, & monitor manufacturers in 2020.

    Alternatively they surprise us with MLED monitors that Japan Display also invested in + Samsung & LG.

    That's way better to me than any Nano-IPS/QLED monitor. They simply cannot compete.
  • Impulses - Tuesday, October 2, 2018 - link

    I would GLADLY pay the premium over the $600-1,000 alternatives IF I thought I was really going to take advantage of what the display offers in the next 2 or even 4 years... But that's the issue. I'm trying to move away from SLI/CF (2x R9 290 atm, about to purchase some sort of 2080), not force myself back into it.

    You're gonna need SLI RTX 2080s (Ti or not) to really eke out frame rates fast enough for the refresh rate to matter at 4K, chances are it'll be the same with the next gen of cards unless AMD pulls a rabbit out of a hat and quickly gets a lot more competitive. That's 2-3 years easy where SLI would be a requirement.

    HDR support seems to be just as much of a mess... I'll probably just end up with a 32" 4K display (because I'm yearning for something larger than my single 16:10 24" and that approaches the 3x 24" setup I've used at times)... But if I wanted to try a fast refresh rate display I'd just plop down a 27" 1440p 165Hz next to it.

    Nate's conclusion is exactly the mental calculus I've been doing, those two displays are still less money than one of these and probably more useful in the long run as secondary displays or hand me down options... As awesome as these G-Sync HDR displays may be, the vendor lock in around G-Sync and active cooling makes em seem like poor investments.

    Good displays should last 5+ years easy IMO, I'm not sure these would still be the best solution in 3 years.
  • Icehawk - Wednesday, October 3, 2018 - link

    Grab yourself an inexpensive 32" 4k display, decent ones are ~$400 these days. I have an LG and it's great all around (I'm a gamer btw), it's not quite high end but it's not a low end display either - it compares pretty favorably to my Dell 27" 2k monitor. I just couldn't see bothering with HDR or any of that other $$$ BS at this point, plus I'm not particularly bothered by screen tearing and I don't demand 100+ FPS from games. Not sure why people are all in a tizzy about super high FPS, as long as the game runs smoothly I am happy.
  • WasHopingForAnHonestReview - Saturday, October 6, 2018 - link

    You dont belong here, plebian.

Log in

Don't have an account? Sign up now