Closing Thoughts

Bringing this review to a close, the ROG Swift PG27UQ has some subtleties as it is just as much a ‘G-Sync HDR’ monitor as it is an ROG Swift 4Kp144 HDR monitor. In terms of panel quality and color reproduction, the PG27UQ is excellent by our tests. As a whole, the monitor comes with some slight design compromises: design bulkiness, active cooling, limited connectivity. However, those aspects aren’t severe enough to be dealbreakers except in very specific scenarios, such as for silent PC setups. Given the pricing and capabilities, the PG27UQ is destined to be paired with the highest end graphics cards; for a 4K 144Hz target, multi-GPU with SLI is the only – and pricy – solution for more intensive games.

And on that note, therein lies the main nuance with the PG27UQ. The $2000 price point is firmly in the ultra-high-end based on the specific combination of functionalities that the display offers: 4K, 144Hz, G-Sync, and DisplayHDR 1000-level HDR.

For ‘price is no object’ types, this is hardly a concern if the ROG Swift PG27UQ can hit all those well – and it does. But if price is at least somewhat of a consideration – and for the vast majority, it still is – then not using all those features simultaneously means not utilizing the full value of the monitor, and at $2000 this is already including an existing premium. The use-cases where all those features would be used simultaneously, that is, HDR games, are somewhat limited due to the nature of HDR support in PC games, as well as the horsepower of graphics cards currently on the market.

The graphics hardware landscape brings us to the other idea behind getting a monitor of this caliber: futureproofing. At this time, even the GeForce RTX 2080 Ti is not capable of reaching much beyond 80fps or so, and with NVIDIA stepping back from SLI, especially with 2+ way configurations, multi-GPU options are somewhat unpredictable and require more user-configuration. This could be particularly problematic depending on the nature of the HDR with 4:2:2 chroma subsampling performance hit for Pascal cards. Though this could go both ways, as some gamers expect minimal user configuration for products at the upper end of ‘premium’.

On the face of it, this is the type of monitor that demands ‘next-generation graphics’, and fortunately we have the benefit of NVIDIA’s announcement – and now launch – of Turing and GeForce RTX GPUs. In looking to that next generation, G-Sync HDR monitors are put in an awkward position. We still don’t know the extent of performance on Turing hybrid rendering with real-time ray tracing effects, but that feature is clearly the primary focus, if the branding ‘GeForce RTX’ wasn’t already clear enough. For traditional rendering in games (i.e. ‘out-of-the-box’ performance in most games), for 4K performance we saw the RTX 2080 Ti as 32% faster than the GTX 1080 Ti, reference-to-reference, and the RTX 2080 as around 8% faster than the GTX 1080 Ti and 35% faster than the GTX 1080. In the mix is the premium pricing of the GeForce RTX 2080 Ti, 2080, and 2070, of which only the 2080 Ti and 2080 support SLI.

Although this is really a topic to revisit after RTX support rolls out in games, Turing and its successors matter if only because this is a forward-looking monitor with G-Sync (and thus using VRR) means using NVIDIA cards. And for modern ultra-high-end gaming monitors, VRR is simply mandatory. Given that Turing’s focus is on new feature sets rather than purely on raw traditional performance over Pascal, then it somewhat plays against the idea of ‘144Hz 4K HDR with variable refresh’ as the near-future 'ultimate gaming experience', presumably in favor of real-time raytracing and the like. So enthusiasts might be faced with a quandary where enabling real-time raytracing effects means forgoing 4K resolution and/or ultra-high refresh rates, and even when for traditional non-raytraced performance, the framerate is still lacking. Again, details won’t become clear until we see the intensity of hybrid rendered game workloads, but this is absolutely something to keep in mind because not only are ultra-high-end gaming monitors and ultra-high-end graphics cards are tied at the hip, but also that the former tends to have longer upgrade/replacement cycles than the latter.

With futureproofing and to a lesser extent early adoption, consumers are paying the premium for features that they will fully utilize at some point, and that the device in question will still be viable until then. But if there is hard divergence from that vision of the future, then some of those features might not be fully utilized for quite some time. For the PG27UQ, it’s clear that the panel quality and HDR capability will keep it viable for quite some time, but right now the rest of the situation is unclear.

Returning to the here-and-now, there are a few general caveats for a prospective buyer. Utilizing HDMI will work with HDR input sources (limited to 60Hz max), but the G-Sync functionality is unused with current generation HDR consoles, which support FreeSync. The monitor is not intended for double-duty as a professional visualization monitor, and for business/productivity purposes the HDR/VRR is not generally useful, and the 4:2:2 chroma subsampling modes may be an issue for clear text reproduction.

On the brightness side, the HDR white and black levels, and the contrast ratios are excellent; with Windows 10 HDR mode these features can be utilized outside of HDR content. The ROG Swift PG27UQ is well-calibrated out-of-the-box, which can’t be understated as most people don’t calibrate monitors. The FALD operates with good uniformity, and color reproduction matches well under both HDR and SDR gamuts.

As for the $2000 price point, and the monitor itself, it all comes down to gaming with all the bells and whistles of PC display technology: 4K, 144Hz, G-Sync, HDR10 with FALD and peak 1000 nits brightness. Market-wise, there isn’t a true option that is a step below this, as right now, the PG27UQ and Acer variant are the only choices if gamers are looking for either high refresh rates on a 4K G-Sync monitor, or a G-Sync monitor that supports HDR.

So seeking either combination leaves consumers to have to step up to the G-Sync HDR products. Nonetheless, Acer did recently announce a DisplayHDR 400 variant without quantum dots or FALD, set at $1299 and due to launch in Q4. However, without QD, FALD, or DisplayHDR 1000/600 capabilities, HDR functionality is on the minimal side, and it’s telling that the monitor is specced as a G-Sync monitor rather than G-Sync HDR. As far as we know, there isn’t an upcoming intermediate panel in the vein of a 1440p 144Hz G-Sync HDR product, which would be less able to justify a premium margin.

But because the monitor is focused on HDR gaming, the situation with OS and video game support needs to be noted, though again we should reiterate that this is outside Asus’ control. There is a limited selection of games with HDR support, which doesn’t always equate to HDR10, and of those games not all are developed or designed to utilize HDR’s breadth of brightness and color. Windows 10 support for HDR displays have improved considerably, but is still a work-in-progress. All of this is to say that HDR gaming is baked into the $2000, and purchasing it for primarily high refresh rate 4K gaming effectively increases the premium that the consumer would be paying.

So essentially, gamers will not get the best value of the PG27UQ unless they:

  • Regularly play or will play games that support HDR, ideally ones that use HDR very well
  • Have or will have the graphics horsepower to go beyond 4K 60fps in those games, and are willing to deal with SLI if necessary to achieve that
  • Are willing to deal with maturing HDR support in video games, software, and Windows 10

Again, if price is no object, then these points don't matter from a value perspective. And if consumers fit the criteria, then the PG27UQ deserves serious consideration, because presently there is no other class of monitor that can provide the gaming experience that G-Sync HDR monitors like the ROG Swift PG27UQ can. Asus's monitor packs in every bell and whistle imaginable on a PC gaming monitor, and the end result is that, outside of Acer's twin monitor, the PG27UQ is unparalleled with respect to its feature set, and is among the best monitors out there in terms of image quality.

But if price is still a factor – as playing on the bleeding edge of technology so often is – consumers will have to keep in mind that they might be paying a premium for features they may not regularly use, will use much later in the future than anticipated, or will cost more than expected to use (i.e. costs of dual RTX 2080 Ti's). In the case of GeForce RTX cards, you might end up in a waiting situation for titles to release with HDR and/or RTX support, whereupon the card would still not push the PG27UQ's capabilties to the max.

On that note, the situation relies a lot on media consumption habits, not only in terms of HDR games or HDR video content but also in terms of console usage, preference for indie over AA/AAA games, and preference over older versus newer titles. If $2000 is an affordable amount, that budget could encompass two quality displays combined that may better suit individual use-case scenarios, for example, Asus' $600 to $700 PG279Q (1440p 165Hz G-Sync IPS 27-in) monitor paired with a $1300 4K HDR 27-in professional monitor with peak 1000 nit luminance. Or instead of a professional HDR monitor, an entry or mid-level 4K HDR TV in the $550 to $1000 range.

Wrapping things up, if it sounds like this is equal parts a conclusion of G-Sync HDR as much as it is of the ROG Swift PG27UQ, it is because it is. G-Sync HDR currently exists as the Asus and Acer 27-in models, and those G-Sync HDR capabilities are what is driving the price; NVIDIA’s G-Sync HDR is not just context or a single feature, it is intrinsically intertwined with the PG27UQ.

Though this is not to say the ROG Swift PG27UQ is flawed. It’s not the perfect package, but the panel provides combined qualities that no other gaming monitor, excluding the Acer variant, can offer. As a consumer-focused site we can never ignore the importance of price in evaluating a product, but just putting that aside for the briefest of moments, it really is an awesome monitor that is well beyond what any other monitor can deliver today. It just costs more than what most gamers will ever consider paying for a monitor, and the nuances of the monitor, G-Sync HDR, and HDR gaming means that $2000 might be more than expected for how you use it.

Ultimately the PG27UQ is the first of many G-Sync HDR monitors. And as the technology matures, hopefully we'll see these monitors further improve and for the price to drop. However for the near future, the schedule slip of the 27-inch G-Sync HDR models doesn’t bode well for the indeterminate timeframe of the 35-in ultrawides and 65-in BFGDs. So if you want the best right now – and what's very likely to be the best 27-inch monitor for at least the next year or two to come – this is it.

Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • Ryan Smith - Wednesday, October 3, 2018 - link

    Aye. The FALD array puts out plenty of heat, but it's distributed, so it can be dissipated over a large area. The FPGA for controlling G-Sync HDR is generates much less heat, but it's concentrated. So passive cooling would seem to be non-viable here.
  • a5cent - Wednesday, October 3, 2018 - link

    Yeah, nVidia's DP1.4 VRR solution is baffelingly poor/non-competitive, not just due to the requirement for active cooling.

    nVidia's DP1.4 g-sync module is speculated to contribute a lot to the monitor's price (FPGA alone is estimated to be ~ $500). If true, I just don't see how g-sync isn't on a path towards extinction. That simply isn't a price premium over FreeSync that the consumer market will accept.

    If g-sync isn't at least somewhat widespread and (via customer lock in) helping nVidia sell more g-sync enabled GPUs, then g-sync also isn't serving any role for nVidia. They might as well drop it and go with VESA's VRR standard.

    So, although I'm actually thinking of shelling out $2000 for a monitor, I don't want to invest in technology it seems has priced itself out of the market and is bound to become irrelevant.

    Maybe you could shed some light on where nVidia is going with their latest g-sync solution? At least for now it doesn't seem viable.
  • Impulses - Wednesday, October 3, 2018 - link

    How would anyone outside of NV know where they're going with this tho? I imagine it does help sell more hardware to one extent or another (be it GPUs, FPGAs to display makers, or a combination of profits thru the side deals) AND they'll stay the course as long as AMD isn't competitive at the high end...

    Just the sad reality. I just bought a G-Sync display but it wasn't one of these or even $1K, and it's still a nice display regardless of whether it has G-Sync or not. I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones.
  • a5cent - Wednesday, October 3, 2018 - link

    "How would anyone outside of NV know where they're going with this tho?"

    Anandtech could talk with their contacts at nVidia, discuss the situation with monitor OEMs, or take any one of a dozen other approaches. Anandtech does a lot of good market research and analysis. There is no reason they can't do that here too. If Anandtech confronted nVidia with the concern of DP1.4 g-sync being priced into irrelevancy, they would surely get some response.

    "I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones."

    You're mistakenly assuming the DP1.2 g-sync is in any way comparable to DP1.4 g-sync. It's not.

    First, nobody sells plenty of g-sync monitors. The $200 price premium over FreeSync has made g-sync monitors (comparatively) low volume niche products. For DP1.4 that premium goes up to over $500. There is no way that will fly in a market where the entire product typically sells for less than $500. This is made worse by the fact that ONLY DP1.4 supports HDR. That means even a measly DisplayHDR 400 monitor, which will soon retail for around $400, will cost at least $900 if you want it with g-sync.

    Almost nobody, for whom price is even a little bit of an issue, will pay that.

    While DP1.2 g-sync monitors were niche products, DP1.4 g-sync monitors will be irrelevant products (in terms of market penetration). Acer's and Asus' $2000 monitors aren't and will not sell in significant numbers. Nothing using nVidia's DP1.4 g-sync module will.

    To be clear, this isn't a rant about price. It's a rant about strategy. The whole point of g-sync is customer lock-in. Nobody, not even nVidia, earns anything selling g-sync hardware. For nVidia, the potential of g-sync is only realized when a person with a g-sync monitor upgrades to a new nVidia card who would otherwise have bought an AMD card. If DP1.4 g-sync isn't adopted in at least somewhat meaningful numbers, g-sync loses its purpose. That is when I'd expect nVidia to either trash g-sync and start supporting FreeSync, OR build a better g-sync module without the insanely expensive FPGA.

    Neither of those two scenarios motivates me to buy a $2000 g-sync monitor today. That's the problem.
  • a5cent - Wednesday, October 3, 2018 - link

    To clarify the above...

    If I'm spending $2000 on a g-sync monitor today, I'd like some reassurance that g-sync will still be relevant and supported three years from now.

    For the reasons mentioned, from where I stand, g-sync looks like "dead technology walking". With DP1.4 it's priced itself out of the market. I'm sure many would appreciate some background on where nVidia is going with this...
  • lilkwarrior - Monday, October 8, 2018 - link

    Nvidia's solution is objectively better besides not being open. Similarly NVLINK is better than any other multi-GPU hardware wise.

    With HDMI 2.1, Nvidia will likely support it unless it's simply underwhelming.

    Once standards catch up, Nvidia hasn't been afraid to deprecate their own previous effort somewhat besides continuing to support it for wide-spread support / loyalty or a balanced approach (i.e. NVLINK for Geforce cards but delegate memory pooling to DX12 & Vulkan)
  • Impulses - Tuesday, October 2, 2018 - link

    If NVidia started supporting standard adaptive sync at the same time that would be great... Pipe dream I know. Things like G-Sync vs Freesync, fans inside displays, and dubious HDR support don't inspire much confidence in these new displays. I'd gladly drop the two grand if I *knew* this was the way forward and would easily last me 5+ years, but I dunno if that would really pan out.
  • DanNeely - Tuesday, October 2, 2018 - link

    Thank you for including the explanation on why DSC hasn't shown up in any products to date.
  • Heavenly71 - Tuesday, October 2, 2018 - link

    I'm pretty disappointed that a gaming monitor with this price still has only 8 bits of native color resolution (plus FRC, I know).

    Compare this to the ASUS PA32UC which – while not mainly targetted at gamers – has 10 bits, no fan noise, is 5 inches bigger (32" total) and many more inputs (including USB-C DP). For about the same price.
  • milkod2001 - Tuesday, October 2, 2018 - link

    Wonder if they make native 10bit monitors. Would you be able to output 10bit colours from gaming GPU or only professional GPU?

Log in

Don't have an account? Sign up now