HDR Gaming Impressions

In the end, a monitor like the PG27UQ is really destined for one purpose: gaming. And not just any gaming, but the type of quality experience that does not compromise between resolution and refresh rate, let alone HDR and VRR.

That being said, not many games support HDR, which for G-Sync HDR means HDR10 support. Even for games that do support an HDR standard of some kind, the quality of the implementation naturally varies from developer to developer. And because of console HDR support, some games only feature HDR in their console incarnations.

The other issue is that the HDR gaming experience is hard to communicate objectively. In-game screenshots won't replicate how the HDR content is delivered on the monitor with its brightness, backlighting, and wider color gamut, while photographs are naturally limited by the capturing device. And naturally, any HDR content will obviously be limited by the viewer's display. On our side, this makes it easy to generally gush about glorious HDR vibrance and brightness, especially as on-the-fly blind A/B testing is not so simple (duplicated SDR and HDR output is not currently possible).

As for today, we are looking at Far Cry 5 (HDR10), F1 2017 (scRGB HDR), Battlefield 1 (HDR10), and Middle-earth: Shadow of War (HDR10), which covers a good mix of genres and graphics intensity. Thanks to in-game benchmarks for three of them, they also provide a static point of reference; in the same vein, Battlefield 1's presence in the GPU game suite means I've seen and benchmarked the same sequence enough times to dream about it.

For such subjective-but-necessary impressions like these, we'll keep ourselves grounded by sticking to a few broad questions:

  • What differences are noticable from 4K with non-HDR G-Sync?
  • What differences are noticable from 4:4:4 to 4:2:2 chroma subsampling at 98Hz?
  • What about lowering resolution to 1440p HDR or lowering details with HDR on, for higher refresh rates? Do I prefer HDR over high refresh rates?
  • Are there any HDR artifacts? e.g. halo effects, washed out or garish colors, blooming due to local dimming

The 4K G-Sync HDR Experience

From the beginning, we expected that targeting 144fps at 4K was not really plausible for graphically intense games, and that still holds true. On a reference GeForce GTX 1080 Ti, none of the games averaged past 75fps, and even the brand-new RTX 2080 Ti won't come close to doubling that.

Ubquituous black loading and intro screens make the local dimming bloom easily noticable, though this is a commonly known phenomenon and somewhat unavoidable. The majority of the time, it is fairly unintrusive. Because the local backlighting zones can only get so small on LCD displays – in the case of this monitor, each zone is roughly 5.2cm2 in area – anything that is smaller than the zone will still be lit up across the zone. For example, a logo or loading throbber on a black background will have a visible glow around them. The issue is not specific to the PG27UQ, only that higher maximum brightness makes it little more obvious. One of the answers to this is OLED, where subpixels are self-emitting and thus lighting can be controlled on an individual subpixel basis, but because of burn-in it's not suitable for PCs.


Loading throbbers for Shadow of War (left) and Far Cry 5 (right) with the FALD haloing effect

Much has been said about describing the sheer brightness range, but the closest analogy that comes to mind is like dialing up smartphone brightness to maximum after a day of nursing a low battery on 10% brightness. It's still up to the game to take full advantage of it with HDR10 or scRGB. Some games will also offer to set gamma, maximum brightness, and/or reference white levels, thereby allowing you to adjust the HDR settings to the brightness capability of the HDR monitor.

The most immediate takeaway is the additional brightness and how fast it can ramp up. The former has a tendency to make things more clear and colorful - the Hunt effect in play, essentially. The latter is very noticable in transitions, such as sudden sunlight, looking up to the sky, and changes in lighting. Of course, the extra color vividness works hand-in-hand with the better contrast ratios, but again this can be game- and scene-dependent; Far Cry 5 seemed to fare the best in that respect, though Shadow of War, Battlefield 1, and F1 2017 still looked better than in SDR.

In-game, I couldn't perceive any quality differences going from 4:4:4 to 4:2:2 chroma subsampling, though the games couldn't reach past 98Hz at 4K anyway. So at 50 to 70fps averages, the experience reminded me more of a 'cinematic' experience, because HDR made the scenes look more realistic and brighter while the gameplay was the typical pseudo 60fps VRR experience. With that in mind, it would probably be better for exploration-heavy games where you would 'stop-and-look' a lot - and unfortunately, we don't have Final Fantasy XV at the moment to try out. NVIDIA themselves say that increased luminance actually increases the perception of judder at low refresh rates, but luckily the presence of VRR would be mitigating judder in the first place.

4K HDR Gaming Performance - GTX 1080 Ti & ROG Swift PG27UQ

What was interesting to observe was a performance impact with HDR enabled (and with G-Sync off) on the GeForce GTX 1080 Ti, which seems to corroborate last month's findings by ComputerBase.de. For the GTX 1080 Ti, Far Cry 5 was generally unaffected, but Battlefield 1 and F1 2017 took clear performance hits, appearing to stem from 4:2:2 chroma subsampling on HDR (YCbCr422). Shadow of War also seemed to fare worse. Our early results also indicate that even HDR with 4:4:4 chroma subsampling (RGB444) may result in a slight performance hit in affected games.

It's not clear what the root cause is, and we'll be digging deeper as we revisit the GeForce RTX 20-series. Taking a glance at the RTX 2080 Ti Founders Editions, the performance hit of 4:2:2 subsampling is reduced to negligable margins in these four games.

4K HDR Gaming Performance - GeForce RTX 2080 Ti FE & PG27UQ

On Asus' side, the monitor does everything that it is asked of: it comfortably reaches 1000 nits, and as long as FALD backlighting is enabled, the IPS backlight bleed is pretty much non-existent. There was no observed ghosting or other artifacts of the like.

The other aspects of the HDR gaming 'playflow' is that enabling HDR can be slightly different per game and Alt-Tabbing is hit-or-miss - that is on Microsoft/Windows 10, not on Asus - but it's certainly much better than before. For example, Shadow of Mordor had no in-game HDR toggle and relied on the Windows 10 toggle. And with G-Sync and HDR now in the mix, adjusting resolution and refresh rate (and for 144Hz, the monitor needs to be put in OC mode) to get the exact desired configuration can be a bit of a headache. Thus, it was very finicky in lowering in-game resolution to 1440p but keeping G-Sync HDR and 144Hz OC mode.

At the end of the day, not all HDR is made equal, which goes for the game-world and scene construction in addition to HDR support. So although the PG27UQ is up to the task, you may not see the full range of color and brightness translated into a given game, depending on its HDR implementation. I would strongly recommend visiting a brick-and-mortar outlet that offered an HDR demo, or look into specific HDR content that you would want to play or watch.

HDR Color and Luminance Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • FreckledTrout - Tuesday, October 2, 2018 - link

    AUO have stated it lands this fall so should be very soon. They made it sound like they will have a shipping monitor by the end of 2018 albeit who really knows but im sure its under 1 year away at this point.

    Can Google: "AUO Expects to Launch Mini LED Gaming Monitor in 2H18"
  • imaheadcase - Wednesday, October 3, 2018 - link

    Don't keep hopes hope, remember this monitor in this very review was delayed 6+ months
  • Lolimaster - Tuesday, October 2, 2018 - link

    384 zones is just CRAP, you only find that number of zones on low end cheapo TV's with FALD just o be a bit more "premium". For that price is should have 1000 AT LEAST.

    Seems we will need to wait for LCD with minileds to actually start seeing monitors with 5000+zones.
  • know of fence - Tuesday, October 2, 2018 - link

    Consoles started to push that 4K / HDR nonsense and now the monopoly provides a monitor to match for the more money than sense crowd. The obscure but sensible strobing backlight / ULMB got sacrificed for the blasted buzzwords and Gsync. Is it because the panel is barely fast enough for Gsync or is it a general shift in direction, doubling down on proprietary G-stink and the ridiculously superfluous 4K native. Is it because with failing VR, high frame rates are off the table completely?
    Is there any mention on how 1920x1080 looks on that monitor (too bad), because the pixel density is decidedly useless and non standard. But scaled down to half it could be 81.5 ppi and this thing can actually be used to read text.
  • godrilla - Tuesday, October 2, 2018 - link

    $1799 at micr1 fyi!
  • godrilla - Tuesday, October 2, 2018 - link

    Microcenter*
  • Hectandan - Tuesday, October 2, 2018 - link

    "the most desired and visible aspects of modern gaming monitors: ultra high resolution (4K)"
    No it's not. At least on Windows where UI scaling still sucks. At least on "slow" graphics card like 2080 Ti where 4K doesn't run 144fps. And 4K monitors can't do 1440p natively, so a huge deal breaker.
  • Zan Lynx - Wednesday, October 3, 2018 - link

    If you had a graphics card that could always run 144 Hz then you would have no need for GSync.
  • imaheadcase - Wednesday, October 3, 2018 - link

    Why would you care about scaling for gaming? Besides, plenty of 3rd party apps to correct windows bullshit.
  • Hectandan - Thursday, October 4, 2018 - link

    No I don't care about scaling in games, but I do care about 144fps in games. Only possible in 4K with SLI 2080 Ti and good game SLI support. Plenty of games don't.
    Also plenty of 3rd party apps not correcting Windows bullshit, and I gain no extra working space if I do scale.
    Simply too many downsides and too little benefit.

Log in

Don't have an account? Sign up now