HDR Gaming Impressions

In the end, a monitor like the PG27UQ is really destined for one purpose: gaming. And not just any gaming, but the type of quality experience that does not compromise between resolution and refresh rate, let alone HDR and VRR.

That being said, not many games support HDR, which for G-Sync HDR means HDR10 support. Even for games that do support an HDR standard of some kind, the quality of the implementation naturally varies from developer to developer. And because of console HDR support, some games only feature HDR in their console incarnations.

The other issue is that the HDR gaming experience is hard to communicate objectively. In-game screenshots won't replicate how the HDR content is delivered on the monitor with its brightness, backlighting, and wider color gamut, while photographs are naturally limited by the capturing device. And naturally, any HDR content will obviously be limited by the viewer's display. On our side, this makes it easy to generally gush about glorious HDR vibrance and brightness, especially as on-the-fly blind A/B testing is not so simple (duplicated SDR and HDR output is not currently possible).

As for today, we are looking at Far Cry 5 (HDR10), F1 2017 (scRGB HDR), Battlefield 1 (HDR10), and Middle-earth: Shadow of War (HDR10), which covers a good mix of genres and graphics intensity. Thanks to in-game benchmarks for three of them, they also provide a static point of reference; in the same vein, Battlefield 1's presence in the GPU game suite means I've seen and benchmarked the same sequence enough times to dream about it.

For such subjective-but-necessary impressions like these, we'll keep ourselves grounded by sticking to a few broad questions:

  • What differences are noticable from 4K with non-HDR G-Sync?
  • What differences are noticable from 4:4:4 to 4:2:2 chroma subsampling at 98Hz?
  • What about lowering resolution to 1440p HDR or lowering details with HDR on, for higher refresh rates? Do I prefer HDR over high refresh rates?
  • Are there any HDR artifacts? e.g. halo effects, washed out or garish colors, blooming due to local dimming

The 4K G-Sync HDR Experience

From the beginning, we expected that targeting 144fps at 4K was not really plausible for graphically intense games, and that still holds true. On a reference GeForce GTX 1080 Ti, none of the games averaged past 75fps, and even the brand-new RTX 2080 Ti won't come close to doubling that.

Ubquituous black loading and intro screens make the local dimming bloom easily noticable, though this is a commonly known phenomenon and somewhat unavoidable. The majority of the time, it is fairly unintrusive. Because the local backlighting zones can only get so small on LCD displays – in the case of this monitor, each zone is roughly 5.2cm2 in area – anything that is smaller than the zone will still be lit up across the zone. For example, a logo or loading throbber on a black background will have a visible glow around them. The issue is not specific to the PG27UQ, only that higher maximum brightness makes it little more obvious. One of the answers to this is OLED, where subpixels are self-emitting and thus lighting can be controlled on an individual subpixel basis, but because of burn-in it's not suitable for PCs.


Loading throbbers for Shadow of War (left) and Far Cry 5 (right) with the FALD haloing effect

Much has been said about describing the sheer brightness range, but the closest analogy that comes to mind is like dialing up smartphone brightness to maximum after a day of nursing a low battery on 10% brightness. It's still up to the game to take full advantage of it with HDR10 or scRGB. Some games will also offer to set gamma, maximum brightness, and/or reference white levels, thereby allowing you to adjust the HDR settings to the brightness capability of the HDR monitor.

The most immediate takeaway is the additional brightness and how fast it can ramp up. The former has a tendency to make things more clear and colorful - the Hunt effect in play, essentially. The latter is very noticable in transitions, such as sudden sunlight, looking up to the sky, and changes in lighting. Of course, the extra color vividness works hand-in-hand with the better contrast ratios, but again this can be game- and scene-dependent; Far Cry 5 seemed to fare the best in that respect, though Shadow of War, Battlefield 1, and F1 2017 still looked better than in SDR.

In-game, I couldn't perceive any quality differences going from 4:4:4 to 4:2:2 chroma subsampling, though the games couldn't reach past 98Hz at 4K anyway. So at 50 to 70fps averages, the experience reminded me more of a 'cinematic' experience, because HDR made the scenes look more realistic and brighter while the gameplay was the typical pseudo 60fps VRR experience. With that in mind, it would probably be better for exploration-heavy games where you would 'stop-and-look' a lot - and unfortunately, we don't have Final Fantasy XV at the moment to try out. NVIDIA themselves say that increased luminance actually increases the perception of judder at low refresh rates, but luckily the presence of VRR would be mitigating judder in the first place.

4K HDR Gaming Performance - GTX 1080 Ti & ROG Swift PG27UQ

What was interesting to observe was a performance impact with HDR enabled (and with G-Sync off) on the GeForce GTX 1080 Ti, which seems to corroborate last month's findings by ComputerBase.de. For the GTX 1080 Ti, Far Cry 5 was generally unaffected, but Battlefield 1 and F1 2017 took clear performance hits, appearing to stem from 4:2:2 chroma subsampling on HDR (YCbCr422). Shadow of War also seemed to fare worse. Our early results also indicate that even HDR with 4:4:4 chroma subsampling (RGB444) may result in a slight performance hit in affected games.

It's not clear what the root cause is, and we'll be digging deeper as we revisit the GeForce RTX 20-series. Taking a glance at the RTX 2080 Ti Founders Editions, the performance hit of 4:2:2 subsampling is reduced to negligable margins in these four games.

4K HDR Gaming Performance - GeForce RTX 2080 Ti FE & PG27UQ

On Asus' side, the monitor does everything that it is asked of: it comfortably reaches 1000 nits, and as long as FALD backlighting is enabled, the IPS backlight bleed is pretty much non-existent. There was no observed ghosting or other artifacts of the like.

The other aspects of the HDR gaming 'playflow' is that enabling HDR can be slightly different per game and Alt-Tabbing is hit-or-miss - that is on Microsoft/Windows 10, not on Asus - but it's certainly much better than before. For example, Shadow of Mordor had no in-game HDR toggle and relied on the Windows 10 toggle. And with G-Sync and HDR now in the mix, adjusting resolution and refresh rate (and for 144Hz, the monitor needs to be put in OC mode) to get the exact desired configuration can be a bit of a headache. Thus, it was very finicky in lowering in-game resolution to 1440p but keeping G-Sync HDR and 144Hz OC mode.

At the end of the day, not all HDR is made equal, which goes for the game-world and scene construction in addition to HDR support. So although the PG27UQ is up to the task, you may not see the full range of color and brightness translated into a given game, depending on its HDR implementation. I would strongly recommend visiting a brick-and-mortar outlet that offered an HDR demo, or look into specific HDR content that you would want to play or watch.

HDR Color and Luminance Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • lilkwarrior - Monday, October 8, 2018 - link

    SLI & Crossfire are succeeded by DX12's & Vulkan's explicit multi-GPU mode. Nvidia deleiberately even ported NVLINK (succeeds classic SLI) to RTX cards from Quadro+ cards but without memory pooling because DX12 & Vulkan already provides that for GPUs.

    Devs have to use DX12 or Vulkan and support such features that is easier for them to consider now that Windows 8 mainstream support is over + ray-tracing that's available only on DX12 & Vulkan.
  • nathanddrews - Wednesday, October 3, 2018 - link

    Still cheaper than my Sony FW-900 CRT was when it was brand new! LOL
  • Hixbot - Wednesday, October 3, 2018 - link

    Still not better than a fw-900 in many ways. This LCD doesn't have a strobing feature to reduce eye tracking motion blur.
  • nathanddrews - Wednesday, October 3, 2018 - link

    No argument there, but my FW900 died, so the options are few...
  • Crazyeyeskillah - Wednesday, October 3, 2018 - link

    my fw-900 is also dead in my closet, hoping of resurrecting it one day. Right now I got another excellent crt monitor I found to game on: Sony Multiscan E540 : It's not as big, but god damn is it smooth and flawless.

    By the time this crt dies, hopefully LCD tech won't be such garbage trying to make workaround for its inferior tech for gaming.
  • nathanddrews - Thursday, October 4, 2018 - link

    Yeah, I've moved on to a Sony C520K for the last ~2 years. In use I think it's far better than my FW900 was in terms of contrast/color plus I'm able to push slightly higher refresh rates, but it's not widescreen. I have bought a couple expensive G-Sync displays, hoping for an adequate replacement, but ended up returning them. I'm really hoping that this CRT lasts until MicroLED hits the market and that mLED can truly combine the best attributes of LCD and OLED without any of the drawbacks.
  • Ironchef3500 - Wednesday, October 3, 2018 - link

    100%
  • Tunnah - Tuesday, October 2, 2018 - link

    I don't get the point. You don't need the features for desktop work, so this is purely a gaming feature. Why not get an equally capable OLED/QLED at a much bigger size for less money ?
  • Inteli - Tuesday, October 2, 2018 - link

    TVs don't support native high refresh rates from sources like monitors do (I think LG's does, but only from USB sources or something like that) or adaptive refresh rates. It's a gaming monitor, so it has gaming-specific features.
  • imaheadcase - Tuesday, October 2, 2018 - link

    You answered own question, because its a gaming monitor. You can't find one like this (yet) that offers all the things it does.

    You like many people are confused on this website about TV vs monitors. A TV equal size, same resolution, is not the same as a dedicated monitor. A LG OLED 55 inch TV looks pretty bland when you use a PC monitor for gaming.

Log in

Don't have an account? Sign up now