HDR Gaming Impressions

In the end, a monitor like the PG27UQ is really destined for one purpose: gaming. And not just any gaming, but the type of quality experience that does not compromise between resolution and refresh rate, let alone HDR and VRR.

That being said, not many games support HDR, which for G-Sync HDR means HDR10 support. Even for games that do support an HDR standard of some kind, the quality of the implementation naturally varies from developer to developer. And because of console HDR support, some games only feature HDR in their console incarnations.

The other issue is that the HDR gaming experience is hard to communicate objectively. In-game screenshots won't replicate how the HDR content is delivered on the monitor with its brightness, backlighting, and wider color gamut, while photographs are naturally limited by the capturing device. And naturally, any HDR content will obviously be limited by the viewer's display. On our side, this makes it easy to generally gush about glorious HDR vibrance and brightness, especially as on-the-fly blind A/B testing is not so simple (duplicated SDR and HDR output is not currently possible).

As for today, we are looking at Far Cry 5 (HDR10), F1 2017 (scRGB HDR), Battlefield 1 (HDR10), and Middle-earth: Shadow of War (HDR10), which covers a good mix of genres and graphics intensity. Thanks to in-game benchmarks for three of them, they also provide a static point of reference; in the same vein, Battlefield 1's presence in the GPU game suite means I've seen and benchmarked the same sequence enough times to dream about it.

For such subjective-but-necessary impressions like these, we'll keep ourselves grounded by sticking to a few broad questions:

  • What differences are noticable from 4K with non-HDR G-Sync?
  • What differences are noticable from 4:4:4 to 4:2:2 chroma subsampling at 98Hz?
  • What about lowering resolution to 1440p HDR or lowering details with HDR on, for higher refresh rates? Do I prefer HDR over high refresh rates?
  • Are there any HDR artifacts? e.g. halo effects, washed out or garish colors, blooming due to local dimming

The 4K G-Sync HDR Experience

From the beginning, we expected that targeting 144fps at 4K was not really plausible for graphically intense games, and that still holds true. On a reference GeForce GTX 1080 Ti, none of the games averaged past 75fps, and even the brand-new RTX 2080 Ti won't come close to doubling that.

Ubquituous black loading and intro screens make the local dimming bloom easily noticable, though this is a commonly known phenomenon and somewhat unavoidable. The majority of the time, it is fairly unintrusive. Because the local backlighting zones can only get so small on LCD displays – in the case of this monitor, each zone is roughly 5.2cm2 in area – anything that is smaller than the zone will still be lit up across the zone. For example, a logo or loading throbber on a black background will have a visible glow around them. The issue is not specific to the PG27UQ, only that higher maximum brightness makes it little more obvious. One of the answers to this is OLED, where subpixels are self-emitting and thus lighting can be controlled on an individual subpixel basis, but because of burn-in it's not suitable for PCs.


Loading throbbers for Shadow of War (left) and Far Cry 5 (right) with the FALD haloing effect

Much has been said about describing the sheer brightness range, but the closest analogy that comes to mind is like dialing up smartphone brightness to maximum after a day of nursing a low battery on 10% brightness. It's still up to the game to take full advantage of it with HDR10 or scRGB. Some games will also offer to set gamma, maximum brightness, and/or reference white levels, thereby allowing you to adjust the HDR settings to the brightness capability of the HDR monitor.

The most immediate takeaway is the additional brightness and how fast it can ramp up. The former has a tendency to make things more clear and colorful - the Hunt effect in play, essentially. The latter is very noticable in transitions, such as sudden sunlight, looking up to the sky, and changes in lighting. Of course, the extra color vividness works hand-in-hand with the better contrast ratios, but again this can be game- and scene-dependent; Far Cry 5 seemed to fare the best in that respect, though Shadow of War, Battlefield 1, and F1 2017 still looked better than in SDR.

In-game, I couldn't perceive any quality differences going from 4:4:4 to 4:2:2 chroma subsampling, though the games couldn't reach past 98Hz at 4K anyway. So at 50 to 70fps averages, the experience reminded me more of a 'cinematic' experience, because HDR made the scenes look more realistic and brighter while the gameplay was the typical pseudo 60fps VRR experience. With that in mind, it would probably be better for exploration-heavy games where you would 'stop-and-look' a lot - and unfortunately, we don't have Final Fantasy XV at the moment to try out. NVIDIA themselves say that increased luminance actually increases the perception of judder at low refresh rates, but luckily the presence of VRR would be mitigating judder in the first place.

4K HDR Gaming Performance - GTX 1080 Ti & ROG Swift PG27UQ

What was interesting to observe was a performance impact with HDR enabled (and with G-Sync off) on the GeForce GTX 1080 Ti, which seems to corroborate last month's findings by ComputerBase.de. For the GTX 1080 Ti, Far Cry 5 was generally unaffected, but Battlefield 1 and F1 2017 took clear performance hits, appearing to stem from 4:2:2 chroma subsampling on HDR (YCbCr422). Shadow of War also seemed to fare worse. Our early results also indicate that even HDR with 4:4:4 chroma subsampling (RGB444) may result in a slight performance hit in affected games.

It's not clear what the root cause is, and we'll be digging deeper as we revisit the GeForce RTX 20-series. Taking a glance at the RTX 2080 Ti Founders Editions, the performance hit of 4:2:2 subsampling is reduced to negligable margins in these four games.

4K HDR Gaming Performance - GeForce RTX 2080 Ti FE & PG27UQ

On Asus' side, the monitor does everything that it is asked of: it comfortably reaches 1000 nits, and as long as FALD backlighting is enabled, the IPS backlight bleed is pretty much non-existent. There was no observed ghosting or other artifacts of the like.

The other aspects of the HDR gaming 'playflow' is that enabling HDR can be slightly different per game and Alt-Tabbing is hit-or-miss - that is on Microsoft/Windows 10, not on Asus - but it's certainly much better than before. For example, Shadow of Mordor had no in-game HDR toggle and relied on the Windows 10 toggle. And with G-Sync and HDR now in the mix, adjusting resolution and refresh rate (and for 144Hz, the monitor needs to be put in OC mode) to get the exact desired configuration can be a bit of a headache. Thus, it was very finicky in lowering in-game resolution to 1440p but keeping G-Sync HDR and 144Hz OC mode.

At the end of the day, not all HDR is made equal, which goes for the game-world and scene construction in addition to HDR support. So although the PG27UQ is up to the task, you may not see the full range of color and brightness translated into a given game, depending on its HDR implementation. I would strongly recommend visiting a brick-and-mortar outlet that offered an HDR demo, or look into specific HDR content that you would want to play or watch.

HDR Color and Luminance Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • imaheadcase - Tuesday, October 2, 2018 - link

    3840x1600 is the dell i mean.
  • Impulses - Tuesday, October 2, 2018 - link

    The Acer Predator 32" has a similar panel as that BenQ and adds G-Sync tho still at a max 60Hz, not as well calibrated out of the box (and with a worse stand and controls) but it has dropped in price a couple times to the same as the BenQ... I've been cross shopping them for a while because 2 grand for a display whose features I may or may not be able to leverage in the next 3 years seems dubious.

    I wanted to go 32" too because the 27" 1440p doesn't seem like enough of a jump from my 24" 1920x1200 (being 16:10 it's nearly as tall as the 16:9 27"erd), and I had three of those which we occasionally used in Eyefinity mode (making a ~40" display). I've looked at 40-43" displays but they're all lacking compared to the smaller stuff (newer ones are all VA too, mostly Phillips and one Dell).

    I use my PC for photo editing as much as PC gaming but I'm not a pro so a decent IPS screen that I can calibrate reasonably well would satisfy my photo needs.
  • Fallen Kell - Tuesday, October 2, 2018 - link

    It is "almost" perfect. It is missing one of the most important things, HDMI 2.1, which has the bandwidth to actually feed the panel with what it is capable of doing (i.e. 4k HDR 4:4:4 120Hz). But we don't have that because this monitor was actually designed 3 years ago and only now finally coming to market, 6 months after HDMI 2.1 was released.
  • lilkwarrior - Monday, October 8, 2018 - link

    HDMI 2.1 certification is still not done; it would not have been able to call itself a HDMI 2.1 till probably late this year or next year.
  • imaheadcase - Tuesday, October 2, 2018 - link

    The 35 inch one has been canceled fyi. Asus rep told me when inquired about it just a week ago, unless in a week something has changed. Reason being panel is not perfect yet to mass produce.

    That said, its not a big loss, even if disappointing. Because HDR is silly tech so you can skip this generation
  • EAlbaek - Tuesday, October 2, 2018 - link

    I bought one of these, just as they came out. Amazing display performance, but the in-built fan to cool the G-Sync HDR-module killed it for me.

    It's one of those noisy 40mm fans, which were otherwise banned from PC setups over a decade ago. It made more noise than the entirety of the rest of my 1080 Ti-SLI system combined. Like a wasp was loose in my room all the time. Completely unbearable to listen to.

    I tried to return the monitor as RMA, as I thought that couldn't be right. But it could, said the retailer. At which point I chose to simply return the unit.

    In my case, these things will have to wait, till nVidia makes a new G-Sync HDR module, which doesn't require active cooling. Plain and simple. I'm sort of guessing that'll fall in line with the availability of micro-LED displays. Which will hopefully also be much cheaper, than the ridiculously expensive FALD-panels in these monitors.
  • imaheadcase - Tuesday, October 2, 2018 - link

    Can't you just replace the fan yourself? I read around the time of release someone simply removed fan and put own silent version on it.
  • EAlbaek - Tuesday, October 2, 2018 - link

    No idea - I shouldn't have to void the warranty on my $2000 monitor, to replace a 40mm fan.
  • madwolfa - Tuesday, October 2, 2018 - link

    Is that G-Sync HDR that requires active cooling or FALD array?
  • EAlbaek - Tuesday, October 2, 2018 - link

    It's the G-Sync HDR chip, apparantly.

Log in

Don't have an account? Sign up now