HDR Gaming Impressions

In the end, a monitor like the PG27UQ is really destined for one purpose: gaming. And not just any gaming, but the type of quality experience that does not compromise between resolution and refresh rate, let alone HDR and VRR.

That being said, not many games support HDR, which for G-Sync HDR means HDR10 support. Even for games that do support an HDR standard of some kind, the quality of the implementation naturally varies from developer to developer. And because of console HDR support, some games only feature HDR in their console incarnations.

The other issue is that the HDR gaming experience is hard to communicate objectively. In-game screenshots won't replicate how the HDR content is delivered on the monitor with its brightness, backlighting, and wider color gamut, while photographs are naturally limited by the capturing device. And naturally, any HDR content will obviously be limited by the viewer's display. On our side, this makes it easy to generally gush about glorious HDR vibrance and brightness, especially as on-the-fly blind A/B testing is not so simple (duplicated SDR and HDR output is not currently possible).

As for today, we are looking at Far Cry 5 (HDR10), F1 2017 (scRGB HDR), Battlefield 1 (HDR10), and Middle-earth: Shadow of War (HDR10), which covers a good mix of genres and graphics intensity. Thanks to in-game benchmarks for three of them, they also provide a static point of reference; in the same vein, Battlefield 1's presence in the GPU game suite means I've seen and benchmarked the same sequence enough times to dream about it.

For such subjective-but-necessary impressions like these, we'll keep ourselves grounded by sticking to a few broad questions:

  • What differences are noticable from 4K with non-HDR G-Sync?
  • What differences are noticable from 4:4:4 to 4:2:2 chroma subsampling at 98Hz?
  • What about lowering resolution to 1440p HDR or lowering details with HDR on, for higher refresh rates? Do I prefer HDR over high refresh rates?
  • Are there any HDR artifacts? e.g. halo effects, washed out or garish colors, blooming due to local dimming

The 4K G-Sync HDR Experience

From the beginning, we expected that targeting 144fps at 4K was not really plausible for graphically intense games, and that still holds true. On a reference GeForce GTX 1080 Ti, none of the games averaged past 75fps, and even the brand-new RTX 2080 Ti won't come close to doubling that.

Ubquituous black loading and intro screens make the local dimming bloom easily noticable, though this is a commonly known phenomenon and somewhat unavoidable. The majority of the time, it is fairly unintrusive. Because the local backlighting zones can only get so small on LCD displays – in the case of this monitor, each zone is roughly 5.2cm2 in area – anything that is smaller than the zone will still be lit up across the zone. For example, a logo or loading throbber on a black background will have a visible glow around them. The issue is not specific to the PG27UQ, only that higher maximum brightness makes it little more obvious. One of the answers to this is OLED, where subpixels are self-emitting and thus lighting can be controlled on an individual subpixel basis, but because of burn-in it's not suitable for PCs.


Loading throbbers for Shadow of War (left) and Far Cry 5 (right) with the FALD haloing effect

Much has been said about describing the sheer brightness range, but the closest analogy that comes to mind is like dialing up smartphone brightness to maximum after a day of nursing a low battery on 10% brightness. It's still up to the game to take full advantage of it with HDR10 or scRGB. Some games will also offer to set gamma, maximum brightness, and/or reference white levels, thereby allowing you to adjust the HDR settings to the brightness capability of the HDR monitor.

The most immediate takeaway is the additional brightness and how fast it can ramp up. The former has a tendency to make things more clear and colorful - the Hunt effect in play, essentially. The latter is very noticable in transitions, such as sudden sunlight, looking up to the sky, and changes in lighting. Of course, the extra color vividness works hand-in-hand with the better contrast ratios, but again this can be game- and scene-dependent; Far Cry 5 seemed to fare the best in that respect, though Shadow of War, Battlefield 1, and F1 2017 still looked better than in SDR.

In-game, I couldn't perceive any quality differences going from 4:4:4 to 4:2:2 chroma subsampling, though the games couldn't reach past 98Hz at 4K anyway. So at 50 to 70fps averages, the experience reminded me more of a 'cinematic' experience, because HDR made the scenes look more realistic and brighter while the gameplay was the typical pseudo 60fps VRR experience. With that in mind, it would probably be better for exploration-heavy games where you would 'stop-and-look' a lot - and unfortunately, we don't have Final Fantasy XV at the moment to try out. NVIDIA themselves say that increased luminance actually increases the perception of judder at low refresh rates, but luckily the presence of VRR would be mitigating judder in the first place.

4K HDR Gaming Performance - GTX 1080 Ti & ROG Swift PG27UQ

What was interesting to observe was a performance impact with HDR enabled (and with G-Sync off) on the GeForce GTX 1080 Ti, which seems to corroborate last month's findings by ComputerBase.de. For the GTX 1080 Ti, Far Cry 5 was generally unaffected, but Battlefield 1 and F1 2017 took clear performance hits, appearing to stem from 4:2:2 chroma subsampling on HDR (YCbCr422). Shadow of War also seemed to fare worse. Our early results also indicate that even HDR with 4:4:4 chroma subsampling (RGB444) may result in a slight performance hit in affected games.

It's not clear what the root cause is, and we'll be digging deeper as we revisit the GeForce RTX 20-series. Taking a glance at the RTX 2080 Ti Founders Editions, the performance hit of 4:2:2 subsampling is reduced to negligable margins in these four games.

4K HDR Gaming Performance - GeForce RTX 2080 Ti FE & PG27UQ

On Asus' side, the monitor does everything that it is asked of: it comfortably reaches 1000 nits, and as long as FALD backlighting is enabled, the IPS backlight bleed is pretty much non-existent. There was no observed ghosting or other artifacts of the like.

The other aspects of the HDR gaming 'playflow' is that enabling HDR can be slightly different per game and Alt-Tabbing is hit-or-miss - that is on Microsoft/Windows 10, not on Asus - but it's certainly much better than before. For example, Shadow of Mordor had no in-game HDR toggle and relied on the Windows 10 toggle. And with G-Sync and HDR now in the mix, adjusting resolution and refresh rate (and for 144Hz, the monitor needs to be put in OC mode) to get the exact desired configuration can be a bit of a headache. Thus, it was very finicky in lowering in-game resolution to 1440p but keeping G-Sync HDR and 144Hz OC mode.

At the end of the day, not all HDR is made equal, which goes for the game-world and scene construction in addition to HDR support. So although the PG27UQ is up to the task, you may not see the full range of color and brightness translated into a given game, depending on its HDR implementation. I would strongly recommend visiting a brick-and-mortar outlet that offered an HDR demo, or look into specific HDR content that you would want to play or watch.

HDR Color and Luminance Display Uniformity and Power Usage
Comments Locked

91 Comments

View All Comments

  • Ryan Smith - Wednesday, October 3, 2018 - link

    Aye. The FALD array puts out plenty of heat, but it's distributed, so it can be dissipated over a large area. The FPGA for controlling G-Sync HDR is generates much less heat, but it's concentrated. So passive cooling would seem to be non-viable here.
  • a5cent - Wednesday, October 3, 2018 - link

    Yeah, nVidia's DP1.4 VRR solution is baffelingly poor/non-competitive, not just due to the requirement for active cooling.

    nVidia's DP1.4 g-sync module is speculated to contribute a lot to the monitor's price (FPGA alone is estimated to be ~ $500). If true, I just don't see how g-sync isn't on a path towards extinction. That simply isn't a price premium over FreeSync that the consumer market will accept.

    If g-sync isn't at least somewhat widespread and (via customer lock in) helping nVidia sell more g-sync enabled GPUs, then g-sync also isn't serving any role for nVidia. They might as well drop it and go with VESA's VRR standard.

    So, although I'm actually thinking of shelling out $2000 for a monitor, I don't want to invest in technology it seems has priced itself out of the market and is bound to become irrelevant.

    Maybe you could shed some light on where nVidia is going with their latest g-sync solution? At least for now it doesn't seem viable.
  • Impulses - Wednesday, October 3, 2018 - link

    How would anyone outside of NV know where they're going with this tho? I imagine it does help sell more hardware to one extent or another (be it GPUs, FPGAs to display makers, or a combination of profits thru the side deals) AND they'll stay the course as long as AMD isn't competitive at the high end...

    Just the sad reality. I just bought a G-Sync display but it wasn't one of these or even $1K, and it's still a nice display regardless of whether it has G-Sync or not. I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones.
  • a5cent - Wednesday, October 3, 2018 - link

    "How would anyone outside of NV know where they're going with this tho?"

    Anandtech could talk with their contacts at nVidia, discuss the situation with monitor OEMs, or take any one of a dozen other approaches. Anandtech does a lot of good market research and analysis. There is no reason they can't do that here too. If Anandtech confronted nVidia with the concern of DP1.4 g-sync being priced into irrelevancy, they would surely get some response.

    "I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones."

    You're mistakenly assuming the DP1.2 g-sync is in any way comparable to DP1.4 g-sync. It's not.

    First, nobody sells plenty of g-sync monitors. The $200 price premium over FreeSync has made g-sync monitors (comparatively) low volume niche products. For DP1.4 that premium goes up to over $500. There is no way that will fly in a market where the entire product typically sells for less than $500. This is made worse by the fact that ONLY DP1.4 supports HDR. That means even a measly DisplayHDR 400 monitor, which will soon retail for around $400, will cost at least $900 if you want it with g-sync.

    Almost nobody, for whom price is even a little bit of an issue, will pay that.

    While DP1.2 g-sync monitors were niche products, DP1.4 g-sync monitors will be irrelevant products (in terms of market penetration). Acer's and Asus' $2000 monitors aren't and will not sell in significant numbers. Nothing using nVidia's DP1.4 g-sync module will.

    To be clear, this isn't a rant about price. It's a rant about strategy. The whole point of g-sync is customer lock-in. Nobody, not even nVidia, earns anything selling g-sync hardware. For nVidia, the potential of g-sync is only realized when a person with a g-sync monitor upgrades to a new nVidia card who would otherwise have bought an AMD card. If DP1.4 g-sync isn't adopted in at least somewhat meaningful numbers, g-sync loses its purpose. That is when I'd expect nVidia to either trash g-sync and start supporting FreeSync, OR build a better g-sync module without the insanely expensive FPGA.

    Neither of those two scenarios motivates me to buy a $2000 g-sync monitor today. That's the problem.
  • a5cent - Wednesday, October 3, 2018 - link

    To clarify the above...

    If I'm spending $2000 on a g-sync monitor today, I'd like some reassurance that g-sync will still be relevant and supported three years from now.

    For the reasons mentioned, from where I stand, g-sync looks like "dead technology walking". With DP1.4 it's priced itself out of the market. I'm sure many would appreciate some background on where nVidia is going with this...
  • lilkwarrior - Monday, October 8, 2018 - link

    Nvidia's solution is objectively better besides not being open. Similarly NVLINK is better than any other multi-GPU hardware wise.

    With HDMI 2.1, Nvidia will likely support it unless it's simply underwhelming.

    Once standards catch up, Nvidia hasn't been afraid to deprecate their own previous effort somewhat besides continuing to support it for wide-spread support / loyalty or a balanced approach (i.e. NVLINK for Geforce cards but delegate memory pooling to DX12 & Vulkan)
  • Impulses - Tuesday, October 2, 2018 - link

    If NVidia started supporting standard adaptive sync at the same time that would be great... Pipe dream I know. Things like G-Sync vs Freesync, fans inside displays, and dubious HDR support don't inspire much confidence in these new displays. I'd gladly drop the two grand if I *knew* this was the way forward and would easily last me 5+ years, but I dunno if that would really pan out.
  • DanNeely - Tuesday, October 2, 2018 - link

    Thank you for including the explanation on why DSC hasn't shown up in any products to date.
  • Heavenly71 - Tuesday, October 2, 2018 - link

    I'm pretty disappointed that a gaming monitor with this price still has only 8 bits of native color resolution (plus FRC, I know).

    Compare this to the ASUS PA32UC which – while not mainly targetted at gamers – has 10 bits, no fan noise, is 5 inches bigger (32" total) and many more inputs (including USB-C DP). For about the same price.
  • milkod2001 - Tuesday, October 2, 2018 - link

    Wonder if they make native 10bit monitors. Would you be able to output 10bit colours from gaming GPU or only professional GPU?

Log in

Don't have an account? Sign up now