Alongside their new Pavilion gaming desktops and laptops, today HP is also announcing an addition to their monitor stable with the release of the new Pavilion Gaming 32 HDR Display. An update of sort to HP's current family of 32-inch monitors, including the existing Pavilion Gaming 32 and Omen 32, HP's latest monitor carves out an important niche for itself by adding basic HDR support.

At a high level, the Pavilion Gaming 32 HDR is built on a 32-inch VA panel featuring a 2560x1440 resolution, 300 nits typical brightness, 3000:1 contrast ratio, 5ms response times, and like many of HP's other 32-inch monitors, is FreeSync (1) enabled. The all-important backlighting system being used to enable HDR is an edge-lit LED system that, like similar systems, supports eight different zones of local dimming for increased contrast.

The chassis of the monitor comes in one color, Shadow Black. About the most identifiable feature is the stand that supports the panel. Instead of a circular base with a single post, HP is using a rectangle shaped base with two posts which should make for more stable footing. The back of the monitor has a large green HP symbol in the middle of its gentle arched rear panel. Still on the rear but at the bottom, we are able to see the power adapter, two USB 3.0 ports, as well as the two HDMI and single DisplayPort ports. On the far left is where the power, menu, and adjustment buttons which leave the front with a clean aesthetic. The only design features on the front is a smaller HP symbol in green in the center of the bottom bezel, as well as the power LED in the bottom right corner.

Officially this is a DisplayHDR 600-certified monitor. This means that it supports limited/mid-tier HDR features, including 600 nits peak luminescence for brief periods of time. HP rates the typical brightness at just 300 nits, though as a DisplayHDR 600-compliant display it should be able to sustain 350 nits indefinitely. Equally important, on the color gamut side, the DisplayHDR requirements mean that this monitor needs to support DCI-P3, with HP going above and beyond the standard with a 95% DCI-P3 color gamut.

Meanwhile, to earn its gaming credentials, the monitor supports AMD's first-generation FreeSync variable refresh technology. The 48Hz to 75Hz refresh rate isn't quite wide enough to support FreeSync low framerate compensation, but along with allowing at least some variability here, the higher-than-average 75Hz refresh rate does give the monitor an edge in smoothness over standard 60Hz monitors. This also puts the Pavilion Gaming 32 HDR in limited company as it offers both HDR and FreeSync support. Notably, however, this is not a FreeSync 2 display, so it doesn't get to take advantage of AMD's latest technology there.

The HP Pavilion Gaming 32” Display will cost $449 and will be available through HP.com and other retailers such as Amazon on May 11th

Specifications of the HP Pavilion Gaming 32 HDR
  3BZ12AA#ABA
Panel 32" VA
Native Resolution 2560x1440 (WQHD)
Refresh Range 48 -75 Hz
Response Time 5 ms GtG
Brightness 300 cd/m² (typical)
600 cd/m² (peak)
Contrast 3000:1
Viewing Angles 178°/178° horizontal/vertical
Pixel Pitch 0.276 mm²
Pixel Density 91.8 ppi
Display Colors 16.7 million
Color Gamut Support DCI-P3 95%
Inputs 1 × DisplayPort
2 × HDMI 
HDCP 1.4/2.2
Stand
Tilt angle: 21° up; 5° down
Audio N/A
VESA 100 × 100
Additional Information N/A

Related Reading: 

Source: HP

POST A COMMENT

41 Comments

View All Comments

  • faiakes - Wednesday, April 11, 2018 - link

    Shouldn't it be a 10 bit panel for HDR? Reply
  • bubblyboo - Wednesday, April 11, 2018 - link

    None of VESA's HDR grades require more than 8-bit + FRC. Reply
  • Valantar - Wednesday, April 11, 2018 - link

    Is there any noticeable/measurable difference between 8-bit+frc and 10-bit? Reply
  • Lolimaster - Wednesday, April 11, 2018 - link

    It will always be a difference unless your eyes are bad. Reply
  • JoeyJoJo123 - Thursday, April 12, 2018 - link

    No. From the words of the very thorough monitor review and calibration site, TFTCentral:

    "In fact on many modern panels these FRC are very good and in practice you’d be hard pressed to spot any real difference between a 6-bit + FRC display and a true 8-bit display. Colour range is good, screens show no obvious gradation of colours, and they show no FRC artefacts or glitches in normal everyday use. Most average users would never notice the difference and so it is more important to think about the panel technology and your individual uses than get bogged down worrying about 6-bit vs. 8-bit arguments."

    http://www.tftcentral.co.uk/faq.htm#colour_depth

    Whether a panel uses a FRC for higher bit depth should make no bearing on your decision to buy or not buy a monitor. Pedantic memesters will think it makes a difference, but to content consumers, the use of an FRC won't make an impact. Use of true 10-bit or even 12-bit displays may be warranted and justified if your bread is earned by created by color-critical content, such as the film or advertisement industry, but if you're posting in the comments section of AnandTech asking whether it's worth it, I'm pretty sure you're not one of those people.
    Reply
  • mdriftmeyer - Saturday, April 14, 2018 - link

    No is not what they said. They said these are ``good enough'' for most discerning eyes. In short, most likely not, but for the average consumer they won't know what to look for so they're good enough. Reply
  • Alistair - Wednesday, April 11, 2018 - link

    Except it doesn't say it supports 8-bit + FRC, that would be a billion colors, not 16 million ( 6-bit + FRC, or 8 bit without ).

    This better support 8-bit + FRC or when I see HDR600 etc. I will just not care.
    Reply
  • Ryan Smith - Wednesday, April 11, 2018 - link

    6-bit panels are not allowed for any DisplayHDR tiers; all panels must be at least 8-bit. For 600 and 1000, 8-bit + FRC is the minimum standard. Reply
  • Alistair - Wednesday, April 11, 2018 - link

    Ok then, so the table should say 1 billion colors? :) Reply
  • bug77 - Thursday, April 12, 2018 - link

    Consumer video cards do not output 10 bits per channel anyway, so it would be waste. Reply

Log in

Don't have an account? Sign up now