Hot on the heels of the Swift PG35VQ announcement, ASUS has unveiled three new Strix-branded gaming monitors that will feature AMD’s FreeSync technology. While the first Strix model was unveiled back at CES 2017 - the Strix XG27VQ - it was a one-off until the announcement of these newest models. Clearly, ASUS intends to have a gaming monitor lineup that consists of both the Strix and Swift series.

Starting off from largest to smallest, we have the Strix XG32V. This model has a 31.5-inch IPS panel with a WQHD resolution of 2560 × 1440 and an 1800R curve that should help provide a wider field of view. We don't have a ton of other technical details, we do know that this model can handle refresh rates of up to 144Hz and it supports FreeSync. It will be interesting to see what the actual FreeSync range actually is.

For connectivity, there are two DisplayPort 1.2 inputs, one HDMI 2.0 input, and an undisclosed amount of USB 3.0 ports. Like on many of their other new gaming-oriented monitors, ASUS has added Aura Sync lighting to the XG32V. This not only means there will be an ROG logo that shines down onto the desk, but a back panel that features RGB LEDs and that can be synchronized with other Aura Sync-enabled PC components and peripherals.

Moving on to the next monitor, the Strix XG27V is a more value-minded model that shares the 1800R curvature of the XG32GV, but shrinks the panel down to 27 inches and the resolution to 1920 x 1080. Thankfully, this model can handle refresh rates of up to 144Hz and it also supports FreeSync.

While the XG27V has onboard Aura RGB lighting, it doesn't have Aura Sync, so it cannot synchronize with other Aura-compatible components and peripherals. Although specifics are lacking, connectivity is the form of a DisplayPort input, an HDMI input, and DVI-D port.

While the Strix XG258 might be the smallest of the bunch, it has an ace up its sleeve. This 24.5-inch display features a 1920 x 1080, which is pretty conventional, but it supports a maximum refresh rate of up to 240Hz. According to ASUS, this means that the delay between new frames is just 4.2 ms, compared to 6.9 ms on a 144Hz gaming monitor.

Much like the XG27V, this model has onboard Aura RGB lighting, but it doesn't have Aura Sync, so it cannot synchronize with other Aura-compatible components and peripherals. For connectivity, there are two DisplayPort 1.2 and two HDMI inputs, one of which is HDMI 2.0.

Although we have no pricing details, all of these new Strix gaming monitors will be available starting in Q3.

Source: ASUS

POST A COMMENT

16 Comments

View All Comments

  • Vyvian07 - Tuesday, June 06, 2017 - link

    Wow, the Strix XG32V sounds amazing. Too bad it will most likely be WAY out of my price range. Reply
  • tarqsharq - Wednesday, June 07, 2017 - link

    AOC is releasing the AGON AG322QCX or something along those lines that I suspect is based on the exact same panel for probably less money. Reply
  • Jax Omen - Monday, June 12, 2017 - link

    I've been waiting for that monitor since february, can't find ANYTHING on it since it was previewed.

    Was supposed to come out in May.

    I just want a >27" 144hz 1440p non-TN monitor to replace my 30" 60hz IPS! V_V
    Reply
  • waltsmith - Tuesday, June 06, 2017 - link

    Awfully nice to see more Freesync monitors hitting the market, now if they are just of a decent refresh range..... Amazing how many are out there with a pitifully small range that is practically unusable. Reply
  • DanNeely - Tuesday, June 06, 2017 - link

    I think a lot of it is the other half of Freesync being based on an optional (if rarely used) part of the spec. While it could be turned on via a firmware update to existing controllers, if the controller wasn't designed around it, the ability to usefully support it was rather limited. Presumably by now any new monitors were designed a full hardware package where everything was intended to be variable sync from the start and all future Freesync monitors should have comparable performance ranges to what GSync monitors can do. Reply
  • edzieba - Tuesday, June 06, 2017 - link

    "Amazing how many are out there with a pitifully small range that is practically unusable. "

    It's an artifact of Freesync using a regular panel controller (hence the reduced cost) rather than a dedicated controller as with G-Sync. Freesync allows a panel controller manufacturer to take an existing 'normal' fixed refresh rate controller, that already has the capability to output a wide range of pixel clock timings (so you can sell one controller to drive a wide range of panels) and 'unlock' that existing variation to be changed per-frame. The downside is there are not that massive a range of panels serviced by each controller (e.g. one controller may happily do 1920x1080 @100Hz, or 2540x1440 @ 60Hz, but won't do 3840x2160 @ 120Hz) which limits the available VRR range to what the controller can do.

    It is of course possible to design a controller that will operate over a very large range of refresh rates, but it's expensive to do so. Monitor assemblers are taking the view of "if Nvidia have already made the investment to design such a panel controller, why would we not just buy the controller that exists rather than funding development of one that doesn't?".
    Reply
  • Alexvrb - Tuesday, June 06, 2017 - link

    Your statements are grounded in bias, rather than fact. Freesync allows a broad range of implementations and controllers with various price levels and capabilities. It's all about giving people choice, a wide range of products. You can have affordable low-refresh panels with a smaller range like 40-75, which is better than similarly priced non-adaptive sync displays with NO adaptive range. Or you can implement a premium gaming monitor with high refresh rates and very large FreeSync ranges. I've seen 144hz displays with a 30-144 range. There's no reason a manufacturer couldn't implement even larger ranges should they so choose. FreeSync version 2 has been out for a while and adds features like LFC, too. G-Sync was conceived as a feature for the elite, high-end displays. The peasants among us could only afford regular old non-adaptive sync monitors... until FreeSync came along.

    Also the number of FreeSync monitors on the market is greater than the number of G-Sync models. That seems to run contrary to your assertion that manufacturers choose the Nvidia solution because they already designed such a controller. This is despite G-Sync's head start. Must be because you didn't factor in the high cost of Nvidia's proprietary solution.
    Reply
  • edzieba - Wednesday, June 07, 2017 - link

    "Freesync allows a broad range of implementations and controllers with various price levels and capabilities."

    I never stated otherwise (in fact, I explicitly stated this). What is allowed for and what is available are two different things though. Asus, Acer, etc do not design panel controllers, nor to AMD. Both are dependant on what panel controller manufacturers can implement for the least cost.

    "Also the number of FreeSync monitors on the market is greater than the number of G-Sync models. That seems to run contrary to your assertion that manufacturers choose the Nvidia solution because they already designed such a controller. This is despite G-Sync's head start. Must be because you didn't factor in the high cost of Nvidia's proprietary solution."
    Cost is the reason for the lack of high-end Freesync monitors, and the proliferation of more basic ones. More capable controllers cost more, regardless of who is designing them. Somebody may pony up the funds to make a high-end controller compatible with DP Adaptive Sync, but this far nobody has.

    "FreeSync version 2 has been out for a while and adds features like LFC, too."
    Note that the majority of featured added to Freesync 2 are host-side ones done in software (e.g. HDR) rather than ones dependant on changes in monitor hardware.
    Reply
  • Alexvrb - Thursday, June 08, 2017 - link

    You said "It's an artifact of Freesync using a regular panel controller". This is simply not true. They could have very easily mandated that manufacturers meet various specs, including sync range. The real cause is them allowing OEMs free reign to implement any number of solutions from cheap to premium. Even the lowest end (or entry-level IPS displays) 40-60hz display is still better than a conventional 60hz display (all else remaining equal), and the adaptive sync range is still beneficial when your framerate fluctuates - dipping into the 50s or 40s wouldn't present a problem nearly as severe as with a conventional monitor. But there are plenty of examples of monitors with wider ranges, yes even with "regular" (non-proprietary) controllers.

    "Cost is the reason for the lack of high-end Freesync monitors, and the proliferation of more basic ones. More capable controllers cost more, regardless of who is designing them. Somebody may pony up the funds to make a high-end controller compatible with DP Adaptive Sync, but this far nobody has." Like I said, cost. Trying to paint the spread of affordable adaptive sync displays as a bad thing smacks of elitism. Adaptive sync? Not for you plebs with your cheap displays. These days even gamers whose rig is $600 (arguably the ones who benefit the most from adaptive sync) can afford a FreeSync monitor now. Also there are high-end FreeSync solutions for those who want and can afford them. Or I guess monitors like the XG270HU are using a low-end controller, weird, wonder how they pull it off.

    "Note that the majority of featured added to Freesync 2 are host-side ones done in software (e.g. HDR) rather than ones dependant on changes in monitor hardware."
    I don't think it much matters as long as it works. For example windowed full screen freesync support. The graphics card (and it's drivers) are the other part of the equation anyway. LFC does have hardware requirements, however.
    Reply
  • JoeyJoJo123 - Tuesday, June 06, 2017 - link

    While the Strix XG258 looks interesting, I'm leaning more towards the XL2540 as a FreeSync 24.5" 1080p TN 240hz monitor myself.

    Cleaner aesthetics, likely more competitive price, and adjustable motion blur settings which can allow for a "60Hz ULMB" mode for plugged in 1080p 60hz consoles and devices.

    https://www.blurbusters.com/benq/strobe-utility/

    >Q: Can I use Blur Reduction for Game Consoles and for Television?

    >A: Yes! You just pre-configure Strobe Utility at 1920×1080 60Hz via computer first, then switch input to the HDMI input for gaming/televison. You can even unplug your monitor and move it to a different room, after pre-configuring it with Strobe Utility. The computer is no longer needed after configuring. Currently, Strobe Utility is the only way to get “LightBoost effect” at 60Hz. This is CRT-clarity 60fps at 60Hz with no motion blur!
    Reply

Log in

Don't have an account? Sign up now