Monitors are getting exciting. Not only are higher resolution panels becoming more of the norm, but the combination of different panel dimensions and feature sets means that buying the monitor you need for the next 10 years is getting more difficult. Today Acer adds some spice to the mix by announcing pre-orders for the XB280HK – a 28-inch TN monitor with 3840x2160 resolution that also supports NVIDIA’s G-Sync to reduce tearing and stuttering.

Adaptive frame rate technologies are still in the early phases for adoption by the majority of users. AMD’s FreeSync is still a few quarters away from the market, and NVIDIA’s G-Sync requires an add-in card which started off as an interesting, if not expensive, monitor upgrade. Fast forward a couple of months and as you might expect, the best place for G-Sync to go is into some of the more impressive monitor configurations. 4K is becoming a go-to resolution for anyone with deep enough wallets, although some might argue that the 21:9 monitors might be better for gaming immersion at least.

The XB280HK will support 3840x2160 at 60 Hz via DisplayPort 1.2, along with a 1 ms gray-to-gray response time and a fixed frequency up to 144 Hz. The stand will adjust up to 155mm in height with 40º of tilt. There is also 120º of swivel and a full quarter turn of pivot allowing for portrait style implementations. The brightness of the panel is rated at 300 cd/m2, with an 8 bit+HiFRC TN display that has a typical contrast ratio of 1000:1 and 72% NTSC. VESA is also supported at the 100x100mm scale, as well as a USB 3.0 Hub as part of the monitor, although there are no monitor speakers.

The XB280HK is currently available for pre-order in the UK at £500, but will have a US MSRP of $800. Also part of the Acer XBO range is the XB270H, a 27-inch 1920x1080 panel with G-Sync with an MSRP of $600. Expected release date, according to the pre-orders, should be the 3rd of October.

Source: Acer

Comments Locked

57 Comments

View All Comments

  • Lord of the Bored - Saturday, September 20, 2014 - link

    FPGAs don't pop out of holes in the ground, no. They are designed and manufactured by companies like Atmel and Xilinx, then sold to other entities for prototyping and small-scale hardware production.
    FPGAs are not custom parts in any way, shape, or form. You can likely buy the EXACT part nVidia uses for GSync off any electronic component store in the world(I've been unable to find an identifier for WHICH FPGA they're using, just that they use one).
    The ONLY difference is you don't have access to the code nVidia is configuring theirs with.

    There is NO nVidia-developed silicon in a GSync module.

    Which is kinda my point. How many product launches have you seen shipping on a generic FPGA dev board instead of a custom chip? I'mma bet it's close to zero.
    Most of the cost of GSync IS that FPGA. An actual GSync chip would be far cheaper, because it has fewer wasted transistors and no need to be reprogrammable.
    And why would they ship an expensive reprogrammable FPGA on a development board(and ask the end user to install it in their own monitor initially!) instead of shipping dedicated silicon to monitor manufacturers for integration? It makes no sense whatsoever if there isn't some need to bring a product to market before there's an actual product to ship.
  • chizow - Saturday, September 20, 2014 - link

    The point is that Nvidia programmed the ASIC to do what was not previously capable with regard to variable refresh. This takes time and effort, calling it haphazard is careless plain and simple, especially when the competition's "response" is to make a lot of claims about their own solution that have since been systematically proven to be untrue.

    Once you put in the work to program the FPGA, it is now Nvidia-developed silicon. Because as you have both stated, you can buy a FPGA, but you can't one that mimics a G-Sync module without the code Nvidia configured it with.
  • Lord of the Bored - Sunday, September 21, 2014 - link

    Programming an FPGA is not converting it into nVidia-developed silicon. It is still an FPGA. Just one that functions as a prototype of a device that might one day be dedicated silicon. There are differences, most notably that dedicated silicon is a lot cheaper and can't be reprogrammed.

    I find it hard to call a product launch that resembles a guy in his garage anything other than haphazard when it comes from a multi-billion-dollar company.

    I haven't seen any evaluation of VESA Adaptive Sync, systematic or otherwise, and I would really love to, if you'd care to share the link. Because I like the idea that nVidia, as a VESA member, saw that their plans to adapt Adaptive Sync to the desktop were flawed, and rushed prototypes out so they could get a BETTER solution to market before VESA's plan became entrenched. It's certainly a nicer explanation than any I've been able to come up with.
    So... link please?
  • chizow - Sunday, September 21, 2014 - link

    Of course it's Nvidia developed silicon at that point, just as any other FPGA you purchase for an intended purpose is now that maker's silicon. Similarly, something as simple as an EEPROM once programmed, is now the silicon of that particular BIOS/board maker. Same for an SSD controller, especially in the case of Intel, where they just purchase commodity controllers from SandForce and program it with their own firmware to get the desired results. That's Intel silicon at that point.

    Again, how is it haphazard how they've designed and brought their G-Sync module to market? They invented a solution from scratch, using existing panel interfaces and replaced the entire logic board on an existing panel on the market. Bear in mind, they don't make monitors or monitor logic boards, they make GPUs. They've since worked with at least half a dozen monitor makers to come out with dozens more designs (4-5 of which have hit the market already) that are going to implement G-Sync not even a year after it was announced.

    You seem fixated on them using an FPGA instead of a dedicated ASIC in your haphazard characterization, but if the FPGA is the best tool for the job and gets the job done, why bother changing? Also, if you read some of the more technical background of G-Sync, you might know that it needs to be tuned to each specific panel, so a FPGA being programmable may very well be the reason Nvidia does not go to a more rigid ASIC design.

    You haven't seen an evaluation of VESA Adaptive Sync because it doesn't exist. AMD has made hints they will debut it sometime soon, there may be something at their GPU event next week (9/25), but what we have seen so far has been underwhelming, and I dare say, even haphazard in the way they have presented their "solution" as their latest efforts don't even accomplish what they say it does (dynamic refresh rates synchronized to the GPU).

    http://www.anandtech.com/show/8129/computex-2014-a...
    Latest demo from Computex, showing a fixed, sub-60Hz refresh rate on some unbranded panel.
  • Samus - Monday, September 22, 2014 - link

    To call it not nVidia silicone is like calling a Geforce TSMC silicone. Technically it is manufactured by somebody else, but it's unique to nVidia, their IP, and their programming and quality standards.

    ASIC, FPGA, BGA, whatever you want to call it. If it is programmed for or by nVidia, it's technically their silicone, just not physically.
  • Kuad - Monday, September 22, 2014 - link

    Silicon, as in the element, not silicone, as in breast implants.
    http://www.livescience.com/37598-silicon-or-silico...
  • Lord of the Bored - Wednesday, September 24, 2014 - link

    A program running on someone else's silicon does not make it YOUR custom-developed silicon.
    It is your custom-developed PROGRAM, but just because I run Windows does not mean my CPU was designed by Microsoft.

    And your link doesn't say anything you claim it does. It says the anonymous monitor is an off-the-shelf product, not an unbranded panel, that it was made FreeSync-compatible with nothing more than a firmware change, and that it varies in refresh between 40 and 60 Hz, but YouTube is fixed 30 Hz(which for obvious reasons makes the video demonstration of dubious utility). It also says that is not THE range for FreeSync, but the monitor is allowed to select a compatible range of refresh rates from the much broader range presented by the standard.
    That's what was said. Not that the demo was running at a fixed sub-60Hz refresh.

    I would still genuinely love to see a link to a source where AMD's claims about FreeSync are "systematically proven to be untrue" , but... that wasn't it.
  • flatrock - Monday, September 22, 2014 - link

    ASICs are cheaper per unit, but there is a larger up front cost, and a large set-up cost for making a batch. Because of the up front costs you need to make an awful lot of them before they become cost effective, and you need to be sure that the design won't need tweaked, because they can't be changed.
    With a FPGA the up front costs are much smaller, and if you need to tweak the design, you can usually do it in circuit. In many cases you can do it as a firmware update while the product remains with the customer.
    As for how many products I have seen shipping with a FPGA? More than I can count. I worked in military/aerospace for quite a while. Smaller volumes than commodity consumer products, and lots of updates and even customization of existing products. We are still talking about more than a few thousand units, but not into the millions.

    G-Sync hasn't hit real mass market volumes yet, so a FPGA makes sense, especially if they managed good volume discounts on the FPGA.
  • chizow - Friday, September 19, 2014 - link

    DP 1.3 spec was just ratified a few days ago, nothing about Adaptive Sync or Variable refresh. The dream that all DP 1.3 monitors will be FreeSync capable is dead (for the few that ever thought this would be the case).
  • Gigaplex - Saturday, September 20, 2014 - link

    It's in DP 1.3, it's just optional.

Log in

Don't have an account? Sign up now