We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

Comments Locked

118 Comments

View All Comments

  • TheJian - Friday, January 9, 2015 - link

    Let me know when AMD shows it running GAMES (a lot of them) to prove it works. I'll take the one that is BEST, no matter who owns it if they also have a great balance sheet (higher R&D, profits, cash on hand etc) and can keep investing in the best drivers for the gpu I'll need to buy to use with the monitors. While I currently run a 5850, it will be my last AMD card/product for a while if Gsync is BETTER. I already have zero interest in their cpus due to Intel. I hate that, but again, why buy 2nd when they're so far behind? I just can't do that anymore these days. AMD has to beat NV in the gpu perf with at least the same power AND at least MATCH Gsync (no loss in quality/perf results) or they lose my money for ages this time.

    NV owns 67% of discrete amd owns 33% or so. NV owns 80% of workstations. Umm, NV is like Intel to AMD in gpus. AMD owns next to nothing these days due to selling everything off to stay afloat. Profits are rare, balance sheet is in shambles, etc. The complete opposite of NV. NV wins in every financial category. These two companies are not even on the same playing field today (used to be, but not now). Please take a look at their financials for the last decade then explain to me how they are even. NV has put as much money (~7B) into Cuda over the last 8 years or so, as AMD has LOST in the last decade (about 7B of losses). If that isn't night and day, I don't know what is. You're not making sense sir. I don't care about everyone, I care about having the best on my desk that I can get ;) If that means proprietary, I'll go that way unless 2nd is so close you can't tell the difference. But we can't say that here yet, since AMD seems afraid to show it GAMING.

    You talk like you've seen freesync running 100 games. NOBODY has. I'm more worried about being stuck with a CRAP solution if NV caves while having the BEST solution already in hand and done (don't think they'd do that, but...). Maybe AMD would have a marketing dept if they'd start making some money so they could afford a REAL one. They need to start putting out GREAT products instead of "good enough for most", so they can charge a PREMIUM and make some profits for a few years in a row. The only reason AMD has the fastest single card is NV didn't want one ;) It's not like they don't have the cash to put out whatever they want to win. I'm fairly certain AMD isn't making wads of cash on that card ;) Their balance sheet doesn't show that anyway :( Whatever they spent to dev that card should have been spent on making BETTER cards that actually sell more volume (along with better drivers too!). NV could probably buy AMD if desired at this point (stupid, but they could). NV's market cap is now 5.5x AMD's and they spend more on R&D with less products (AMD makes a lot of cpus, etc but spends less on R&D). NV is beating AMD for the same reasons Intel is. Smarter management (should have paid 1/3 for ATI, should have passed on consoles like NV, etc etc), better products, more cash, less debt, more profits. That's a LOT for any small company to overcome and we're not even talking the coming ARM wave to steal even more low-end crap on the cpu front where AMD lives (getting squished by ARM+Intel sides). You should get the point. You can't compete without making money for long.
  • Antronman - Saturday, January 10, 2015 - link

    What about HBM? What if (it's very likely) that the R9 300 series card have HBM?

    What then?

    Because who really needs adaptive refresh rates when you can just buy a 144Hz monitor and have a minimum 144fps?

    Adaptive refresh rates are a feature for budget systems and 4k resolutions on the current GPUs. There really is little other use.
  • chizow - Sunday, January 11, 2015 - link

    @Antronman, this is the 2nd statement you've made that is more or less, nonsense.

    Are you some kind of Hope Merchant? Are you really trying to say HBM is automagically going to result in 144fps minimums? Is HBM even confirmed for AMD's next cards? Bit of a stretch since you're kinda going 2 degrees of FUD separation here, don't you think?

    Back in reality, many people will benefit from less than max refresh rates without having to use triple buffering while solving the issue of tearing and minimal input lag/studder.
  • Antronman - Sunday, January 11, 2015 - link

    That was an exaggeration.

    But several multi-card setups (and even now some single cards) can attain 144fps.

    Adaptive refresh rates are only useful if you have a card that can't keep a minimum fps past the refresh rate of a fixed refresh rate panel. If the maximum refresh rates aren't anything higher than what they are now (if we can even see the difference between higher refresh rates).
  • chizow - Monday, January 12, 2015 - link

    Ah, so you posted a bunch of rubbish and only clarified once called on it, gotcha. Why am I not surprised? Sounds like more "noise" within a distinctly pro-AMD diatribe.

    There are very few multi-card set-ups that can achieve and maintain 144fps minimums in modern games, and also, at what resolution are you talking about? 1080p? 720p? This new wave of FreeSync and G-Sync monitors are moving beyond 1080p and pushing 1440p which is nearly 2x as many pixels as 1080p, meaning it is that much harder to maintain those FPS.

    Same as 4K, its 4x the pixels/resolution of 1080p, so yeah, you're looking at video cards and combinations that don't exist yet that can maintain 60FPS minimums at 4K for modestly demanding, recent games.

    In every single one of these situations, variable refresh would work wonders for end-users, in fact, spending a little bit extra money on a Variable refresh monitor may end up saving you from having to spend 2-3-4x as much on video cards to try to achieve the unrealistic goal of meeting minimum FPS that meet or exceed maximum monitor refresh rates at 144Hz or 60Hz at 4K.
  • djc208 - Thursday, January 8, 2015 - link

    Because it's the Apple way of doing things. You have to buy into the NVidia ecosystem and stay there. Nothing prevents NVidia from supporting both their standard and FreeSync, or from Intel to offer it in their APUs. Since there's nothing to license it should eventually be easy for monitor manufacturers to build it into every monitor they offer, even if in limited form for cheaper panels that wouldn't support the same range of refresh rates.
    No one argues Apple doesn't make good stuff, and they have helped drive the mobile and desktop market in beneficial directions for a while, but it's still Apple, and most of us don't want to live under their roof. Same goes for Nvidia.
  • Yojimbo - Thursday, January 8, 2015 - link

    It's also the AMD way of doing things (pure audio), and just about every company's way of doing things. AMD simply felt they couldn't compete here so they played defensively instead of going head-to-head on it with NVIDIA.
  • medi03 - Thursday, January 8, 2015 - link

    So what is "defensive" about FreeSync? The fact that it is available to all vendors (with no royalty fee) to implement?
  • FlushedBubblyJock - Tuesday, February 24, 2015 - link

    What's defensive is AMD is pennyless, so they will do the second hand generic and demand, like all their fans do, that it is the universal monopoly.
    Thus, when Bill Gates and Microsoft OWN the OS on every computer in the world, you AMD fans need to be reminded, that's the way you like it.
  • chizow - Thursday, January 8, 2015 - link

    And thus is the nature of competition, isn't it? You make your products better than the competition, to benefit your existing customers and attract new ones?

    I guess Nvidia, Apple, Intel and everyone else should just stop innovating and developing new tech, or just give it away for free so that their competitors can catch up? Or even more laughably, give up on their originally, pre-existing, and superior tech just because their competitor is finally offering an inferior analogue over a year later?

Log in

Don't have an account? Sign up now