When AMD launched Freesync back in March, one of the limitations of the initial launch version was that only single-GPU configurations were supported. Multi-GPU Crossfire systems could not be used with Freesync, requiring users to trade-off between Crossfire and Freesync. At the time AMD claimed that Crossfire Freesync support would be coming in April, however as April comes to a close it has become clear that such a release isn’t going to happen.

To that end, AMD has posted a short update on the status of Crossfire Freesync over on their forums. In the update, AMD states that after QA testing they believe that Crossfire Freesync is “is not quite ready for release” and that they will be holding it back as a result. Unfortunately AMD is not committing to a new release date for the feature, but given the fact that it’s more important to get this right than to get it out quickly, this is likely for the best.

Source: AMD

Comments Locked

92 Comments

View All Comments

  • chizow - Thursday, April 30, 2015 - link

    *force V-Sync off
  • Gigaplex - Friday, May 1, 2015 - link

    When going outside the range (pumping more frames than the monitor can handle) G-SYNC goes into forced V-Sync, increasing input latency. Freesync gives the option of V-Sync on or V-Sync off. You don't get both tearing and stutter, you get either tearing or increased input latency (you get to choose what is more important to you), whereas with G-SYNC you don't get the choice. Many competitive gamers would prefer the tearing to keep input latency low.

    There's nothing in the GPU disabling anti-ghosting and overdrive mechanisms with Freesync. That's an implementation detail in current monitors (assuming it's true, I haven't heard anyone else claim this but I'll give you the benefit of the doubt).

    When the frame rate drops below the monitor refresh rate, G-SYNC does some frame duplication to trick the monitor into thinking it's operating at a higher rate. Freesync doesn't currently do this, but this is entirely up to the GPU driver. This can easily change in the future. If you're regularly encountering this situation though, then your GPU isn't powerful enough to run the game at the current settings. You're going to have a sub-par experience with either G-SYNC or Freesync.
  • Socius - Friday, May 1, 2015 - link

    That's not entirely true. VSYNC at 144Hz and GSYNC at 144hz are completely different. In playing FPS games (I average a 7.3 KDR) I haven't noticed any input lag by hitting the 144Hz cap. That's because VSYNC and GSYNC are entirely different beasts.

    It's not that hitting above 144fps all of a sudden introduces input lag. It should be no different than if you were to play with VSYNC off and enforced a 144fps cap through rivatuner/Msi afterburner.
  • chizow - Friday, May 1, 2015 - link

    Exactly. Anyone who has actually used G-Sync understands this so thank you for chiming in.
  • chizow - Friday, May 1, 2015 - link

    Again, you're wrong on this. V-Sync is NEVER enabled with G-Sync, in fact, you MUST force it off in-game for G-Sync to work properly. At capped FPS, you do get FPS capped to the max refresh rate of the monitor, but it is still not V-sync, because it is not nearly as input laggy, and while it does have more input lag than Vsync off or G-Sync below the capped FPS, it is still fluid and dynamic. This was by design as Nvidia's goal was to never have tearing, ever. However, Tom Petersen has stated they liked the idea of disabling VSync at the top end of the refresh range and will look into making that happen in the future.

    FreeSync on the other hand completely forces Vsync off when you fall out of VRR range, but in that transition, there is a noticeable "jarring" jerk as you move from VRR on to VRR off. Also, if you look at usage scenarios, the flaws with FreeSync VRR range are MUCH more likely to be exhibited since most of these panels are high-res 2560x1440 and 4K panels. What makes it worst is you can't even rely on CF to help you stay above these relatively high VRR thresholds. 40 FPS on a 4K monitor for example, is still damn hard to accomplish with any single GPU, even the Titan X.

    Also, I am right about FreeSync disabling OD, which is why you end up getting ghosting. Some responsible review sites covered it more than others (PCPer, TFTCentral, Forbes), but TFT confirmed it happens with ALL FreeSync panels. Later, PCPer correctly surmised Nvidia's G-Sync module took over these OD timings and controls as it essentially replaces the existing scaler module.

    "For some reason, the combination of FreeSync support and this display disables the AMA function.....Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on."

    AMD claims they can fix this via driver, but does that really seem likely? I mean they use a simple protocol, V-Blank, to replace the functions a scaler performed at a fixed refresh rate, and expect to just fix this with drivers? I guess it is possible, but it is looking less and less likely, every passing day, and certainly not as "simple" as some made it out to be (over 1 month since simple claims were made when FreeSync launched).
  • chizow - Friday, May 1, 2015 - link

    Oops, here's the link to the TFT review, just scroll down to the big yellow "Incomplete" portion of the review.

    http://www.tftcentral.co.uk/reviews/benq_xl2730z.h...
  • gnuliver - Saturday, May 2, 2015 - link

    I don't understand why you would want to have "vsync off" in any situation when using VRR. It defeats the point of VRR entirely when you have tearing and visual inconsistency. If you're a competitive gamer and that matters to you, turn off VRR.
  • chizow - Sunday, May 3, 2015 - link

    Well, I guess the belief is at the high-end of a 120-144Hz VRR range, Vsync off negates any input lag (which I've found negligible anyways, and certainly less than Vsync On) and when your refresh rate is that high, the associated tearing is much less noticeable as the on-screen distance/artifacts between split frames would be much less noticeable. I guess you can say it is the same reason some might prefer ULMB at high FPS over G-Sync. They prefer the lack of blur over the minor tearing at high FPS.

    I think it would be a good option to allow the user to decide, and it sounds like Tom Petersen is going to make it happen if it is possible within the limitations of G-Sync.
  • Cellar Door - Thursday, April 30, 2015 - link

    "No surprise, it's obvious AMD is running into all kinds of problems with the limitations of their FreeSync spec."

    And you are an expert on FreeSync and know what the issues are - dude you sounds like an idiot.
  • chizow - Thursday, April 30, 2015 - link

    Isn't it obvious? We keep hearing how "simple" it is to fix these problems, that AMD is just going to release a driver fix to address these issues.

    Where are these fixes? Oh right, see announcement: DELAYED.

    Obviously, it is not as simple as AMD claimed, maybe Nvidia was right when they said getting VRR right was hard, which is why they ultimately went with their own custom FPGA to get the job done rather than relying on existing hodge podge standards that leave you with a half-baked solution that introduces as many problems as it fixes?

Log in

Don't have an account? Sign up now