Catalyst 13.8 Beta 1: The First Multi-GPU Frame Pacing Driver

The culmination of AMD’s first wave of efforts to manage frame pacing is the Catalyst 13.8 driver (driver branch 13.200). Being released in beta form today, the marquee feature for this driver is the new frame pacing mechanism for Crossfire setups. As with any major new driver branch this also includes some other improvements, and while we don’t have the complete release notes, AMD has mentioned that these drivers will bring about full OpenGL 4.3 compliance (apparently they were missing a couple of items before).

AMD is calling this driver “phase 1” of their frame pacing solution, and for good reason. In implementing frame pacing AMD has tackled the issue in what’s very obviously a triage-like manner, focusing on the most important/significant problems and working out from there. So what’s addressed by this first driver resolves AMD’s biggest issues, but not all of them.

So what’s being addressed in phase 1? Phase 1 is being dedicated to Direct3D 10+ games running on a single display. What’s not being addressed in the first driver are the Direct3D 9 and OpenGL rendering paths, along with Eyefinity in any scenario.

It goes without saying that in an ideal would we would have liked to see AMD hit everything at once, but if they couldn’t do it all at once then choosing to tackle D3D10+ games first was the next best move they could make. This covers virtually all of the games present and future that are graphically challenging enough to weigh down a high-end Crossfire setup. D3D9 games by and large are not that demanding on this class of hardware – we’d have to resort to Skyrim mods to find a D3D9-exclusive title that isn’t CPU limited and/or gets less than 90fps off of a single GPU. OpenGL has even less traction, the last OpenGL game of note being 2011’s Rage which is capped at 60fps and easily hits that at 1080p on even 7800 series hardware.

Catalyst 13.8 Frame Pacing
  Single Display Eyefinity
D3D11 Y N
D3D10 Y N
D3D9 N N
OpenGL N N

It’s Eyefinity users who will be the most unfortunate bunch at the moment. Eyefinity is one of the premiere usage scenarios for Crossfire because of the amount of GPU horsepower required, however it’s also the most complex scenario to tackle – splitting work across multiple GPUs and then multiple display controllers – compared to the fairly low user uptake. More so than with D3D9 and OpenGL AMD does need to get Eyefinity sorted and quickly, but for the moment single display setups are it. On that note, 4K displays are technically also out, since the current 60Hz 4K displays actually present themselves as two displays, with video cards addressing them via Eyefinity and other multi-monitor surround modes.

On the plus side, since this is a purely driver based solution, AMD is rolling out frame pacing to all of their currently supported products, and not just the 7000/8000 series based GCN parts. This means 5000 and 6000 series Crossfire setups, including multi-GPU cards like the 5970 and 6990, are also having their pacing issues resolved in this driver. Given the limited scope of this driver we were afraid it would be GCN-only, so this ended up being a relief.

Moving on, let’s dive into the new driver. True to their word, AMD has made the new frame pacing mechanism a user controllable option available in the Catalyst Control Center. Located in the CrossfireX section of the 3D Application Settings page and simply titled “Frame Pacing,” it defaults to on. Turn it off and AMD’s rendering behavior reverts to the low-lag behavior in previous drivers.

As far as technical details go, AMD has not offered up any significant details on how their new frame pacing mechanism works. Traditionally neither AMD nor NVIDIA have offered a ton of detail into how they implement AFR under the hood, so while unfortunate from an editorial standpoint it’s not unexpected. Hopefully once AMD finishes the other phases and enabling the new frame pacing mechanism elsewhere, we’ll be able to get some solid details on what AMD is doing to implement frame pacing. So for the moment we only have the barest of details: AMD is delaying frames as to prevent any frame from being shown too early, presumably relying on backpressure in the rendering queue to stabilize and keep future frames coming at a reasonable pace.

With that said, based on just the frame time measurements from our benchmark suite we can deduce a bit more about what AMD is doing. Unlike NVIDIA’s “organic” approach, which results in frame times that follow a similar pattern as single-GPU setups but with far wider variation, the frame times we’re seeing on 13.8 have a very distinct, very mechanical metered approach.

Accounting for some slight variation due to how back buffer swapping works, what we see are some very distinct minimum frame time plateaus in our results. Our best guess is that AMD is running some kind of adaptive algorithm which is looking at a window of rendering times and based on that is enforcing a minimum frame time, ultimately adjusting itself every few seconds as necessary. NVIDIA doesn’t implement something quite like this, but beyond that we don’t know how the two compare algorithmically at this time. However regardless of their differences what we’re ultimately interested in is how well each mechanism works.

In Summary: The Frame Pacing Problem The Test
Comments Locked

102 Comments

View All Comments

  • mwildtech - Friday, August 2, 2013 - link

    Are you still signed into AOL...? ;) I also haven't had many issues with either, at least from a single GPU perspective.
  • kyuu - Friday, August 2, 2013 - link

    What a surprise, the AMD-bashing trolls are out in force with long rants that nobody will read.

    Give it a rest guys.

    Anyways, great write-up Ryan. Good to see AMD is getting the issue taken care of.
  • chizow - Saturday, August 3, 2013 - link

    Except in this case, "AMD bashing trolls" helped fix your CF drivers. A simple "thank you" would have sufficed.
  • TheJian - Tuesday, August 6, 2013 - link

    ROFL...I sincerely thank you for the laugh ;)

    I liked many products over the years but have been saved by vocal complainers pointing out things to make me run, or at least wait until fixes come. I waited for RROD to get fixed with Jasper. Years of complainers finally got a fix (it took so long I started doubting I'd ever own one). My friend who jumped on x360 early shipped his back multiple times in the first year. I believe it spent more time at MS than in his house...LOL. He was a vocal complainer in their forums etc but I never called him a MS bashing troll for it. I laughed and thanked him for being one of the people who saved me years of that frustration :) He only thought that was funny after some beers...LOL

    Thankfully he has a great sense of humor. He's ready with forum accounts everywhere he thinks the complainers will be for xbox1 this time (complainers have value people). But he expects to be a reader this time rather than the complainer ;) I think he'll go PS4 in the end despite the MS love he has vs. Sony. His wallet has no trouble voting against his fanboy thoughts.

    I'm torn over the consoles though. I'd love to see AMD start making some cash, but at the same time I'm pretty unhappy they blew a wad of R&D money on something I want completely dead instead of cpus/gpus/arm socs. Had that R&D went to PC's I don't think I'd be making these statements dissing AMD. At the least they could have kept the layoffs from happening (losing 30% of your smartest people will shaft us on PC's for a few years at least and longer if consoles don't take off by the millions), and had good drivers all last year. That also might have given them a better reputation thus not needing to give out free games that are clearly wiping out profits (Q report shows this). AMD has a great gpu. It's a pity they didn't have enough funding for R&D to pair it with a great driver from day1 and funding to avoid the Fcat disaster. Even if it affects a small group it causes a lot of people to paint your other products with that image.
  • Steveymoo - Friday, August 2, 2013 - link

    Interestingly enough, I seem to remember my GTX 460s having microstutter and performance issues in SLI. To the point where your experience in twitch games would be better if you just disabled on of the GPUs. However, over the years, and many driver updates, I don't seem to notice it any more. Nvidia really must have quite a talented software team, who communicate well with the hardware division. I would say there might be some kind of company structure issues for an issue such as this to go unnoticed, and un-fixed for such a long time.
  • anubis44 - Friday, August 2, 2013 - link

    Ssshhhhh! TheJian will be all over you like a duck on a june bug! Remember, Nvidia's drivers are always perfect! They never make any mistakes...

    ...well, except for the chronic problem I had with the GTX670 card I bought for my 3 monitor setup - kept requiring about 20 steps to get all three screens to display due to bad default refresh rate/synch issue in the Nvidia driver. Got so frustrated having to go through 20 steps every time I updated to a newer driver that I sold the card for close to what I paid (~$400) and bought a Gigabyte 7950 for about ~$100 less and flashed the bios to 1050MHz. 3 monitors in eyefinity set up in about 5 minutes in the Catalyst control panel and not a problem since.
  • DanNeely - Friday, August 2, 2013 - link

    Are you using display port monitors or an active DP-DVI adapter for your third monitor? If the latter, has it finally gotten plug and play vs the problems when it first came out? I was never able to get an adapter to work with my 5870; and since my setup wasn't EF compatable anyway (2x 1200x1600 1x 2560x1600) ended up cutting my losses with a 5450 for the 3rd monitor and went nVidia for my next GPU in response.
  • krutou - Friday, August 2, 2013 - link

    Nvidia is known to suck at multi-monitor support because AMD was the first to develop the technology. One of AMD's few strengths is Eyefinity support.
  • TheJian - Tuesday, August 6, 2013 - link

    From the article (and this is repeated at every site reviewing the drivers):
    "So what’s being addressed in phase 1? Phase 1 is being dedicated to Direct3D 10+ games running on a single display. What’s not being addressed in the first driver are the Direct3D 9 and OpenGL rendering paths, along with Eyefinity in any scenario."

    So Eyefinity has issues and isn't even touched with phase1. At the very least AMD is the opposite of strength with eyefinity for now. Phase2 maybe? ;)
    https://en.wikipedia.org/wiki/File:Graham%27s_Hier...
    You've stated a point without backing it (4th, green).

    Refutation:
    I found your mistake and explained why it is one and backed it with a direct quote (from this article no less...ROFL) thus proving my point ;) That's the purple one :) But I'm pretty sure I made it into the grey anyway. Your central point is debunked. But I can live with purple if it makes you feel better.

    Being first has no bearing on who is better later. Horses got us from point A to B first, long before cars right? But that didn't stop a car from blowing them away later. I could say the same about the first car engine vs. say a Lamborghini engine today. First doesn't mean best.
  • TheJian - Tuesday, August 6, 2013 - link

    Why, he's pointing out reality and what most sites point out. All multi cards had issues for a while and still do. NV just spent a lot more to come up with the tools/software to fix it as best as possible (and I'd still go single potent vs. even NV multi given a reasonable choice). You're mistaking an accurate product complaint for fanboyism. That is not what my complaints are. There is no reason to attack his comment as I already know it's at least partially true for all CF/SLI and the fix is proven (so is AMD's lack of it up to now, and still having issues with 3 cards).

    Would you feel better if I ranted on Bumpgate for a few paragraphs? When a company sucks I point it out. I don't care who it is. Caminogate anyone? I ranted then too. Win8, don't get me started, Vista...(fista? Nuff said). I have equal hate for all crappy releases no matter how much love or hate I have for a company (I hate apple's tactics & pricing, but they do generally have a good polished product). If AMD releases a great 20nm product and NV sucks I will RAVE for AMD and shout at the top of my lungs how NV's product sucks. Based on R&D I doubt NV will suck but AMD can still get out a good product, I just need proof at this point due to lack of funds/engineers pointing to a possible problem launch again.

    Comically you miss the entire point of any of my posts (which are backed by data from other sites etc), then rant yourself on NV. Congrats though, at least you made it to the 4th rung here (well sort of):
    https://en.wikipedia.org/wiki/Ad_hominem
    But not without making the 2nd worst type of argument first...ROFL. You're not outing me here, you're outing yourself.

Log in

Don't have an account? Sign up now