Discussing Percentiles and Minimum Frame Rates

Up until this point we have only discussed average frame rates, which is an easy number to generate from a benchmark run. Discussing minimum frame rates is a little tricky, because it could be argued that the time taken to render the worst frame should be the minimum. All it then takes is a bad GPU request (misaligned texture cache) which happens infrequently to provide skewed data. To this end, thanks to the logging functionality of the benchmark, we are able to report the frame rate profiles of each run and percentile numbers.

For the GTX 980 and AMD Fury X, we pulled out the 90th, 95th and 99th percentile data from the outputs, as well as plotting full graphs. For each of these data points, the 90th percentile should represent the frame rate (we’ll stick to reporting frame rates to simplify the matter) a game will achieve during 90% of the frames. Similar logic applies to the 95th and 99th percentile data, where these are closer to the absolute maximum but should be more consistent between runs.

This page (and the next) is going to be data heavy, but our analysis will discuss the effect of CPU scaling on percentile data on both GPUs in all three resolutions using all three CPUs. Starting with the GTX 980 Ti:

Fable Legends Beta: GTX 980 Ti Percentiles

All three arrangements at 3840x2160 perform similarly, though there are slight regressions moving from the i3 to the i7 along most of the range, perhaps suggesting that having an excess of thread data has some issues. The Core i7 arrangement seems to have the upper hand at the low percentile (2%-4%) numbers as well.

Fable Legends Beta: GTX 980 Ti Percentiles

At 1080p, the Core i7 gives greater results when the frame rate is above the average and we see some scaling effects when the scenes are simple (giving high frame rates). But for whatever reason, when the going gets tough the i7 seems to bottom out as we go beyond the 80th percentile.

Fable Legends Beta: GTX 980 Ti Percentiles

If we ever wanted to see a good representation of CPU scaling, the 720p graph is practically there – all except for the 85th percentile and up which makes the data points pulled out in this region perhaps unrepresentative of the whole. This issue might be the same issue when it comes to the 1080p results as well.

CPU Scaling Discussing Percentiles and Minimum Frame Rates - AMD Fury X
Comments Locked

141 Comments

View All Comments

  • Nenad - Thursday, September 24, 2015 - link

    I suggest using 2560x1440 as one of tested resolution for articles where important part are top cards, since that is currently sweet spot for top end cards like GTX 980Ti or AMD Fury.

    I know that, on steam survey, that resolution is not nearly represented as 1920x1080, but neither are cards like 980ti and FuryX.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    The benchmark doesn't support this.
  • Peichen - Thursday, September 24, 2015 - link

    Buy a Fury X if you want to play Fable at 720p. Buy a 980Ti if you want to play 4K. And remember, get a fast Intel CPU if you are playing at 720p otherwise your Fury X won't be able to do 120+fps.
  • DrKlahn - Thursday, September 24, 2015 - link

    The Fury is 2fps less at 4k (though a heavily overclocked 980ti may increase this lead), so I'd say both are pretty comparable at 4K. Fury also scales better at lower resolutions with lower end CPU's so I'm not sure what point you're trying to make. Although to be fair none of the cards tested struggle to maintain playable frame rates at lower resolutions.
  • Asomething - Thursday, September 24, 2015 - link

    Man what a strange world where amd has less driver overhead than the competition.
  • extide - Thursday, September 24, 2015 - link

    Also, remember there are those newer AMD drivers that were not able to be used for this test. I could easily see a new driver gaining 2+ fps to match/beat the 980 ti in this game.
  • Gigaplex - Thursday, September 24, 2015 - link

    If you're buying a card specifically for one game at 720p, you wouldn't be spending top dollar for a Fury X.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    Kind of suprising you chose to publish this at all. Given how limited the testing options are in the benchmark this release seems a little to much like a pure marketing stunt for my tastes from Lionhead and Microsoft rather than a DX12 showcase. The fact that it doesn't include a DX11 option is a dead giveaway.
  • piiman - Saturday, September 26, 2015 - link

    "The fact that it doesn't include a DX11 option is a dead giveaway."

    My thought also. What the point of this Benchmark if there are no Dx11 to compare them to?
  • Gotpaidmuch - Thursday, September 24, 2015 - link

    How come GTX 980 was not included in the tests? Did it score that bad?

Log in

Don't have an account? Sign up now