Discussing Percentiles and Minimum Frame Rates

Up until this point we have only discussed average frame rates, which is an easy number to generate from a benchmark run. Discussing minimum frame rates is a little tricky, because it could be argued that the time taken to render the worst frame should be the minimum. All it then takes is a bad GPU request (misaligned texture cache) which happens infrequently to provide skewed data. To this end, thanks to the logging functionality of the benchmark, we are able to report the frame rate profiles of each run and percentile numbers.

For the GTX 980 and AMD Fury X, we pulled out the 90th, 95th and 99th percentile data from the outputs, as well as plotting full graphs. For each of these data points, the 90th percentile should represent the frame rate (we’ll stick to reporting frame rates to simplify the matter) a game will achieve during 90% of the frames. Similar logic applies to the 95th and 99th percentile data, where these are closer to the absolute maximum but should be more consistent between runs.

This page (and the next) is going to be data heavy, but our analysis will discuss the effect of CPU scaling on percentile data on both GPUs in all three resolutions using all three CPUs. Starting with the GTX 980 Ti:

Fable Legends Beta: GTX 980 Ti Percentiles

All three arrangements at 3840x2160 perform similarly, though there are slight regressions moving from the i3 to the i7 along most of the range, perhaps suggesting that having an excess of thread data has some issues. The Core i7 arrangement seems to have the upper hand at the low percentile (2%-4%) numbers as well.

Fable Legends Beta: GTX 980 Ti Percentiles

At 1080p, the Core i7 gives greater results when the frame rate is above the average and we see some scaling effects when the scenes are simple (giving high frame rates). But for whatever reason, when the going gets tough the i7 seems to bottom out as we go beyond the 80th percentile.

Fable Legends Beta: GTX 980 Ti Percentiles

If we ever wanted to see a good representation of CPU scaling, the 720p graph is practically there – all except for the 85th percentile and up which makes the data points pulled out in this region perhaps unrepresentative of the whole. This issue might be the same issue when it comes to the 1080p results as well.

CPU Scaling Discussing Percentiles and Minimum Frame Rates - AMD Fury X
Comments Locked

141 Comments

View All Comments

  • Traciatim - Thursday, September 24, 2015 - link

    RAM generally has very little to no impact on gaming except for a few strange cases (like F1).

    Though, the machine still has it's cache available so the i3 test isn't quite the same thing as a real i3 it should be close enough that you wouldn't notice the difference.
  • Mr Perfect - Thursday, September 24, 2015 - link

    In the future, could you please include/simulate a 4 core/8 thread CPU? That's probably what most of us have.
  • Oxford Guy - Thursday, September 24, 2015 - link

    How about Ashes running on a Fury and a 4.5 GHz FX CPU.
  • Oxford Guy - Thursday, September 24, 2015 - link

    and a 290X, of course, paired against a 980
  • vision33r - Thursday, September 24, 2015 - link

    Just because a game supports DX12 doesn't mean it uses all DX12 features. It looks like they have DX12 as a check box but not really utilizing DX12 complete features. We have to see more DX12 implemenations to know for sure how each card stack up.
  • Wolfpup - Thursday, September 24, 2015 - link

    I'd be curious about a direct X 12 vs 11 test at some point.

    Regarding Fable Legends, WOW am I disappointed by what it is. I shouldn't be in a sense, I mean I'm not complaining that Mario Baseball isn't a Mario game, but still, a "free" to play deathmatch type game isn't what I want and isn't what I think of with Fable (Even if, again, really this could be good for people who want it, and not a bad use of the license).

    Just please don't make a sequel to New Vegas or Mass Effect or Bioshock that's deathmatch LOL
  • toyotabedzrock - Thursday, September 24, 2015 - link

    You should have used the new driver given you where told it was related to this specific game preview.
  • Shellshocked - Thursday, September 24, 2015 - link

    Does this benchmark use Async compute?
  • Spencer Andersen - Thursday, September 24, 2015 - link

    Negative, Unreal Engine does NOT use Async compute except on Xbox One. Considering that is one of the main features of the newer APIs, what does that tell you? Nvidia+Unreal Engine=BFF But I don't see it as a big deal considering that Frostbite and likely other engines already have most if not all DX12 features built in including Async compute.

    Great article guys, looking forward to more DX12 benchmarks. It's an interesting time in gaming to say the least!
  • oyabun - Thursday, September 24, 2015 - link

    There is something wrong with the webpages of the article, an ad by Samsung seems to cover the entire page and messes up all the rendering. Furthermore wherever I click a new tab opens at www.space.com! I had to reload several times just to be able to post this!

Log in

Don't have an account? Sign up now