Discussing Percentiles and Minimum Frame Rates

Continuing from the previous page, we performed a similar analysis on AMD's Fury X graphics card. Same rules apply - all three resolution/setting combinations using all three system configurations. Results are given as frame rate profiles showing percentiles as well as choosing the 90th, 95th and 99th percentile values to get an indication of minimum frame rates.

Fable Legends Beta: AMD Fury X Percentiles

Moving on to the Fury X at 4K and we see all three processor lineups performing similarly, giving us an indication that we are more GPU limited here. There is a slight underline on the Core i7 though, giving slightly lower frame rates in easier scenes but a better frame rate when the going gets tough beyond the 95th percentile.

Fable Legends Beta: AMD Fury X Percentiles

For 1080p, the results take a twist. It almost seems as if we have some form of reverse scaling, whereby more cores is doing more damage to the results. If we have a look at the breakdown provided by the in-game benchmark (given in milliseconds, so lower is better):

Fable Legends Beta: AMD Fury X at 1080p Render Sub-Results

Three areas stand out as benefitting from fewer cores: Transparency and Effects, GBuffer Rendering and Dynamic Lighting. All three are related to illumination and how the illumination interacts with its surroundings. One reason springs to mind on this – with large core counts, too many threads are issuing work to the graphics card causing thread contention in the cache or giving the thread scheduler a hard time depending on what comes in as high priority.

Nevertheless, the situation changes when we move down again to 720p:

Fable Legends Beta: AMD Fury X Percentiles

Here the Core i3 takes a nose dive as we become CPU limited to pushing out the frames.

Discussing Percentiles and Minimum Frame Rates - NVIDIA GTX 980 Ti Comparing Percentile Numbers Between the GTX 980 Ti and Fury X
Comments Locked

141 Comments

View All Comments

  • Nenad - Thursday, September 24, 2015 - link

    I suggest using 2560x1440 as one of tested resolution for articles where important part are top cards, since that is currently sweet spot for top end cards like GTX 980Ti or AMD Fury.

    I know that, on steam survey, that resolution is not nearly represented as 1920x1080, but neither are cards like 980ti and FuryX.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    The benchmark doesn't support this.
  • Peichen - Thursday, September 24, 2015 - link

    Buy a Fury X if you want to play Fable at 720p. Buy a 980Ti if you want to play 4K. And remember, get a fast Intel CPU if you are playing at 720p otherwise your Fury X won't be able to do 120+fps.
  • DrKlahn - Thursday, September 24, 2015 - link

    The Fury is 2fps less at 4k (though a heavily overclocked 980ti may increase this lead), so I'd say both are pretty comparable at 4K. Fury also scales better at lower resolutions with lower end CPU's so I'm not sure what point you're trying to make. Although to be fair none of the cards tested struggle to maintain playable frame rates at lower resolutions.
  • Asomething - Thursday, September 24, 2015 - link

    Man what a strange world where amd has less driver overhead than the competition.
  • extide - Thursday, September 24, 2015 - link

    Also, remember there are those newer AMD drivers that were not able to be used for this test. I could easily see a new driver gaining 2+ fps to match/beat the 980 ti in this game.
  • Gigaplex - Thursday, September 24, 2015 - link

    If you're buying a card specifically for one game at 720p, you wouldn't be spending top dollar for a Fury X.
  • Jtaylor1986 - Thursday, September 24, 2015 - link

    Kind of suprising you chose to publish this at all. Given how limited the testing options are in the benchmark this release seems a little to much like a pure marketing stunt for my tastes from Lionhead and Microsoft rather than a DX12 showcase. The fact that it doesn't include a DX11 option is a dead giveaway.
  • piiman - Saturday, September 26, 2015 - link

    "The fact that it doesn't include a DX11 option is a dead giveaway."

    My thought also. What the point of this Benchmark if there are no Dx11 to compare them to?
  • Gotpaidmuch - Thursday, September 24, 2015 - link

    How come GTX 980 was not included in the tests? Did it score that bad?

Log in

Don't have an account? Sign up now