Discussing Percentiles and Minimum Frame Rates

Continuing from the previous page, we performed a similar analysis on AMD's Fury X graphics card. Same rules apply - all three resolution/setting combinations using all three system configurations. Results are given as frame rate profiles showing percentiles as well as choosing the 90th, 95th and 99th percentile values to get an indication of minimum frame rates.

Fable Legends Beta: AMD Fury X Percentiles

Moving on to the Fury X at 4K and we see all three processor lineups performing similarly, giving us an indication that we are more GPU limited here. There is a slight underline on the Core i7 though, giving slightly lower frame rates in easier scenes but a better frame rate when the going gets tough beyond the 95th percentile.

Fable Legends Beta: AMD Fury X Percentiles

For 1080p, the results take a twist. It almost seems as if we have some form of reverse scaling, whereby more cores is doing more damage to the results. If we have a look at the breakdown provided by the in-game benchmark (given in milliseconds, so lower is better):

Fable Legends Beta: AMD Fury X at 1080p Render Sub-Results

Three areas stand out as benefitting from fewer cores: Transparency and Effects, GBuffer Rendering and Dynamic Lighting. All three are related to illumination and how the illumination interacts with its surroundings. One reason springs to mind on this – with large core counts, too many threads are issuing work to the graphics card causing thread contention in the cache or giving the thread scheduler a hard time depending on what comes in as high priority.

Nevertheless, the situation changes when we move down again to 720p:

Fable Legends Beta: AMD Fury X Percentiles

Here the Core i3 takes a nose dive as we become CPU limited to pushing out the frames.

Discussing Percentiles and Minimum Frame Rates - NVIDIA GTX 980 Ti Comparing Percentile Numbers Between the GTX 980 Ti and Fury X
Comments Locked

141 Comments

View All Comments

  • medi03 - Thursday, September 24, 2015 - link

    "AMD had driver with better results, but we didn't use it", "oh, Bryan tested it, but he's away" adds some sauce to it.
  • Oxford Guy - Thursday, September 24, 2015 - link

    "we are waiting for a better time to test the Ashes of the Singularity benchmark"

    L-O-L
  • Frenetic Pony - Thursday, September 24, 2015 - link

    This is as usual a trolley, click bait response. The truth is far more complex than whether one side "wins". Here we can see Amds older card once again benefiting greatly from Dx12, to the point where it clearly pulls ahead of Nvidias similarly priced options. Yet on the high end it seems Nvidia has scaled it's gpu architecture better than Amd has, with the 980ti having the advantage. So, technology, like life, is complicated and not prone to simple quips that accurately reflect reality.
  • jospoortvliet - Friday, September 25, 2015 - link

    True. One thing has not changed and becomes more pronounced with DirectX12: AMD offers better performance at every price point.
  • Th-z - Thursday, September 24, 2015 - link

    Are you referring to AotS results? That benchmark stresses different things than this classic flyby benchmark, both are useful in their own right. AotS's more akin to real gameplay and stresses draw call capability of DX12, which is *the* highlights of DX12.

    My question for Ryan, Anandtech didn't test AotS because you said it's still in the early developmental stage, however this one isn't? I say test them all, make interesting before-and-after study cases regardless. Also have you considered improving your comment section?
  • DoomGuy64 - Friday, September 25, 2015 - link

    Incorrect. The $650 Ti is the only Nvidia card better in dx12, and it has 96 ROPs, compared to the Fury's 64, not that Fury is actually doing that bad. AMD on the other hand, is cleaning up with the mid-range cards, which is what most people are buying.
  • masaville888 - Saturday, October 10, 2015 - link

    I left AMD for good after too many years of technical issues such as artifacting, poorly optimized drivers and so forth. I always had a good experience with nVidia and after going back I have no regrets. AMD seems more interested in tech announcements than user experience. If they figure out the customer side of things they have the potential to be great, but not until then.
  • lprates - Thursday, October 15, 2015 - link

    I totally Agree
  • lprates - Thursday, October 15, 2015 - link

    I totally Agree
  • lprates - Sunday, October 18, 2015 - link

    I totally Agree

Log in

Don't have an account? Sign up now