Our First FCAT & The Test

First announced back at the end of March, FCAT has been something of a bewildering experience for us. NVIDIA has actually done a great job on the software, but between picky games, flaky DVI cables, and dead SSDs (we killed an Intel enterprise grade SSD 910 with FCAT) things have not gone quite to plan, pushing back our intended use of FCAT more than once. In any case, with most of the kinks worked out we’re ready to start integrating it into our major GPU reviews.

For the time being we’re putting FCAT on beta status, as we intend to try out a few different methods of presenting data to find something that’s meaningful, useful, and legible. To that end we’d love to get your feedback in our comments section so that we can further iterate on our presentation and data collection.

We’ve decided to go with two metrics for our first run with FCAT. The first metric is rather simple: 95th percentile frametimes. For years we’ve done minimum framerates (when practical), which are similar in concept, so this allows us to collect similar stats at the end of the rendering pipeline while hopefully avoiding some of the quirkiness that comes from looking at minimum framerates within games themselves. The 95th percentile frametime is quite simply the amount of time it takes to render the slowest 5% of frames. If a game or video card is introducing significant one-off stuttering by taking too long to render some frames, this will show us.

This is primarily meant to capture single-GPU issues, but in practice with AMD having fixed the bulk of their single-GPU issues months ago, we don’t actually expect much. None the less it’s a good way of showing that nothing interesting is happening in those situations.

Our second metric is primarily focused on multi-GPU setups, and is an attempt to quantize the wild frametime variations seen at times with multi-GPU setups, which show up as telltale zigzag lines in frametime graphs.

In this metric, which for the moment we’re calling Delta Percentages, we’re collecting the deltas (differences) between frametimes, averaging that out, and then running the delta average against the average frametime of the entire run. The end result of this process is that we can measure whether sequential frames are rendering in roughly the same amount of time, while controlling for performance differences by looking at the data relative to the average frametime (rather than as absolute time).

In general, a properly behaving single-GPU card should have a delta average of under 3%, with the specific value depending in part on how variable the workload is throughout any given game benchmark. 3% may sound small, but since we’re talking about an average it means it’s weighed against the entire run. The higher the percentage the more unevenly frames are arriving, and exceeding 3% is about where we expect players with good eyes to start noticing a difference. Alternatively in a perfectly frame metered situation, such as v-sync enabled with a setup that can always hit 60fps, then this would be a flat 0%, representing the pinnacle of smoothness.

Moving on, we’ll be running FCAT against 6 of our 10 games for the time being: Sleeping Dogs, Hitman: Absolution, Total War: Shogun 2, Battlefield 3, Bioshock, and Crysis 3. The rest of our games are either highly inconsistent or generally fussy, introducing too much variance into our FCAT results.

Finally, due to the amount of additional time it takes to put together FCAT results, we’re going to primarily publish FCAT results with major product launches and major driver updates. Due to how frame metering works, the only time frame consistency significantly changes is either with the introduction of new architectures/GPUs, or with the introduction of significant driver changes, so those are the scenarios we’ll be focusing on.

The Test

NVIDIA’s launch drivers for the GTX 780 are 320.18, drivers that are essentially identical to the public 320.14 drivers released last week.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Video Cards: AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7990
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX Titan
Video Drivers: NVIDIA ForceWare 320.14
NVIDIA ForceWare 320.18
AMD Catalyst 13.5 Beta 2
OS: Windows 8 Pro

 

Software, Cont: ShadowPlay and "Reason Flags" DiRT: Showdown
Comments Locked

155 Comments

View All Comments

  • Rodrigo - Thursday, May 23, 2013 - link

    Excellent choice for less money than Titan! :-)
  • Ja5087 - Thursday, May 23, 2013 - link

    "NVIDIA will be pricing the GTX 680 at $650, $350 below the GTX Titan and GTX 690, and around $200-$250 more than the GTX 680."

    I think you mean the 780?
  • Ja5087 - Thursday, May 23, 2013 - link

    Accidently replied instead of commented
  • Ryan Smith - Thursday, May 23, 2013 - link

    Thanks. Fixed.
  • nunomoreira10 - Thursday, May 23, 2013 - link

    compared to titan it sure is a better value, but compared to the hight end 2 years ago its twice as much ( titan vs 580 ; 780 vs 570 ; 680 vs 560)
    NVIDIA is slowly geting people acoustmed to hight prices again,
    im gona wait for AMD to see what she can bring to the table
  • Hrel - Friday, May 24, 2013 - link

    She? AMD is a she now?
  • SevenWhite7 - Monday, July 8, 2013 - link

    Yeah, 'cause AMD's more bang-for-the-buck.
    Basically, NVidia's 'he' 'cause it's always the most powerful, but also costs the most.
    AMD's 'she' 'cause it's always more efficient and reasonable.
    I'm a guy, and guys are usually more about power and girls are more about the overall package.
    Just my experience, anyway, and this is just me being dumb trying to explain it with analogies =P
  • sperkowsky - Wednesday, February 26, 2014 - link

    bang for your buck has changed a bit just sold my 7950 added 80 bucks and bought a evga acx 780 b stock
  • cknobman - Thursday, May 23, 2013 - link

    At $650 I am just not seeing it. In fact I dont even see this card putting any pressure on AMD to do something.

    I'd rather save $200+ and get a 7970GE. If Nvidia really wants to be aggressive they need to sell this for ~$550.
  • chizow - Thursday, May 23, 2013 - link

    Nvidia has the GTX 770 next week to match up against the 7970GE in that price bracket, the 780 is clearly meant to continue on the massive premiums for GK110 flagship ASIC started by Titan. While it may not justify the difference in price relative to 7970GHz it's performance, like Titan, is clearly in a different class.

Log in

Don't have an account? Sign up now