Our First FCAT & The Test

First announced back at the end of March, FCAT has been something of a bewildering experience for us. NVIDIA has actually done a great job on the software, but between picky games, flaky DVI cables, and dead SSDs (we killed an Intel enterprise grade SSD 910 with FCAT) things have not gone quite to plan, pushing back our intended use of FCAT more than once. In any case, with most of the kinks worked out we’re ready to start integrating it into our major GPU reviews.

For the time being we’re putting FCAT on beta status, as we intend to try out a few different methods of presenting data to find something that’s meaningful, useful, and legible. To that end we’d love to get your feedback in our comments section so that we can further iterate on our presentation and data collection.

We’ve decided to go with two metrics for our first run with FCAT. The first metric is rather simple: 95th percentile frametimes. For years we’ve done minimum framerates (when practical), which are similar in concept, so this allows us to collect similar stats at the end of the rendering pipeline while hopefully avoiding some of the quirkiness that comes from looking at minimum framerates within games themselves. The 95th percentile frametime is quite simply the amount of time it takes to render the slowest 5% of frames. If a game or video card is introducing significant one-off stuttering by taking too long to render some frames, this will show us.

This is primarily meant to capture single-GPU issues, but in practice with AMD having fixed the bulk of their single-GPU issues months ago, we don’t actually expect much. None the less it’s a good way of showing that nothing interesting is happening in those situations.

Our second metric is primarily focused on multi-GPU setups, and is an attempt to quantize the wild frametime variations seen at times with multi-GPU setups, which show up as telltale zigzag lines in frametime graphs.

In this metric, which for the moment we’re calling Delta Percentages, we’re collecting the deltas (differences) between frametimes, averaging that out, and then running the delta average against the average frametime of the entire run. The end result of this process is that we can measure whether sequential frames are rendering in roughly the same amount of time, while controlling for performance differences by looking at the data relative to the average frametime (rather than as absolute time).

In general, a properly behaving single-GPU card should have a delta average of under 3%, with the specific value depending in part on how variable the workload is throughout any given game benchmark. 3% may sound small, but since we’re talking about an average it means it’s weighed against the entire run. The higher the percentage the more unevenly frames are arriving, and exceeding 3% is about where we expect players with good eyes to start noticing a difference. Alternatively in a perfectly frame metered situation, such as v-sync enabled with a setup that can always hit 60fps, then this would be a flat 0%, representing the pinnacle of smoothness.

Moving on, we’ll be running FCAT against 6 of our 10 games for the time being: Sleeping Dogs, Hitman: Absolution, Total War: Shogun 2, Battlefield 3, Bioshock, and Crysis 3. The rest of our games are either highly inconsistent or generally fussy, introducing too much variance into our FCAT results.

Finally, due to the amount of additional time it takes to put together FCAT results, we’re going to primarily publish FCAT results with major product launches and major driver updates. Due to how frame metering works, the only time frame consistency significantly changes is either with the introduction of new architectures/GPUs, or with the introduction of significant driver changes, so those are the scenarios we’ll be focusing on.

The Test

NVIDIA’s launch drivers for the GTX 780 are 320.18, drivers that are essentially identical to the public 320.14 drivers released last week.

CPU: Intel Core i7-3960X @ 4.3GHz
Motherboard: EVGA X79 SLI
Power Supply: Antec True Power Quattro 1200
Hard Disk: Samsung 470 (256GB)
Memory: G.Skill Ripjaws DDR3-1867 4 x 4GB (8-10-9-26)
Case: Thermaltake Spedo Advance
Monitor: Samsung 305T
Video Cards: AMD Radeon HD 7970 GHz Edition
AMD Radeon HD 7990
NVIDIA GeForce GTX 580
NVIDIA GeForce GTX 680
NVIDIA GeForce GTX 690
NVIDIA GeForce GTX 780
NVIDIA GeForce GTX Titan
Video Drivers: NVIDIA ForceWare 320.14
NVIDIA ForceWare 320.18
AMD Catalyst 13.5 Beta 2
OS: Windows 8 Pro

 

Software, Cont: ShadowPlay and "Reason Flags" DiRT: Showdown
Comments Locked

155 Comments

View All Comments

  • Stuka87 - Thursday, May 23, 2013 - link

    The video card does handle the decoding and rendering for the video. Anand has done several tests over the years comparing their video quality. There are definite differences between AMD/nVidia/Intel.
  • JDG1980 - Thursday, May 23, 2013 - link

    Yes, the signal is digital, but the drivers often have a bunch of post-processing options which can be applied to the video: deinterlacing, noise reduction, edge enhancement, etc.
    Both AMD and NVIDIA have some advantages over the other in this area. Either is a decent choice for a HTPC. Of course, no one in their right mind would use a card as power-hungry and expensive as a GTX 780 for a HTPC.

    In the case of interlaced content, either the PC or the display device *has* to apply post-processing or else it will look like crap. The rest of the stuff is, IMO, best left turned off unless you are working with really subpar source material.
  • Dribble - Thursday, May 23, 2013 - link

    To both of you above, on DVD yes, not on bluray - there is no interlacing, noise or edges to reduce - bluray decodes to a perfect 1080p picture which you send straight to the TV.

    All the video card has to do is decode it, which why a $20 bluray player with $5 cable will give you exactly the same picture and sound quality as a $1000 bluray player with $300 cable - as long as TV can take the 1080p input and hifi can handle the HD audio signal.
  • JDG1980 - Thursday, May 23, 2013 - link

    You can do any kind of post-processing you want on a signal, whether it comes from DVD, Blu-Ray, or anything else. A Blu-Ray is less likely to get subjective quality improvements from noise reduction, edge enhancement, etc., but you can still apply these processes in the video driver if you want to.

    The video quality of Blu-Ray is very good, but not "perfect". Like all modern video formats, it uses lossy encoding. A maximum bit rate of 40 Mbps makes artifacts far less common than with DVDs, but they can still happen in a fast-motion scene - especially if the encoders were trying to fit a lot of content on a single layer disc.

    Most Blu-Ray content is progressive scan at film rates (1080p23.976) but interlaced 1080i is a legal Blu-Ray resolution. I believe some variants of the "Planet Earth" box set use it. So Blu-Ray playback devices still need to know how to deinterlace (assuming they're not going to delegate that task to the display).
  • Dribble - Thursday, May 23, 2013 - link

    I admit it's possible to post process but you wouldn't, a real time post process is highly unlikely to add anything good to the picture - fancy bluray players don't post process, they just pass on the signal. As for 1080i that's a very unusual case for bluray, but as it's just the standard HD TV resolution again pass it to the TV - it'll de-interlace it just like it does all the 1080i coming from your cable/satelight box.
  • Galidou - Sunday, May 26, 2013 - link

    ''All the video card has to do is decode it, which why a $20 bluray player with $5 cable will give you exactly the same picture and sound quality as a $1000 bluray player with $300 cable - as long as TV can take the 1080p input and hifi can handle the HD audio signal.''

    I'm an audiophile and a professionnal when it comes to hi-end home theater, I myself built tons of HT system around PCs and or receivers and I have to admit this is the funniest crap I've had to read. I'd just like to know how many blu-ray players you've personnally compared up to let's say the OPPO BDP -105(I've dealt with pricier units than this mere 1200$ but still awesome Blu-ray player).

    While I can certainly say that image quality not affected by much, the audio on the other side sees DRASTIC improvements. Hardware not having an effect on sound would be like saying: there's no difference between a 200$ and a 5000$ integrated amplifier/receiver, pure non sense.

    ''the same picture and sound quality''

    The part speaking about sound quality should really be removed from your comment as it really astound me to think you can beleive what you said is true.
  • eddman - Thursday, May 23, 2013 - link

    http://i.imgur.com/d7oOj7d.jpg
  • EzioAs - Thursday, May 23, 2013 - link

    If I were a Titan owner (and I actually purchase the card, not some free giveaway or something), I would regret that purchase very, very badly. $650 is still a very high price for the normal GTX x80 cards but it makes the Titan basically a product with incredibly bad pricing (not that we don't know that already). Still, I'm no Titan owner, so what do I know...

    On the other hand, when I look at the graphs, I think the HD7970 is an even better card than ever despite it being 1.5 years older. However, as Ryan pointed out for previous GTX500 users who plan on sticking with Nvidia and are considering high end cards like this, it may not be a bad card at all since there are situations (most of the time) where the performance improvements are about twice the GTX580.
  • JeffFlanagan - Thursday, May 23, 2013 - link

    I think $350 is almost pocket-change to someone who will drop $1000 on a video card. $1K is way out of line with what high-quality consumer video cards go for in recent years, so you have to be someone who spends to say they spent, or someone mining one of the bitcoin alternatives in which case getting the card months earlier is a big benefit.
  • mlambert890 - Thursday, May 23, 2013 - link

    I have 3 Titans and don't regret them at all. While I wouldn't say $350 is "pocket change" (or in this case $1050 since its x3), it also is a price Im willing to pay for twice the VRAM and more perf. With performance at this level "close" doesn't count honestly if you are looking for the *highest* performance possible. Gaming in 3D surround even 3xTitan actually *still* isn't fast enough, so no way I'd have been happy with 3x780s for $1000 less.

Log in

Don't have an account? Sign up now