Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

Bioshock Infinite - 3840x2160 - Ultra Quality + DDoF

Bioshock Infinite - 3840x2160 - High Quality

Bioshock Infinite - 2560x1440 - Ultra Quality + DDoF

Bioshock Infinite - 1920x1080 - Ultra Quality + DDoF

Even with advanced depth of field effects, our highest end video cards are starting to run away with Bioshock: Infinite. That is particularly true for the GTX 980, which in a game that NVIDIA frequently does well in further props up the GTX 980’s advantage. Only at 4K are the R9 290XU and GTX 980 anywhere near close, otherwise at 1440p it’s a 37% performance advantage. GTX 780 Ti on the other hand holds much closer, still falling behind the GTX 980 but only by around 5% at sub-4K resolutions. This does make for a good moment for showcasing the GTX 980’s greater ROP throughput though; as we crank up the resolution to 4K, the 780 Ti falls further behind, especially when we’re at lower quality settings that leave us less shader-bound.

On an absolute basis 120Hz/144Hz gamer should have a blast even with a single GTX 980 at 1080p, while purists will need more performance for 1440p than the 85fps the card can offer. And at 4K the GTX 980 is doing very well for itself, almost cracking 60fps at High quality, and becoming the only card to crack 40fps with Ultra quality.

This will be one of the weaker showings for the GTX 980 over the GTX 680 though; at sub-4K resolutions it’s only a 60-65% performance improvement.

Bioshock Infinite - Delta Percentages

Bioshock Infinite - Surround/4K - Delta Percentages

Meanwhile Bioshock is the first of 5 games we can reliably measure with the FCAT tools to check for frame pacing consistency. Bioshock is a bit more erratic than most games in this respect, and while our general rule of thumb for an excellent performance from a single card is 3%, our recording for GTX 980 is a bit higher at 3.5%. On the other hand at 4K it measures in at just 2.3%. So while frame pacing is going to be a bit of a rubber stamping process overall, we can confirm that GTX 980 is delivering a good frame pacing experience in Bioshock.

Company of Heroes 2 Battlefield 4
Comments Locked

274 Comments

View All Comments

  • kron123456789 - Friday, September 19, 2014 - link

    Look at "Load Power Consuption — Furmark" test. It's 80W lower with 980 than with 780Ti.
  • Carrier - Friday, September 19, 2014 - link

    Yes, but the 980's clock is significantly lowered for the FurMark test, down to 923MHz. The TDP should be fairly measured at speeds at which games actually run, 1150-1225MHz, because that is the amount of heat that we need to account for when cooling the system.
  • Ryan Smith - Friday, September 19, 2014 - link

    It doesn't really matter what the clockspeed is. The card is gated by both power and temperature. It can never draw more than its TDP.

    FurMark is a pure TDP test. All NVIDIA cards will reach 100% TDP, making it a good way to compare their various TDPs.
  • Carrier - Friday, September 19, 2014 - link

    If that is the case, then the charts are misleading. GTX 680 has a 195W TDP vs. GTX 770's 230W (going by Wikipedia), but the 680 uses 10W more in the FurMark test.

    I eagerly await your GTX 970 report. Other sites say that it barely saves 5W compared to the GTX 980, even after they correct for factory overclock. Or maybe power measurements at the wall aren't meant to be scrutinized so closely :)
  • Carrier - Friday, September 19, 2014 - link

    To follow up: in your GTX 770 review from May 2013, you measured the 680 at 332W in FurMark, and the 770 at 383W in FurMark. Those numbers seem more plausible.
  • Ryan Smith - Saturday, September 20, 2014 - link

    680 is a bit different because it's a GPU Boost 1.0 card. 2.0 included the hard TDP and did away with separate power targets. Actually what you'll see is that GTX 680 wants to draw 115% TDP with NVIDIA's current driver set under FurMark.
  • Carrier - Saturday, September 20, 2014 - link

    Thank you for the clarification.
  • wanderer27 - Friday, September 19, 2014 - link

    Power at the wall (AC) is going to be different than power at the GPU - which is coming from the DC PSU.

    There are loses and efficiency difference in converting from AC to DC (PSU), plus a little wiggle from MB and so forth.
  • solarscreen - Friday, September 19, 2014 - link

    Here you go:

    http://books.google.com/books?id=v3-1hVwHnHwC&...
  • PhilJ - Saturday, September 20, 2014 - link

    As stated in the article, the power figures are total system power draw. The GTX980 is throwing out nearly double the FPS of the GTX680, so this is causing the rest of the system (mostly the CPU) to work harder to feed the card. This in tun drives the total system power consumption up, despite the fact the GTX980 itself is drawing less power than the GTX680.

Log in

Don't have an account? Sign up now