Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

Bioshock Infinite - 2560x1440 - Ultra Quality + DDoF

AMD and NVIDIA exchange places once more, with the 7990 taking a small lead over the GTX 690.

Bioshock Infinite - Delta Percentages - 2560x1440 - Ultra Quality + DDoF

AMD’s initial situation with Bioshock was not as dire as it was in say Battlefield 3, but with deltas approaching 60% it wasn’t pretty either. Once more they’ve managed to get their delta percentages to around 20%, a level that is acceptable for now while leaving clear room for improvement. Especially as once more the 7990 delivers deltas more than twice those of the GTX 690.

Though on a side note, this game is a great reminder of just how much better single-GPU cards are at consistency. The best multi-GPU setup is at 8.2%; the worst single-GPU setup is 2.6%.

Graphically things are roughly as expected. It’s interesting to note that NVIDIA has some significant frame time spikes that AMD doesn’t encounter, though a single-GPU setup would shortcut the issue entirely.

 

Bioshock Infinite - 95th Percentile FT - 2560x1440 - Ultra Quality + DDoF

AMD’s 95th percentile improvement isn’t nearly as pronounced in Bioshock. Meanwhile the higher variability does cost them just enough to have the 7990 fall behind the GTX 690 here.

Battlefield 3 Crysis 3
Comments Locked

102 Comments

View All Comments

  • waldoh - Thursday, August 1, 2013 - link

    Its unfortunate it a competing company to shine light on an issue for another to address it.
  • waldoh - Thursday, August 1, 2013 - link

    took*
  • tackle70 - Thursday, August 1, 2013 - link

    I'd say it's more like expected than unfortunate. This is why competition is a good thing and why you never want one company to blow away another - competition makes all companies serve their customer better.

    Big time kudos to AMD for their work on this; it's nice to see real competition available again in the $500+ market.
  • Rezurecta - Thursday, August 1, 2013 - link

    Excellent and well said.
  • HisDivineOrder - Thursday, August 1, 2013 - link

    I think he was referring to the fact that this issue was present for many years and not only did reviewers not catch on despite common complaints (and HardOCP) discussing the issue, but the company making the card was apparently completely blindsided by it after years and years of Crossfire sales. That's why people who own only one company's cards should try the other side to see that sometimes when someone says something like, "The nVidia cards are smoother in SLI than CF," sometimes--just sometimes--that's not fanboyism. Sometimes, it really is just smoother.

    No, I think the, "it took a competing company to shine a light on an issue," was more in reference to the fact that nVidia had to basically take AMD by the hand and slowly walk them through how to detect a problem highly prevalent on their products after years and years of waiting for them to get it.

    They had to take out their own measurement software they built custom in-house and actually hand it over to the other team just to help them get it. This isn't typical competition teaching the other guy what to do.

    This is like Pepsi-Cola taking Coca-Cola by the hand and saying, "Okay, so soda is supposed to have sugar and caffeine. Here is where you get it. Here is our supplier. Try it."

    That's why he's saying it's sad. If AMD had figured it out on their own and fixed it, then yeah, that's competition because they FIGURED IT OUT. Instead, they didn't. It took TechReport slamming them on it with DATA after years of HardOCP just slamming them without data and thousands upon thousands of users saying, "Crossfire is not very good compared to SLI" and then nVidia hand delivering them FCAT for them to get it.

    Before that, they were clueless. AMD is a company that produces discrete GPU's for the gaming market and not only did they have no clue how to test for this problem, they didn't even know there WAS a problem they were so clueless.

    And that truly is very sad.
  • Galidou - Thursday, August 1, 2013 - link

    Not sure that it was as much present in past products, I owned crossfire 6850s for a while then switched to a single 660ti to gain not much except lower temps and a little more FPS. Only game I could tell there was a real noticeable difference in smoothness was Skyrim and that was mainly because of thextures taking more than the mere 1gb my 6850s had.
  • chizow - Friday, August 2, 2013 - link

    Can't really agree with this, microstutter was documented and covered significantly in the German press for years, largely ignored by the NA press. 4870X2 microstutter problems were the first time the issue was really brought to light by PCGamesHardware, there's tons of documentation about it about if you search, here's the original test by PCGH:

    http://www.pcgameshardware.com/aid,653711/PCGH-pro...
  • Death666Angel - Saturday, August 3, 2013 - link

    Multi GPU stuttering was well known about pretty much a few months into having multi GPU solutions. The issue with single GPUs also experiencing uneven frame pacing is much newer. And the believe among AMD was that it was an issue that affects AMD and nVidia equally, which is why they never thought about changing it in their drivers. Until Scott made the revelations.
  • taltamir - Monday, August 5, 2013 - link

    I personally documented single GPU multistuttering years ago (caused by lack of CPU power (C2D 8400, problem resolved going to a Q6600; using nvidia GPU), with hard data. (fraps individual frame render times record).

    I posted it on anandtech forums and there was a brisk discussion of it. It wasn't well known, but it shouldn't have completely blindsided the so called professionals. HisDivineOrder really said it best
  • chizow - Wednesday, August 7, 2013 - link

    Yes I remember, there was a lot of user testing that stemmed from the initial reports on PCGH and the FRAPS frametime methodology became standard in allowing virtually any user who could download FRAPs and work a spreadsheet illustrating microstutter.

    I do agree though, the pros and press kept ignoring and sweeping it under the rug as if it didn't exist despite countless requests from end-users asking for more detail on it.

Log in

Don't have an account? Sign up now