Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

As opposed to our previous game, with Bioshock the GTX 780 Ti comes out at a very strong contender, easily surpassing everything AMD and NVIDIA. Here we see it best AMD’s best by 18%, and against GTX Titan and GTX 780 it’s 7% and 20% ahead respectively. Though admittedly everything here is averaging better than 60fps at this point.

Meanwhile for the AFR matchup, with a pair of GTX 780 Ti’s we’re either looking framerates that will make a 120Hz gamer happy, or enough horsepower to take on 4K at our highest settings and still come out well ahead. At 57.3fps the GTX 780 Ti is several frames per second ahead of the 290X CF, coming up just short of averaging 60fps even at this very high resolution.

Company of Heroes 2 Battlefield 3
Comments Locked

302 Comments

View All Comments

  • A5 - Thursday, November 7, 2013 - link

    BF4 has a built-in benchmark too, but I have no idea how good it is. I'd guess they're waiting on a patch?

    If nothing else, there will be BF4 results if/when that Mantle update comes out.
  • IanCutress - Thursday, November 7, 2013 - link

    BF4 has a built in benchmark tool? I can't find any reference to one.
  • Ryan Smith - Thursday, November 7, 2013 - link

    BF3 will ultimately get replaced with BF4 later this month. For the moment with all of the launches in the past few weeks, we haven't yet had the time to sit down and validate BF4, let alone collect all of the necessary data.
  • 1Angelreloaded - Thursday, November 7, 2013 - link

    Hell man people run FEAR still as a benchmark because of how brutal it is against GPU/CPU/HDD.
  • Bakes - Thursday, November 7, 2013 - link

    I think it's better to wait until driver performance stabilizes for new applications before basing benchmarks on them. If you don't then early benchmark numbers become useless for comparison sake.
  • TheJian - Thursday, November 7, 2013 - link

    I would argue warhead needs to go. Servers for that game have been EMPTY for ages and ZERO people play it. You can ask to add BF4, but to remove BF3 given warhead is included (while claiming bf3 old) is ridiculous. How old is Warhead? 7-8 years? People still play BF3. A LOT of people. I would argue they need to start benchmarking based on game sales.
    Starcraft2, Diablo3, World of Warcraft Pandaria, COD Black ops 2, SplinterCell Blacklist, Assassins Creed 3 etc etc... IE, black ops 2 has over 5x the sales of Hitman Absolution. Which one should you be benchmarking?
    Warhead...OLD.
    Grid 2 .03 total sales for PC says vgchartz
    StarCraft 2 5.2.mil units (just PC).
    Which do you think should be benchmarked?

    Even Crysis 3 only has .27mil units says vgchartz.
    Diablo 3? ROFL...3.18mil for PC. So again, 11.5x Crysis 3.

    Why are we not benchmarking games that are being sold in the MILLIONS of units?
    WOW still has 7 million people playing and it can slow down a lot with tons of people doing raids etc.
  • TheinsanegamerN - Friday, November 8, 2013 - link

    because any halfway decent machine can run WoW? they use the most demanding games to show how powerful the gpu really is. 5760x1080p with 4xMSAA gets 69 FPS with the 780ti.
    why benchmark hitman over black ops? simple, it is not what we call demanding.
    they use demanding games. not the super popular games thatll run on hardware from 3 years ago.
  • powerarmour - Thursday, November 7, 2013 - link

    Well, that time on the throne for the 290X lasted about as long as Ned Stark...
  • Da W - Thursday, November 7, 2013 - link

    I look at 4K gaming since i play in 3X1 eyefinity (being +/- 3.5K gaming).
    At these resolution i see an average of 1FPS lead for 780Ti over 290X. For 200$ more.
    Power consumption is about the same.
    And as far as temperature go, it's temperature AT THE CHIP level. Both cards will heat your room equally if they consume as much power.

    The debate is really about the cooler, and Nvidia got an outright lead as far as cooling goes.
  • JDG1980 - Thursday, November 7, 2013 - link

    It seems to me that both Nvidia and AMD are charging too much of a price premium for their top-end cards. The GTX 780 Ti isn't worth $200 more than the standard GTX 780, and the R9 290X isn't worth $150 more than the standard R9 290.

    For gamers who want a high-end product but don't want to unnecessarily waste money, it seems like the real competition is between the R9 290 ($399) and the GTX 780 ($499). At the moment the R9 290 has noise issues, but once non-reference cards become available (supposedly by the end of this month), AMD should hold a comfortable lead. That said, the Titan Cooler is indeed a really nice piece of industrial design, and I can see someone willing to pay a bit extra for it.

Log in

Don't have an account? Sign up now