Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

Bioshock Infinite - 3840x2160 - Ultra Quality + DDoF

Bioshock Infinite - 3840x2160 - Medium Quality

Bioshock Infinite - 2560x1440 - Ultra Quality + DDoF

At Bioshock’s highest quality settings the game generally favors NVIDIA’s GPUs, particularly since NVIDIA’s most recent driver release. As a result we’ll see the 295X2 come up short of 60fps on Ultra quality at 2160p, and otherwise trail the GTX 780 Ti SLI at both 2160p and 1440p. However it’s interesting to note that at 2160p with Medium quality – a compromise setting mostly for testing single-GPU setups at this resolution – we see the 295X2 jump ahead of NVIDIA’s best, illustrating the fact that what’s ultimately dragging down AMD’s performance in this game is a greater degree of bottlenecking with Bioshock’s Ultra quality effects.

Bioshock Infinite - Delta Percentages

Bioshock Infinite - Surround/4K - Delta Percentages

Meanwhile our first set of frame pacing benchmarks has more or less set the stage. Thanks to its XDMA engine the 295X2 is able to deliver acceptable frame pacing performance at both 1440p and 2160p, though at 1440p in particular NVIDIA does technically fare better than AMD here. As for the Radeon HD 7990, this offers a solid example of how AMD’s older GCN 1.0 based dual-GPU card still has great difficulty with frame pacing at higher resolutions.

Company of Heroes 2 Battlefield 4
Comments Locked

131 Comments

View All Comments

  • Dupl3xxx - Wednesday, April 9, 2014 - link

    $2k+ for a 4k screen? where are you wasing your money? In norway, you can get a 4k screen for just about 5kNOK, or just about 850USD, including tax! also, why would you need a $1500 CPU, whene the 4930k is 200MHz slower, for half the price?

    Also, WHY would you want 32GB of 2400MHz ram!?!?!?! There is next to no improvement over 1600MHz!

    As far as SSD's goes, a single samsung 250/500GB should be plenty, you got 32GB of ram to use as buffer!

    And if you want a "tight" system with insane preformance, the 295x2 is the best choice ATM. Double the 290x preformance, "half" the size.
  • lehtv - Wednesday, April 9, 2014 - link

    Another difference is the way this card handles heat compared to any 290X CF setup apart from custom water cooling. The CLLC combines the benefits of reference GPUs - the ability to exhaust hot air externally rather than into the case - with the benefits of third party cooling - the ability to keep temperatures and noise levels lower than those of reference blower cards. A 290X crossfire setup using reference cooling is not even worth considering for anyone who cares about noise output, while third party 290X crossfire is restricted to cases with enough cooling capacity to handle the heat.
  • Supersonic494 - Friday, April 11, 2014 - link

    You are right, but keep in mind on big limitation with normal crossfire/SLI is the space taken up by 2 big dual slot GPUs, with this it is only one slot; however other than that you might as well get 2 290x's
  • bj_murphy - Friday, April 11, 2014 - link

    Dual GPU doesn't have the requirement for 2 PCI-E slots; you can't do SLI/Crossfire in a Mini-ITX system for example.
  • HalloweenJack - Tuesday, April 8, 2014 - link

    muppet - 20w more in furmark , and 160 in games - not hundreds more. keep drinking the ananadtech koolaid.
  • WaltC - Tuesday, April 8, 2014 - link

    Interesting. [H] seems to have done some pretty thorough testing, and the AMD card blows by 780Ti SLI in every single case. Of course, [H] is testing @ 4k resolutions/3-way Eyefinity exclusively--but that's where anyone who shells out this kind of money is going to be. 1080P? Don't make me laugh...;)
  • WaltC - Tuesday, April 8, 2014 - link

    Can't edit, so I'll just say I don't know where "1080P" came from...;)
  • lwooood - Tuesday, April 8, 2014 - link

    Apologies for going slightly OT. Is there any indication when AMD fills in the middle of their product stack with GCN 1.1 parts?
  • sascha - Tuesday, April 8, 2014 - link

    I like to know that, too!
  • MrSpadge - Tuesday, April 8, 2014 - link

    I would say that indication is 20 nm chips, at the end of the year the earliest.

Log in

Don't have an account? Sign up now