Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

With Bioshock we once again see the 290 trailing the 290X by a small margin, this time of 5%. It’s the difference between technically sustaining a 60fps average at 2560 or just falling short, but only just. Meanwhile compared to the GTX 780 the 290 is handed its first loss, though by an even narrower margin of only 3%. More to the point, on a pure price/performance basis, the 290 would need to lose by quite a bit more to offset the $100 price difference.

Meanwhile, it’s interesting to note not only how much faster the 290 is than the 280X or the GTX 770, but even the 7950B. The 290 series is not necessarily intended to be an upgrade for existing 7900 series, but because the 7950’s performance was set so much lower than the 7970/280X’s, and because 290 performs so closely to the top-end 290X, it creates a sizable gap between the 7950 and its official replacement. With a performance difference just shy of 50%, the 290 is reaching the point where it’s going to be a practical upgrade for 7950 owners, particularly those who purchased it in early 2012 and who paid the full $450 price tag it launched at. It’s nowhere near a full generational jump, but it’s certainly a lot more than we’d expect to see for a GPU that’s manufactured on the same process as 7950’s GPU, Tahiti.

Company of Heroes 2 Battlefield 3
Comments Locked

295 Comments

View All Comments

  • just4U - Wednesday, November 6, 2013 - link

    You have to ask yourself is Ryan biased with Nvidia or AMD... or maybe it's simply just his tolerance for noise that is the issue.

    Anyway.. people buying these cards will have some options. For me the 95C is a no go as is the noise. Something I'd tolerate until a good aftermarket solution could be implemented. AMD and Nvidia (until their titan reference cooler) have always been a little meh.. with reference coolers. We all know this..

    My last two cards have been AMD ones and if I was in the market for a card today I'd go straight for the Nvidia 780. Not because of it's speeds, certainly not because of its drivers, and not because I am a fan. I simply like their kickass reference cooler and games bundle.

    Im not in the market though lol. Quite happy with my Radeon 7870.. and not looking to upgrade yet.
  • jbs181818 - Thursday, November 7, 2013 - link

    With all that power consumption, what size PSU is required? Assuming 1 GPU and a haswell CPU, 1 SSD.
  • dwade123 - Thursday, November 7, 2013 - link

    290x doesn't make sense when the cheaper 290 performs almost identical. And neither can max out Crysis 3. Gamers are better off waiting for real next-gen cards like Maxwell, and with next-gen console ports coming in 2014 suggests it is common sense to do so.
  • polaco - Tuesday, November 12, 2013 - link

    "neither can max out Crysis 3" what the hell are you talking about?
    52 fps at 2560x1440 HQ + FXAA
    77 fps at 1920x1080 HQ + FXAA
    with that line of thinking then nor 780 or Titan are worthy since fps diff is minimal

    "gamers are better off waiting for real next-gen cards like Maxwell"
    well, 290 and 290X are AMD true next gen cards, maybe you feel fooled by having bought a 780 for almost 700 bucks and then you feel like Maxwell will relief that pain, or maybe you work for NVidia marketing deparment... for the time NVidia came out with it AMD will be pushing their next gen too, will you recommend waiting then too? so we wait forever then uh?
    "and with next-gen console ports coming in 2014 suggests it is common sense to do so"
    you mean to wait for NVidia card to run games that will be optimized to AMD hardware that is inside every next gen console?
    please go to see a doctor....
  • TempAccount007 - Saturday, November 9, 2013 - link

    Who the hell uses a reference cooler on any AMD card? The only people that buy reference cards are those who are going to water cool them.
  • NA1NSXR - Monday, November 11, 2013 - link

    If I was in the market for a card I'd wait until the aftermarket cooler designs come out. Should make the noise and temp situation a little more bearable. Still, the proprietary nVidia value-adds like HBAO+, adative vsync, TXAA, etc. are hard to give up for me. It is a hard call. If the 780 was only $50 more than the 290 I'd take the 780, but since the difference is $100....I don't know. Really tough call.
  • beck2448 - Tuesday, November 12, 2013 - link

    Too noisy and hot.
  • devilskreed - Tuesday, November 12, 2013 - link

    Hail High AMD..The gamers saviour!!!
    Hail High AMD..The price/performance king
    Hail High AMD..The peoples choice..

    Healthy competition from AMD's side,i stopped buying nvidia after 8800GT :p purely due to price/performance benefits that AMD offers..
  • bloodbones - Thursday, November 14, 2013 - link

    The battle between amd ex ati and nvidia has been around since i was 18 years old and i am 30 now. Over the years i have try a huge numbers of video cards from both companies and the only conclusion is that things have always been the same, nothing change over the years: more or less the same performance and:
    Nvidia = more expensive cards but more quality cards, lower noise levels lower temps
    Ati/Amd = cheaper cards with higher noise levels higher temps
    Period.
  • horse07 - Thursday, November 14, 2013 - link

    Guys, when will you update the 2013 GPU benchmarks with the recent R7/R9 and 700 series?

Log in

Don't have an account? Sign up now