Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

With Bioshock we once again see the 290 trailing the 290X by a small margin, this time of 5%. It’s the difference between technically sustaining a 60fps average at 2560 or just falling short, but only just. Meanwhile compared to the GTX 780 the 290 is handed its first loss, though by an even narrower margin of only 3%. More to the point, on a pure price/performance basis, the 290 would need to lose by quite a bit more to offset the $100 price difference.

Meanwhile, it’s interesting to note not only how much faster the 290 is than the 280X or the GTX 770, but even the 7950B. The 290 series is not necessarily intended to be an upgrade for existing 7900 series, but because the 7950’s performance was set so much lower than the 7970/280X’s, and because 290 performs so closely to the top-end 290X, it creates a sizable gap between the 7950 and its official replacement. With a performance difference just shy of 50%, the 290 is reaching the point where it’s going to be a practical upgrade for 7950 owners, particularly those who purchased it in early 2012 and who paid the full $450 price tag it launched at. It’s nowhere near a full generational jump, but it’s certainly a lot more than we’d expect to see for a GPU that’s manufactured on the same process as 7950’s GPU, Tahiti.

Company of Heroes 2 Battlefield 3
Comments Locked

295 Comments

View All Comments

  • TempAccount007 - Saturday, November 9, 2013 - link

    What part of REFERENCE COOLER do you not understand?
  • johnny_boy - Wednesday, November 13, 2013 - link

    The IF isn't so big, I think. A lot of gamers already have blocks for their graphics cards, or don't care much about the additional noise, or want a block anyway at some point and the 290 presents an opportunity to get one now (and then cooling is quieter/better than the competing nVidia cards for the same price when figuring in the watercooling costs for the AMD card). I'd rather get the 290 (over the 780) and use my current watercooling solution. If I didn't have watercooling then I'd still rather buy the 290 and upgrade to watercooling.
  • Eniout - Thursday, November 14, 2013 - link

    my Aunty Julia recently got Jeep Compass SUV by working from a
    computer. read the full info here www.Jobs37.coℳ
  • tgirgis - Thursday, February 20, 2014 - link

    That's really extremely one sided, first of all, AMD already has a response to G-Sync, (their version for now has been dubbed "Free-Sync" but no idea if that nomenclature is final) and they have TressFX (which, at the moment, does look better than Nvidia's "Hairworks" but Nvidia will probably soon catch up), and they've got Mantle, which is definitely a massive advantage.

    Not to mention the R9 290 comes with 4GB Vram, as opposed to the GTX 780's 3GB, though it's really not a huge issue except in 4k gaming. Finally, shield compatibility isn't really a benefit, it's a $250 handheld game system, it's only beneficial if you interested in purchasing one of those, as opposed to being an included feature.

    Nvidia is not without it's advantages however, they still have lower power consumption and thermals which is great for mini-itx systems (although manufacturer custom cooled cards can help bridge the gap for thermals) and they do still have Physx.

    If Mantle keeps going the way it is now, Nvidia might be forced to pay royalties to AMD similar to how they did with Intel a few years back. If anything, AMD should throw "Allow us to use Physx" in the negotiations :)
  • slickr - Tuesday, November 5, 2013 - link

    O yeah, Nvidia at this point has no choice, but to lower its prices again. I mean for $400 this card is amazing. It performs on the same level as the $1000 Titan and on the same level as the $550 290X, so a giant performance at a very cheap price.

    Even with the high noise(just wait 2 weeks for custom cooler) this card blows the GTX 780 out of the water, the performance is so much better.

    I think if Nvidia wants to stay in the competition they would need to cut the GTX 780 price to at least $400 as well and try and get sales due to better acoustics and a lower power consumption, but if it was just performance in question they would need to lower the price of the 780 to $350 or 300 euros.

    Of course that would mean that the 770 should get a price reduction as well and be around $270.
  • holdingitdown - Tuesday, November 5, 2013 - link

    Yes this card is incredibly disruptive. The performance makes the 780 look like a mess. Expect to see at least another $100 slashed off the 780 and the 770 needs a little more taken off.

    The R9 290 is a monster!
  • crispyitchy - Tuesday, November 5, 2013 - link

    Best card to release yet as far as I am concerned.

    The noise profile is not perfect, but every card is noisy once gaming to one degree or another.

    What is perfect is the giant performance for this perfect price.

    Newegg here I COME
  • Wreckage - Tuesday, November 5, 2013 - link

    I doubt NVIDIA will cut their price. This card is so loud that most people will stay away and get a 780 or 770. AMD is so desperate to increase performance that they sacrifice everything else. It's like the last sad days of 3DFX.
  • Da W - Tuesday, November 5, 2013 - link

    Remember what happened after 3Dfx died? Higher price and mediocre performance.
    I'd buy AMD if only to keep them alive and force Nvidia to drop their prices.
  • HisDivineOrder - Tuesday, November 5, 2013 - link

    Actually, traditionally, 3dfx was overpriced until the very end. ATI was always there competing with nVidia and 3dfx, anyway.

    So competition existed for as long as we've had discrete GPU's in any meaningful way. It's AMD that wants to end competition by standardizing PC gaming high performance around a GCN-based API only they can use meaningfully.

Log in

Don't have an account? Sign up now