Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

The first of the games AMD allowed us to publish results for, Bioshock is actually a straight up brawl between the 290X and the GTX 780 at 2560. The 290X’s performance advantage here is just 2%, much smaller than the earlier leads it enjoyed and essentially leaving the two cards tied, which also makes this one of the few games that 290X can’t match GTX Titan. At 2560 everything 290X/GTX 780 class or better can beat 60fps despite the heavy computational load of the depth of field effect, so for AMD 290X is the first single-GPU card from them that can pull this off.

Meanwhile at 4K things end up being rather split depending on the resolution we’re looking at. At Ultra quality the 290X and GTX 780 are again tied, but neither is above 30fps. Drop down to Medium quality however and we get framerates above 60fps again, while at the same time the 290X finally pulls away from the GTX 780, beating it by 14% and even edging out GTX Titan. Like so many games we’re looking at today the loss in quality cannot justify the higher resolution, in our opinion, but it presents another scenario where 290X demonstrates superior 4K performance.

For no-compromises 4K gaming we once again turn our gaze towards the 290X CF and GTX 780 SLI, which has AMD doing very well for themselves. While AMD and NVIDIA are nearly tied at the single GPU level – keep in mind we’re in uber mode for CF, so the uber 290X has a slight performance edge in single GPU mode – with multiple GPUs in play AMD sees better scaling from AFR and consequently better overall performance. At 95% the 290X achieves a nearly perfect scaling factor here, while the GTX 780 SLI achieves only 65%. Curiously this is better for AMD and worse for NVIDIA than the scaling factors we see at 2560, which are 86% and 72% respectively.

Moving on to our FCAT measurements, it’s interesting to see just how greatly improved the frame pacing is for the 290X versus the 280X, even with the frame pacing fixes in for the 280X. Whereas the 280X has deltas in excess of 21%, the 290X brings those deltas down to 10%, better than halving the variance in this game. Consequently the frame time consistency we’re seeing goes from being acceptable but measurably worse than NVIDIA’s consistency to essentially equal. In fact 10% is outright stunning for a multi-GPU setup, as we rarely achieve frame rates this consistent on those setups.

Finally for 4K gaming our variance increases a bit, but not immensely so. Despite the heavier rendering workload and greater demands on moving these large frames around, the delta percentages keep to 13%.

Company of Heroes 2 Battlefield 3
Comments Locked

396 Comments

View All Comments

  • Sandcat - Friday, October 25, 2013 - link

    That depends on what you define as 'acceptable frame rates'. Yeah, you do need a $500 card if you have a high refresh rate monitor and use it for 3d games, or just improved smoothness in non-3d games. A single 780 with my brothers' 144hz Asus monitor is required to get ~90 fps (i7-930 @ 4.0) in BF3 on Ultra with MSAA.

    The 290x almost requires liduid...the noise is offensive. Kudos to those with the equipment, but really, AMD cheaped out on the cooler in order to hit the price point. Good move, imho, but too loud for me.
  • hoboville - Thursday, October 24, 2013 - link

    Yup, and it's hot. It will be worth buying once the manufacturers can add their own coolers and heat pipes.

    AMD has always been slower at lower res, but better in the 3x1080p to 6x1080p arena. They have always aimed for high-bandwidth memory, which is always performs better at high res. This is good for you as a buyer because it means you'll get better scaling at high res. It's essentially forward-looking tech, which is good for those who will be upgrading monitors in the new few years when 1440p IPS starts to be more affordable. At low res the bottleneck isn't RAM, but computer power. Regardless, buying a Titan / 780 / 290X for anything less than 1440p is silly, you'll be way past the 60-70 fps human eye limit anyway.
  • eddieveenstra - Sunday, October 27, 2013 - link

    Maybe 60-70fps is the limit. but at 120Hz 60FPS will give noticable lag. 75 is about the minimum. That or i'm having eagle eyes. The 780gtx still dips in the low framerates at 120Hz (1920x1080). So the whole debate about titan or 780 being overkill @1080P is just nonsense. (780gtx 120Hz gamer here)
  • hoboville - Sunday, October 27, 2013 - link

    That really depends a lot on your monitor. When they talked about Gsync and frame lag and smoothness, they mentioned when FPS doesn't exactly match the refresh rate you get latency and bad frame timing. That you have this problem with a 120 Hz monitor is no surprise as at anything less than 120 FPS you'll see some form of stuttering. When we talk about FPS > refresh rate then you won't notice this. At home I use a 2048x1152 @ 60 Hz and beyond 60 FPS all the extra frames are dropped, where as in your case you'll have some frames "hang" when you are getting less than 120 FPS, because the frames have to "sit" on the screen for an interval until the next one is displayed. This appears to be stuttering, and you need to get a higher FPS from the game in order for the frame delivery to appear smoother. This is because apparent delay decreases as a ratio of [delivered frames (FPS) / monitor refresh speed]. Once the ratio is small enough, you can no longer detect apparent delay. In essence 120 Hz was a bad idea, unless you get Gsync (which means a new monitor).

    Get a good 1440p IPS at 60 Hz and you won't have that problem, and the image fidelity will make you wonder why you ever bought a monitor with 56% of 1440p pixels in the first place...
  • eddieveenstra - Sunday, October 27, 2013 - link

    To be honnest. I would never think about going back to 60Hz. I love 120Hz but don't know a thing about IPS monitors. Thanks for the response....

    Just checked it and that sounds good. When becoming more affordable i will start thinking about that. Seems like the IPS monitors are better with colors and have less blur@60Hz than TN. link:http://en.wikipedia.org/wiki/IPS_panel
  • Spunjji - Friday, October 25, 2013 - link

    Step 1) Take data irrespective of different collection methods.

    Step 2) Perform average of data.

    Step 3) Completely useless results!

    Congratulations, sir; you have broken Science.
  • nutingut - Saturday, October 26, 2013 - link

    But who cares if you can play at 90 vs 100 fps?
  • MousE007 - Thursday, October 24, 2013 - link

    Very true, but remember, the only reason nvidia prices their cards where they are is because they could. (Eg Intel CPUs v AMD) Having said that, I truly welcome the competition as it makes it better for all of us, regardless of which side of the fence you sit.
  • valkyrie743 - Thursday, October 24, 2013 - link

    the card runs at 95C and sucks power like no tomorrow. only only beats the 780 by a very little. does not overclock well.

    http://www.youtube.com/watch?v=-lZ3Z6Niir4
    and
    http://www.youtube.com/watch?v=3OHKWMgBhvA

    http://www.overclock3d.net/reviews/gpu_displays/am...

    i like his review. its pure honest and shows the facts. im not a nvidia fanboy nore am i a amd fanboy. but ill take nvidia right how over amd.

    i do like how this card is priced and the performance for the price. makes the titan not worth 1000 bucks (or the 850 bucks it goes used on forums) but as for the 780. if you get a non reference 780. it will be faster than the 290x and put out LESS heat and LESS noise. as well as use less power.

    plus gtx 780 TI is coming out in mid November which will probably cut the cost of the current 780 too 550 and and this card would be probably aorund 600 and beat this card even more.
  • jljaynes - Friday, October 25, 2013 - link

    you say the review sticks with the facts - he starts off talking about how ugly the card is so it needs to beat a titan. and then the next sentence he says the R9-290X will cost $699.

    he sure seems to stick with the facts.

Log in

Don't have an account? Sign up now