Civilization: Beyond Earth

Shifting gears from action to strategy, we have Civilization: Beyond Earth, the latest in the Civilization series of strategy games. Civilization is not quite as GPU-demanding as some of our action games, but at Ultra quality it can still pose a challenge for even high-end video cards. Meanwhile as the first Mantle-enabled strategy title Civilization gives us an interesting look into low-level API performance on larger scale games, along with a look at developer Firaxis’s interesting use of split frame rendering with Mantle to reduce latency rather than improving framerates.

Civilization: Beyond Earth - 3840x2160 - Ultra Quality

Civilization: Beyond Earth - 2560x1440 - Ultra Quality

As one of the few games that can hit 60fps on the R9 Fury at 4K with everything turned up, it’s interesting to see how resolution impacts all of our cards with Civilization. At 4K the R9 Fury is well ahead of the GTX 980, surpassing it by 17%. Yet at 1440p that lead becomes a very slight loss, with the Sapphire Tri-X R9 Fury’s mild factory overclock giving it just enough of a boost to stay ahead of the GTX 980.

Meanwhile the Fury/Fury X gap widens ever so slightly here. The R9 Fury is now a full 10% behind the full-fledged Fury.

Civilization: Beyond Earth - Min. Frame Rate - 3840x2160 - Ultra Quality

Civilization: Beyond Earth - Min. Frame Rate - 2560x1440 - Ultra Quality

The minimum framerate situation for Civilization is very nearly a mirror of the averages. The R9 Fury does relatively well at 4K, but at 1440p it’s now neck-and-neck with the GTX 980 once again.

Middle Earth: Shadow of Mordor Dragon Age: Inquisition
Comments Locked

288 Comments

View All Comments

  • nightbringer57 - Friday, July 10, 2015 - link

    Intel kept it in stock for a while but it didn't sell. So the management decided to get rid of it, gave it away to a few colleagues (dell, HP, many OEMs used BTX for quite a while, both because it was a good user lock-down solution and because the inconvenients of BTX didn't matter in OEM computers, while the advantages were still here) and noone ever heard of it on the retail market again?
  • nightbringer57 - Friday, July 10, 2015 - link

    Damn those not-editable comments...
    I forgot to add: with the switch from the netburst.prescott architecture to Conroe (and its followers), CPU cooling became much less of a hassle for mainstream models so Intel did not have anything left to gain from the effort put into BTX.
  • xenol - Friday, July 10, 2015 - link

    It survived in OEMs. I remember cracking open Dell computers in the later half of 2000 and finding out they were BTX.
  • yuhong - Friday, July 10, 2015 - link

    I wonder if a BTX2 standard that fixes the problems of original BTX is a good idea.
  • onewingedangel - Friday, July 10, 2015 - link

    With the introduction of HBM, perhaps it's time to move to socketed GPUs.

    It seems ridiculous for the industry standard spec to devote so much space to the comparatively low-power CPU whilst the high-power GPU has to fit within the confines of (multiple) pci-e expansion slots.

    Is it not time to move beyond the confines of ATX?
  • DanNeely - Friday, July 10, 2015 - link

    Even with the smaller PCB footprint allowed by HBM; filling up the area currently taken by expansion cards would only give you room for a single GPU + support components in an mATX sized board (most of the space between the PCIe slots and edge of the mobo is used for other stuff that would need to be kept not replaced with GPU bits); and the tower cooler on top of it would be a major obstruction for any non-GPU PCIe cards you might want to put into the system.
  • soccerballtux - Friday, July 10, 2015 - link

    man, the convenience of the socketed GPU is great, but just think of how much power we could have if it had it's own dedicated card!
  • meacupla - Friday, July 10, 2015 - link

    The clever design trend, or at least what I think is clever, is where the GPU+CPU heatsinks are connected together, so that, instead of many smaller heatsinks trying to cool one chip each, you can have one giant heatsink doing all the work, which can result in less space, as opposed to volume, being occupied by the heatsink.

    You can see this sort of design on high end gaming laptops, Mac Pro, and custom water cooling builds. The only catch is, they're all expensive. Laptops and Mac Pro are, pretty much, completely proprietary, while custom water cooling requires time and effort.

    If all ATX mobos and GPUs had their core and heatsink mounting holes in the exact same spot, it would be much easier to design a 'universal multi-core heatsink' that you could just attach to everything that needs it.
  • Peichen - Saturday, July 11, 2015 - link

    That's quite a good idea. With heat-pipes, distance doesn't really matter so if there is a CPU heatsink that can extend 4x 8mm/10mm heatpipes over the videocard to cooled the GPU, it would be far quieter than the 3x 90mm can cooler on videocard now.
  • FlushedBubblyJock - Wednesday, July 15, 2015 - link

    330 watts transferred to the low lying motherboard, with PINS attached to amd's core failure next...
    Slap that monster heat onto the motherboard, then you can have a giant green plastic enclosure like Dell towers to try to move that heat outside the case... oh, plus a whole 'nother giant VRM setup on the motherboard... yeah they sure will be doing that soon ... just lay down that extra 50 bucks on every motherboard with some 6X VRM's just incase amd fanboy decides he wants to buy the megawatter amd rebranded chip...

    Yep, NOT HAPPENING !

Log in

Don't have an account? Sign up now