Civilization V

A game that has plagued my testing over the past twelve months is Civilization V. Being on the older 12.3 Catalyst drivers were somewhat of a nightmare, giving no scaling, and as a result I dropped it from my test suite after only a couple of reviews. With the later drivers used for this review, the situation has improved but only slightly, as you will see below. Civilization V seems to run into a scaling bottleneck very early on, and any additional GPU allocation only causes worse performance.

Our Civilization V testing uses Ryan’s GPU benchmark test all wrapped up in a neat batch file. We test at 1440p, and report the average frame rate of a 5 minute test.

One 7970

Civilization V - One 7970, 1440p, Max Settings

Civilization V is the first game where we see a gap when comparing processor families. A big part of what makes Civ5 perform at the best rates seems to be PCIe 3.0, followed by CPU performance – our PCIe 2.0 Intel processors are a little behind the PCIe 3.0 models. By virtue of not having a PCIe 3.0 AMD motherboard in for testing, the bad rap falls on AMD until PCIe 3.0 becomes part of their main game.

Two 7970s

Civilization V - Two 7970s, 1440p, Max Settings

The power of PCIe 3.0 is more apparent with two 7970 GPUs, however it is worth noting that only processors such as the i5-2500K and above have actually improved their performance with the second GPU. Everything else stays relatively similar.

Three 7970s

Civilization V - Three 7970, 1440p, Max Settings

More cores and PCIe 3.0 are winners here, but no GPU configuration has scaled above two GPUs.

Four 7970s

Civilization V - Four 7970, 1440p, Max Settings

Again, no scaling.

One 580

Civilization V - One 580, 1440p, Max Settings

While the top end Intel processors again take the lead, an interesting point is that now we have all PCIe 2.0 values for comparison, the non-hyper threaded 2500K takes the top spot, 10% higher than the FX-8350.

Two 580s

Civilization V - Two 580s, 1440p, Max Settings

We have another Intel/AMD split, by virtue of the fact that none of the AMD processors scaled above the first GPU. On the Intel side, you need at least an i5-2500K to see scaling, similar to what we saw with the 7970s.

Civilization V conclusion

Intel processors are the clear winner here, though not one stands out over the other. Having PCIe 3.0 seems to be the positive point for Civilization V, but in most cases scaling is still out of the window unless you have a monster machine under your belt.

GPU Benchmarks: Dirt 3 GPU Benchmarks: Sleeping Dogs
Comments Locked

116 Comments

View All Comments

  • Dentons - Tuesday, June 4, 2013 - link

    He's complaint is on the mark. Haswell is about mobile, not desktop, not gaming.

    Ivy Bridge was about cost reduction, Haswell is about reducing TDP. It is shocking that a mid-range 2+ year old Sandy Bridge desktop part is still so very competitive, even though it's been superseded by two whole generations.

    Intel deserves all this criticism and more. They've clearly put the interests of desktop users and gamers far onto the back burner. They're now focused on almost entirely mobile and are treading water with everything else.
  • takeship - Tuesday, June 4, 2013 - link

    Eh, how can you blame them? The pure play desktop market has been shrinking for a while now, with high performance desktop (basically gamers) even more of a niche. Maybe if they had some real competition from AMD in single threaded perf... A lot of this is just Amdahl's law at it's natural conclusion. The easy performance gains are mostly gone, so if you're Intel do you dump endless money into another 25-30% per generation, or go after the areas that haven't been well optimized yet instead? Not a hard choice to make, especially considering the market moves towards mobile & cool computing in the last decade.
  • Silma - Wednesday, June 5, 2013 - link

    Intel doesn't deserve criticism. Haswell is a small improvement over Ivy Bridge because it has become extremely difficult to optimize and already excellent processor. Do you see anything better from AMD, ARM, Oracle or others?

    Is there a need to upgrade from Ivy to Haswell? No. Was it necessary to upgrade from Nethalem to Sandy Bride? No. The fact is that for most applications processors have been good enough for years and money is better spent on ssds, gpus and whatnot.

    The real conclusion of this article should be that processors absolutely do not matter for gaming and that the money is better spent on speedier gpu. Processors may become relevant for the very very very few people that have extreme 2/3x extreme cards. Even a setup with 2 middle cards such as gtx 560 is not cpu dependent. I would welcome actual statistics from the number of players with 2x 3x high-end gpus. I'm quite sure the count is ultra tiny and for those people willing and able to spend thousand of dollars, do you think 100$ less on a processor is relevant?
  • chizow - Wednesday, June 5, 2013 - link

    I don't have a problem with the conclusion he comes to, complaining about dissemination of information to come to that conclusion is what makes no sense. Put all the information out there, 1, 2, 3 articles a day np, then make your own informed decision on the platform. Bemoan the fact there is actual coverage a day or two after launch and one or two reviews? Makes no sense.
  • Memristor - Tuesday, June 4, 2013 - link

    Too bad that Richland, which is available as of today, didn't make it into this review. Other than that great read.
  • eddieobscurant - Tuesday, June 4, 2013 - link

    many of us have a q6600 @ 3600mhz, and personally i'm very happy this and my 7870. I would still like to see a comparison of my cpu @ 3600mhz, with the modern cpus because i don't think there is a huge difference in games.
  • chizow - Tuesday, June 4, 2013 - link

    It depends what you play, any game that is CPU limited is going to be HUGE difference with that CPU. I had the same chip at 3.6GHz, which was great btw, and even when I upgraded to 920 @4GHz there was huge improvement in some games, most notably GTA4 at the time. Some other games that scale extremely well with CPU are WoW, Diablo 3, etc. just to name a few.
  • medi02 - Wednesday, June 5, 2013 - link

    Nah. Most of the tests show that to get CPU limited you need a multi-GPU setup.
    i7 and intel mobo will cost you about 500$ with marginal improvements.
  • chizow - Wednesday, June 5, 2013 - link

    Sorry, just not true. Even with just 1x680 WoW and other similarly CPU dependent games scale tremendously well with faster CPUs:

    http://www.tomshardware.com/reviews/fx-8350-visher...

    Q6600 @ 3.6 is probably just a tad faster than the Phenom IIs in that test.
  • TheJian - Thursday, June 6, 2013 - link

    See my comments here...Chizow is correct, and even understating it some. There are a LOT of games cpu limited as I showed in my links. Huge differences in cpu perf from A10-5800 up to 4770k, never mind the junk Ian recommends here A8-5600 for single gpu. It just isn't correct to recommend that cpu or even A10-5800K which I showed getting smacked around in many games at 1080p. Articles like this make people think games are not cpu bound (it's far more games than Civ5). Neverwinter, Metro Last light, tomb raider, Farcry3, Crysis 3 etc etc...Once 20nm comes we may find even 1440p showing just as many limited by cpu. If rumors are true Volcanic doubles stream processors. I'm sure NV will match that. You end up gpu bound when you up the res to 1440 on single cards now, but that won't be forever and 98.75% of us according to steam don't play at 1440p (.87%) or above (1.25% total of all res above 1920x1200).

    Check the 1080p data on my links (techreport was a good one as they show 1080p in most of the listed games). Toms shows neverwinter as I noted needing a very high cpu also. Hit all comments on this article, and Ctrl-F my name. Ignore my post comments and just click the links in them to prove Chizow's point (and my own). CPU is important at 1080p and 1920x1200 NOW and will be important at higher res with the next gen cards at 20nm. You will never get out of your AMD mistake if you take this article's suggestions. Well at least not without changing to an Intel board/chip...LOL. Who wants to do that? Just buy an Intel unless you're broke. Don't trust me though, read the links provided and judge for yourself how accurate anandtech is here.

    I showed some games that are nearly DOUBLE on Intel vs. A10-5800K! You don't have to like the way I made my point or believe me, just check the links :) They all say the same thing. CPU is an issue just as Chizow shows in his link. You can find this in many cpu articles where they use a top gpu (usually 7970/680) and test new cpus with the oldies in there too which show large separations. Check i7-3770k or fx 8350 articles (just google those two cpu models and "review" for ample sites showing the spreak)...1080p separates the men from the boys in cpu's.

    After you check the links (and chizow's), come back and agree Anandtech needs to change their ways, or tear my comments apart if I'm lying :) Future gpu's will only make our point stick out even more. CPU matters. Also note a lot of the games that are gpu limited on single cards are NOT playable anyway (check sleeping dogs right here in this article 1440p...7970 at 28fps avg is NOT playable, mins will dip to 20's or below). So you're forced back into cpu limited in a lot of cases at 1080p. Where 98.75% of us play you see cpu limits a lot.

    Go back one page on Chizow's link to Skyrim's benchmark in the same article for the same data. 1080p 3770 scores 88.2 to 8350's 67.4 (that's a lot and a huge hint to how your future on AMD will look)
    http://www.tomshardware.com/reviews/fx-8350-visher...
    That's a 30% difference and an 8350FX is far faster than an A8-5600 Ian recommends here. Chizow is even more right if you toss in Ian's recommendation of an even slower cpu than 8350 vs. Intel's stuff. Even in skyrim at 1680x1050 they separate from 90fps to 68fps for 8350fx. So until you completely tap out your gpu (1440p and up which basically requires 2+ cards) you will notice if your cpu is junk or not. Since this article is only written for apparently 1.25% of the readership (or world for that matter according to steam survey), you will notice the cpu! Unless you're raising your hand as the 1.25% :) I don't call 30-100% faster marginal improvements do you? Add CIV 5 also which this site even proves in this article ;) At least they got something right.

Log in

Don't have an account? Sign up now