Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
POST A COMMENT

111 Comments

View All Comments

  • yougotkicked - Tuesday, June 04, 2013 - link

    As always, thanks for the great article and hard work Ian.

    I'd really like to see how a few of the tests scale with overclocked CPU's, notably those in which the sandy bridge processors were competitive with ivy bridge and haswell parts. Obviously overclocking introduces a lot of variables into your testing, but it would be very interesting to see a few of the popular choices tested (sandy bridge parts @ 4.5 are quite common, and many users on such a system were waiting for haswell before they upgrade).
    Reply
  • eBombzor - Tuesday, June 04, 2013 - link

    Crysis 3 benchmarks PLEASE!! Reply
  • frozentundra123456 - Tuesday, June 04, 2013 - link

    Interesting results, but very limited as well. Why test at a resolution used by only 4% of the players?

    I would have rather seen the results at 1080p, over a wider variety of games. Especially RTS games and newer games like crysis 3, FC3, and Tomb Raider. I tested Heart of the Swarm on my computer with a HD7770 and i5 2320 and was able to max out the cpu in a 10 player skirmish match at ultra, 1080p. So I am sure an A8-5600 would be limiting in that case.

    Even considering the results only of the games tested, the A8-5600k seems a strange choice. The i3 seems just as valid, considering it is equal or faster in every game but one, while using less power.
    Reply
  • makerofthegames - Tuesday, June 04, 2013 - link

    Question - are those blank entries for the Xeons because they could not run, or just that data was not collected for them? Reply
  • Awful - Tuesday, June 04, 2013 - link

    Glad to see there's no reason to upgrade the i5-2500k in my desktop yet - still happily chugging away at 4.9GHz after 2 years! Reply
  • holistic - Tuesday, June 04, 2013 - link

    Ian,

    Thank you, for your time, effort, and energy in compiling an encyclopedic database on the effects of cpu on single and multi gpu configurations, in alternate gaming/engine scenarios. Your work is insightful, informative, and wholly devoted to the science of benchmarking. This approach has helped me, as a relatively new computer enthusiast, to more deeply understand testing methodology in the computing field.

    I am interested in the pure CPU benchmarks of Starcraft 2 with the 4770k and 4670k. I understand this game is not optimized, is directx9, and is extremely cpu limited with only 2 maximum cores active, and thus not in top priority for providing benchmarks. Will haswell be added to the benchmarking database for sc2?

    Cheers,

    Craig
    Reply
  • khanov - Tuesday, June 04, 2013 - link

    Ian, I have to say (again) that i7-3820 should be in this review.
    You say that i7-4770K is a better value proposition than Sandy Bridge-E (X79), I assume because you are only thinking of the expensive 6 core X79 CPU's. That changes if you do consider i7-3820.

    X79 brings far better support for multi-gpu setups with enough PCIe lanes to feed multiple cards quite happily. No PLX needed. Pair that with an i7 3820 (cheaper than i7-3770K/i7-4770K) and you may find the performance surprisingly good for the price.
    Reply
  • chizow - Friday, June 07, 2013 - link

    I considered the 3820 numerous times (it's cheap at MC, same price as high-end 3770K/4770K) but I shy away because it inexplicably performs *WORST* than 2700K/3770K/4770K. I don't know why, it has more L3 cache, and is clocked higher before/after boost. Just an oddball chip.

    Besides, X79 as a platform was dated almost as soon as it released. PCIe 3.0 support is spotty with Nvidia (reg hack not guaranteed), no native USB 3.0 and no full SATA 6G support. I went for Z87 + 4770K instead because X79 + 3820 didn't offer any noticeable advantages while carrying a significant higher price tag (board price).
    Reply
  • TheJian - Wednesday, June 05, 2013 - link

    So if you take out the 1920x1200 from the steam survey (4.16 - 2.91% right?), you've written an article for ~1.25% of the world. Thanks...I always like to read about the 1% which means absolutely nothing to me and well, 98.75% of the world.

    WHO CARES? As hardocp showed even a Titan still can't turn on EVERY detail at even 1920x1080. I would think your main audience is the 99% with under $1000 for a video card (or worse for multigpu) and another $600-900 for a decent 1440p monitor you don't have to EBAY from some dude in Korea.

    Whatever...The midpoint to you is a decimal point of users (your res is .87%, meaning NOT ONE PERCENT & far less have above that so how is that midpoint? I thought you passed MATH)?...Quit wasting time on this crap and give us FCAT data like pcper etc (who seems to be able to get fcat results into EVERY video card release article they write).

    "What we see is 30.73% of gamers running at 1080p, but 4.16% of gamers are above 1080p. If that applies to all of the 4.6 million gamers currently on steam, we are talking about ~200,000 individuals with setups bigger than 1080p playing games on Steam right now, who may or may not have to run at a lower resolution to get frame rates."

    That really should read ~55,000 if you take away the 2.91% that run 1920x1200. And your gaming rig is 1080p because unless you have a titan (which still has problems turning it all on MAX according to hardocp etc to remain playable) you need TWO vid cards to pull off higher than 1920x1200 without turning off details constantly. If you wanted to game on your "Korean ebay special" you would (as if I'd ever give my CC# to some DUDE in a foreign country as Ryan suggested in the 660TI comment section to me, ugh). It's simply a plug change to game then a plug change back right? Too difficult for a Doctor I guess? ;)

    This article needs to be written in 3 years maybe with 14nm gpus where we might be able to run a single gpu that can turn it all on max and play above 30fps while doing it and that will still be top rung, as I really doubt maxwell will do this, I'm sure they will still be turning stuff off or down to stay above 30fps min, just as Titan has to do it for 1080p now. Raise your hand if you think a $500 maxwell card will be 2x faster than titan.

    1440p yields an overall pixel count of 3,686,400 pixels for a monitor in 1440p resolution, substantially higher than the 2,073,600 pixels found on a 1080p monitor/tv etc. So since Titan is SHORT of playing ALL games maxed on 1080p we would need ~2x the power at say $500 for it to be even called anywhere NEAR mainstream at 1440p right? I don't see NV's $500 range doing 2x Titan with maxwell and that is 6-9 months away (6 for AMD volcanic, ~7-9 for NV?). Raise your hand if you call $500 mainstream...I see no hands. They may do this at 14nm for $300 but this is a long ways off right and most call $200 mainstream right? Hence I say write this in another 3yrs when the 1080p number of users in the steam survey (~31%) is actually the 1440p#. Quit writing for .87% please and quit covering for AMD with FCAT excuses. We get new ones from this site with every gpu article. The drivers changed, some snafu that invalidated all our data, not useful for this article blah blah, while everyone else seems to be able to avoid all anandtech's issues with FCAT and produce FCAT after FCAT results. Odd you are the ONLY site AMD talked too directly (which even Hilbert at Guru3d mentions...rofl). Ok, correction. IT'S NOT ODD. AMD personal attention to website=no fcat results until prototype/driver issues are fixed....simple math.

    http://www.alexa.com/siteinfo/anandtech.com#
    Judging your 6 month traffic stats I'd say you'd better start writing REAL articles without slants before your traffic slides to nothing. How much more of a drop in traffic can you guys afford before you switch off the AMD love? Click the traffic stats tab. You have to be seeing this right Anand? Your traffic shows nearly in half since ~9 months ago and the 660TI stuff. :) I hope this site fixes it's direction before Volcanic & Maxwell articles. I might have to start a blog just to pick the results of those two apart along with very detailed history of the previous articles and comments sections on them. All in one spot for someone to take in at once I'm sure many would be able to do the math themselves and draw some startling conclusions about the last year on this site and how it's changed. I can't wait for Ryan's take on the 20nm chips :)
    Reply
  • Laststop311 - Wednesday, June 05, 2013 - link

    Who actually buys a computer and does nothing but game on it every second they are on it? That's why the A8-5600k should not be the recommended cpu. Just gonna drag you down in every other thing you do with the computer. The i5-2500k should be here too. You can get them for a STEAL on ebay used I've seen them go for around 140-150. Sure you can pay 100-110 on ebay for the a8-5600k is a 40 dollar savings worth that much performance loss? Reply

Log in

Don't have an account? Sign up now