Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
Comments Locked

242 Comments

View All Comments

  • DigitalFreak - Thursday, May 9, 2013 - link

    Do people not read the article? He works with what he has on hand and what he can get access to.
  • tackle70 - Thursday, May 9, 2013 - link

    Great article! The only gripe I would have (and yes I know the reasoning behind it is explained) is the decision not to include Crysis 3 in the testing.

    The reason I make that gripe is the even though it has no time demo functionality and adds more work is that it is the closest thing to a next-gen game we have right now, and it is also the *only* game I've seen that reliably eats up as much CPU power and as many cores as you give it. It would have been interesting to see it here.
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    Core i7 860 overclocked at 3.6Ghz, GTX580 sli Pci2 x8/x8 = min 44, average 49. Final scene destroy the Ceph Alpha. No overclock on GPUs but plenty of headroom. Not scientific but would be useful to see same scene if someone has a more up to date processor.
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    REs 1920 x 1080
  • DigitalFreak - Thursday, May 9, 2013 - link

    Suckie suckie, two dolla?
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    Great review, the CPU has definitely become less important. I used to change my CPU around every 18 months or my system would show signs of struggling. I bought my i860 in 2009 and it is sitting alongside two GTX 580s (SLI x8/x8). Nearly four years seems like an eternity, got my first GTX580 in early 2011 is the longest I have kept with the same GPU. Shows you that games developers don't challenge the hardware like they used too.
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    People who make comments like this do not understand that it is about making a properly balanced system so that you get maximum bang for your bucks. This takes skill and a proper understanding of hardware capabilities and technology. On a gaming system you can trade down on a processor and buy a better GPU (or an SSD or both). When you get it right you get more FPS for the same or less money, faster loading times, and have overclocking headroom to use at a later date.
  • oSHINSAo - Thursday, May 9, 2013 - link

    Well i thought my 2600k was old... but im looking is too near to 3rd gen i7 3770k ... will stick with it, and focus on getting CrossfireX config...
  • T1K0L P0G1 - Friday, May 10, 2013 - link

    EXCELLENT WORK!!!
  • gnasen - Friday, May 10, 2013 - link

    Nice article. Still missing few of the legends: Q9550, i7-920, i5-750.

Log in

Don't have an account? Sign up now