Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
Comments Locked

242 Comments

View All Comments

  • iamezza - Thursday, May 9, 2013 - link

    I have 3 x 1080p and a 7970, on modern games it isn't possible to get 60fps without turning settings way down. Really need 2 x 7970 to maintain 60+ fps
  • TheInternal - Saturday, May 11, 2013 - link

    I'm guessing it does 30+ FPS comfortably though?
  • Arnulf - Wednesday, May 8, 2013 - link

    Are you retarded or just an imbecile ?
  • marc1000 - Wednesday, May 8, 2013 - link

    good work Ian!

    that's a LOT of data, but the best part is the explanation of WHY. hope it makes matters clear.

    side note: it was nice to see the link to www.adrenaline.com.br ! those guys are insane indeed! =D
  • Doomtomb - Wednesday, May 8, 2013 - link

    I have an i7-875K. I would like you to include an i7 from the Westmere/Nehalem generation. Thanks!
  • mapesdhs - Monday, May 20, 2013 - link


    I'm doing lots of tests that should help in your case. If you want me to test anything specific,
    feel free to PM. I have the same 875K, but also 2500K, 2700K, 3930K, Ph2 965, QX9650
    and many others.

    Ian.
  • Pheesh - Wednesday, May 8, 2013 - link

    I'm really surprised that minimum FPS wasn't also tested. Testing just for average FPS is not that informative to the actual experience you will have. If given the choice between two CPU's I'd take one averaging 70 fps but with a minimum fps of 50 over one that averages 80fps but has a minimum fps of 30.
  • mip1983 - Wednesday, May 8, 2013 - link

    Perhaps some games are more CPU limited, I'm thinking MMO's like Planetside 2 were there are a lot of players at once. Not sure how you'd benchmark that game though.
  • bebimbap - Wednesday, May 8, 2013 - link

    Ian I know you are a BL2 fan. The game is written with a old UT engine i'm told, so it's performance scaling isn't the same as some of these other titles. The method of testing you used was similar to how I buy my own equipment and recommend to others.
    With my same 3770k clocked at stock 3.9ghz I can only get about 57fps with my gtx670. when it is OC'd to 4.7ghz that same scene now becomes GPU limited at 127fps on my 144hz lcd. I'm glad you posted this. When people ask for my advice on what hardware to buy, I always tell them, that they should aim for a resolution first, 1080p for example, then what game they would want to play and what performance presets, mid settings 120hz, then buy a gpu/cpu combo that compliments those settings. if your budget allows then up the hardware a tier or two. Too many times do I see people just buy a top tier GPU and wonder why their fps is lower than expected. My way your expectations are met, then if budget allows, are exceeded. I hope you start a trend with this report. So that others can go this route when performing upgrades.
  • Michaelangel007 - Wednesday, May 8, 2013 - link

    The article is a good start! Pity it didn't include the Tomb Raider benchmark that anyone can run, nor include a discussion about the badly implemented Windows timer frequency that Lucas Hale documented with his "TimerResolution" program. HyperMatrix found lowering the default timer resolution from 10ms down to 1 ms allowed for "Crysis 3 - 30% Framerate and Performance"

Log in

Don't have an account? Sign up now