Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
POST A COMMENT

111 Comments

View All Comments

  • FBB - Tuesday, June 04, 2013 - link

    They've had over 5 million concurrent online users. The total number will be much higher. Reply
  • DanNeely - Tuesday, June 04, 2013 - link

    What exactly does Steam count as online? Does just having the client sit in my tray count; or do I need to be playing a steam game at the time to be counted? Reply
  • wicko - Tuesday, June 04, 2013 - link

    Definitely just signed in: 823,220 Players In-Game | 4,309,324 Players Online
    Source: http://steamcommunity.com/
    Reply
  • chizow - Tuesday, June 04, 2013 - link

    Thanks for the tests, there's a lot of data points in there so that's always appreciated.

    I would've liked to have seen some higher perf Nvidia solutions in there though, at the very least some Kepler parts. It looks like a lot of the higher end Intel parts hit a GPU bottleneck at the top, which is not unexpected at 1440p with last-gen Fermi parts.

    What it does show for sure is, you may give pause to going beyond 2-way CF/SLI if you have to go lower than x8 on that 3rd slot. Which means you will probably have to shell out for one of the pricier boards. Hard not to recommend X79 at this point for 3-way or higher, although the lack of official PCIe 3.0 support was a red flag for me.

    I went with the Gigabyte Z87x UD4 because I don't ever intend to go beyond 2-way SLI and the 3rd slot being x4 (2.0) was better than the x8/x4/x4 (3.0) config on most boards, which gives me the option to run a PhsyX card and retain x8/x8 (3.0) for my two main cards.
    Reply
  • Gunbuster - Tuesday, June 04, 2013 - link

    So I'll stick with my 2600K @4.5ghz and continue to ponder what new Korean 27" LCD to get. Tech is pretty boring at the moment. Reply
  • wicko - Tuesday, June 04, 2013 - link

    I haven't bothered overclocking my 2600K and I still feel it's plenty powerful. I think I may get a second GTX 670 though, Metro Last Light doesn't run all that great at 2560x1440. Reply
  • kallogan - Tuesday, June 04, 2013 - link

    Haswell, haswell, haswell. Making one paper per day about it will not make it better. Boring cpu gen. Wake me up when something interesting shows up. Reply
  • chizow - Tuesday, June 04, 2013 - link

    So I guess the solution is to just ignore the launch to placate all those who have no interest in the launch, rather than post reviews and info about it for the ones that actually do? Doesn't make a lot of sense.

    If it doesn't interest you, move along.
    Reply
  • Dentons - Tuesday, June 04, 2013 - link

    He's complaint is on the mark. Haswell is about mobile, not desktop, not gaming.

    Ivy Bridge was about cost reduction, Haswell is about reducing TDP. It is shocking that a mid-range 2+ year old Sandy Bridge desktop part is still so very competitive, even though it's been superseded by two whole generations.

    Intel deserves all this criticism and more. They've clearly put the interests of desktop users and gamers far onto the back burner. They're now focused on almost entirely mobile and are treading water with everything else.
    Reply
  • takeship - Tuesday, June 04, 2013 - link

    Eh, how can you blame them? The pure play desktop market has been shrinking for a while now, with high performance desktop (basically gamers) even more of a niche. Maybe if they had some real competition from AMD in single threaded perf... A lot of this is just Amdahl's law at it's natural conclusion. The easy performance gains are mostly gone, so if you're Intel do you dump endless money into another 25-30% per generation, or go after the areas that haven't been well optimized yet instead? Not a hard choice to make, especially considering the market moves towards mobile & cool computing in the last decade. Reply

Log in

Don't have an account? Sign up now