Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
POST A COMMENT

111 Comments

View All Comments

  • random2 - Wednesday, June 05, 2013 - link

    "What we see is 30.73% of gamers running at 1080p, but 4.16% of gamers are above 1080p."

    So an article and benches are provided for the benefit of 4.16% of the gamers who might be running more pixels vs the 65% (almost 3 million) lions share of gamers that must be running at fewer pixels than found at 1080p. Very strange.
    Reply
  • Dribble - Thursday, June 06, 2013 - link

    Just to point out the blindingly obvious but who would spend big $$$ on a 1440p monitor and a top end gpu and then buy a low end budget cpu (A8-5600)...

    The realistic min recommendation is going to be a i3570K.
    Reply
  • xineis - Thursday, June 06, 2013 - link

    So, how would a 955BE perform compared to the CPUs on the test? From what I understand, I should just keep this CPU, as a new one is not going to make much of a difference? Reply
  • Zoatebix - Friday, June 07, 2013 - link

    Thank you for doing all this work. A great follow-up to the original!

    Could you please correct some charts on the CPU Benchmarks page, though? The "Video Conversion - x264 HD Benchmark" section is displaying the charts for the "Grid Solvers - Explicit Finite Difference" section.
    Reply
  • Klimax - Saturday, June 08, 2013 - link

    Frankly not best article. Resolution too high for GPU and then recommending CPU based on it. CPU, which will not provide performance needed for games. (Techreport showed that APU is not good idea when paired with real GPU; FPS might be in range, but latency is in hell) Reply
  • JNo - Sunday, June 09, 2013 - link

    Ian, I'm afraid I have to agree with some of the naysayers here. You've tried so hard to have clean *scientific* analysis that you've failed to see the wood for the trees. In actual fact I fear you've reached the opposite of a scientific conclusion *because* you only focussed on easily obtainable/reproducible results.

    Just because results for modern games are hard to obtain, doesn't mean you can ignore them despite it being a hard path to walk. I have 1440p but agree that it's not relevant to the vast majority and anyone affording a 1440p monitor won't care to save $40 on AMD A8 vs core i5. So you have to be *realistic* (as well as scientific).

    I know from a few years of international finance analysis that when doing an independent study, there is a chance you can come to a conclusion that flies in the face of the market or common opinion. You have to be *SO* careful when this happens and quadruple check what you have ended up with because 99% of the time, the market or 'hive mind' is correct and there is an error or misunderstanding in your own work. After all, the conglomerate conclusion of hundreds of often intelligent people is hardly likely to wrong, even if you are a smart guy. The chance that you have found the truth and that everyone else is wrong really is about 1% (yes it does happen but it is a once in a blue moon type of event).

    It might seem a huge hit to admit that much of your hard work was misdirected but it could save more pain in the long run to go back to the drawing board and consider what you are trying to achieve and how best to go about it. A very small sample of older titles at unpopular resolutions really could skew results to be misleading.
    Reply
  • CiccioB - Wednesday, June 12, 2013 - link

    I agree. However we have still to understand what was the thesis Ian wanted to demonstrate.
    If it was "AMD CPU don't have to appear so bad vs Intel" the strategy used for the demonstration is quite good.
    On the other hand, if it was "Let's see which is the best CPU for playing games" the strategy is a complete fail. And it still is partially the same if it were "Let's see which is the cheapest CPU to cope with a bottlenecked GPU", as those old games, but Civ5, all do not have any complex AI o scripts which are a CPU intensive task .
    If I were to judge this work as a homework I would evaluate it as F because it is intended for a small part of the market, using old benchmarks not valid today, incomplete (lack of FCAT) with a wrong setup (bottlenecking GPUs to evaluate CPU performances?).
    Wrong on all aspects but, unless said, the intent was to show that AMD CPU are just trailing Intel most expensive ones instead of being a complete generation behind. In this case evaluation can be a B, but becomes quite limited if we look at the represented market (is 3% of a market that is capable of spending well more that an average gamers a good target to demonstrate that they can spare few bucks using an otherwise castrated CPU?)

    For all these reasons I may say that this is one of the worst article I have ever read on this site. It show some incompetence or worse a bias.
    Reply
  • Filiprino - Thursday, June 20, 2013 - link

    It's cool that you test old CPUs, so we can see the improvement of CPU processing power over the years. Reply
  • UltraTech79 - Saturday, June 22, 2013 - link

    This article is irrelevant to 95+% of people. What was the point in this? I don't give a rats ass what will be in 3-5 years, I want to know performance numbers for using a setup with realistic numbers of TODAY.

    Useless.
    Reply
  • core4kansan - Monday, July 15, 2013 - link

    While I appreciate the time and effort you put into this, I have to agree with those who call out 1440p's irrelevance for your readers. I think if we tested at sane resolutions, we'd find that a low-end cpu, like a G2120, coupled with a mid-to-high range GPU, would yield VERY playable framerates at 1080p. I'd love to see some of the older Core 2 Duos up against the likes of a G2120, i3-3220/5, on up to i5-3570 and higher with a high end GPU and 1080p res. That would be very useful info for your readers and could save many of them lots of money. In fact, wouldn't you rather put your hard-earned money into a better GPU if you knew that you could save $200 on the cpu? I'm hinting that I believe (without seeing actual numbers) that a G2120+high end GPU would perform virtually identically in gaming to a $300+ cpu with the same graphics accelerator, at 1080p. Sure, you'd see see greater variation between the cpus at 1080p, but when we're testing cpus, don't we WANT that? Reply

Log in

Don't have an account? Sign up now