Metro 2033

Our first analysis is with the perennial reviewers’ favorite, Metro 2033. It occurs in a lot of reviews for a couple of reasons – it has a very easy to use benchmark GUI that anyone can use, and it is often very GPU limited, at least in single GPU mode. Metro 2033 is a strenuous DX11 benchmark that can challenge most systems that try to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1440p with full graphical settings. Results are given as the average frame rate from a second batch of 4 runs, as Metro has a tendency to inflate the scores for the first batch by up to 5%.

One 7970

Metro 2033 - One 7970, 1440p, Max Settings

With one 7970 at 1440p, every processor is in full x16 allocation and there seems to be no split between any processor with 4 threads or above. Processors with two threads fall behind, but not by much as the X2-555 BE still gets 30 FPS. There seems to be no split between PCIe 3.0 or PCIe 2.0, or with respect to memory.

Two 7970s

Metro 2033 - Two 7970s, 1440p, Max Settings

When we start using two GPUs in the setup, the Intel processors have an advantage, with those running PCIe 2.0 a few FPS ahead of the FX-8350. Both cores and single thread speed seem to have some effect (i3-3225 is quite low, FX-8350 > X6-1100T).

Three 7970s

Metro 2033 - Three 7970, 1440p, Max Settings

More results in favour of Intel processors and PCIe 3.0, the i7-3770K in an x8/x4/x4 surpassing the FX-8350 in an x16/x16/x8 by almost 10 frames per second. There seems to be no advantage to having a Sandy Bridge-E setup over an Ivy Bridge one so far.

Four 7970s

Metro 2033 - Four 7970, 1440p, Max Settings

While we have limited results, PCIe 3.0 wins against PCIe 2.0 by 5%.

One 580

Metro 2033 - One 580, 1440p, Max Settings

From dual core AMD all the way up to the latest Ivy Bridge, results for a single GTX 580 are all roughly the same, indicating a GPU throughput limited scenario.

Two 580s

Metro 2033 - Two 580s, 1440p, Max Settings

Similar to one GTX580, we are still GPU limited here.

Metro 2033 conclusion

A few points are readily apparent from Metro 2033 tests – the more powerful the GPU, the more important the CPU choice is, and that CPU choice does not matter until you get to at least three 7970s. In that case, you want a PCIe 3.0 setup more than anything else.

CPU Benchmarks GPU Benchmarks: Dirt 3
POST A COMMENT

111 Comments

View All Comments

  • Silma - Wednesday, June 05, 2013 - link

    Intel doesn't deserve criticism. Haswell is a small improvement over Ivy Bridge because it has become extremely difficult to optimize and already excellent processor. Do you see anything better from AMD, ARM, Oracle or others?

    Is there a need to upgrade from Ivy to Haswell? No. Was it necessary to upgrade from Nethalem to Sandy Bride? No. The fact is that for most applications processors have been good enough for years and money is better spent on ssds, gpus and whatnot.

    The real conclusion of this article should be that processors absolutely do not matter for gaming and that the money is better spent on speedier gpu. Processors may become relevant for the very very very few people that have extreme 2/3x extreme cards. Even a setup with 2 middle cards such as gtx 560 is not cpu dependent. I would welcome actual statistics from the number of players with 2x 3x high-end gpus. I'm quite sure the count is ultra tiny and for those people willing and able to spend thousand of dollars, do you think 100$ less on a processor is relevant?
    Reply
  • chizow - Wednesday, June 05, 2013 - link

    I don't have a problem with the conclusion he comes to, complaining about dissemination of information to come to that conclusion is what makes no sense. Put all the information out there, 1, 2, 3 articles a day np, then make your own informed decision on the platform. Bemoan the fact there is actual coverage a day or two after launch and one or two reviews? Makes no sense. Reply
  • Memristor - Tuesday, June 04, 2013 - link

    Too bad that Richland, which is available as of today, didn't make it into this review. Other than that great read. Reply
  • eddieobscurant - Tuesday, June 04, 2013 - link

    many of us have a q6600 @ 3600mhz, and personally i'm very happy this and my 7870. I would still like to see a comparison of my cpu @ 3600mhz, with the modern cpus because i don't think there is a huge difference in games. Reply
  • chizow - Tuesday, June 04, 2013 - link

    It depends what you play, any game that is CPU limited is going to be HUGE difference with that CPU. I had the same chip at 3.6GHz, which was great btw, and even when I upgraded to 920 @4GHz there was huge improvement in some games, most notably GTA4 at the time. Some other games that scale extremely well with CPU are WoW, Diablo 3, etc. just to name a few. Reply
  • medi02 - Wednesday, June 05, 2013 - link

    Nah. Most of the tests show that to get CPU limited you need a multi-GPU setup.
    i7 and intel mobo will cost you about 500$ with marginal improvements.
    Reply
  • chizow - Wednesday, June 05, 2013 - link

    Sorry, just not true. Even with just 1x680 WoW and other similarly CPU dependent games scale tremendously well with faster CPUs:

    http://www.tomshardware.com/reviews/fx-8350-visher...

    Q6600 @ 3.6 is probably just a tad faster than the Phenom IIs in that test.
    Reply
  • TheJian - Thursday, June 06, 2013 - link

    See my comments here...Chizow is correct, and even understating it some. There are a LOT of games cpu limited as I showed in my links. Huge differences in cpu perf from A10-5800 up to 4770k, never mind the junk Ian recommends here A8-5600 for single gpu. It just isn't correct to recommend that cpu or even A10-5800K which I showed getting smacked around in many games at 1080p. Articles like this make people think games are not cpu bound (it's far more games than Civ5). Neverwinter, Metro Last light, tomb raider, Farcry3, Crysis 3 etc etc...Once 20nm comes we may find even 1440p showing just as many limited by cpu. If rumors are true Volcanic doubles stream processors. I'm sure NV will match that. You end up gpu bound when you up the res to 1440 on single cards now, but that won't be forever and 98.75% of us according to steam don't play at 1440p (.87%) or above (1.25% total of all res above 1920x1200).

    Check the 1080p data on my links (techreport was a good one as they show 1080p in most of the listed games). Toms shows neverwinter as I noted needing a very high cpu also. Hit all comments on this article, and Ctrl-F my name. Ignore my post comments and just click the links in them to prove Chizow's point (and my own). CPU is important at 1080p and 1920x1200 NOW and will be important at higher res with the next gen cards at 20nm. You will never get out of your AMD mistake if you take this article's suggestions. Well at least not without changing to an Intel board/chip...LOL. Who wants to do that? Just buy an Intel unless you're broke. Don't trust me though, read the links provided and judge for yourself how accurate anandtech is here.

    I showed some games that are nearly DOUBLE on Intel vs. A10-5800K! You don't have to like the way I made my point or believe me, just check the links :) They all say the same thing. CPU is an issue just as Chizow shows in his link. You can find this in many cpu articles where they use a top gpu (usually 7970/680) and test new cpus with the oldies in there too which show large separations. Check i7-3770k or fx 8350 articles (just google those two cpu models and "review" for ample sites showing the spreak)...1080p separates the men from the boys in cpu's.

    After you check the links (and chizow's), come back and agree Anandtech needs to change their ways, or tear my comments apart if I'm lying :) Future gpu's will only make our point stick out even more. CPU matters. Also note a lot of the games that are gpu limited on single cards are NOT playable anyway (check sleeping dogs right here in this article 1440p...7970 at 28fps avg is NOT playable, mins will dip to 20's or below). So you're forced back into cpu limited in a lot of cases at 1080p. Where 98.75% of us play you see cpu limits a lot.

    Go back one page on Chizow's link to Skyrim's benchmark in the same article for the same data. 1080p 3770 scores 88.2 to 8350's 67.4 (that's a lot and a huge hint to how your future on AMD will look)
    http://www.tomshardware.com/reviews/fx-8350-visher...
    That's a 30% difference and an 8350FX is far faster than an A8-5600 Ian recommends here. Chizow is even more right if you toss in Ian's recommendation of an even slower cpu than 8350 vs. Intel's stuff. Even in skyrim at 1680x1050 they separate from 90fps to 68fps for 8350fx. So until you completely tap out your gpu (1440p and up which basically requires 2+ cards) you will notice if your cpu is junk or not. Since this article is only written for apparently 1.25% of the readership (or world for that matter according to steam survey), you will notice the cpu! Unless you're raising your hand as the 1.25% :) I don't call 30-100% faster marginal improvements do you? Add CIV 5 also which this site even proves in this article ;) At least they got something right.
    Reply
  • TheJian - Thursday, June 06, 2013 - link

    http://www.tomshardware.com/reviews/a10-6700-a10-6...
    Check the toms A10-6800 review. With only a 6670D card i3-3220 STOMPS the A10-6800k with the same 6670 radeon card in 1080p in F1 2012. 68fps to 40fps is a lot right? Both chips are roughly $145. Skyrim shows 6800k well, but you need 2133memory to do it. But faster Intel cpu's will leave this in the dust with a better gpu anyway.

    http://www.guru3d.com/articles_pages/amd_a10_6800k...
    You can say 100fps is a lot in far cry2 (it is) but you can see how a faster cpu is NOT limiting the 580 GTX here as all resolutions run faster. The i7-4770 allows GTX 580 to really stretch it's legs to 183fps, and drops to 132fps at 1920x1200. The FX 8350 however is pegged at 104 for all 4 resolutions. Even a GTX 580 is held back, never mind what you'd be doing to a 7970ghz etc. All AMD cpu's here are limiting the 580GTX while the Intel's run up the fps. Sure there are gpu limited games, but I'd rather be using the chip that runs away from slower models when this isn't the case. From what all the data shows amongst various sites, you'll be caught with your pants down a lot more than anandtech is suggesting here. Hopefully that's enough games for everyone to see it's far more than Civ5 and even with different cards affecting things. If both gpu sides double their gpu cores, we could have a real cpu shootout in many things at 1440p (and of course below this they will all spread widely even more than I've shown with many links/games).
    Reply
  • roedtogsvart - Tuesday, June 04, 2013 - link

    Hey Ian, how come no Nehalem or Lynnfield data points? There are a lot of us on these platforms who are looking at this data to weigh vs. the cost of a Haswell upgrade. With the ol' 775 geezers represented it was disappointing not to see 1366 or 1156. Superb work overall however! Reply

Log in

Don't have an account? Sign up now