Metro2033

Metro2033 is a DX11 benchmark that challenges every system that tries to run it at any high-end settings.  Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1920x1080 with full graphical settings.  Results are given as the average frame rate from 4 runs.

Metro2033 IGP, 1920x1080, All except PhysX

Against DDR3-1333, our GeIL kit today offers a 7.67% increase in frame rates in our Metro2033 test on IGP.  This falls in between the 6.86% increase from the 1866 C9 and 8.87% increase from the 2133 C9 kits.  Arguably we could conclude that Metro likes lower CL numbers, as the MemTweakIt suggests that our kit should achieve similar to the 2133 C9 kit.

Civilization V

Civilization V is a strategy video game that utilizes a significant number of the latest GPU features and software advances.  Using the in-game benchmark, we run Civilization V at 1920x1080 with full graphical settings, similar to Ryan in his GPU testing functionality.  Results reported by the benchmark are the total number of frames in sixty seconds, which we normalize to frames per second.

Civilization V IGP, 1920x1080 High Settings

Similar results are seen with Civ V like Metro2033, but this time we get only a 5.29% increase over DDR3-1333.  The kit again performs more like a 1866 C9 than the 2133 C9.

Dirt 3

Dirt 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters.  Using the in game benchmark, Dirt 3 is run at 1920x1080 with Ultra Low graphical settings.  Results are reported as the average frame rate across four runs.

Dirt 3 IGP, 1920x1080, Ultra Low Settings

Even though in the past we have shown Dirt 3 to love CPU MHz and GPU capacity, when it comes to memory the results seem to top off around the 1866 C9 mark.  All the kits between 1866 C9 and 2400 C10 give an 8-10% performance boost on IGP over 1333 C9.

Market Positioning, Test Bed, Kit Order Gaming Tests: Portal 2, Batman AA, Overall IGP
Comments Locked

30 Comments

View All Comments

  • Beenthere - Wednesday, October 24, 2012 - link

    Can't change the type from 8166 MHz. to the proper 1866 MHz. but most folks should be able to figure it out...
  • silverblue - Thursday, October 25, 2012 - link

    Of course, if you have an APU-based system, the faster memory does indeed make a difference... though I agree, it's the exception rather than the norm.
  • JlHADJOE - Thursday, October 25, 2012 - link

    But then its totally contrary to one of the main reasons behind having an APU -- penny pinching.

    These kits cost twice the DDR3-1333 going rate, so that's $75 you could have put into a GPU. Can't speak for everyone, but I'd probably choose an i3 with DDR3-1333 + a 7750 over an A10-5800k with DDR3-2400.
  • JohnMD1022 - Wednesday, October 24, 2012 - link

    My thoughts exactly.

    1600 seems to be the sweet spot on price and performance.
  • PseudoKnight - Wednesday, October 24, 2012 - link

    Anandtech did a series of memory frequency tests like a year ago (I forget exactly). While they found that 1333 to 1600 didn't offer much in terms of average FPS gains in gaming, it had a clearer impact on minimum frame rates. I'm not saying it's worth it either way here, but I'd like people to give some attention to minimum frame rates when talking about the benefits of bumps in memory frequency.

    That said, 2400 is obviously overkill here, but that should be obvious to anyone who wants to spend their money efficiently.
  • Impulses - Thursday, October 25, 2012 - link

    The article the did a year ago (with Sandy Bridge in mind) says absolutely nothing about minimum frame rates vs average... I don't even see how faster memory could have such an effect with a dedicated GPU.
  • Impulses - Thursday, October 25, 2012 - link

    *they
  • JlHADJOE - Thursday, October 25, 2012 - link

    It might have been techreport. They're the guys who usually do those frame-time measurements.
  • poohbear - Thursday, October 25, 2012 - link

    pseudoking what are u talking about? there is virtually NO effect on minimum frames on a dedicated GPU system. Ever since the memory controller moved to the CPU, the RAM timings have become ALOT a less important component in the system. The only way it shows a difference is when you go to all kinds of outlandish scenerios that involve isolating the GPU and CPU to situations that show some difference between RAM, but in a real world setting those situations are so rare that it becomes pointless to even entertain them.
  • Ratman6161 - Thursday, October 25, 2012 - link

    But add running virtual machines to your list of reasons why a lot of memory might be good. When working from home I've actually typically got the host machine where I'm doing most of my actual work plus at least two virtual machines running, each VPN'ed into a different remote network. So it isn't too uncommon for me to see about 90% of my 16 gb in use at any one time. And I do occasionally hit times when I have to shut down one VM in order to start another. So I wouldn't actually mind having 32 GB.

    On the other hand, while I need a large quantity of RAM, my 1600 MHz G-Skill works just fine performance wise so I don't need speed - I need quantity.

Log in

Don't have an account? Sign up now