Metro2033

Metro2033 is a DX11 benchmark that challenges every system that tries to run it at any high-end settings.  Developed by 4A Games and released in March 2010, we use the inbuilt DirectX 11 Frontline benchmark to test the hardware at 1920x1080 with full graphical settings.  Results are given as the average frame rate from 4 runs.

Metro2033 IGP, 1920x1080, All except PhysX

While comparing graphical results in the 5 FPS range may not seem appropriate, it taxes the system to its fullest, exposing whether at this high end memory actually makes a difference or if we are weighing on computation.  What we do see is a gradual increase in frame rate with each kit, up to 10% difference between the top end and the bottom kit.  The pivotal point of increase is from 1333 to 1866 – beyond 1866 our increases are smaller despite the increased cost of those kits.

Civilization V

Civilization V is a strategy video game that utilizes a significant number of the latest GPU features and software advances.  Using the in-game benchmark, we run Civilization V at 1920x1080 with full graphical settings, similar to Ryan in his GPU testing functionality.  Results reported by the benchmark are the total number of frames in sixty seconds, which we normalize to frames per second.

Civilization V IGP, 1920x1080 High Settings

In comparison to Metro2033, Civilization V does not merit a large % increase with memory kit, moving from 3% to 6.7% up the memory kits.  Again we do this test with all the eye candy enabled to really stress the CPU and IGP as much as we can to find out where faster memory will help.

Dirt 3

Dirt 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters.  Using the in game benchmark, Dirt 3 is run at 1920x1080 with Ultra Low graphical settings.  Results are reported as the average frame rate across four runs.

Dirt 3 IGP, 1920x1080, Ultra Low Settings

In contrast to our previous tests, this one we run at 1080p with ultra-low graphical settings.  This allows for more applicable frame rates, where the focus will be on processing pixels rather than post-processing with effects.  In previous testing on the motherboard side, we have seen that Dirt3 seems to love every form of speed increase possible – CPU speed, GPU speed, and as we can see here, memory speed.  Almost every upgrade to the system will give a better frame rate.  Moving from 1333 to 1600 gives us almost a 10% FPS increase, whereas 1333 to 1866 gives just under 15%.  We peak at 15% with the 2133 kit, but this reinforces the idea that choosing a 1600 C9 kit over a 1333 C9 kit is a no brainer for the price difference.  Choosing that 1866 C9 kit looks like a good idea, but the 2133 C9 kit is reaching the law of diminishing returns.

Market Positioning, Test Bed, Kit Order Gaming Tests: Portal 2, Batman AA, Overall IGP
Comments Locked

114 Comments

View All Comments

  • Mitch101 - Thursday, October 18, 2012 - link

    Love this article first time I ever commented on one. I believe you see little improvement past 1600/1866 because the Intel chips on die cache do a good job of keeping the CPU fed. Meaning the bottleneck on an Intel chip is the CPU itself not the memory or cache.

    Can you do this with an AMD chip also as I believe we would see a bigger improvement with their chips because the on die cache cant keep up with the chip and faster external memory would give bigger performance jumps for AMD chips. Well maybe 2 generations ago AMD but lets see your pockets are deeper than mine.

    Hope I said that right I'm a little droopy eyed from lack of caffeine.
  • Jjoshua2 - Thursday, October 18, 2012 - link

    Just bought RipjawsZ from Newegg for $90 after coupon! I feel vindicated in my choice now :)
  • ludikraut - Thursday, October 18, 2012 - link

    I thought the performance difference would be less than it was. Has me rethinking whether I need to update my old OCZ DDR3-1333 chips. I haven't yet, as I'm probably giving away 5-10% performance in my OC alone. I targeted efficiency, not absolute speed - at 4GHz my i7-920 D0 consumes 80W less @ idle than the default settings of my mobo - go figure.

    l8r)
  • Beenthere - Thursday, October 18, 2012 - link

    For typical desktop use with RAM frequencies of 1333 MHz. and higher there is no tangible gains in SYSTEM performance to justify paying a premium for higher RAM frequency, increased capacity above 4 GB. or lower latencies - with APUs being the minor exception.

    In real apps, not synthetic benches, there is simply nothing of significance to be gained in system performance above 1333 MHz. as DDR3 running at 1333 MHz. is not a system bottleneck. Synthetic benches exaggerate any real gains so they are quite misleading and should be ignored.
  • tynopik - Thursday, October 18, 2012 - link

    WinRAR is a 'real' app
  • silverblue - Thursday, October 18, 2012 - link

    It's okay, he said the same thing on Xbit Labs.
  • VoraciousGorak - Thursday, October 18, 2012 - link

    "For typical desktop use with RAM frequencies of 1333 MHz. and higher there is no tangible gains in SYSTEM performance to justify paying a premium for higher RAM frequency, increased capacity above 4 GB. or lower latencies - with APUs being the minor exception."

    No tangible gains above four gi-... what industries have you worked in? Because my old AdWords PPC company's software benefited from over 4GB, and that's the lightest workload I've had on a computer in a while. For home use, I just bumped my system to 16GB because I kept capping my 8GB, and I do zero video/photo work. If you just do word processing, I'll trade you a nice netbook with a VGA out for whatever you're using now.

    DDR3-1333 to 1600 is almost the same price on Newegg, and 1866 isn't much more. Think about it in percentage cost of your computer. Using current Newegg prices for 2x4GB CL9 DDR3, a $1000 computer with 8GB DDR3-1333 will cost $1002 with DDR3-1600, $1011 with DDR3-1866, and $1025 with DDR3-2133. Not exactly a crushing difference.
  • Olaf van der Spek - Thursday, October 18, 2012 - link

    Why isn't XMP enabled by default? The BIOS should know what the CPU supports, shouldn't it?
  • Gigaplex - Thursday, October 18, 2012 - link

    What this article glosses over is that G.Skill memory often recommends manually increasing the voltages when enabling XMP profiles. I have the F3-1866C10D-16GAB kit and G.Skill recommends pushing the memory controller voltage out of spec for Ivy Bridge in order to enable XMP. As a result I just run them at 1333 (they don't have 1600 timings in the SPD table and I can't be bothered experimenting to find a stable setting).
  • IanCutress - Friday, October 19, 2012 - link

    I did not have to adjust the voltage once on any of these kits. If anything, what you are experiencing is more related to the motherboard manufacturer. Some manufacturers have preferred memory vendors, of which G.Skill may not be one. In that case you either have to use work arounds to make kits work, or wait for a motherboard BIOS update. If you have read any of my X79 or Z77 reviews, you will see that some boards do not like my 2400 C9 kit that I use for testing at XMP without a little voltage boost. But on the ASUS P8Z77-V Premium, all these kits worked fine at XMP, without issue.

    Ian

Log in

Don't have an account? Sign up now