Portal 2

A stalwart of the Source engine, Portal 2 is the big hit of 2011 following on from the original award-winning Portal.  In our testing suite, Portal 2 performance should be indicative of CS:GO performance to a certain extent.  Here we test Portal 2 at 1920x1080 with High/Very High graphical settings.

Portal 2 IGP, 1920x1080, Very High, 8xMSAA

Portal 2, like Dirt3, seems to find something else the limiting factor other than memory fairly quickly.  On our 2400 C11 kit we are within 0.05 fps of the peak frame rate on the IGP.

Batman Arkham Asylum

Made in 2009, Batman:AA uses the Unreal Engine 3 to create what was called “the Most Critically Acclaimed Superhero Game Ever”, awarded in the Guinness World Record books with an average score of 91.67 from reviewers.  The game boasts several awards including a BAFTA.  Here we use the in-game benchmark while at the lowest specification settings without PhysX at 1920x1080.  Results are reported to the nearest FPS, and as such we take 4 runs and take the average value of the final three, as the first result is sometimes +33% more than normal.

Batman: AA IGP, 1920x1080, Ultra Low

As we saw in our memory overview, Batman:AA seems to love memory boosts when playing on IGP, and using our 2400 C11 kit puts frame rates right up there with our top performing kits.

Overall IGP

Taking all our IGP results gives us the following graph:

We see a distinct dichotomy in our testing.  In some tests (Metro, Civilization, Dirt3) despite having the high MHz modules, the low command rate number makes the 2400 C11 kit act more like the 1866 C9 kit.  In contrast, Portal 2 and Batman: AA seem to prefer the high MHz over the lower CL, and the kit performs more like the 2133 C9 and 2400 C10 kits.

Gaming Tests: Metro 2033, Civilization V, Dirt 3 Throughput Testing
Comments Locked

30 Comments

View All Comments

  • Beenthere - Wednesday, October 24, 2012 - link

    Can't change the type from 8166 MHz. to the proper 1866 MHz. but most folks should be able to figure it out...
  • silverblue - Thursday, October 25, 2012 - link

    Of course, if you have an APU-based system, the faster memory does indeed make a difference... though I agree, it's the exception rather than the norm.
  • JlHADJOE - Thursday, October 25, 2012 - link

    But then its totally contrary to one of the main reasons behind having an APU -- penny pinching.

    These kits cost twice the DDR3-1333 going rate, so that's $75 you could have put into a GPU. Can't speak for everyone, but I'd probably choose an i3 with DDR3-1333 + a 7750 over an A10-5800k with DDR3-2400.
  • JohnMD1022 - Wednesday, October 24, 2012 - link

    My thoughts exactly.

    1600 seems to be the sweet spot on price and performance.
  • PseudoKnight - Wednesday, October 24, 2012 - link

    Anandtech did a series of memory frequency tests like a year ago (I forget exactly). While they found that 1333 to 1600 didn't offer much in terms of average FPS gains in gaming, it had a clearer impact on minimum frame rates. I'm not saying it's worth it either way here, but I'd like people to give some attention to minimum frame rates when talking about the benefits of bumps in memory frequency.

    That said, 2400 is obviously overkill here, but that should be obvious to anyone who wants to spend their money efficiently.
  • Impulses - Thursday, October 25, 2012 - link

    The article the did a year ago (with Sandy Bridge in mind) says absolutely nothing about minimum frame rates vs average... I don't even see how faster memory could have such an effect with a dedicated GPU.
  • Impulses - Thursday, October 25, 2012 - link

    *they
  • JlHADJOE - Thursday, October 25, 2012 - link

    It might have been techreport. They're the guys who usually do those frame-time measurements.
  • poohbear - Thursday, October 25, 2012 - link

    pseudoking what are u talking about? there is virtually NO effect on minimum frames on a dedicated GPU system. Ever since the memory controller moved to the CPU, the RAM timings have become ALOT a less important component in the system. The only way it shows a difference is when you go to all kinds of outlandish scenerios that involve isolating the GPU and CPU to situations that show some difference between RAM, but in a real world setting those situations are so rare that it becomes pointless to even entertain them.
  • Ratman6161 - Thursday, October 25, 2012 - link

    But add running virtual machines to your list of reasons why a lot of memory might be good. When working from home I've actually typically got the host machine where I'm doing most of my actual work plus at least two virtual machines running, each VPN'ed into a different remote network. So it isn't too uncommon for me to see about 90% of my 16 gb in use at any one time. And I do occasionally hit times when I have to shut down one VM in order to start another. So I wouldn't actually mind having 32 GB.

    On the other hand, while I need a large quantity of RAM, my 1600 MHz G-Skill works just fine performance wise so I don't need speed - I need quantity.

Log in

Don't have an account? Sign up now