Portal 2

A stalwart of the Source engine, Portal 2 is the big hit of 2011 following on from the original award-winning Portal.  In our testing suite, Portal 2 performance should be indicative of CS:GO performance to a certain extent.  Here we test Portal 2 at 1920x1080 with High/Very High graphical settings.

Portal 2 IGP, 1920x1080, Very High, 8xMSAA

Portal 2 mirrors previous testing, albeit our frame rate increases as a percentage are not that great – 1333 to 1600 is a 4.3% increase, but 1333 to 2400 is only an 8.8% increase.

Batman Arkham Asylum

Made in 2009, Batman:AA uses the Unreal Engine 3 to create what was called “the Most Critically Acclaimed Superhero Game Ever”, awarded in the Guinness World Record books with an average score of 91.67 from reviewers.  The game boasts several awards including a BAFTA.  Here we use the in-game benchmark while at the lowest specification settings without PhysX at 1920x1080.  Results are reported to the nearest FPS, and as such we take 4 runs and take the average value of the final three, as the first result is sometimes +33% more than normal.

Batman: AA IGP, 1920x1080, Ultra Low

Batman: AA represents some of the best increases of any application in our testing.  Jumps from 1333 C9 to 1600 C9 and 1866 C9 gives an 8% then another 7% boost, ending with a 21% increase in frame rates moving from 1333 C9 to 2400 C10.

Overall IGP Results

Taking all our IGP results gives us the following graph:

The only game that beats the MemTweakIt predictions is Batman: AA, but most games follow the similar shape of increases just scaled differently.  Bearing in mind the price differences between the kits, if IGP is your goal then either the 1600 C9 or 1866 C9 seem best in terms of bang-for-buck, but 2133 C9 will provide extra performance if the budget stretches that far.

Gaming Tests: Metro 2033, Civilization V, Dirt 3 Input/Output Testing
POST A COMMENT

108 Comments

View All Comments

  • rscoot - Thursday, October 18, 2012 - link

    I remember paying upwards of $400 for a pair of matched 2x512MB Kingston HyperX modules with BH-5 chips. Those were the days! 300MHz at 2-2-2-5 1T in dual channel if you could put enough volts through them. Nowadays I don't think memory matters nearly as much as it did back then. Reply
  • superflex - Thursday, October 18, 2012 - link

    Your first kit was an E6400?
    Let me know when you get hair down there.
    My first computer was an Apple IIe in 1984, and my first build was an Opteron 170 with 400 MHz 2,2,2,5 DDR.
    Reply
  • Magnus101 - Thursday, October 18, 2012 - link

    Once again this only confirms that memory speed makes no real world difference.
    I mean, who in their right mind use the integrated GPU on an expensive i7-system to play metro-2033 with single digit framerate?
    The only thing standing out is the Winrar compression, but, how many use winrar for compression?
    Yes to decompress files it is very common but I only remember using it 2-3 times in my whole life to compress my own files.
    So that isn't important to most users, except for the ones that actually use winrar to compress files.
    And I don't get why the x264 encoding seemed like a big deal. The differences were very small.

    It's beem the same story all the way back to the late 90;s were tests between sdr memory at 100 and 133 MHz or at different timings showed no differences in real life applications in contrast to synthetics.

    But sure, if you are building a new system and choose between, let say 1333 or 1600, then a $5 difference is a no brainer.
    Then again, it would make no noticeable difference anyway.
    Reply
  • silverblue - Thursday, October 18, 2012 - link

    Here's one - will it affect QuickSync in any way? Reply
  • twoodpecker - Monday, October 22, 2012 - link

    I'd be interested in QuickSync results too. In my experience, not proven, it makes a big difference. I adjusted my memory speeds from 1600 to 2000 and noticed at some point that encoding is 25x instead of 15x. This might be due to different factors though, like software optimizations, because I didn't benchmark after adjusting mem speeds. Reply
  • Geofram - Thursday, October 18, 2012 - link

    I don't believe he's implying that single digit frame rates on a game are going to real-life usable for anyone. I believe the point of the test was simply: "Lets take a system that is generally fast and put it in a situation where the IGP is being stressed. This will be the best-case scenario for faster RAM helping it. Lets see if it does".

    To me the idea was not showing everyone everyday situations where faster RAM will help them, instead it was to see where those situations might lay, by setting up a stressful situation and seeing the results. Most of the results were extremely small differences.

    I agree it's not a noticeable difference in most cases. It doesn't make me feel like I should get rid of PC1333 RAM. I don't fault the logic for the tests used however. It was nice to see someone actually comparing the slight differences caused by RAM speed.
    Reply
  • vegemeister - Friday, October 19, 2012 - link

    Most of the (still tiny) difference that appeared in the x264 benchmark was in the first pass. Two pass encodes really only make sense when you're trying to fit a single video onto a single storage device. That's an extremely uncommon use case these days, for everyone but the people mastering blu-rays. Reply
  • jonyah - Thursday, October 18, 2012 - link

    "I remember buying my first memory kit ever. It was a 4GB kit of OCZ DDR2 for my brand new E6400 system, and at the time I paid ~$240, sometime back in 2005."

    I remember buying my first kit too. It was an upgrade from the 2MB I had to 6MB (yes MB, not GB), and that 6MB cost me $200 as well, this was back in 1995. Ten years and we had a 1000x improvement in size and who knows how much in speed.
    Reply
  • rchris - Thursday, October 18, 2012 - link

    Well, dang it! All these "I remember..." comments have really made me feel old. In my case it was paying $300 for a used 1MB board for a Zenith Z100. Can't even remember the year--somewhere in the mid- to late-1980s. Reply
  • IanCutress - Thursday, October 18, 2012 - link

    I should point out that the kit I got was my first purchased kit on its own... Many computers before then where they were built my family or came pre-built.

    On the topic of A10 comparisons, I had thought of doing some in the future if enough interest was there. As the majority of CPU sales is in Intel's favor, we went with Intel first. (Also most of the testing for this review occurred before I had an A10 sample at hand.)

    Ian
    Reply

Log in

Don't have an account? Sign up now