Gaming Performance

We're testing with a handful of new titles in today's review, many of which are far from CPU bound even at relatively low (by today's standards) resolutions. For our S.T.A.L.K.E.R. and Supreme Commander tests we had to reduce the in-game resolution to 1024 x 768, while Rainbow Six: Vegas and Lost Planet both required 800 x 600 in order to produce measurable differences between these CPUs.

On the one hand, this is good news for those looking to build gaming PCs on lower end processors. On the other hand, it means that we have to test with less real world settings in our CPU reviews to accurately compare overall gaming performance of modern day CPUs. The CPU/GPU boundry pendulum will continuously swing from one end to the other, we're simply at a point today where even the almighty GeForce 8800 GTX can't run everything perfectly smoothly at 1600 x 1200.

As CPUs and GPUs converge, games will undoubtedly become even more compute bound, but it's difficult to predict what effect this will have (if any) on the balance between sequential and highly parallel general purpose processing.

Our first 3D game test is our walkthrough of Bruma in the popular RPG Oblivion. This test was run at 1600 x 1200 with Very High quality defaults selected from Oblivion's launcher. FRAPS was used in this benchmark:

Gaming Performance - Oblivion

Oblivion was one of the two benchmarks that showed a significant performance improvement due to the faster 1333MHz FSB. Looking at the E6420 vs. 5600+ comparison, AMD actually pulls ahead here thanks to its aggressive pricing.

We ran Half Life 2: Episode One at 1600 x 1200, with all settings at their maximum values with the exception of AA/anisotropic filtering, which we left disabled.

Gaming Performance - Half Life 2: Episode One

A small improvement for the 1333MHz FSB, and AMD continues to win the performance battle at ~$180.

We ran Prey at 1600 x 1200 with High Quality textures, all detail settings were set to their highest options, no AA, and 8X aniso:

Gaming Performance - Prey

S.T.A.L.K.E.R. was tested at 1024 x 768 with full dynamic lighting enabled and high quality detail settings:

Gaming Performance - S.T.A.L.K.E.R.

We ran Supreme Commander at 1024 x 768 with medium quality presets. We ran a subset of the built in performance test, specifically we only used the third performance test in the script as it was the most CPU bound.

Gaming Performance - Supreme Commander

Rainbow Six: Vegas proved to be particularly GPU bound, even at 800 x 600. We left most detail options enabled/high, with the exception of the eye effect setting.

Gaming Performance - Rainbow Six: Vegas

Capcom's Lost Planet demo is available in both DX9 and DX10 flavors, but for this review we used the DX9 version given that we've not been able to find any real benefit to running the DX10 version. Just like RS:V, we had to run Lost Planet at 800 x 600 with a mixture of high/medium quality settings:

Gaming Performance - Lost Planet Snow Benchmark DX9

Gaming Performance - Lost Planet Cave Benchmark DX9

Photo Processing Performance Overclocking and Final Words
Comments Locked

42 Comments

View All Comments

  • zsdersw - Monday, June 25, 2007 - link

    That's the P35 chipset. The article coldpower was pointing to is P965.. and he said modified his statement with "here", meaning "in the article mentioned".
  • TA152H - Monday, June 25, 2007 - link

    So you think that's relevant? People are going to buy 1333 FSB for the P965???? Again, are you crazy? P965 doesn't even support 1333 officially. P35 is what's important.
  • zsdersw - Tuesday, June 26, 2007 - link

    And besides.. the marginal improvement in overall system performance that P35 brings to the table doesn't prove or reliably suggest that Core 2 is particularly dependent on memory bandwidth or speed.
  • TA152H - Tuesday, June 26, 2007 - link

    You're seriously confused.

    Most of the information out now shows that you get pretty good performance with higher performance memory running at high clock speeds, especially for DDR3. It's now becoming common knowledge. But, they test DDR2-800 for some reason. To really see the performance of 1333 FSB, they should be using it with the proper memory instead of obsolete memory running at inadequate clock speeds. Luckily, there is another site that promised to do that in the very near future. Why they couldn't figure that out here is a mystery to me though, it kind of hits one in the face.
  • zsdersw - Tuesday, June 26, 2007 - link

    That's the expectation: higher performance with memory running at higher speeds. None of that suggests that Core 2's performance hinges upon extracting more and more out of the memory/chipset, though.
  • zsdersw - Tuesday, June 26, 2007 - link

    .. or, rather, that Core 2's performance depends on extracting more and more out of the memory/chipset.
  • zsdersw - Tuesday, June 26, 2007 - link

    All I'm saying is that you're barking up the wrong tree. coldpower's reference was to the P965, and then you started talking about P35 as if it had something to do with the results of the P965. They're separate.
  • TA152H - Tuesday, June 26, 2007 - link

    Are you unable to understand things in context?? Or are you arguing just to argue?

    The P965 is irrelevant, therefore his post is irrelevant, and therefore he has no point. The P965 doesn't matter for FSB 1333 processors, the P35 does.

    My point was that they should be running memory at 1333 speeds, which means the P35. He brought up some nonsense that was irrelevant, and now you think that it was, and the P35 isn't. It's like the Twilight Zone.
  • coldpower27 - Wednesday, June 27, 2007 - link

    No, my post is completely relevant, if your going to argue about official support on the P965 for 1.33GHZ FSB processors then DDR3-1333 is rejected to it being not officially supported by the P35 Express chipset, the only chipset to have official support for that is X38.

    If you need to prod others then I believe it's you who are the one that can't stand losing an argument.
  • zsdersw - Tuesday, June 26, 2007 - link

    An established chipset on which the Core 2 processors run is not irrelevant to the issue he was addressing: Core 2 performance vis-a-vis memory bandwidth/speed.

Log in

Don't have an account? Sign up now