Gaming Performance

There's simply no better gaming CPU on the market today than Sandy Bridge. The Core i5 2500K and 2600K top the charts regardless of game. If you're building a new gaming box, you'll want a SNB in it.

Our Fallout 3 test is a quick FRAPS runthrough near the beginning of the game. We're running with a GeForce GTX 280 at 1680 x 1050 and medium quality defaults. There's no AA/AF enabled.

Fallout 3

In testing Left 4 Dead we use a custom recorded timedemo. We run on a GeForce GTX 280 at 1680 x 1050 with all quality options set to high. No AA/AF enabled.

Left 4 Dead

Far Cry 2 ships with several built in benchmarks. For this test we use the Playback (Action) demo at 1680 x 1050 in DX9 mode on a GTX 280. The game is set to medium defaults with performance options set to high.

Far Cry 2

Crysis Warhead also ships with a number of built in benchmarks. Running on a GTX 280 at 1680 x 1050 we run the ambush timedemo with mainstream quality settings. Physics is set to enthusiast however to further stress the CPU.

Crysis Warhead

Our Dragon Age: Origins benchmark begins with a shift to the Radeon HD 5870. From this point on these games are run under our Bench refresh testbed under Windows 7 x64. Our benchmark here is the same thing we ran in our integrated graphics tests - a quick FRAPS walkthrough inside a castle. The game is run at 1680 x 1050 at high quality and texture options.

Dragon Age: Origins

We're running Dawn of War II's internal benchmark at high quality defaults. Our GPU of choice is a Radeon HD 5870 running at 1680 x 1050.

Dawn of War II

Our World of Warcraft benchmark is a manual FRAPS runthrough of a lightly populated server with no other player controlled characters around. The frame rates here are higher than you'd see in a real world scenario, but the relative comparison between CPUs is accurate.

We run on a Radeon HD 5870 at 1680 x 1050. We're using WoW's high quality defaults but with weather intensity turned down all the way.

World of Warcraft

For Starcraft II we're using our heavy CPU test. This is a playback of a 3v3 match where all players gather in the middle of the map for one large, unit-heavy battle. While GPU plays a role here, we're mostly CPU bound. The Radeon HD 5870 is running at 1024 x 768 at medium quality settings to make this an even more pure CPU benchmark.

Starcraft II

This is Civ V's built in Late GameView benchmark, the newest addition to our gaming test suite. The benchmark outputs three scores: a full render score, a no-shadow render score and a no-render score. We present the first and the last, acting as a GPU and CPU benchmark respectively. 

We're running at 1680 x 1050 with all quality settings set to high. For this test we're using a brand new testbed with 8GB of memory and a GeForce GTX 580.

Civilization V: Late GameView Benchmark

Civilization V: Late GameView Benchmark

Visual Studio 2008, Flash Video Creation, & Excel Performance Power Consumption
Comments Locked

283 Comments

View All Comments

  • CreativeStandard - Monday, January 3, 2011 - link

    PC mag reports these new i7's only support up to 1333 DDR3 but you are running faster, is PC mag wrong, what is the maximum supported memory speeds?
  • Akv - Monday, January 3, 2011 - link

    Is it true that it has embedded DRM ?
  • DanNeely - Monday, January 3, 2011 - link

    Only to the extent that like all intel Core2 and later systems it supports a TPM module to allow locking down servers in the enterprise market and that the system *could* be used to implement consumer DRM at some hypothetical point in the future; but since consumer systems aren't sold with TPM modules it would have no impact on systems bought without.
  • shabby - Monday, January 3, 2011 - link

    Drm is only on the h67 chipset, and its basically just for watching movies on demand and nothing more.
  • Akv - Monday, January 3, 2011 - link

    Mmmhh... ok...

    Nevertheless the intel HD + H67 was already modest, if it has DRM in addition then it becomes not particularly seducing.
  • marraco - Monday, January 3, 2011 - link

    Thanks for adding Visual Studio compilation benchmark. (Although you omitted the 920).
    It seems that not even SSD, nor can better processors do much for that annoying time waster. It does not matter how much money you throw at it.

    I wish to see also SLI/3-way SLI/crossfire performance, since the better cards frequently are CPU bottlenecked. How much better it does relative to i7 920? And with good cooler at 5Ghz?

    Note: you mention 3 video cards on test setup, but what one is on the benchmarks?
  • Anand Lal Shimpi - Monday, January 3, 2011 - link

    You're welcome on the VS compile benchmark. I'm going to keep playing with the test to see if I can use it in our SSD reviews going forward :)

    I want to do more GPU investigations but they'll have to wait until after CES.

    I've also updated the gaming performance page indicating what GPU was used in each game, as well as the settings for each game. Sorry, I just ran out of time last night and had to catch a flight early this morning for CES.

    Take care,
    Anand
  • c0d1f1ed - Monday, January 3, 2011 - link

    I wonder how this CPU scores with SwiftShader. The CPU part actually has more computing power than the GPU part. All that's lacking to really make it efficient at graphics is support for gather/scatter instructions. We could then have CPUs with more generic cores instead.
  • aapocketz - Monday, January 3, 2011 - link

    I have read that CPU overclock is only available on P67 motherboards, and H67 motherboards cannot overclock the CPU, so you can either use the onboard graphics OR get overclocking? Is this true?

    "K-series SKUs get Intel’s HD Graphics 3000, while the non-K series SKUs are left with the lower HD Graphics 2000 GPU."

    whats the point of improving the graphics on K series, if pretty much everyone who gets one will have a P67 motherboard which cannot even access the GPU?

    Let me know if I am totally not reading this right...
  • MrCromulent - Monday, January 3, 2011 - link

    Great review as always, but on the HTPC page I would have wished for a comparison of the deinterlacing quality of SD (480i/576i) and HD (1080i) material. Ati's onboard chips don't offer vector adaptive deinterlacing for 1080i material - can Intel do better?

    My HD5770 does a pretty fine job, but I want to lose the dedicated video card in my next HTPC.

Log in

Don't have an account? Sign up now