Gaming Performance

There's simply no better gaming CPU on the market today than Sandy Bridge. The Core i5 2500K and 2600K top the charts regardless of game. If you're building a new gaming box, you'll want a SNB in it.

Our Fallout 3 test is a quick FRAPS runthrough near the beginning of the game. We're running with a GeForce GTX 280 at 1680 x 1050 and medium quality defaults. There's no AA/AF enabled.

Fallout 3

In testing Left 4 Dead we use a custom recorded timedemo. We run on a GeForce GTX 280 at 1680 x 1050 with all quality options set to high. No AA/AF enabled.

Left 4 Dead

Far Cry 2 ships with several built in benchmarks. For this test we use the Playback (Action) demo at 1680 x 1050 in DX9 mode on a GTX 280. The game is set to medium defaults with performance options set to high.

Far Cry 2

Crysis Warhead also ships with a number of built in benchmarks. Running on a GTX 280 at 1680 x 1050 we run the ambush timedemo with mainstream quality settings. Physics is set to enthusiast however to further stress the CPU.

Crysis Warhead

Our Dragon Age: Origins benchmark begins with a shift to the Radeon HD 5870. From this point on these games are run under our Bench refresh testbed under Windows 7 x64. Our benchmark here is the same thing we ran in our integrated graphics tests - a quick FRAPS walkthrough inside a castle. The game is run at 1680 x 1050 at high quality and texture options.

Dragon Age: Origins

We're running Dawn of War II's internal benchmark at high quality defaults. Our GPU of choice is a Radeon HD 5870 running at 1680 x 1050.

Dawn of War II

Our World of Warcraft benchmark is a manual FRAPS runthrough of a lightly populated server with no other player controlled characters around. The frame rates here are higher than you'd see in a real world scenario, but the relative comparison between CPUs is accurate.

We run on a Radeon HD 5870 at 1680 x 1050. We're using WoW's high quality defaults but with weather intensity turned down all the way.

World of Warcraft

For Starcraft II we're using our heavy CPU test. This is a playback of a 3v3 match where all players gather in the middle of the map for one large, unit-heavy battle. While GPU plays a role here, we're mostly CPU bound. The Radeon HD 5870 is running at 1024 x 768 at medium quality settings to make this an even more pure CPU benchmark.

Starcraft II

This is Civ V's built in Late GameView benchmark, the newest addition to our gaming test suite. The benchmark outputs three scores: a full render score, a no-shadow render score and a no-render score. We present the first and the last, acting as a GPU and CPU benchmark respectively. 

We're running at 1680 x 1050 with all quality settings set to high. For this test we're using a brand new testbed with 8GB of memory and a GeForce GTX 580.

Civilization V: Late GameView Benchmark

Civilization V: Late GameView Benchmark

Visual Studio 2008, Flash Video Creation, & Excel Performance Power Consumption
Comments Locked

283 Comments

View All Comments

  • aviat72 - Tuesday, January 4, 2011 - link

    Though SB will be great for some applications, there are still rough edges in terms of the overall platform. I think it will be best to wait for SNB-E or at least the Z68. SNB-E seems to be the best future-proofing bet.

    I also wonder how a part rated for 95W TDP was drawing 111W in the 4.4GHz OC (the Power Consumption Page). SB's power budget controller must be really smart to allow the higher performance without throttling down, assuming your cooling system can manage the thermals.
  • marraco - Tuesday, January 4, 2011 - link

    I wish to know more about this Sandy Bridge "feature":

    http://www.theinquirer.net/inquirer/news/1934536/i...
  • PeterO - Tuesday, January 4, 2011 - link

    Anand, Thanks for the great schooling and deep test results -- something surely representing an enormous amount of time to write, produce, and massage within Intel's bumped-forward official announcement date.

    Here's a crazy work-around question:

    Can I have my Quick Synch cake and eat my Single-monitor-with-Discrete-Graphics-card too if I, say:

    1). set my discrete card output to mirror Sandy Bridge's IGP display output;

    2). and, (should something exist), add some kind of signal loopback adapter to the IGP port to spoof the presence of a monitor? A null modem, of sorts?

    -- I have absolutely no mobo/video signaling background, so my idea may be laugh in my face funny to anybody who does but I figure it's worth a post, if only for your entertainment. :)
  • Hrel - Wednesday, January 5, 2011 - link

    It makes me SO angry when Intel does stupid shit like disable HT on most of their CPU's even though the damn CPU already has it on it, they already paid for. It literally wouldn't cost them ANYTHING to turn HT on those CPU's yet the greedy bastards don't do it.
  • Moizy - Wednesday, January 5, 2011 - link

    The HD Graphics 3000 performance is pretty impressive, but won't be utilized by most. Most who utilize Intel desktop graphics will be using the HD Graphics 2000, which is okay, but I ran back to the AMD Brazos performance review to get some comparisons.

    In Modern Warfare 2, at 1024 x 768, the new Intel HD Graphics 2000 in the Core i3 2100 barely bests the E-350. Hmm--that's when it's coupled with a full-powered, hyper-threaded desktop compute core that would run circles around the compute side of the Brazos E-350, an 18w, ultra-thin chip.

    This either makes Intel's graphics less impressive, or AMD's more impressive. For me, I'm more impressed with the graphics power in the 18w Brazos chip, and I'm very excited by what mainstream Llano desktop chips (65w - 95w) will bring, graphics-wise. Should be the perfect HTPC solution, all on the CPU (ahem, APU, I mean).

    I'm very impressed with Intel's video transcoding, however. Makes CUDA seem...less impressive, like a bunch of whoop-la. Scary what Intel can do when it decides that it cares about doing it.
  • andywuwei - Wednesday, January 5, 2011 - link

    not sure if anybody else noticed. CPU temp of the i5@3.2GHz is ~140 degrees. any idea why it is so high?
  • SantaAna12 - Wednesday, January 5, 2011 - link

    Did I miss the part where you tell of about the DRM built into this chip?
  • Cb422 - Wednesday, January 5, 2011 - link

    When will Sandy Bridge be available on Newegg or Amazon for me to purchase?
  • DesktopMan - Thursday, January 6, 2011 - link

    Very disappointed in the lack of vt-d and txt on k-variants. They are after all the high end products. I also find the fact that only the k-variants having the faster GPU very peculiar, as those are the CPUs most likely to be paired with a discrete GPU.
  • RagingDragon - Thursday, January 6, 2011 - link

    Agreed. I find the exclusion of VT-d particularly irritating: many of the overclockers and enthusiasts to whom the K chips are marketed also use virtualization. Though I don't expect many enthusiasts, if any, to miss TXT (it's more for locked down corporate systems, media appliances, game consoles, etc.).

    With the Z68 chipset coming in the indeterminate near future, the faster GPU on K chips would have made sense if the K chips came with every other feature enabled (i.e. if they were the "do eveything chips").

    Also, I'd like to have the Sandy Bridge video encode/decode features separate from the GPU functionality - i.e. I'd like to choose between Intel and Nvidia/AMD video decode/encode when using a discrete GPU.

Log in

Don't have an account? Sign up now