Gaming Performance

There's simply no better gaming CPU on the market today than Sandy Bridge. The Core i5 2500K and 2600K top the charts regardless of game. If you're building a new gaming box, you'll want a SNB in it.

Our Fallout 3 test is a quick FRAPS runthrough near the beginning of the game. We're running with a GeForce GTX 280 at 1680 x 1050 and medium quality defaults. There's no AA/AF enabled.

Fallout 3

In testing Left 4 Dead we use a custom recorded timedemo. We run on a GeForce GTX 280 at 1680 x 1050 with all quality options set to high. No AA/AF enabled.

Left 4 Dead

Far Cry 2 ships with several built in benchmarks. For this test we use the Playback (Action) demo at 1680 x 1050 in DX9 mode on a GTX 280. The game is set to medium defaults with performance options set to high.

Far Cry 2

Crysis Warhead also ships with a number of built in benchmarks. Running on a GTX 280 at 1680 x 1050 we run the ambush timedemo with mainstream quality settings. Physics is set to enthusiast however to further stress the CPU.

Crysis Warhead

Our Dragon Age: Origins benchmark begins with a shift to the Radeon HD 5870. From this point on these games are run under our Bench refresh testbed under Windows 7 x64. Our benchmark here is the same thing we ran in our integrated graphics tests - a quick FRAPS walkthrough inside a castle. The game is run at 1680 x 1050 at high quality and texture options.

Dragon Age: Origins

We're running Dawn of War II's internal benchmark at high quality defaults. Our GPU of choice is a Radeon HD 5870 running at 1680 x 1050.

Dawn of War II

Our World of Warcraft benchmark is a manual FRAPS runthrough of a lightly populated server with no other player controlled characters around. The frame rates here are higher than you'd see in a real world scenario, but the relative comparison between CPUs is accurate.

We run on a Radeon HD 5870 at 1680 x 1050. We're using WoW's high quality defaults but with weather intensity turned down all the way.

World of Warcraft

For Starcraft II we're using our heavy CPU test. This is a playback of a 3v3 match where all players gather in the middle of the map for one large, unit-heavy battle. While GPU plays a role here, we're mostly CPU bound. The Radeon HD 5870 is running at 1024 x 768 at medium quality settings to make this an even more pure CPU benchmark.

Starcraft II

This is Civ V's built in Late GameView benchmark, the newest addition to our gaming test suite. The benchmark outputs three scores: a full render score, a no-shadow render score and a no-render score. We present the first and the last, acting as a GPU and CPU benchmark respectively. 

We're running at 1680 x 1050 with all quality settings set to high. For this test we're using a brand new testbed with 8GB of memory and a GeForce GTX 580.

Civilization V: Late GameView Benchmark

Civilization V: Late GameView Benchmark

Visual Studio 2008, Flash Video Creation, & Excel Performance Power Consumption
Comments Locked

283 Comments

View All Comments

  • Loki726 - Monday, January 3, 2011 - link

    Thanks a ton Anand for adding a compiler benchmark. I spent the vast majority of my time on builds and this will help me spec out a few new machines. It's interesting to see results indicating that I should not go anywhere near a low-end Sandybridge system, and that a lot of cheap AMD cores might not be a bad idea.
  • estee - Monday, January 3, 2011 - link

    Can't believe the 23.976Hz output bug is still in SB after all this time. Several years ago, the G35 had this issue and Intel proclaimed they'll have a fix for it. Subsequently, G45 still had the problem and even the iCores, but SB? C'mon....it's a big issue for HTPC buffs, because there's too much judder from 1) LCD displays 2) 3:2 cadencing from film to video conversion, so 1:1 (or rather 5:5 for most 120Hz sets) was a must for large screen HPTC setups. Yes, the bitstreaming is good and all, but most folks are content with just 7.1 DD/DTS output. I guess we'll have to wait (again) for iB and cling on to my ol' nVidia 9300 for now. :(
  • mastrdrver - Monday, January 3, 2011 - link

    Was just looking at the pictures that are downloadable and comparing and notice a couple of differences. Maybe they are just a driver tweak but I thought I remember ATI and/or nVidia getting slammed in the past for pulling similar tactics.

    The first thing I notice was when comparing the AA shots in COD. It appears that maybe the Sandy Bridge graphics isn't applying AA to the twigs in the ground. Or is this just an appearance thing where Intel might have a different algorithm that causing this?

    The second is a little more obvious to me. In the Dirt 2 pictures I notice that Sandy Bridge is blurring and not clearly rendering the distance objects. The sign to the right side is what caught my eye.

    One last thing is the DAO pictures. I've seen someone (in the past) post up pictures of the same exact place in the game. The quality looks a lot better then what Anand has shown and I was wondering if that is correct. I don't have the game so I have no way to confirm.

    As always Anand I appreciate the time you and your staff take to do all of your articles and the quality that results. Its just one of the reasons why I've always found myself coming back here ever since the early years of your website.
  • RagingDragon - Monday, January 3, 2011 - link

    Why don't K series parts get the full suite of virtualization features?
  • xxtypersxx - Monday, January 3, 2011 - link

    Anand,
    Great review as always, I love the in depth feature analysis that Anandtech provides.

    Bios updates have been released for Gigabyte, Asus, and Intel P67 boards that correct an internal PLL overvolt issue that was artificially limiting overclocks. Users in the thread over at HWbot are reporting that processors that were stuck at 4.8 before are now hitting 5.4ghz.
    http://hwbot.org/forum/showthread.php?t=15952

    Would you be able to do a quick update on the overclocking results for your chips with the new BIOS updates?
  • Gothmoth - Monday, January 3, 2011 - link

    ".....Sandy Bridge will be worth the upgrade for Quick Sync alone."

    you say that and a few pages before you say it will not work on PC´s with a discreet grafic card.

    i don´t know you but videoencoding is done here on performance systems.
    system that have discreet GFX cards like a 460 GTX or better.

    and i think most enthusiast will buy a P67 mainboard and that would mean NO QUICK SYNC for them.

    so please do an update on your review and clarify what exactly happens when you use a P67 mainboard with a discreet GFX card.

    will quick sync really don´t work...??
  • Gothmoth - Monday, January 3, 2011 - link

    please make clear how you have tested quick sync in your review.

    i saw a few comments from people that are confused about your review.
    i guess you tested quick sync on an H67 mainboard but i did not notice that you mentioned that in the text.

    for my it looks liek intel is screwing the user who buy this 1. generation sandy bridge chipsets.

    i will wait for Z68 thats for sure......
  • Manabu - Monday, January 3, 2011 - link

    In the quick sync test I missed a comparison with x264, that is currently the fastest and highest quality encoder for H.264, on an fast CPU. For example, using the presets superfast and very slow (one for speed with reasonable quality, the other for quality with reasonable speed). Also, with an too high bitrate, even the crapiest encoder will look good...

    I also wanted to see how low you can undervolt an i5-2400 when it has hit the overclocking cap, and how is the power consumption then. The same for the other locked CPUs would be cool too. Also, what is the power consumption of the sandy bridge CPUs running the quick sync hardware encoder?
  • NJoy - Monday, January 3, 2011 - link

    Wow, what a SLAP in AMD's face! The idea they nursed for gazillion years and were set to finally release somewhere this week is brought to you, dear customer, first to the market, with a sudden change in NDA deadline to please you sooner with a hyperperformer from Intel. Who cares that NDAs make an important play in all planning activities, PR, logistics and whatever follows - what matters is that they are first to put the GPU on-die and this is what the average Joe will now know, with a bit of PR, perhaps. Snatch another design win. Hey, AMD, remember that pocket money the court ordered us to pay you? SLAP! And the licence? SLAP! Nicely planned and executed whilst everyone was so distracted with the DAAMIT versus nVidia battles and, ironically, a lack of leaks from the red camp.
    I just hope Bulldozer will kick some assess, even though I doubt it's really going to happen...
  • DanNeely - Monday, January 3, 2011 - link

    If AMD didn't put a steel toed boot into their own nuts by blowing the original 09Q3 release date for fusion I'd have more sympathy for them. Intel won because they made their launch date while the competition blew theirs by at least half a year.

Log in

Don't have an account? Sign up now