Gaming Performance

In testing Left 4 Dead we use a custom recorded timedemo. We run on a GeForce GTX 280 at 1680 x 1050 with all quality options set to high. No AA/AF enabled.

Left 4 Dead

Far Cry 2 ships with several built in benchmarks. For this test we use the Playback (Action) demo at 1680 x 1050 in DX9 mode on a GTX 280. The game is set to medium defaults with performance options set to high.


Far Cry 2

Crysis Warhead also ships with a number of built in benchmarks. Running on a GTX 280 at 1680 x 1050 we run the ambush timedemo with mainstream quality settings. Physics is set to enthusiast however to further stress the CPU.

Crysis Warhead

Our Dragon Age: Origins benchmark begins with a shift to the Radeon HD 5870. From this point on these games are run under our Bench refresh testbed under Windows 7 x64. Our benchmark here is the same thing we ran in our integrated graphics tests - a quick FRAPS walkthrough inside a castle. The game is run at 1680 x 1050 at high quality and texture options.


Dragon Age: Origins

We're running Dawn of War II's internal benchmark at high quality defaults. Our GPU of choice is a Radeon HD 5870 running at 1680 x 1050.

Dawn of War II

Our World of Warcraft benchmark is a manual FRAPS runthrough of a lightly populated server with no other player controlled characters around. The frame rates here are higher than you'd see in a real world scenario, but the relative comparison between CPUs is accurate.

We run on a Radeon HD 5870 at 1680 x 1050. We're using WoW's high quality defaults but with weather intensity turned down all the way.

World of Warcraft

For Starcraft II we're using our heavy CPU test. This is a playback of a 3v3 match where all players gather in the middle of the map for one large, unit-heavy battle. While GPU plays a role here, we're mostly CPU bound. The Radeon HD 5870 is running at 1024 x 768 at medium quality settings to make this an even more pure CPU benchmark.


Starcraft II

This is Civ V's built in Late GameView benchmark, the newest addition to our gaming test suite. The benchmark outputs three scores: a full render score, a no-shadow render score and a no-render score. We present the first and the last, acting as a GPU and CPU benchmark respectively. 

We're running at 1680 x 1050 with all quality settings set to high. For this test we're using a brand new testbed with 8GB of memory and a GeForce GTX 580.

Civilization V: Late GameView Benchmark

Civilization V: Late GameView Benchmark

Visual Studio 2008: Compiler Performance, FLV Creation & Excel Perf Power Consumption
Comments Locked

78 Comments

View All Comments

  • Action_Parsnip - Tuesday, May 3, 2011 - link

    "I don't wanna be negative, but I know for a fact that Bulldozer will not beat Intel's upcoming LGA 2011 CPUs. Maybe Bulldozer is meant to compete with Sandy-Bridge"

    Unless you've seen the prototypes you do not know for a fact. PERIOD.

    "Intel has silicon running stock air-cooled 4GHz and beyond in house, guaranteed."

    Unless you've been 'in-house' you cannot guarantee that. PERIOD.
  • Orwell - Tuesday, May 3, 2011 - link

    What about overclocking the CPU-NB of the chip?

    It has been proved useful in one of the Phenom II X6 reviews (can't find it now though), where performance in HAWX just shot right up bij about 20% I believe when upping from 2GHz to 3GHz.

    It's a shame most if not all reviewers don't overclock their uncores. Or, well, at least, they're not telling you and they don't put a CPU-Z Memory Tab screenshot in the review, showing the Uncore-frequency.

    I know pretty much all hope is lost for this aging design (Deneb), but as an owner of this furnace CPU called the Phenom II X4 C2 (yes, 140W at 3.4GHz), I'd like to know how much faster the Intels are compared to my 3.7GHz/2.4GHz oc.
  • Stas - Tuesday, May 3, 2011 - link

    I haven't had an Intel CPU since P4 Willamette. I've been happy with AMD bang-for-buck, as performance seemed sufficient, and overclock always covered any shortcomings. Nowadays, I see mid-level Intel CPUs beat AMDs top-end offerings every release. And honestly, I'm really bottlenecking in the CPU department, but I don't see AMD offering a solution (running Phenom X3 @x4 3.5Ghz). I've been waiting for 2 years to upgrade the processor, and I'm getting tired of this. Don't make me cross the Sandy Bridge, AMD. Make BD happen. And it better be good.
  • jabber - Tuesday, May 3, 2011 - link

    I often see the reviews with Intels chips running ahead.

    Then I think "hang on though, the AMD chip gives me 60fps+ and costs half the price including the motherboard tax!"

    Then its not so bad.
  • starfalcon - Monday, May 9, 2011 - link

    But then Ivy Bridge comes just a few months after BD, so what happens then?
  • raevoxx - Tuesday, May 3, 2011 - link

    Where I work, AMD still outsells Intel by a factor of.. well... over 20-to-1 if not more. What we always tell customers, is that if price is no object, the Intel platforms are higher performance. But best bang-for-the-buck is AMD, and it's not like we're comparing an i7 to a 486-SX. We try to explain it in best terms, but there's Intel processors in our display case that are actually gathering dust. Which frankly makes an AMD fan like me happy :) But I digress.

    We carry a full line of Intels, from the Celeron cheapies, all the way up to i7. And we finally closed out all of our 1156s and only sell 1155s.

    Like it or not, our customers are amazed that they can pair up a decent mobo, an Athlon II 250, and 4GB of 1333, for less than $160. Most spring for the 1075T for the price, too. Whether or not it's faster than a similarly priced i5, people like the ability to say they have six cores. When they can get 75% the performance of a comparable chip, for less than 50% of the cost... people bite.

    We're quite excited to start carrying Zambezi chips, when we're able to, since they'll be more competitive. But it's always going to be darkest before the dawn, and it's nice that AMD is throwing in a speed bump or two (1100T, etc) before the architecture change. Instead of letting their chips languish until BD.
  • jabber - Tuesday, May 3, 2011 - link

    Must admit I havent ever made an Intel box for a customer, its always AMD.

    I check out the Intel CPU range every now and then and check what price the bottom non Celeron Intel chip is going for, then the cheapest decent brand Intel motherboard and after seeing any profit just vaporise, I roll my eyes and go back to the AMD section.

    Intel isnt worth the extra cost for most ordinary folks. Intel are total overkill. The good old 3GHz dual core Athlon with a mATX MB and 4GB of 1333DDR3 works a charm everytime.

    If a customer came to me and said he wanted to do loads of transcoding and video editing and had £1200 to spend then lets go Intel. But as most come to me with a budget of £4-500 and I need to take my cut, its not going to happen.
  • MilwaukeeMike - Tuesday, May 3, 2011 - link

    This is exactly how I feel. I've always owned AMD because it's been fast enough and cheaper. If I had to build a PC today I'd choose a Sandy Bridge processor, but i'm not building one because my AMD 955 BE still does everything I need it to. I have tons of windows open on my 22" monitor and play my games on my 23" and have no issues.

    A lot of that Intel performance gain falls into the 'can't even tell' category for many users.
  • Peroxyde - Tuesday, May 3, 2011 - link

    The saving made by buying AMD, would you pay it back in electricity? Let's say, after 2 years? Just want to see if the higher power consumption would translate somewhere. If any of you have done any comparison in this area, I would appreciate very much if you can give some highlights.
  • jabber - Wednesday, May 4, 2011 - link

    It would take longer than the life of the Intel i3 box to make back the £65+ extra the intel box cost me in power savings.

    If the difference was £10 then yes but £65 ($104) is just too great to make back.

    AMD still wins for a standard system cost wise. As these are PCs for Joe User and not overclockers then you can switch on the power saving settings anyway.

    Plus they rarely run at 100% for very long.

    Intel still isnt competative at the increasingly growing low end customer group.

    Most people dont need 4GHz+ leviathan power CPUs anymore. If anything Intels future customers at the top end will be getting a smaller and smaller group.

    How many of us here still demand the top end (or as close to) CPU we can buy? I bet many of us are now happy to make do with a mid-range or less CPU and spend the saving elsewhere.

Log in

Don't have an account? Sign up now