Gaming Performance

In testing Left 4 Dead we use a custom recorded timedemo. We run on a GeForce GTX 280 at 1680 x 1050 with all quality options set to high. No AA/AF enabled.

Left 4 Dead

Far Cry 2 ships with several built in benchmarks. For this test we use the Playback (Action) demo at 1680 x 1050 in DX9 mode on a GTX 280. The game is set to medium defaults with performance options set to high.


Far Cry 2

Crysis Warhead also ships with a number of built in benchmarks. Running on a GTX 280 at 1680 x 1050 we run the ambush timedemo with mainstream quality settings. Physics is set to enthusiast however to further stress the CPU.

Crysis Warhead

Our Dragon Age: Origins benchmark begins with a shift to the Radeon HD 5870. From this point on these games are run under our Bench refresh testbed under Windows 7 x64. Our benchmark here is the same thing we ran in our integrated graphics tests - a quick FRAPS walkthrough inside a castle. The game is run at 1680 x 1050 at high quality and texture options.


Dragon Age: Origins

We're running Dawn of War II's internal benchmark at high quality defaults. Our GPU of choice is a Radeon HD 5870 running at 1680 x 1050.

Dawn of War II

Our World of Warcraft benchmark is a manual FRAPS runthrough of a lightly populated server with no other player controlled characters around. The frame rates here are higher than you'd see in a real world scenario, but the relative comparison between CPUs is accurate.

We run on a Radeon HD 5870 at 1680 x 1050. We're using WoW's high quality defaults but with weather intensity turned down all the way.

World of Warcraft

For Starcraft II we're using our heavy CPU test. This is a playback of a 3v3 match where all players gather in the middle of the map for one large, unit-heavy battle. While GPU plays a role here, we're mostly CPU bound. The Radeon HD 5870 is running at 1024 x 768 at medium quality settings to make this an even more pure CPU benchmark.


Starcraft II

This is Civ V's built in Late GameView benchmark, the newest addition to our gaming test suite. The benchmark outputs three scores: a full render score, a no-shadow render score and a no-render score. We present the first and the last, acting as a GPU and CPU benchmark respectively. 

We're running at 1680 x 1050 with all quality settings set to high. For this test we're using a brand new testbed with 8GB of memory and a GeForce GTX 580.

Civilization V: Late GameView Benchmark

Civilization V: Late GameView Benchmark

Visual Studio 2008: Compiler Performance, FLV Creation & Excel Perf Power Consumption
Comments Locked

78 Comments

View All Comments

  • silverblue - Wednesday, May 4, 2011 - link

    Phenom II came to market after Nehalem. Yes, it wasn't exactly meant to compete with it, and was more about sorting out what Phenom did wrong, but the unfortunate truth is that the i7 beat it to market.

    529th - I think the issue would be that you would be disabling a feature and thus not showing accurate performance, however that in itself would show the difference given by having Turbo in the first place. Doing the same with Thuban would be interesting.

    Zosma would've made more sense than continuing on with Deneb, however AMD quickly shelved that idea.
  • GullLars - Wednesday, May 4, 2011 - link

    Nah, it's just Intel has a huge advantage on fabs. They also have a clock for clock advantage on most workloads, but AMD's engineers are by no means incompetent. Brazos is the only recent great product though.
    If both Bulldozer and Lano fails spectacularly, i'll start leaning towards your side though. The last years Intel have alienated me with horrible business ethics and artificial restrictions on processors for market segmentation with very high prices on the chips that were not gimped (disabled functional units).
    Hopefully power consumption will be better with GloFo.
  • silverblue - Wednesday, May 4, 2011 - link

    Weren't you banned? No? Oh.

    Intel actually DOES need to make its compilers more vendor-agnostic since the settlement. If a piece of software works rather poorly on an AMD CPU and much better for an Intel offering, this isn't always down to the Intel CPU being stronger; it might actually be the result of the AMD CPU working on an inefficient codepath. Of course, it may actually be the limit of the processor. If something is compiled using a neutral compiler, I'd be far more inclined to trust the results of that.

    I doubt Intel is forcing people to use their compiler nor are they making them develop for Intel-only platforms, however in the interests of fair competition it's only right that checking for the CPU ID tag be removed, if that is what they're doing.
  • silverblue - Wednesday, May 4, 2011 - link

    Since you're such an expert, could you perhaps tell us, oh mighty one, what errata requires fixing? What is so buggy about an AMD processor?

    If you're going to make such bold statements, it's about time you actually backed them up with FACTS.
  • silverblue - Thursday, May 5, 2011 - link

    Still waiting.
  • silverblue - Wednesday, May 4, 2011 - link

    The most popular CPUs are towards the lower end. Intel doesn't have a quad core CPU below $100 whereas AMD has multiple. There's no reason why an Athlon II X4 setup shouldn't easily undercut an i3 setup, and occasionally, four true cores will be a benefit.
  • wh3resmycar - Wednesday, May 4, 2011 - link

    why and why would are you still guys sticking to 1680x1050? don't give me the crap about correctly segmenting processor performance hence a cpu bound resolution for gameX.

    people who actually are gamers, who are looking for a quadcore and up cpu solutions are probably gaming at HD resolutions and upwards.

    at least add graphs that are done on higher resolutions. your gaming performance benchmarks don't work in the real world.
  • stimudent - Sunday, May 8, 2011 - link

    This is great hardware, but it seems that most everyone I know who is an average user has left their tower systems and switched to smart phones, iPads, and gaming consoles for the most part. These items are now being discussed at family get togethers instead of PCs as was the case just a few years ago. Some advanced users are using the latest graphics cards for computational projects instead of gaming - the CPU is no longer seen as the best way to fold. It's the graphics card doing the grunt work while the quad-core processor sits mostly idle.

Log in

Don't have an account? Sign up now