DiRT 3

DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.

One 7970

DiRT 3 - One 7970, 1440p, Max Settings

While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K.

Two 7970s

DiRT 3 - Two 7970s, 1440p, Max Settings

When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.

Three 7970s

DiRT 3 - Three 7970, 1440p, Max Settings

Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K.

Four 7970s

DiRT 3 - Four 7970, 1440p, Max Settings

At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).

One 580

DiRT 3 - One 580, 1440p, Max Settings

Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.

Two 580s

DiRT 3 - Two 580s, 1440p, Max Settings

Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.

DiRT 3 conclusion

Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
Comments Locked

242 Comments

View All Comments

  • colonelclaw - Thursday, May 9, 2013 - link

    Jarred, I would fully agree with banning any person who continually makes no contribution to the discussion. These comment sections often supply me with useful information, and can be read as a continuation of the article itself. Having to hunt for the valuable opinions amongst piles of cretins and idiots makes me want to go elsewhere.
  • extide - Thursday, May 9, 2013 - link

    Please ban him, and I would consider myself a pretty solidly Intel guy myself, but you have to be realistic. Sheesh!
  • Blibbax - Thursday, May 9, 2013 - link

    Ban him and anyone else remotely similar.
  • duploxxx - Friday, May 10, 2013 - link

    if all would start voting to ban this person there would be a huge amount or thread reply :)
  • Jon Tseng - Wednesday, May 8, 2013 - link

    Hmmm. So bottom line is my 2007-vintage QX6850 is perfectly good a 1080p so long as I get the a decent GPU.

    Bizarro state of affairs when a 6 year old CPU is perfectly happy running cutting edge games. Not sure if I should blame the rise of the GPU or the PS3/XBox360 for holding back gaming engines for so long!
  • TheInternal - Wednesday, May 8, 2013 - link

    In games that are CPU limited (like Skyrim or Arkham Asylum), no. I continue to get the impression from both personal experience and articles/reviews like this that once you have "enough" CPU power, the biggest limiting factor is the GPU. "Enough" often seems to be a dual core operating at 3.0GHz, but newer titles and CPU bound titles continue to raise the bar.
  • Azusis - Wednesday, May 8, 2013 - link

    Agreed. Especially in multiplayer situations. Try running PlanetSide 2 or Natural Selection 2 with a core2quad like I do. It isn't pretty. But just about any other singleplayer game... sure, no problem.
  • TheInternal - Wednesday, May 8, 2013 - link

    So... these were all tested on a single monitor? Though the article has lots of interesting information, I'd argue that doing these tests on a three monitor 1440p setup would show much more useful information that consumers looking at these setups would be able to apply to their purchasing decisions. It's great to see more reviews on different CPU + multiple GPU configurations, as well as the limitations of such settings, but by limiting such tests to an increasingly unlikely usage scenario of a single monitor, the data becomes somewhat esoteric.
  • Kristian Vättö - Wednesday, May 8, 2013 - link

    Did you mean three 1080p monitors (i.e. 5760x1080) by any chance? 7680x1440 is a very, very rare setup especially for a gamer. For work purposes (e.g. graphics designer, video editor etc) it can be justified as the extra screen estate can increase productivity, but I've never seen a gamer with such setup (heck, the monitors alone will cost you close to $2000!). I'm not saying there aren't any but it's an extreme minority and I'm not sure if it's worth it to spend hours, even days, testing something that's completely irrelevant to most of our readers.

    Furthermore, while I agree that 5760x1080 tests would be useful, keep in mind that Ian already spend months doing this article. The testing time would pretty much double if you added a second monitor configuration as you'd have to run all tests on both configs. Maybe this is something Ian can add later? There is always the trouble of timing as if you start including every possible thing, your article will be waaay outdated when it's ready to be published.
  • TheInternal - Thursday, May 9, 2013 - link

    I didn't mean three 1080p monitors, which does seem to be the "common" three monitor configuration I've seen most gamers going for (since it's cheap to do with 24" panels being under $200 a pop) My 27" S-IPS 2560x1440 panel cost about $300, so I'm not sure where you're getting the $2000 figure from... and if you spend $1500-$2000 on the graphics subsystem, why wouldn't you be spending at least half as much on the monitors?

    Most modern high-end graphics cards should be able to easily handle three 1080p monitors in a three card config... possibly a two card config... a round up like this would be much more useful to consumers if it did include such information... as well as show just how well the different CPU and GPU combos worked with multiple monitors.

Log in

Don't have an account? Sign up now