DiRT 3

DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.

One 7970

DiRT 3 - One 7970, 1440p, Max Settings

While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K.

Two 7970s

DiRT 3 - Two 7970s, 1440p, Max Settings

When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.

Three 7970s

DiRT 3 - Three 7970, 1440p, Max Settings

Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K.

Four 7970s

DiRT 3 - Four 7970, 1440p, Max Settings

At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).

One 580

DiRT 3 - One 580, 1440p, Max Settings

Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.

Two 580s

DiRT 3 - Two 580s, 1440p, Max Settings

Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.

DiRT 3 conclusion

Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
Comments Locked

242 Comments

View All Comments

  • DigitalFreak - Thursday, May 9, 2013 - link

    Do people not read the article? He works with what he has on hand and what he can get access to.
  • tackle70 - Thursday, May 9, 2013 - link

    Great article! The only gripe I would have (and yes I know the reasoning behind it is explained) is the decision not to include Crysis 3 in the testing.

    The reason I make that gripe is the even though it has no time demo functionality and adds more work is that it is the closest thing to a next-gen game we have right now, and it is also the *only* game I've seen that reliably eats up as much CPU power and as many cores as you give it. It would have been interesting to see it here.
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    Core i7 860 overclocked at 3.6Ghz, GTX580 sli Pci2 x8/x8 = min 44, average 49. Final scene destroy the Ceph Alpha. No overclock on GPUs but plenty of headroom. Not scientific but would be useful to see same scene if someone has a more up to date processor.
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    REs 1920 x 1080
  • DigitalFreak - Thursday, May 9, 2013 - link

    Suckie suckie, two dolla?
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    Great review, the CPU has definitely become less important. I used to change my CPU around every 18 months or my system would show signs of struggling. I bought my i860 in 2009 and it is sitting alongside two GTX 580s (SLI x8/x8). Nearly four years seems like an eternity, got my first GTX580 in early 2011 is the longest I have kept with the same GPU. Shows you that games developers don't challenge the hardware like they used too.
  • SurrenderMonkey - Thursday, May 9, 2013 - link

    People who make comments like this do not understand that it is about making a properly balanced system so that you get maximum bang for your bucks. This takes skill and a proper understanding of hardware capabilities and technology. On a gaming system you can trade down on a processor and buy a better GPU (or an SSD or both). When you get it right you get more FPS for the same or less money, faster loading times, and have overclocking headroom to use at a later date.
  • oSHINSAo - Thursday, May 9, 2013 - link

    Well i thought my 2600k was old... but im looking is too near to 3rd gen i7 3770k ... will stick with it, and focus on getting CrossfireX config...
  • T1K0L P0G1 - Friday, May 10, 2013 - link

    EXCELLENT WORK!!!
  • gnasen - Friday, May 10, 2013 - link

    Nice article. Still missing few of the legends: Q9550, i7-920, i5-750.

Log in

Don't have an account? Sign up now