DiRT 3

DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.

One 7970

Dirt 3 - One 7970, 1440p, Max Settings

While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K.  The 4770K has a small but ultimately unnoticable advantage in gameplay.

Two 7970s

Dirt 3 - Two 7970s, 1440p, Max Settings

When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.

Three 7970s

Dirt 3 - Three 7970, 1440p, Max Settings

Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K.  The 4770K is slightly ahead of the 3770K at x8/x4/x4, suggesting a small IPC difference,

Four 7970s

Dirt 3 - Four 7970, 1440p, Max Settings

At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).

One 580

Dirt 3 - One 580, 1440p, Max Settings

Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.

Two 580s

Dirt 3 - Two 580s, 1440p, Max Settings

Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.

DiRT 3 conclusion

Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
POST A COMMENT

111 Comments

View All Comments

  • MarcVenice - Tuesday, June 04, 2013 - link

    Please, for the love of god, add a game like Crysis 3 or Far Cry 3. Your current games are all very old, and you will see a bigger difference in newer games. Reply
  • garrun - Tuesday, June 04, 2013 - link

    Agree with request for Crysis 3. It has enough options to deliver a great visual experience and GPU beating, and it also scales well to multi-monitor resolutions for testing at extremes. Reply
  • BrightCandle - Tuesday, June 04, 2013 - link

    gamegpu.ru have done a lot of testing on all games with a variety of CPUs. Anandtech's choice of games actually edge cases. Once you start looking at a wider list of games (Just do a few CPUs but lots of games) you'll see a much bigger trend of performance difference especially in a lot of the non AAA titles. Around 50% of games show a preference for 3930k's at this point over a 2600k, so more multithreading is start to appear but you need to test a lot more games or you wont catch that trend and instead come to a misleading conclusion. Reply
  • ninjaquick - Tuesday, June 04, 2013 - link

    I am not sure that the CPU is used any more in more recent games. This is a CPU test, and testing older games that are known to be CPU dependent is a must.

    Moving forward, with the next gen consoles that is, testing the absolute newest multiplatform games will be a bit more relevant. However, even Farcry 3 and Crysis 3 are mostly GPU bound, so there will be little to no difference in performance by changing the CPUs out.
    Reply
  • superjim - Tuesday, June 04, 2013 - link

    Was thinking the same. Tomb Raider, BF3, Crysis 3, hell even Warhead would be good. Reply
  • garrun - Tuesday, June 04, 2013 - link

    I think Supreme Commander or Supreme Commander 2 would make an excellent CPU demo. Those games have been, and remain CPU limited in a way no other games are, and for good reasons (complexity, AI, unit count), rather than poor coding. A good way to do this is to record a complex 8 player game against AI and then play it back at max speed, timing the playback. That benchmark responds pretty much 1:1 with clock speed increases and also has a direct improvement effect on gameplay when dealing with large, complex battles with thousands of units on map. The upcoming Planetary Annihilation should also be a contender for this, but isn't currently in a useful state for benchmarking. Reply
  • Traciatim - Tuesday, June 04, 2013 - link

    I kind of hope Planetary Annihilation will have both server and client benchmarks available, since this seems like it would be a pretty amazing platform for benchmarking. Reply
  • IanCutress - Tuesday, June 04, 2013 - link

    Interesting suggestion - is SupCom2 still being updated for performance in drivers? Does playback come out with the time automatically or is it something I'll have to try and code with a batch file. Please email me with details if you would like, I've never touched SupCom2 before.

    Ian
    Reply
  • yougotkicked - Tuesday, June 04, 2013 - link

    this sounds quite interesting, though I wonder if the AI is runtime bound rather than solution bound, as this could make the testing somewhat nondeterministic.

    To clarify what I mean; a common method in AI programming is to let algorithms continue searching for better and better solution, interrupting the algorithm when a time limit has passed and taking the best solution it has found so far. Such approaches can result in inconsistent gameplay when pitting multiple AI units against each other, which may change the game state too much between trials to serve as a good testing platform.

    Even if the AI does use this approach it may not bias the results enough to matter, so I guess the only way to be sure is to run the tests a few times and see how consistent the results are on a single test system.
    Reply
  • Zoeff - Tuesday, June 04, 2013 - link

    Forget about SupCom2 - That game has been scaled down quite a bit compared to SupCom1 and isn't as demanding to CPUs. There's also an active SupCom1 community that has and still is pushing out community made patches. :-)

    SupCom actually has a build-in benchmark that plays a scripted map with some fancy camera work. Anyone can launch this by adding "/map perftest" to your shortcut. That said, it doesn't seem to be working properly anymore after several patches nor does it actually give any useful data as the sim score is capped at 10k for today's CPUs. And yet it's extremely easy to cripple any CPU you throw at it when simply playing the game. Just open up an 81x81km map with 7 AI enemies and watch your computer slow to a crawl as the map starts filling up.

    And yes, the AI is "solution bound". Replays of recorded games with AI in them wouldn't work otherwise.

    I wonder if somebody could create a custom SupCom1 benchmark... *Hint Hint*
    Reply

Log in

Don't have an account? Sign up now