DiRT 3

DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.

One 7970

Dirt 3 - One 7970, 1440p, Max Settings

While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K.  The 4770K has a small but ultimately unnoticable advantage in gameplay.

Two 7970s

Dirt 3 - Two 7970s, 1440p, Max Settings

When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.

Three 7970s

Dirt 3 - Three 7970, 1440p, Max Settings

Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K.  The 4770K is slightly ahead of the 3770K at x8/x4/x4, suggesting a small IPC difference,

Four 7970s

Dirt 3 - Four 7970, 1440p, Max Settings

At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).

One 580

Dirt 3 - One 580, 1440p, Max Settings

Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.

Two 580s

Dirt 3 - Two 580s, 1440p, Max Settings

Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.

DiRT 3 conclusion

Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
Comments Locked

116 Comments

View All Comments

  • majorleague - Wednesday, June 5, 2013 - link

    Here is a youtube link showing 3dmark11 and windows index rating for the 4770k 3.5ghz Haswell. Not overclocked.
    This is apparently around 10-20fps slower than the 6800k in most games. And almost twice the price!!
    Youtube link:
    http://www.youtube.com/watch?v=k7Yo2A__1Xw
  • kilkennycat - Wednesday, June 5, 2013 - link

    Quote:" The only way to go onto 3-way or 4-way SLI is via a PLX 8747 enabled motherboard, which greatly enhances the cost of a motherboard build. This should be kept in mind when dealing with the final results."

    The only way? X79 supports up to 4 8X channels of PCie 2/3.
    The 4-core 3820 overclocks readily and on a X79 board is a very small cost enhancement
    over a high-end non-PLX8747 1155-socket setup. Plus the upgrade benefit of stepping up to the 6-core 3930K if one wants to combine usage for professional multicore applications with gaming.
  • random2 - Wednesday, June 5, 2013 - link

    "What we see is 30.73% of gamers running at 1080p, but 4.16% of gamers are above 1080p."

    So an article and benches are provided for the benefit of 4.16% of the gamers who might be running more pixels vs the 65% (almost 3 million) lions share of gamers that must be running at fewer pixels than found at 1080p. Very strange.
  • Dribble - Thursday, June 6, 2013 - link

    Just to point out the blindingly obvious but who would spend big $$$ on a 1440p monitor and a top end gpu and then buy a low end budget cpu (A8-5600)...

    The realistic min recommendation is going to be a i3570K.
  • xineis - Thursday, June 6, 2013 - link

    So, how would a 955BE perform compared to the CPUs on the test? From what I understand, I should just keep this CPU, as a new one is not going to make much of a difference?
  • Zoatebix - Friday, June 7, 2013 - link

    Thank you for doing all this work. A great follow-up to the original!

    Could you please correct some charts on the CPU Benchmarks page, though? The "Video Conversion - x264 HD Benchmark" section is displaying the charts for the "Grid Solvers - Explicit Finite Difference" section.
  • Klimax - Saturday, June 8, 2013 - link

    Frankly not best article. Resolution too high for GPU and then recommending CPU based on it. CPU, which will not provide performance needed for games. (Techreport showed that APU is not good idea when paired with real GPU; FPS might be in range, but latency is in hell)
  • JNo - Sunday, June 9, 2013 - link

    Ian, I'm afraid I have to agree with some of the naysayers here. You've tried so hard to have clean *scientific* analysis that you've failed to see the wood for the trees. In actual fact I fear you've reached the opposite of a scientific conclusion *because* you only focussed on easily obtainable/reproducible results.

    Just because results for modern games are hard to obtain, doesn't mean you can ignore them despite it being a hard path to walk. I have 1440p but agree that it's not relevant to the vast majority and anyone affording a 1440p monitor won't care to save $40 on AMD A8 vs core i5. So you have to be *realistic* (as well as scientific).

    I know from a few years of international finance analysis that when doing an independent study, there is a chance you can come to a conclusion that flies in the face of the market or common opinion. You have to be *SO* careful when this happens and quadruple check what you have ended up with because 99% of the time, the market or 'hive mind' is correct and there is an error or misunderstanding in your own work. After all, the conglomerate conclusion of hundreds of often intelligent people is hardly likely to wrong, even if you are a smart guy. The chance that you have found the truth and that everyone else is wrong really is about 1% (yes it does happen but it is a once in a blue moon type of event).

    It might seem a huge hit to admit that much of your hard work was misdirected but it could save more pain in the long run to go back to the drawing board and consider what you are trying to achieve and how best to go about it. A very small sample of older titles at unpopular resolutions really could skew results to be misleading.
  • CiccioB - Wednesday, June 12, 2013 - link

    I agree. However we have still to understand what was the thesis Ian wanted to demonstrate.
    If it was "AMD CPU don't have to appear so bad vs Intel" the strategy used for the demonstration is quite good.
    On the other hand, if it was "Let's see which is the best CPU for playing games" the strategy is a complete fail. And it still is partially the same if it were "Let's see which is the cheapest CPU to cope with a bottlenecked GPU", as those old games, but Civ5, all do not have any complex AI o scripts which are a CPU intensive task .
    If I were to judge this work as a homework I would evaluate it as F because it is intended for a small part of the market, using old benchmarks not valid today, incomplete (lack of FCAT) with a wrong setup (bottlenecking GPUs to evaluate CPU performances?).
    Wrong on all aspects but, unless said, the intent was to show that AMD CPU are just trailing Intel most expensive ones instead of being a complete generation behind. In this case evaluation can be a B, but becomes quite limited if we look at the represented market (is 3% of a market that is capable of spending well more that an average gamers a good target to demonstrate that they can spare few bucks using an otherwise castrated CPU?)

    For all these reasons I may say that this is one of the worst article I have ever read on this site. It show some incompetence or worse a bias.
  • Filiprino - Thursday, June 20, 2013 - link

    It's cool that you test old CPUs, so we can see the improvement of CPU processing power over the years.

Log in

Don't have an account? Sign up now