DiRT 3

DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.

One 7970

Dirt 3 - One 7970, 1440p, Max Settings

While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K.  The 4770K has a small but ultimately unnoticable advantage in gameplay.

Two 7970s

Dirt 3 - Two 7970s, 1440p, Max Settings

When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.

Three 7970s

Dirt 3 - Three 7970, 1440p, Max Settings

Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K.  The 4770K is slightly ahead of the 3770K at x8/x4/x4, suggesting a small IPC difference,

Four 7970s

Dirt 3 - Four 7970, 1440p, Max Settings

At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).

One 580

Dirt 3 - One 580, 1440p, Max Settings

Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.

Two 580s

Dirt 3 - Two 580s, 1440p, Max Settings

Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.

DiRT 3 conclusion

Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
POST A COMMENT

111 Comments

View All Comments

  • UltraTech79 - Saturday, June 22, 2013 - link

    Thats a pretty shitty point. Reply
  • Jon Irenicus - Sunday, June 16, 2013 - link

    who cares what most of the market has, 1440p monitors are in the 300 dollar range from the korean ebay sellers, just because a bunch of no nothings did not get the memo and get one of those better monitors and spent all their cash upgrading their cpu/gpus with their crap 1080p monitors does not mean reviews should not focus on where people SHOULD go.

    1080p is a garbage resolution for large displays when you have easy and CHEAP access to 1440p. I got one of those monitors, it's beautiful. The problem is not the 4% that are higher than 1080/1200p, is the rest of you who are too cpu focused to get a better monitor.

    I mean jesus people, you sit and stare at that thing ALL DAMN DAY, and people actually spend HUNDREDS of dollars on multi gpu setups and high end cpus to game at 1080p... it's submental. YOU and others need to stop complaining about a lack of focus on 1080p, and get on board the 1440p train. You don't have that? well get it, stop lagging, you are choosing an inferior setup and complaining to anandtech because they chose not to focus on your crap resolution monitor?

    It's almost as if you specifically cripple your gaming resolution just so you can feel more satisfied at how much faster the intel cpus beat out the amds. Well, you're right, they do, and you still chose an inferior gaming resolution, stop living in the ghetto of the pc gaming world and move higher.
    Reply
  • UltraTech79 - Saturday, June 22, 2013 - link

    I stopped reading at "no nothings". Lol what a ranting lunatic. Reply
  • metasyStratS - Thursday, June 06, 2013 - link

    "This is another attempt at covering for AMD and trying to help them sell products... Anandtech is MISLEADING you at best by showing a resolution higher than 98.75% of us are using and tapping out the single gpu..."

    You could also easily argue that the article is helping to sell Intel's 4770K, providing data that misleadingly (though not falsely) indicates the superiority of the 4770K over the 2500K/3770K group.

    For the majority gamers, it is indeed misleading to focus on 1440p only. For a good number, it is also misleading to focus only on stock clocks.

    As you point out, at 1080p, overclocking does help (though the benefit has to be weighed against the cost of upgraded cooling). As as others in forums have pointed out, 2700K vs. 3770K is roughly equal: with any given aftermarket cooler, a 3770K at 'Maximum Stable Overclock' will have roughly the same performance as a 2700K at 'Maximum Stable Overclock', will run hotter than a 2700K, but will consume less energy, and so on...

    On the other hand, preliminary indications are that for the majority of overclockers (those who do not want to spend as much for a custom water-cooling loop as for the CPU itself), a 4770K is a worse bet, as it apparently runs hotter than even the 3770K, and the gains in 'Instructions per Clock' likely do not make up for what would thus be a reduced 'Maximum Stable Overclock.' See here: http://forums.pureoverclock.com/cpu-overclocking/2...

    In short: CPU overclocking yields a tangible benefit for 1080p gamers, and for the majority of CPU Overclockers (those who do not want to spend as much for a custom water-cooling loop as for the CPU itself), the 4770K appears to be something LESS than a 3770K or 2700K.
    Reply
  • TheJian - Thursday, June 06, 2013 - link

    I didn't say anything about overclocking. Maybe one of the quotes did? My statements are pure chip to chip, no overclocking discussed. Maybe you were replying to someone else?

    The article is isn't helping to sell 4770k's when he says single gpu owners (98% according to steampowered survey) can play fine on a A8-5600. Single GPU owners again, according to the survey are NOT running above 1920x1200. So AMD gets killed unless you pull a stunt like anandtech did here as the benchmarks in the links I pointed to show.

    I did not point out overclocking at 1080p helps. I made no statement regarding overclocking, but Intel wins that anyway.
    Reply
  • Obsoleet - Thursday, June 06, 2013 - link

    DOWN WITH THE 1.25%!! Reply
  • Calinou__ - Friday, June 07, 2013 - link

    Fun fact: the A10-5800K's upside is its IGP, not the processor part.

    If you want to do gaming with an AMD CPU you better pick a FX-6xxx or a FX-8xxx.
    Reply
  • dishayu - Tuesday, June 04, 2013 - link

    I'm sorry if i missed this info while reading but does Haswell come with dual link DVI support? You know, so that i can drive my 1440p displays for everyday usage, since i don't game all that much. Reply
  • Mobilus - Tuesday, June 04, 2013 - link

    The problem isn't Haswell, the problem is the mainboard. You would need a mainboard that supports dual-link and at least with the older generations that feature wasn't implemented. Unless the usual suspects changed that with their new offerings, you will have to use a displayport to dvi adapter to get that resolution without a dedicated card (hdmi on mainboards is usually restricted to 1080p as well, unless... see above). Reply
  • K_Space - Tuesday, June 04, 2013 - link

    I know Anandtech hasn't got to review the Richland desktop variants yet; but surely if the current recommendation is a trinity APU, surely a >10% performance increase and a lower TDP would clench it for Richland?
    The newly launched top end A8 6600K is £20 more than the A8 5600K.... but that's launch price.
    Reply

Log in

Don't have an account? Sign up now