DiRT 3

DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.

One 7970

Dirt 3 - One 7970, 1440p, Max Settings

While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K.  The 4770K has a small but ultimately unnoticable advantage in gameplay.

Two 7970s

Dirt 3 - Two 7970s, 1440p, Max Settings

When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.

Three 7970s

Dirt 3 - Three 7970, 1440p, Max Settings

Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K.  The 4770K is slightly ahead of the 3770K at x8/x4/x4, suggesting a small IPC difference,

Four 7970s

Dirt 3 - Four 7970, 1440p, Max Settings

At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).

One 580

Dirt 3 - One 580, 1440p, Max Settings

Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.

Two 580s

Dirt 3 - Two 580s, 1440p, Max Settings

Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.

DiRT 3 conclusion

Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
Comments Locked

116 Comments

View All Comments

  • Dentons - Tuesday, June 4, 2013 - link

    He's complaint is on the mark. Haswell is about mobile, not desktop, not gaming.

    Ivy Bridge was about cost reduction, Haswell is about reducing TDP. It is shocking that a mid-range 2+ year old Sandy Bridge desktop part is still so very competitive, even though it's been superseded by two whole generations.

    Intel deserves all this criticism and more. They've clearly put the interests of desktop users and gamers far onto the back burner. They're now focused on almost entirely mobile and are treading water with everything else.
  • takeship - Tuesday, June 4, 2013 - link

    Eh, how can you blame them? The pure play desktop market has been shrinking for a while now, with high performance desktop (basically gamers) even more of a niche. Maybe if they had some real competition from AMD in single threaded perf... A lot of this is just Amdahl's law at it's natural conclusion. The easy performance gains are mostly gone, so if you're Intel do you dump endless money into another 25-30% per generation, or go after the areas that haven't been well optimized yet instead? Not a hard choice to make, especially considering the market moves towards mobile & cool computing in the last decade.
  • Silma - Wednesday, June 5, 2013 - link

    Intel doesn't deserve criticism. Haswell is a small improvement over Ivy Bridge because it has become extremely difficult to optimize and already excellent processor. Do you see anything better from AMD, ARM, Oracle or others?

    Is there a need to upgrade from Ivy to Haswell? No. Was it necessary to upgrade from Nethalem to Sandy Bride? No. The fact is that for most applications processors have been good enough for years and money is better spent on ssds, gpus and whatnot.

    The real conclusion of this article should be that processors absolutely do not matter for gaming and that the money is better spent on speedier gpu. Processors may become relevant for the very very very few people that have extreme 2/3x extreme cards. Even a setup with 2 middle cards such as gtx 560 is not cpu dependent. I would welcome actual statistics from the number of players with 2x 3x high-end gpus. I'm quite sure the count is ultra tiny and for those people willing and able to spend thousand of dollars, do you think 100$ less on a processor is relevant?
  • chizow - Wednesday, June 5, 2013 - link

    I don't have a problem with the conclusion he comes to, complaining about dissemination of information to come to that conclusion is what makes no sense. Put all the information out there, 1, 2, 3 articles a day np, then make your own informed decision on the platform. Bemoan the fact there is actual coverage a day or two after launch and one or two reviews? Makes no sense.
  • Memristor - Tuesday, June 4, 2013 - link

    Too bad that Richland, which is available as of today, didn't make it into this review. Other than that great read.
  • eddieobscurant - Tuesday, June 4, 2013 - link

    many of us have a q6600 @ 3600mhz, and personally i'm very happy this and my 7870. I would still like to see a comparison of my cpu @ 3600mhz, with the modern cpus because i don't think there is a huge difference in games.
  • chizow - Tuesday, June 4, 2013 - link

    It depends what you play, any game that is CPU limited is going to be HUGE difference with that CPU. I had the same chip at 3.6GHz, which was great btw, and even when I upgraded to 920 @4GHz there was huge improvement in some games, most notably GTA4 at the time. Some other games that scale extremely well with CPU are WoW, Diablo 3, etc. just to name a few.
  • medi02 - Wednesday, June 5, 2013 - link

    Nah. Most of the tests show that to get CPU limited you need a multi-GPU setup.
    i7 and intel mobo will cost you about 500$ with marginal improvements.
  • chizow - Wednesday, June 5, 2013 - link

    Sorry, just not true. Even with just 1x680 WoW and other similarly CPU dependent games scale tremendously well with faster CPUs:

    http://www.tomshardware.com/reviews/fx-8350-visher...

    Q6600 @ 3.6 is probably just a tad faster than the Phenom IIs in that test.
  • TheJian - Thursday, June 6, 2013 - link

    See my comments here...Chizow is correct, and even understating it some. There are a LOT of games cpu limited as I showed in my links. Huge differences in cpu perf from A10-5800 up to 4770k, never mind the junk Ian recommends here A8-5600 for single gpu. It just isn't correct to recommend that cpu or even A10-5800K which I showed getting smacked around in many games at 1080p. Articles like this make people think games are not cpu bound (it's far more games than Civ5). Neverwinter, Metro Last light, tomb raider, Farcry3, Crysis 3 etc etc...Once 20nm comes we may find even 1440p showing just as many limited by cpu. If rumors are true Volcanic doubles stream processors. I'm sure NV will match that. You end up gpu bound when you up the res to 1440 on single cards now, but that won't be forever and 98.75% of us according to steam don't play at 1440p (.87%) or above (1.25% total of all res above 1920x1200).

    Check the 1080p data on my links (techreport was a good one as they show 1080p in most of the listed games). Toms shows neverwinter as I noted needing a very high cpu also. Hit all comments on this article, and Ctrl-F my name. Ignore my post comments and just click the links in them to prove Chizow's point (and my own). CPU is important at 1080p and 1920x1200 NOW and will be important at higher res with the next gen cards at 20nm. You will never get out of your AMD mistake if you take this article's suggestions. Well at least not without changing to an Intel board/chip...LOL. Who wants to do that? Just buy an Intel unless you're broke. Don't trust me though, read the links provided and judge for yourself how accurate anandtech is here.

    I showed some games that are nearly DOUBLE on Intel vs. A10-5800K! You don't have to like the way I made my point or believe me, just check the links :) They all say the same thing. CPU is an issue just as Chizow shows in his link. You can find this in many cpu articles where they use a top gpu (usually 7970/680) and test new cpus with the oldies in there too which show large separations. Check i7-3770k or fx 8350 articles (just google those two cpu models and "review" for ample sites showing the spreak)...1080p separates the men from the boys in cpu's.

    After you check the links (and chizow's), come back and agree Anandtech needs to change their ways, or tear my comments apart if I'm lying :) Future gpu's will only make our point stick out even more. CPU matters. Also note a lot of the games that are gpu limited on single cards are NOT playable anyway (check sleeping dogs right here in this article 1440p...7970 at 28fps avg is NOT playable, mins will dip to 20's or below). So you're forced back into cpu limited in a lot of cases at 1080p. Where 98.75% of us play you see cpu limits a lot.

    Go back one page on Chizow's link to Skyrim's benchmark in the same article for the same data. 1080p 3770 scores 88.2 to 8350's 67.4 (that's a lot and a huge hint to how your future on AMD will look)
    http://www.tomshardware.com/reviews/fx-8350-visher...
    That's a 30% difference and an 8350FX is far faster than an A8-5600 Ian recommends here. Chizow is even more right if you toss in Ian's recommendation of an even slower cpu than 8350 vs. Intel's stuff. Even in skyrim at 1680x1050 they separate from 90fps to 68fps for 8350fx. So until you completely tap out your gpu (1440p and up which basically requires 2+ cards) you will notice if your cpu is junk or not. Since this article is only written for apparently 1.25% of the readership (or world for that matter according to steam survey), you will notice the cpu! Unless you're raising your hand as the 1.25% :) I don't call 30-100% faster marginal improvements do you? Add CIV 5 also which this site even proves in this article ;) At least they got something right.

Log in

Don't have an account? Sign up now