Dirt 3

Dirt 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters.  Dirt 3 also falls under the list of ‘games with a handy benchmark mode’.  In previous testing, Dirt 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything.  The small issue with Dirt 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed.  Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033.  This in essence should make the benchmark more variable, but we take repeated in order to smooth this out.  Using the benchmark mode, Dirt 3 is run at 1440p with Ultra graphical settings.  Results are reported as the average frame rate across four runs.

One 7970

Dirt 3 - One 7970, 1440p, Max Settings

Similar to Metro, pure dual core CPUs seem best avoided when pushing a high resolution with a single GPU.  The Haswell CPUs seem to be near the top due to their IPC advantage.

Two 7970s

Dirt 3 - Two 7970s, 1440p, Max Settings

When running dual AMD GPUs only the top AMD chips seem to click on to the tail of Intel, with the hex-core CPUs taking top spots.  Again there's no real change moving from 4670K to 4770K, and even the Nehalem CPUs keep up within 4% of the top spots

Three 7970s

Dirt 3 - Three 7970, 1440p, Max Settings

At three GPUs the 4670K seems to provide the equivalent grunt to the 4770K, though more cores and more lanes seems to be the order of the day.  Moving from a hybrid CPU/PCH x8/x8 + x4 lane allocation to a pure CPU allocation (x8/x4/x4) merits a 30 FPS rise in itself.  The Nehalem CPUs, without NF200 support, seem to be on the back foot performing worse than Piledriver.

One 580

Dirt 3 - One 580, 1440p, Max Settings

On the NVIDIA side, one GPU performs similarly across the board in our test.

Two 580s

Dirt 3 - Two 580s, 1440p, Max Settings

When it comes to dual NVIDIA GPUs, ideally the latest AMD architecture and anything above a dual core Intel Sandy Bridge processor is enough to hit 100 FPS.

Dirt3 Conclusion

Our big variations occured on the AMD GPU side where it was clear that above two GPUs that perhaps moving from Nehalem might bring a boost to frame rates.  The 4670K is still on par with the 4770K in our testing, and the i5-4430 seemed to be on a similar line most of the way but was down a peg on tri-GPU.

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
Comments Locked

137 Comments

View All Comments

  • warezme - Thursday, October 3, 2013 - link

    I to invested in the venerable (speak only in awe hushed whispers), i7 920 which I promptly overclocked to 3.6Ghz. This little jewel has been going strong for Goodness almost half a decade? and stable as a rock and I notice holding it's own very well even up against the latest and greatest. This is a testament to competition and engineering when competition in the CPU arena existed. I have long switched from dual GPU's to single but dual core cards on a single fat 16x pci-e bus even though my Evga X58SLI board supports higher. I'll ride the wave one more year and see what new gear crashes in next year. Hopefully a new Nvidia architecture that will inspire me to upgrade everything.
  • Hrel - Thursday, October 3, 2013 - link

    "our next update will focus solely on the AMD midrange."

    Please don't do that. PLEASE include at least 3 Intel CPU's for comparison. It doesn't matter if the FX8320 does well in benchmarks if for another $40 bucks I can get a i54670 that runs 50% faster. These are hypothetical numbers, obviously, but then Intel will be faster. By how much matters, once you factor in price and energy draw especially.
  • A5 - Thursday, October 3, 2013 - link

    The old numbers will still be there for comparison. The next update is just *adding* more AMD data.
  • just4U - Thursday, October 3, 2013 - link

    It's hard making sense of AMD data in comparison to Intel. As near as I can tell their sitting at just beyond i7920 performance these days but /w all the new features. It gets confusing when you look at the X4 X6 older stuff though since some of that is actually faster... yet somehow only compares favorably to Intel's 9X Core2 stuff.
  • just4U - Thursday, October 3, 2013 - link

    Why 3? The i5 entry level 4430 beats out every AMD chip on the market in most instances. Adding in more simply confuses people and adds more fodder for fanboys to fight over.. and I think it taxes the patience of most of us that already know what's what in the cpu arena.

    Simple rule of thumb. If your on a budget you may want to go AMD to get all the "other" bells and whistles your looking to buy or.. if you have a more to spend your starting point will be the i54430.
  • just4U - Thursday, October 3, 2013 - link

    Excellent article Ian, I really like the inclusion of older CPU's. It's a good basis in which to decide if it's "time" to upgrade on that front. Most of the people I know are not on the bleeding edge of technology. Many sit back in 2009 with minor updates to video and Hard Drives. Anyway.. Well done lots to sift thru.
  • Jackie60 - Thursday, October 3, 2013 - link

    At last Anandtech is doing some meaningful second decade of 21st century testing. Well done and keep it up ffs!
  • SolMiester - Thursday, October 3, 2013 - link

    Can someone please tell me why we are using 2+ yr old GPUs?
  • A5 - Thursday, October 3, 2013 - link

    You could read the article.
  • OrphanageExplosion - Thursday, October 3, 2013 - link

    Amazing data. I do wonder whether the testing at max settings is a good idea though. The variation in performance can be extreme. Just watch the Metro 2033 benchmark play out. Does that look like the kind of experience you'd want to play?

    Perhaps more importantly though, the arrival of next-gen console changes everything.

    Did you see the news that Watch Dogs is x64 only? That's just the tip of the iceberg. Developers need to go wide to make the most out of six available Jaguar cores. Jobs-based scheduling over up to eight cores will become the norm rather than the exception. The gap between i5 vs. i7 will widen. AMD FX will suddenly become a lot more interesting.

    In short order, I'd expect to see dual core CPUs and less capable quads start to look much less capable very quickly. i5 vs. i7 will see a much larger gulf in performance.

    Check out the CPU data here for the Battlefield 4 beta:

    http://gamegpu.ru/action-/-fps-/-tps/battlefield-4...

    The dual cores are being maxed out, FX-8350 is up there with the 3930K (!)

Log in

Don't have an account? Sign up now