Dirt 3

Dirt 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters.  Dirt 3 also falls under the list of ‘games with a handy benchmark mode’.  In previous testing, Dirt 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything.  The small issue with Dirt 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed.  Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033.  This in essence should make the benchmark more variable, but we take repeated in order to smooth this out.  Using the benchmark mode, Dirt 3 is run at 1440p with Ultra graphical settings.  Results are reported as the average frame rate across four runs.

One 7970

Dirt 3 - One 7970, 1440p, Max Settings

Similar to Metro, pure dual core CPUs seem best avoided when pushing a high resolution with a single GPU.  The Haswell CPUs seem to be near the top due to their IPC advantage.

Two 7970s

Dirt 3 - Two 7970s, 1440p, Max Settings

When running dual AMD GPUs only the top AMD chips seem to click on to the tail of Intel, with the hex-core CPUs taking top spots.  Again there's no real change moving from 4670K to 4770K, and even the Nehalem CPUs keep up within 4% of the top spots

Three 7970s

Dirt 3 - Three 7970, 1440p, Max Settings

At three GPUs the 4670K seems to provide the equivalent grunt to the 4770K, though more cores and more lanes seems to be the order of the day.  Moving from a hybrid CPU/PCH x8/x8 + x4 lane allocation to a pure CPU allocation (x8/x4/x4) merits a 30 FPS rise in itself.  The Nehalem CPUs, without NF200 support, seem to be on the back foot performing worse than Piledriver.

One 580

Dirt 3 - One 580, 1440p, Max Settings

On the NVIDIA side, one GPU performs similarly across the board in our test.

Two 580s

Dirt 3 - Two 580s, 1440p, Max Settings

When it comes to dual NVIDIA GPUs, ideally the latest AMD architecture and anything above a dual core Intel Sandy Bridge processor is enough to hit 100 FPS.

Dirt3 Conclusion

Our big variations occured on the AMD GPU side where it was clear that above two GPUs that perhaps moving from Nehalem might bring a boost to frame rates.  The 4670K is still on par with the 4770K in our testing, and the i5-4430 seemed to be on a similar line most of the way but was down a peg on tri-GPU.

GPU Benchmarks: Metro2033 GPU Benchmarks: Civilization V
Comments Locked

137 Comments

View All Comments

  • tackle70 - Thursday, October 3, 2013 - link

    The 8350 is with the 2600k, not the 3930k...

    So yeah, it's a very good showing for AMD, but not as good as what you indicate. Also, according to sweclockers, an overclocked i5 is still superior to an overclocked 83xx CPU, so make of that what you wish.

    I'm just glad we're seeing games starting to use more than 2-4 threads effectively.
  • Traciatim - Thursday, October 3, 2013 - link

    Much more likely is that games will just become less and less reliant on CPU power because of the terrible netbook processors in the consoles and will instead rely more and more on the GPU. The PC versions of games will just be the same game with a high res texture pack and some extra graphics bling to use up GPU cycles while your processor sits around shuffling a little data.
  • Flunk - Friday, October 4, 2013 - link

    I'm not sure AMD will benefit that much. As soon as consumer CPUs have a reason to have more cores they're just release a new chip with more cores. There is absolutely no reason that they can't release a 8 or ever 12 core desktop processor, they're already selling them for servers.
  • Flunk - Friday, October 4, 2013 - link

    Forgot to mention, Watch Dogs is probably x64 only because they want to use more than 2GB of RAM (which is the limit for the user-mode memory partition in Win32).
  • Nirvanaosc - Thursday, October 3, 2013 - link

    Looking just at the gaming results, does this means that almost any CPU is capable to feed the GPU at 1440p and it is always GPU limited?
  • Nirvanaosc - Thursday, October 3, 2013 - link

    I mean in single GPU config.
  • Traciatim - Thursday, October 3, 2013 - link

    That's pretty much just the games they picked. If you could reliably benchmark large scale PC games like Planetside 2, or other popular large scale MMO's reliably you'd pretty much see the exact opposite. The trouble is, it seems like no MMO makers give you reliable benchmarking tools so you can't use them for tests like these.
  • ryccoh - Thursday, October 3, 2013 - link

    I would really like to see a CPU comparison for strategy games.
    For example, one could have a save game of a far advanced game in Civilization 5 or Total War with many AI players on the largest map and then see how the waiting time varies between the different CPUs. This should be feasible, shouldn't it?
    I'm running an i5 2500k @4.6ghz and it just isn't cutting it for Civilization 5 on a large map once you're far into the game, it would be nice to see whether getting hyperthreading and more cores would be worth it.
  • glugglug - Thursday, October 3, 2013 - link

    Having waited the ridiculous amounts of time between turns on Civ V, and having dual monitors, I put task manager up on the second monitor while it was running, to see that Civ V *IS NOT MULTITHREADED. AT ALL*. Setting the CPU affinity to make it use only 1 logical core makes absolutely no performance difference at all! The only thing I can think of for why a better result would be seen on quad-core systems would be that it likes having a larger L3 cache.
  • glugglug - Thursday, October 3, 2013 - link

    P.S. If my "Civ V just likes cache" theory is right, an Iris Pro laptop should be the ultimate Civ V machine.

Log in

Don't have an account? Sign up now