Choosing a Gaming CPU at 1440p: Adding in Haswell
by Ian Cutress on June 4, 2013 10:00 AM ESTDiRT 3
DiRT 3 is a rallying video game and the third in the Dirt series of the Colin McRae Rally series, developed and published by Codemasters. DiRT 3 also falls under the list of ‘games with a handy benchmark mode’. In previous testing, DiRT 3 has always seemed to love cores, memory, GPUs, PCIe lane bandwidth, everything. The small issue with DiRT 3 is that depending on the benchmark mode tested, the benchmark launcher is not indicative of game play per se, citing numbers higher than actually observed. Despite this, the benchmark mode also includes an element of uncertainty, by actually driving a race, rather than a predetermined sequence of events such as Metro 2033. This in essence should make the benchmark more variable, but we take repeated runs in order to smooth this out. Using the benchmark mode, DiRT 3 is run at 1440p with Ultra graphical settings. Results are reported as the average frame rate across four runs.
One 7970
While the testing shows a pretty dynamic split between Intel and AMD at around the 82 FPS mark, all processors are roughly +/- 1 or 2 around this mark, meaning that even an A8-5600K will feel like the i7-3770K. The 4770K has a small but ultimately unnoticable advantage in gameplay.
Two 7970s
When reaching two GPUs, the Intel/AMD split is getting larger. The FX-8350 puts up a good fight against the i5-2500K and i7-2600K, but the top i7-3770K offers almost 20 FPS more and 40 more than either the X6-1100T or FX-8150.
Three 7970s
Moving up to three GPUs and DiRT 3 is jumping on the PCIe bandwagon, enjoying bandwidth and cores as much as possible. Despite this, the gap to the best AMD processor is growing – almost 70 FPS between the FX-8350 and the i7-3770K. The 4770K is slightly ahead of the 3770K at x8/x4/x4, suggesting a small IPC difference,
Four 7970s
At four GPUs, bandwidth wins out, and the PLX effect on the UP7 seems to cause a small dip compared to the native lane allocation on the RIVE (there could also be some influence due to 6 cores over 4).
One 580
Similar to the one 7970 setup, using one GTX 580 has a split between AMD and Intel that is quite noticeable. Despite the split, all the CPUs perform within 1.3 FPS, meaning no big difference.
Two 580s
Moving to dual GTX 580s, and while the split gets bigger, processors like the i3-3225 are starting to lag behind. The difference between the best AMD and best Intel processor is only 2 FPS though, nothing to write home about.
DiRT 3 conclusion
Much like Metro 2033, DiRT 3 has a GPU barrier and until you hit that mark, the choice of CPU makes no real difference at all. In this case, at two-way 7970s, choosing a quad core Intel processor does the business over the FX-8350 by a noticeable gap that continues to grow as more GPUs are added, (assuming you want more than 120 FPS).
116 Comments
View All Comments
TheJian - Thursday, June 6, 2013 - link
http://www.tomshardware.com/reviews/a10-6700-a10-6...Check the toms A10-6800 review. With only a 6670D card i3-3220 STOMPS the A10-6800k with the same 6670 radeon card in 1080p in F1 2012. 68fps to 40fps is a lot right? Both chips are roughly $145. Skyrim shows 6800k well, but you need 2133memory to do it. But faster Intel cpu's will leave this in the dust with a better gpu anyway.
http://www.guru3d.com/articles_pages/amd_a10_6800k...
You can say 100fps is a lot in far cry2 (it is) but you can see how a faster cpu is NOT limiting the 580 GTX here as all resolutions run faster. The i7-4770 allows GTX 580 to really stretch it's legs to 183fps, and drops to 132fps at 1920x1200. The FX 8350 however is pegged at 104 for all 4 resolutions. Even a GTX 580 is held back, never mind what you'd be doing to a 7970ghz etc. All AMD cpu's here are limiting the 580GTX while the Intel's run up the fps. Sure there are gpu limited games, but I'd rather be using the chip that runs away from slower models when this isn't the case. From what all the data shows amongst various sites, you'll be caught with your pants down a lot more than anandtech is suggesting here. Hopefully that's enough games for everyone to see it's far more than Civ5 and even with different cards affecting things. If both gpu sides double their gpu cores, we could have a real cpu shootout in many things at 1440p (and of course below this they will all spread widely even more than I've shown with many links/games).
roedtogsvart - Tuesday, June 4, 2013 - link
Hey Ian, how come no Nehalem or Lynnfield data points? There are a lot of us on these platforms who are looking at this data to weigh vs. the cost of a Haswell upgrade. With the ol' 775 geezers represented it was disappointing not to see 1366 or 1156. Superb work overall however!roedtogsvart - Tuesday, June 4, 2013 - link
Other than 6 core Xeon, I mean...A5 - Tuesday, June 4, 2013 - link
Hasn't had time to test it yet, and hardware availability. He covers that point pretty well in this article and the first one.chizow - Tuesday, June 4, 2013 - link
Yeah I understand and agree, would definitely like to see some X58 and Kepler results.ThomasS31 - Tuesday, June 4, 2013 - link
Seems 1440p is too demanding on the GPU side to show the real gaming difference between these CPUs.Is 1440p that common in gaming these days?
I have the impression (from my CPU change experiences) that we would see different differences at 1080p for example.
A5 - Tuesday, June 4, 2013 - link
Read the first page?ThomasS31 - Tuesday, June 4, 2013 - link
Sure. Though still for single GPU, it would be a wiser choice to be "realistic" and do 1080p that is more common (on single monitor average Joe gamer type of scenario).And go 1440p (or higher) for multi GPUs and enthusiast.
The purpose of the article is choosing a CPU and that needs to show some sort of scaling in near real life scenarios, but if the GPU kicks in from start it will not be possible to evaluate the CPU part of the performance equation in games.
Or maybe it would be good to show some sort of combined score from all the test, so the Civ V and other games show some differentation at last in the recommendation as well, sort of.
core4kansan - Tuesday, June 4, 2013 - link
The G2020 and G860 might well be the best bang-for-buck cpus, especially if you tested at 1080p, where most budget-conscious gamers would be anyway.Termie - Tuesday, June 4, 2013 - link
Ian,A couple of thoughts for you on methodology:
(1) While I understand the issue of MCT is a tricky one, I think you'd be better off just shutting it off, or if you test with it, noting the actually core speeds that your CPUs are operating at, which should be 200MHz above nominal Turbo.
(2) I don't understand the reference to an i3-3225+, as MCT should not have any effect on a dual-core chip, since it has no Turbo mode.
(3) I understand the benefit of using time demos for large-scale testing like what you're doing, but I do think you should use at least one modern game. I'd suggest replacing Metro2033, which has incredibly low fps results due to a lack of engine optimization, with Tomb Raider, which has a very simple, quick, and consistent built-in benchmark.
Thanks for all your hard work to add to the body of knowledge on CPUs and gaming.
Termie