Windows 7 Gaming Performance

Our Bench suite is getting a little long in the tooth, so I added a few more gaming tests under Windows 7 with a new group of processors. We'll be adding some of these tests to Bench in the future but the number of datapoints is obviously going to be small as we build up the results.

Batman is an Unreal Engine 3 game and a fairly well received one at that. Performance is measured using the built in benchmark at the highest image quality settings without AA enabled.

Gaming performance is competitive, but we don't see any huge improvements under Batman.

Dragon Age Origins is another very well received game. The 3rd person RPG gives our CPUs a different sort of workload to enjoy:

Dragon Age on the other hand shows an 11.6% gain vs. the i5 760 and equal performance to the Core i7 880. Given that the i5 2400 is slated to be cheaper than the i5 760, I can't complain.

World of Warcraft needs no introduction. An absurd number of people play it, so we're here to benchmark it. Our test favors repeatability over real world frame rates, so our results here will be higher than in the real world with lots of server load. But what our results will tell you is what the best CPU is to get for playing WoW:

Performance in our WoW test is top notch. The i5 2400 is now the fastest CPU we've ever run through our WoW benchmark, the Core i7 980X included.

We've been working on putting together Starcraft II performance numbers, so here's a quick teaser:

A 12% advantage over the Core i7 880 and an 18% improvement over the Core i5 760.

Archiving Performance Power Consumption
Comments Locked

200 Comments

View All Comments

  • DanNeely - Friday, August 27, 2010 - link

    Maybe, but IIRC Apple's biggest issue with the Clarkdale platform on smaller laptops was wanting to maintain CUDA support across their entire platform without adding a 3rd chip to the board, not general GPU performance. Unless the Intel/nVidia lawsuit concludes with nVidia getting a DMI license or Intel getting a CUDA license this isn't going to change.
  • Pinski - Saturday, August 28, 2010 - link

    I don't think it has anything to do with CUDA. I mean, they sell Mac Pros with AMD/ATI Cards in them, and they don't support CUDA. It's more of OpenCL and high enough performance. However, just looking at these new performance, I'm willing to say that it'll be the next chip for the MBP 13" easily.
  • Pinski - Saturday, August 28, 2010 - link

    Well, wait never mind. Apparently it doesn't support OpenCL, which basically puts it out of the picture for Apple to use.
  • starfalcon - Saturday, August 28, 2010 - link

    Hmm, they really want all of the systems to have OpenCL?
    I don't have OpenCL and I don't care at all and I have CUDA but have only used it once.
    320M doesn't even have OpenCl does it?
    Seems like it would be ok for the less expensive ones to have Intel graphics and the higher end ones to have CUDA, OpenCL, and better gaming performance if someone cares about those.
    They'll keep on upgrading the performance and features of Intel graphics though, who knows.
  • Veerappan - Thursday, September 2, 2010 - link

    No, just ... no.

    Nvidia implements an OpenCL run-time by translating OpenCL API calls to CUDA calls. If your card supports CUDA, it supports OpenCL.

    The 320M supports OpenCL, and every Apple laptop/desktop that has shipped in the last few years has as well.

    A large portion of the motivation for OS X 10.6 (Snow Leopard) was introducing OpenCL support.. along with increasing general performance.

    There is a large amount of speculation that OS X 10.7 will take advantage of the OpenCL groundwork that OS X 10.6 has put in place.

    Also, in the case that you have a GPU that doesn't support OpenCL (older Intel Macs with Intel IGP graphics), Apple has written a CPU-based OpenCL run-time. It'll be slower than GPU, but the programs will still run. That being said, I highly doubt that Apple will be willing to accept such a performance deficit existing in a brand new machine compared to prior hardware.
  • Penti - Saturday, August 28, 2010 - link

    It has more to do with nVidia's VP3 PureVideo engine which they rely on for video acceleration. It's as simple as that.

    Which is why they only find their place in the notebooks. It's also a low-end gpu with enough performance to say run a source game at low res. And they have more complete drivers for OS X.

    CUDA is a third party add on. OpenCL isn't.
  • burek - Friday, August 27, 2010 - link

    Will there be a "cheap"(~$300) 6-core LGA-2011 replacement for i7 920/930 or will Intel limit the 6/8 cores to the high-end/extreme price segment ($500+)?
  • DJMiggy - Friday, August 27, 2010 - link

    yea I doubt that will happen. It would be like trying to SLI/crossfire an nvidia to an ati discrete. You would need a special chip like the hyrda one.
  • DJMiggy - Friday, August 27, 2010 - link

    Hydra even. Hydra Lucid chip.
  • Touche - Friday, August 27, 2010 - link

    Questionable overclocking is bad enough, but together with...

    "There’s no nice way to put this: Sandy Bridge marks the third new socket Intel will have introduced since 2008."

    "The CPU and socket are not compatible with existing motherboards or CPUs. That’s right, if you want to buy Sandy Bridge you’ll need a new motherboard."

    "In the second half of 2011 Intel will replace LGA-1366 with LGA-2011."

    ...it is just terrible!

    I'll definitely buy AMD Bulldozer, even if it ends up a bit slower. At least they have some respect for their customers and an ability of forward thinking when designing sockets (actually, Intel probably has it too, but just likes to milk us on chipset purchases also). And I am no fanboy, 4 of my 7 PC's are Intel based (two of those 4 were my latest computer purchases).

Log in

Don't have an account? Sign up now