Discrete GPU Gaming Performance

Gaming performance with a discrete GPU does improve in line with the rest of what we've seen thus far from Ivy Bridge. It's definitely a step ahead of Sandy Bridge, but not enough to warrant an upgrade in most cases. If you haven't already made the jump to Sandy Bridge however, the upgrade will do you well.

Dragon Age Origins

DAO has been a staple of our CPU gaming benchmarks for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age Origins—1680 x 1050—Max Settings (no AA/Vsync)

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at Ultra quality settings at 1680 x 1050:

Dawn of War II—1680 x 1050—Ultra Settings

World of Warcraft

Our WoW test is run at High quality settings on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1680 x 1050.

World of Warcraft

Starcraft 2

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of 6 armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings however, both benchmarks are influenced by CPU and GPU. We'll get to the GPU test shortly, but our CPU test results are below. The benchmark runs at 1024 x 768 at Medium Quality settings with all CPU influenced features set to Ultra.

Starcraft 2

Metro 2033

We're using the Metro 2033 benchmark that ships with the game. We run the benchmark at 1024 x 768 for a more CPU bound test as well as 1920 x 1200 to show what happens in a more GPU bound scenario.

Metro 2033 Frontline Benchmark—1024 x 768—DX11 High Quality

Metro 2033 Frontline Benchmark—1920 x 1200—DX11 High Quality

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3—Aspen Benchmark—1024 x 768 Low Quality

DiRT 3—Aspen Benchmark—1920 x 1200 High Quality

Crysis: Warhead

Crysis Warhead Assault Benchmark—1680 x 1050 Mainstream DX10 64-bit

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance. We're looking at the no-render score here to isolate CPU performance alone:

Civilization V—1680 x 1050—DX11 High Quality

The Test & CPU Performance Intel's HD 4000 Explored
Comments Locked

173 Comments

View All Comments

  • wingless - Monday, April 23, 2012 - link

    I'll keep my 2600K

    .....just kidding
  • formulav8 - Monday, April 23, 2012 - link

    I hope you give AMD even more praise when Trinity is released Anand. IMO you way overblew how great Intels igp stuff. Its their 4th gen that can't even beat AMDs first gen.

    Just my opinion :p
  • Zstream - Monday, April 23, 2012 - link

    I agree..
  • dananski - Monday, April 23, 2012 - link

    As much as I like the idea of decent Skyrim framerates on every laptop, and even though I find the HD4000 graphics an interesting read, I couldn't care less about it in my desktop. Gamers will not put up with integrated graphics - even this good - unless they're on a tight budget, in which case they'll just get Llano anyway, or wait for Trinity. As for IVB, why can't we have a Pentium III sized option without IGP, or get 6 cores and no IGP?
  • Kjella - Tuesday, April 24, 2012 - link

    Strategy, they're using their lead in CPUs to bundle it with a GPU whether you want it or not. When you take your gamer card out of your gamer machine it'll still have an Intel IGP for all your other uses (or for your family or the second-hand market or whatever), that's one sale they "stole" from AMD/nVidia's low end. Having a separate graphics card is becoming a niche market for gamers. That's better for Intel than lowering the expectation that a "premium" CPU costs $300, if you bring the price down it's always much harder to raise it again...
  • Samus - Tuesday, April 24, 2012 - link

    As amazing this CPU is, and how much I'd love it (considering I play BF3 and need a GTX560+ anyway) I have to agree the GPU improvement is pretty disappointing...

    After all that work, Intel still can't even come close to AMD's integrated graphics. It's 75% of AMD's performance at best.
  • Cogman - Thursday, May 3, 2012 - link

    There is actually a good reason for both AMD and Intel to keep a GPU on their CPUs no matter what. That reason is OpenCV. This move makes the assumption that OpenCV or programming languages like it will eventually become mainstream. With a GPU coupled to every CPU, it saves developers from writing two sets of code to deal with different platforms.
  • froggr - Saturday, May 12, 2012 - link

    OpenCV is Open Computer Vision and runs either way. I think you're talking about OpenCL (Open Compute Language). and even that runs fine without a GPU. OpenCL can use all cores CPU + GPU and does not require separate code bases.

    OpenCL runs faster with a GPU because it's better parallellized.
  • frozentundra123456 - Monday, April 23, 2012 - link

    Maybe we could actually see some hard numbers before heaping so much praise on Trinity??

    I will be convinced about the claims of 50% IGP improvements when I see them, and also they need to make a lot of improvements to Bulldozer, especially in power consumption, before it is a competitive CPU. I hope it turns out to be all the AMD fans are claiming, but we will see.
  • SpyCrab - Tuesday, April 24, 2012 - link

    Sure, Llano gives good gaming performance. But it's pretty much at Athlon II X4 CPU performance.

Log in

Don't have an account? Sign up now