Discrete GPU Gaming Performance

Gaming performance with a discrete GPU does improve in line with the rest of what we've seen thus far from Ivy Bridge. It's definitely a step ahead of Sandy Bridge, but not enough to warrant an upgrade in most cases. If you haven't already made the jump to Sandy Bridge however, the upgrade will do you well.

Dragon Age Origins

DAO has been a staple of our CPU gaming benchmarks for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age Origins - 1680 x 1050 - Max Settings (no AA/Vsync)

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at Ultra quality settings at 1680 x 1050:

Dawn of War II - 1680 x 1050 - Ultra Settings

World of Warcraft

Our WoW test is run at High quality settings on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1680 x 1050.

World of Warcraft

Starcraft 2

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of 6 armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings however, both benchmarks are influenced by CPU and GPU. We'll get to the GPU test shortly, but our CPU test results are below. The benchmark runs at 1024 x 768 at Medium Quality settings with all CPU influenced features set to Ultra.

Starcraft 2

Metro 2033

We're using the Metro 2033 benchmark that ships with the game. We run the benchmark at 1024 x 768 for a more CPU bound test as well as 1920 x 1200 to show what happens in a more GPU bound scenario.

Metro 2033 Frontline Benchmark - 1024 x 768 - DX11 High Quality

Metro 2033 Frontline Benchmark - 1920 x 1200 - DX11 High Quality

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3 - Aspen Benchmark - 1024 x 768 Low Quality

DiRT 3 - Aspen Benchmark - 1920 x 1200 High Quality

Crysis: Warhead

Crysis Warhead Assault Benchmark - 1680 x 1050 Mainstream DX10 64-bit

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance. We're looking at the no-render score here to isolate CPU performance alone:

Civilization V - 1680 x 1050 - DX11 High Quality

Compression & Encryption Performance Power Consumption
Comments Locked

195 Comments

View All Comments

  • fic2 - Wednesday, March 7, 2012 - link

    I totally agree. Intel is again going to cobble the lower end with the HD2500 graphics so that people that don't need the i7 cpu have to buy a discrete video card. I really wish review sites would hammer Intel for this and pressure them to include the better integrated graphics. It's not like the HD4000 is so good that people will buy an i7 just for the graphics.
  • Jamahl - Thursday, March 8, 2012 - link

    HD4000 takes up more die space which means it costs them more. That's all intel cares about, they don't give a shit about what people need at the lower end.

    They were forced to start using HD3000 graphics in all their lower end chips because of Llano. The 2105 basically replaced the 2100 at the same money so they would be less embarrassed by Llano. That's what competition does.
  • Death666Angel - Wednesday, March 7, 2012 - link

    I like this tick. The CPU performance goes up by as much as I expected and the iGPU side goes up significantly.

    If I had the spare change to throw around, I'd upgrade from my 3.8GHz i7 860. But as it is now, an upgraded CPU wouldn't do much for me in terms of gaming performance and I rarely do CPU intensive tasks these days. The chipset and native USB 3.0 are nice, but I'll wait for Haswell next year and get a good GPU or two instead.
  • tiro_uspsss - Wednesday, March 7, 2012 - link

    I'm a little confused :/

    the 3770K consistently beat the 3820 (by a very small margin)

    *wait*

    oh.. I found out why.. the specs of the 3820 as listed in the 'line up' are incorrect - the 3820 'only' turbos to 3.8 not 3.9.. is this why the 3770K did a little better?

    aside from the small extra turbo that the 3770K has, the 3820 has more L3, more memory channels & a higher core clock (that's if the core clock listed for the 3770K is correct)

    soooo.. the extra turbo.. is that why the 3770K is slightly better all-round?
  • Death666Angel - Wednesday, March 7, 2012 - link

    You know that they are different CPU generations, right? One is SNB-E on a 32nm process node and the other is IVB on a 22nm node. The review said that IVB has a 5-15% higher IPC.
  • tiro_uspsss - Wednesday, March 7, 2012 - link

    *slaps own forehead* DUH! thats right! I forgot! :D I knew I was forgetting something! :P :D thanks! makes sense now! :)
  • BSMonitor - Wednesday, March 7, 2012 - link

    The number scheme is misleading.

    3820 and up are SNB-E.

    3770K is Ivy Bridge.

    An IVB core will perform better than a SNB core clocked at the same speed.

    New architecture wins over cache, memory channels, clock speed.
  • Shadowmaster625 - Wednesday, March 7, 2012 - link

    "Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective. The 20 - 40% increase on the graphics side is what blurs the line between a conventional tick and what we have with Ivy Bridge."

    "Being able to play brand new titles at reasonable frame rates as realistic resolutions is a bar that Intel has safely met."
  • hansmuff - Wednesday, March 7, 2012 - link

    The review is good, I really like that you added the compilation benchmark for chromium -- good job!

    I'm a little disappointed in the lack of overclocking information. What is the point of reviewing the K edition of this chip without even doing a simple overclock with a comparison to 2600K in terms of power draw and heat?
  • Silenus - Wednesday, March 7, 2012 - link

    That is because this is NOT a review...it's just a preview. I'm sure they will do some overclocking testing in the full review later. Those results would be more meaningful then anyway as this is still early hardware/drivers.

Log in

Don't have an account? Sign up now