Discrete GPU Gaming Performance

Gaming performance with a discrete GPU does improve in line with the rest of what we've seen thus far from Ivy Bridge. It's definitely a step ahead of Sandy Bridge, but not enough to warrant an upgrade in most cases. If you haven't already made the jump to Sandy Bridge however, the upgrade will do you well.

Dragon Age Origins

DAO has been a staple of our CPU gaming benchmarks for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age Origins - 1680 x 1050 - Max Settings (no AA/Vsync)

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at Ultra quality settings at 1680 x 1050:

Dawn of War II - 1680 x 1050 - Ultra Settings

World of Warcraft

Our WoW test is run at High quality settings on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1680 x 1050.

World of Warcraft

Starcraft 2

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of 6 armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings however, both benchmarks are influenced by CPU and GPU. We'll get to the GPU test shortly, but our CPU test results are below. The benchmark runs at 1024 x 768 at Medium Quality settings with all CPU influenced features set to Ultra.

Starcraft 2

Metro 2033

We're using the Metro 2033 benchmark that ships with the game. We run the benchmark at 1024 x 768 for a more CPU bound test as well as 1920 x 1200 to show what happens in a more GPU bound scenario.

Metro 2033 Frontline Benchmark - 1024 x 768 - DX11 High Quality

Metro 2033 Frontline Benchmark - 1920 x 1200 - DX11 High Quality

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3 - Aspen Benchmark - 1024 x 768 Low Quality

DiRT 3 - Aspen Benchmark - 1920 x 1200 High Quality

Crysis: Warhead

Crysis Warhead Assault Benchmark - 1680 x 1050 Mainstream DX10 64-bit

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance. We're looking at the no-render score here to isolate CPU performance alone:

Civilization V - 1680 x 1050 - DX11 High Quality

Compression & Encryption Performance Power Consumption
Comments Locked

195 Comments

View All Comments

  • rpsgc - Wednesday, March 7, 2012 - link

    All that revenue, all that profit and yet, they STILL can't bet AMD in integrated graphics.

    I think that qualifies as a fail.

    Thanks for (kind of) proving his point?
  • dagamer34 - Thursday, March 8, 2012 - link

    They don't really care to. The point of a business is to make money, not have the best products. The latter only gets solved when AMD gets serious in competing with Intel on power/performance again.
  • Operandi - Tuesday, March 6, 2012 - link

    The internet called,"stop wasting my bits".
  • StevoLincolnite - Tuesday, March 6, 2012 - link

    You know what? All you do is bash AMD.
    If you think AMD sucks THAT much and it's engineers and everything else is incredibly bad...
    Then I have a challenge.

    Go build your own Processor or GTFO with the bashing.
  • bennyg - Wednesday, March 7, 2012 - link

    Do not feed the troll.
  • StevoLincolnite - Tuesday, March 6, 2012 - link

    Except... Intels IGP drivers on Windows are bad already. They are allot worst on the Mac.
    Historically Intel has never supported it's IGP's to *any* great length and even had to throw up a compatibility list for it's IGP's so you know what games they could potentially run.

    Here is a good example:
    http://www.intel.com/support/graphics/intelhdgraph...

    Heck I recall it taking Intel a good 12 months just to enable TnL and Shader Model 3 on the x3100 chips.

    Historically the support has just not been there.
  • earthrace57 - Tuesday, March 6, 2012 - link

    AMD's CPUs are going to die...sucks to be an AMD fanboy. However, whatever they are doing with their dedicated GPUs, they are doing something right...if they can manage to pull their act together on the driver side, I think AMD would live as a GPU company...
  • earthrace57 - Tuesday, March 6, 2012 - link

    I'm sorry, but Llano APUs will stay on top for quite a while; Intel is still at heart a CPU, Llano is part GPU...if AMD can get drivers the quality of nVidias, they will most likely do extremely well on that front.
  • zshift - Tuesday, March 6, 2012 - link

    I really enjoyed the added compilation benchmark. This site has the most comprehensive collection of benchmarks that I've seen, it's a one-stop shop for most of my reviews. Keep up the great work!
  • Jamahl - Tuesday, March 6, 2012 - link

    Would be great to see power benchmarks of the IGP, especially vs Llano and the HD 3000. Let's see if the graphics improvements have come at the price of yet more power consumption or if intel has managed to keep that down.

Log in

Don't have an account? Sign up now