Discrete GPU Gaming Performance

Gaming performance with a discrete GPU does improve in line with the rest of what we've seen thus far from Ivy Bridge. It's definitely a step ahead of Sandy Bridge, but not enough to warrant an upgrade in most cases. If you haven't already made the jump to Sandy Bridge however, the upgrade will do you well.

Dragon Age Origins

DAO has been a staple of our CPU gaming benchmarks for some time now. The third/first person RPG is well threaded and is influenced both by CPU and GPU performance. Our benchmark is a FRAPS runthrough of our character through a castle.

Dragon Age Origins - 1680 x 1050 - Max Settings (no AA/Vsync)

Dawn of War II

Dawn of War II is an RTS title that ships with a built in performance test. I ran at Ultra quality settings at 1680 x 1050:

Dawn of War II - 1680 x 1050 - Ultra Settings

World of Warcraft

Our WoW test is run at High quality settings on a lightly populated server in an area where no other players are present to produce repeatable results. We ran at 1680 x 1050.

World of Warcraft

Starcraft 2

We have two Starcraft II benchmarks: a GPU and a CPU test. The GPU test is mostly a navigate-around-the-map test, as scrolling and panning around tends to be the most GPU bound in the game. Our CPU test involves a massive battle of 6 armies in the center of the map, stressing the CPU more than the GPU. At these low quality settings however, both benchmarks are influenced by CPU and GPU. We'll get to the GPU test shortly, but our CPU test results are below. The benchmark runs at 1024 x 768 at Medium Quality settings with all CPU influenced features set to Ultra.

Starcraft 2

Metro 2033

We're using the Metro 2033 benchmark that ships with the game. We run the benchmark at 1024 x 768 for a more CPU bound test as well as 1920 x 1200 to show what happens in a more GPU bound scenario.

Metro 2033 Frontline Benchmark - 1024 x 768 - DX11 High Quality

Metro 2033 Frontline Benchmark - 1920 x 1200 - DX11 High Quality

DiRT 3

We ran two DiRT 3 benchmarks to get an idea for CPU bound and GPU bound performance. First the CPU bound settings:

DiRT 3 - Aspen Benchmark - 1024 x 768 Low Quality

DiRT 3 - Aspen Benchmark - 1920 x 1200 High Quality

Crysis: Warhead

Crysis Warhead Assault Benchmark - 1680 x 1050 Mainstream DX10 64-bit

Civilization V

Civ V's lateGameView benchmark presents us with two separate scores: average frame rate for the entire test as well as a no-render score that only looks at CPU performance. We're looking at the no-render score here to isolate CPU performance alone:

Civilization V - 1680 x 1050 - DX11 High Quality

Compression & Encryption Performance Power Consumption
Comments Locked

195 Comments

View All Comments

  • tipoo - Wednesday, March 7, 2012 - link

    Thankfully the comments of a certain troll were removed so mine no longer makes sense, for any future readers.
  • Articuno - Tuesday, March 6, 2012 - link

    Just like how overclocking a Pentium 4 resulted in it beating an Athlon 64 and had lower power consumption to boot-- oh wait.
  • SteelCity1981 - Tuesday, March 6, 2012 - link

    That's a stupid comment only a stupid fanboy would make AMD is way ahead of Intel in the graphics department and is very competitive with Intel in the mobile segment now.
  • tipoo - Tuesday, March 6, 2012 - link

    Your comments would do nothing to inform regular readers of sites like this, we already know more. So please, can it.
  • tipoo - Tuesday, March 6, 2012 - link

    Not what I asked little troll. Give a source that says Apple will get a special HD4000 like no other.
  • Operandi - Tuesday, March 6, 2012 - link

    What are you talking about? As long as AMD has a better iGPU there is plenty of reason for them to be viable choice today. And if gaming iGPU performance holds on against Intel there is more than just hope of them getting back in the game in terms of high performance comput tomorrow.
  • tipoo - Tuesday, March 6, 2012 - link

    I'm pretty sure even 16x AF has a sub 2% performance hit on even the lowest end of todays GPUs, is it different with the HD Graphics? If not, why not just enable it like most people would, even on something like a 4670 I max out AF without thinking twice about it, AA still hurts performance though.
  • IntelUser2000 - Tuesday, March 6, 2012 - link

    AF has greater performance impact on low end GPUs. Typically its about 10-15%. It's less on the HD Graphics 3000, only because their 16x AF really only works at much lower levels. It's akin to having option for 1280x1024 resolution, but performing like 1024x768 because it looks like the latter.

    If Ivy Bridge improved AF quality to be on par with AMD/Nvidia, performance loss should be similar as well.
  • tipoo - Wednesday, March 7, 2012 - link

    Hmm I did not know that, what component of the GPU is involved in that performance hit (shaders, ROPs, etc)? My card is fairly low end and 16x AF performs nearly no different than 0x.
  • Exophase - Wednesday, March 7, 2012 - link

    AF requires more samples in cases of high anisotropy so I guess the TMU load increases, which may also increase bandwidth requirements since it could force higher LOD in these cases. You'll only see a performance difference if the AF causes the scene to be TMU/bandwidth limited instead of say, ALU limited. I'd expect this to happen more as you move up in performance, not down, since ALU:TEX ratio tends to go up along the higher end.. but APUs can be more bandwidth sensitive and I think Intel's IGPs never had a lot of TMUs.

    Of course it's also very scene dependent. And maybe an inferior AF implementation could end up sampling more than a better one.

Log in

Don't have an account? Sign up now