Crysis: Warhead

Our first graphics test is Crysis: Warhead, which in spite of its relatively high system requirements is the oldest game in our test suite. Crysis was the first game to really make use of DX10, and set a very high bar for modern games that still hasn't been completely cleared. And while its age means it's not heavily played these days, it's a great reference for how far GPU performance has come since 2008. For an iGPU to even run Crysis at a playable framerate is a significant accomplishment, and even more so if it can do so at better than performance (low) quality settings.

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

While Crysis on the HD 4000 was downright impressive, the HD 2500 is significantly slower.

Metro 2033

Our next graphics test is Metro 2033, another graphically challenging game. Since IVB is the first Intel GPU to feature DX11 capabilities, this is the first time an Intel GPU has been able to run Metro in DX11 mode. Like Crysis this is a game that is traditionally unplayable on Intel iGPUs, even in DX9 mode.

Metro 2033

Metro 2033

Metro 2033

DiRT 3

DiRT 3 is our next DX11 game. Developer Codemasters Southam added DX11 functionality to their EGO 2.0 engine back in 2009 with DiRT 2, and while it doesn't make extensive use of DX11 it does use it to good effect in order to apply tessellation to certain environmental models along with utilizing a better ambient occlusion lighting model. As a result DX11 functionality is very cheap from a performance standpoint, meaning it doesn't require a GPU that excels at DX11 feature performance.

DiRT 3

DiRT 3

Portal 2

Portal 2 continues to be the latest and greatest Source engine game to come out of Valve's offices. While Source continues to be a DX9 engine, and hence is designed to allow games to be playable on a wide range of hardware, Valve has continued to upgrade it over the years to improve its quality, and combined with their choice of style you’d have a hard time telling it’s over 7 years old at this point. From a rendering standpoint Portal 2 isn't particularly geometry heavy, but it does make plenty of use of shaders.

It's worth noting however that this is the one game where we encountered something that may be a rendering error with Ivy Bridge. Based on our image quality screenshots Ivy Bridge renders a distinctly "busier" image than Llano or NVIDIA's GPUs. It's not clear whether this is causing an increased workload on Ivy Bridge, but it's worth considering.

Portal 2

Portal 2

Ivy Bridge's processor graphics struggles with Portal 2. A move to fewer EUs doesn't help things at all.

Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it was the first AAA DX10+ game. Consequently it makes no attempt to shy away from pushing the graphics envelope, and pushing GPUs to their limits at the same time. Even at low settings Battlefield 3 is a handful, and to be able to run it on an iGPU would no doubt make quite a few traveling gamers happy.

Battlefield 3

The HD 4000 delivered a nearly acceptable experience in single player Battlefield 3, but the HD 2500 falls well below that. At just under 20 fps, this isn't very good performance. It's clear the HD 2500 is not made for modern day gaming, never mind multiplayer Battlefield 3.

Starcraft 2

Our next game is Starcraft II, Blizzard’s 2010 RTS megahit. Starcraft II is a DX9 game that is designed to run on a wide range of hardware, and given the growth in GPU performance over the years it's often CPU limited before it's GPU limited on higher-end cards.

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 performance is borderline at best on the HD 2500. At low enough settings the HD 2500 can deliver an ok experience, but it's simply not fast enough.

Skyrim

Bethesda's epic sword & magic game The Elder Scrolls V: Skyrim is our RPG of choice for benchmarking. It's altogether a good CPU benchmark thanks to its complex scripting and AI, but it also can end up pushing a large number of fairly complex models and effects at once. This is a DX9 game so it isn't utilizing any of IVB's new DX11 functionality, but it can still be a demanding game.

The Elder Scrolls V: Skyrim

The Elder Scrolls V: Skyrim

At lower quality settings, Intel's HD 4000 definitely passed the threshold for playable in Skyrim on average. The HD 2500 is definitely not in the same league however. At 21.5 fps performance is marginal at best, and when you crank up the resolution to 1680 x 1050 the HD 2500 simply falls apart.

Minecraft

Switching gears for the moment we have Minecraft, our OpenGL title. It's no secret that OpenGL usage on the PC has fallen by the wayside in recent years, and as far major games go Minecraft is one of but a few recently released major titles using OpenGL. Minecraft is incredibly simple—not even utilizing pixel shaders let alone more advanced hardware—but this doesn't mean it's easy to render. Its use of massive amounts of blocks (and the overdraw that creates) means you need solid hardware and an efficient OpenGL implementation if you want to hit playable framerates with a far render distance. Consequently, as the most successful OpenGL game in quite some number of years (at over 5.5mil copies sold), it's a good reminder for GPU manufacturers that OpenGL is not to be ignored.

Minecraft

Our test here is pretty simple: we're looking at lush forest after the world finishes loading. Ivy Bridge's processor graphics maintains a significant performance advantage over the Sandy Bridge generation, making this one of the only situations where the HD 2500 is able to significantly outperform Intel's HD 3000. Minecraft is definitely the exception however as whatever advantage we see here is purely architectural.

Civilization V

Our final game, Civilization V, gives us an interesting look at things that other RTSes cannot match, with a much weaker focus on shading in the game world, and a much greater focus on creating the geometry needed to bring such a world to life. In doing so it uses a slew of DirectX 11 technologies, including tessellation for said geometry, driver command lists for reducing CPU overhead, and compute shaders for on-the-fly texture decompression. There are other games that are more stressful overall, but this is likely the game most stressing of DX11 performance in particular.

Civilization V

Civilization V

Civilization V was an extremely weak showing on the HD 4000 when we looked at it last month, and it's even worse on the HD 2500. Civ players need not bother with Intel's processor graphics, go AMD or discrete.

Intel's HD 2500 & Quick Sync Performance Intel HD 2500: Compute, Synthetics & Power
Comments Locked

67 Comments

View All Comments

  • BSMonitor - Thursday, May 31, 2012 - link

    f they truly were interested in building the best APU. And by that, a knockout iGPU experience.

    Where are the dual-core Core i7's with 30-40 EU's??
    Or the AMD <whatevers> (not sure anymore what they call their APU) Phenom X2 CPU with 1200 Shaders??

    When we are talking about a truly GPU intense application, a LOT of times single/dual core CPU is enough. Heck, if you were to take a dual-core Core 2, and stick it with a GeForce 670 or Radeon 7950.. You would see very similar numbers in terms of gaming performance to what's in the BENCH charts. ESP at the 1920x1080 and below.

    Surely Intel can afford another die that aims a ton of transistors at just the GPU side of things. AMD, maybe. Why do we get from BOTH, their top end iGPU stuck with the most transistors dedicated to the CPU??

    I find it hard to believe anyone shopping for an APU is hoping for amazing CPU performance to go with their average iGPU performance. That market would be the opposite. Sacrifice a few threads on the CPU side for amazing iGPU.

    Am I missing something technically limiting?? Is that many GPU units overkill in terms of power/heat dissipation of the CPU socket??
  • tipoo - Thursday, May 31, 2012 - link

    Well, their chips have to work in a certain set of thermal limits. Maybe at this point 1200 shader cores would not be possible on the same die as a quad core CPU for power consumption and heat reasons. I think Haswel will have 64 EUs though if the rumours are true.
  • Roland00Address - Thursday, May 31, 2012 - link

    There is no point of a 1200 shaders apu due to memory bandwidth. You couldn't feed a beast of an apu with only dual channel 1600 mhz memory when that same memory limits the performance of llano and trinity compared to their gpu cousins which have the same calculation units and core clocks but the gpus perform significantly better.
  • silverblue - Thursday, May 31, 2012 - link

    Possibly, but at the moment, bandwidth is a surefire performance killer.
  • BSMonitor - Thursday, May 31, 2012 - link

    Good Points. But currently, Intel has Quad Channel DDR3-1600 up on the Socket 2011. I am sure AMD could get more bandwidth there too, if they step up the memory controller.

    My overall point, is that neither is even trying for a low-medium transistor CPU and a high transistor GPU.

    It's either Low-Medium CPU with Low-Medium GPU (disabled cores and what have you), or High End CPU with "High End" GPU.

    There is no attempt at giving up CPU die space for more GPU transistors from either.. None. If you someone spends $$ on the High End of the CPU (Quad Core i7), the implementation of iGPU is not even close to worth using for that much CPU.
  • Roland00Address - Thursday, May 31, 2012 - link

    Quad Channel is not a "free upgrade" it requires much more traces on the motherboard as well as more pins on the cpu socket. This dramtically increases costs for the motherboard and the cpu. Both of those are going against what AMD is trying to do with their APUs which will be both laptop as well as desktop chips. They are trying to increase their margins on their chips not decrease them.

    You have a large number OEMs only putting a single 4gb ddr3 stick in laptops and desktops (thus not achieving dual channel) in the current apus. You want think those same vendors are suddenly going to put 16gbs of memory on an apu (and it is going to be 16gbs since 2gb ddr3 sticks are being phased out via the memory manufactures.)
  • tipoo - Thursday, May 31, 2012 - link

    I'm curious why the HD4000 outperforms something like the 5450 by nearly double in Skyrim, yet falls behind in something like Portal or Civ, or even Minecraft? Is it immature drivers or something in the architecture itself?
  • ShieTar - Thursday, May 31, 2012 - link

    For Minecraft, read the article and what it has to say about OpenGL.

    For Portal or Civ, it might very well be related to Memory Bandwidth. The HD2500 can have 25.6 GB/s (with DDR3-1600), or even more. The 5450 generally comes with half as much (12.8 GB/s), or even a quarter of it since there are also 5450s with DDR2.

    As a matter of fact, I remember reading several reports on how much the Llano-Graphics would improve with faster Memory, even beyond DDR3-1600. I havn't seen any tests on the impact of memory speed from Ivy Bridge or Trinity yet, but that would be interesting given their increased computing powers.
  • silverblue - Thursday, May 31, 2012 - link

    I'm sure it'll matter for both, more so for Trinity. I'm not sure we'll see much in the way of a comparison until the desktop Trinity appears, but for IB, I'm certainly waiting.
  • tipoo - Thursday, May 31, 2012 - link

    Having half the memory bandwidth would lead to the reverse expectation, the 5450 is close to or even surpasses the HD4000 with twice the bandwidth in those games, yet the 4000 beats it by almost double in games like Skyrim, even the 2500 beats it there.

Log in

Don't have an account? Sign up now