Crysis: Warhead

Our first graphics test is Crysis: Warhead, which in spite of its relatively high system requirements is the oldest game in our test suite. Crysis was the first game to really make use of DX10, and set a very high bar for modern games that still hasn't been completely cleared. And while its age means it's not heavily played these days, it's a great reference for how far GPU performance has come since 2008. For an iGPU to even run Crysis at a playable framerate is a significant accomplishment, and even more so if it can do so at better than performance (low) quality settings.

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

While Crysis on the HD 4000 was downright impressive, the HD 2500 is significantly slower.

Metro 2033

Our next graphics test is Metro 2033, another graphically challenging game. Since IVB is the first Intel GPU to feature DX11 capabilities, this is the first time an Intel GPU has been able to run Metro in DX11 mode. Like Crysis this is a game that is traditionally unplayable on Intel iGPUs, even in DX9 mode.

Metro 2033

Metro 2033

Metro 2033

DiRT 3

DiRT 3 is our next DX11 game. Developer Codemasters Southam added DX11 functionality to their EGO 2.0 engine back in 2009 with DiRT 2, and while it doesn't make extensive use of DX11 it does use it to good effect in order to apply tessellation to certain environmental models along with utilizing a better ambient occlusion lighting model. As a result DX11 functionality is very cheap from a performance standpoint, meaning it doesn't require a GPU that excels at DX11 feature performance.

DiRT 3

DiRT 3

Portal 2

Portal 2 continues to be the latest and greatest Source engine game to come out of Valve's offices. While Source continues to be a DX9 engine, and hence is designed to allow games to be playable on a wide range of hardware, Valve has continued to upgrade it over the years to improve its quality, and combined with their choice of style you’d have a hard time telling it’s over 7 years old at this point. From a rendering standpoint Portal 2 isn't particularly geometry heavy, but it does make plenty of use of shaders.

It's worth noting however that this is the one game where we encountered something that may be a rendering error with Ivy Bridge. Based on our image quality screenshots Ivy Bridge renders a distinctly "busier" image than Llano or NVIDIA's GPUs. It's not clear whether this is causing an increased workload on Ivy Bridge, but it's worth considering.

Portal 2

Portal 2

Ivy Bridge's processor graphics struggles with Portal 2. A move to fewer EUs doesn't help things at all.

Battlefield 3

Its popularity aside, Battlefield 3 may be the most interesting game in our benchmark suite for a single reason: it was the first AAA DX10+ game. Consequently it makes no attempt to shy away from pushing the graphics envelope, and pushing GPUs to their limits at the same time. Even at low settings Battlefield 3 is a handful, and to be able to run it on an iGPU would no doubt make quite a few traveling gamers happy.

Battlefield 3

The HD 4000 delivered a nearly acceptable experience in single player Battlefield 3, but the HD 2500 falls well below that. At just under 20 fps, this isn't very good performance. It's clear the HD 2500 is not made for modern day gaming, never mind multiplayer Battlefield 3.

Starcraft 2

Our next game is Starcraft II, Blizzard’s 2010 RTS megahit. Starcraft II is a DX9 game that is designed to run on a wide range of hardware, and given the growth in GPU performance over the years it's often CPU limited before it's GPU limited on higher-end cards.

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 performance is borderline at best on the HD 2500. At low enough settings the HD 2500 can deliver an ok experience, but it's simply not fast enough.

Skyrim

Bethesda's epic sword & magic game The Elder Scrolls V: Skyrim is our RPG of choice for benchmarking. It's altogether a good CPU benchmark thanks to its complex scripting and AI, but it also can end up pushing a large number of fairly complex models and effects at once. This is a DX9 game so it isn't utilizing any of IVB's new DX11 functionality, but it can still be a demanding game.

The Elder Scrolls V: Skyrim

The Elder Scrolls V: Skyrim

At lower quality settings, Intel's HD 4000 definitely passed the threshold for playable in Skyrim on average. The HD 2500 is definitely not in the same league however. At 21.5 fps performance is marginal at best, and when you crank up the resolution to 1680 x 1050 the HD 2500 simply falls apart.

Minecraft

Switching gears for the moment we have Minecraft, our OpenGL title. It's no secret that OpenGL usage on the PC has fallen by the wayside in recent years, and as far major games go Minecraft is one of but a few recently released major titles using OpenGL. Minecraft is incredibly simple—not even utilizing pixel shaders let alone more advanced hardware—but this doesn't mean it's easy to render. Its use of massive amounts of blocks (and the overdraw that creates) means you need solid hardware and an efficient OpenGL implementation if you want to hit playable framerates with a far render distance. Consequently, as the most successful OpenGL game in quite some number of years (at over 5.5mil copies sold), it's a good reminder for GPU manufacturers that OpenGL is not to be ignored.

Minecraft

Our test here is pretty simple: we're looking at lush forest after the world finishes loading. Ivy Bridge's processor graphics maintains a significant performance advantage over the Sandy Bridge generation, making this one of the only situations where the HD 2500 is able to significantly outperform Intel's HD 3000. Minecraft is definitely the exception however as whatever advantage we see here is purely architectural.

Civilization V

Our final game, Civilization V, gives us an interesting look at things that other RTSes cannot match, with a much weaker focus on shading in the game world, and a much greater focus on creating the geometry needed to bring such a world to life. In doing so it uses a slew of DirectX 11 technologies, including tessellation for said geometry, driver command lists for reducing CPU overhead, and compute shaders for on-the-fly texture decompression. There are other games that are more stressful overall, but this is likely the game most stressing of DX11 performance in particular.

Civilization V

Civilization V

Civilization V was an extremely weak showing on the HD 4000 when we looked at it last month, and it's even worse on the HD 2500. Civ players need not bother with Intel's processor graphics, go AMD or discrete.

Intel's HD 2500 & Quick Sync Performance Intel HD 2500: Compute, Synthetics & Power
Comments Locked

67 Comments

View All Comments

  • shin0bi272 - Friday, June 1, 2012 - link

    any gamer with a good quad core doesnt need to upgrade their cpu. Who's going to spend hundreds of dollars to upgrade from another quad core (like lets say my i7 920) to this one for a whopping 7 fps in one game and 1 fps in another? That sounds like something an apple fanboy would do... oh look the new isuch-and-such is out and its marginally better than the one I spent $x00 on last month I have to buy it now! no thanks.
  • Sogekihei - Monday, June 4, 2012 - link

    This really depends a lot on what you have (or want) to do with your computer. Architectural differences are obviously a big deal or else instead of an i7-920 you'd probably be rocking a Phenom (1) x4 or Core2 Quad by your logic that having a passable quad core means you don't need to upgrade your processor until the majority of gaming technology catches up.

    Let's take the bsnes emulator as an example here, it focuses on low-level emulation of the SNES hardware to reproduce games as accurately as possible. With most new version releases, the hardware requirements gradually increase as more intricate backend code needs to execute within the same space of time to avoid dropping framerates; being that these games determined their running speed by their framerate and being sub-60 or sub-50 (region-dependent) means running at less than full speed, this could eventually be a problem for somebody wanting to use such an accurate emulator. From what I've heard, most Phenom and Phenom II systems are very bogged down and can barely get any games running at full speed on it these days and from my own experience, Nehalem-based Intel chips either require ludicrous clock speeds or simply aren't capable of running certain games at full speed (such as Super Mario RPG.) Obviously in cases such as this, the performance increases from a new architecture could benefit a user greatly.

    Another example I'll give is based on the probability through my own experiences dealing with other people that the vast majority of gamers DO use their rigs for other tasks too. Any intensive work with maths, spreadsheets, video or image editing and rendering, software development, blueprinting, or anything else you could name that people do on a computer nowadays instead of by hand in order to speed the process will see massive gains when moving to a faster processor architecture. For anybody that has such work to do, be it for a very invested-in hobby, as part of a post-secondary education course, or as part of their career, the few hundred dollars/euros/currency of choice it costs to update their system is easily worth the potentially hundreds or thousands of hours per upgrade cycle they may save through the more powerful hardware.

    I will concede that in today's market, the majority of gaming-exclusive cases don't yield much better results from increasing a processor's power (usually being GPU-limited instead) however that's a very broad statement and doesn't account for things that are heavily multithreaded (like newer Assassin's Creed games) or that are very processor-intensive (which I believe Civilization V can qualify as in mid- to late-game scenarios.)

    There will always be case-specific conditions which will make buying something make sense or not, but do try to keep in mind that a lot of people do have disposable income and will very likely end up putting it into their hobbies before anything else. If their hobbies deal with computers they're likely going to want to always have, to the best extent they can afford, the latest and greatest technology available. Does it mean your system is trash? Of course not. Does it mean they're stupid? No moreso than the man that puts $10 a week into his local lottery and never wins anything. It just comes down to you having different priorities from them.

    The only other thing I want to address is your stance on Apple products. Yes the hipsters are annoying, but you would likely lose the war if you wanted to argue on the upgrade cycle users take with Mac OSX-based computers. New product generations only come about once a year or so and most users wait 2-3 generations before upgrading and quite a few wait much longer than the average Linux/Windows PC user will before upgrading. The ones that don't wait are usually professionals in some sort of graphic arts industry (such as photography) where they need the most processing power, memory, graphics capabilities, and battery life possible and it's a justified business expense.
  • CeriseCogburn - Monday, June 11, 2012 - link

    People usually skip a generation - so from i7 920 we can call it one gen with SB being nearly the same as IB, so you're correct.

    But anyone on core 2 or phenom 2 or athlon 2x or 4x, yeah they could do it an be happy - and don't forget the sata 6 and usb 3 they get with the new board - so it's not just the cpu with IB and SB - you get hard drive and usb speed too.

    So with those extras it could drive a few people in your position - sata 6 card and usb 3 card is half the upgrade cost anyway, so add in pci-e 3 as well. I see some people moving from where you are.
  • ClagMaster - Saturday, June 2, 2012 - link

    The onboard graphics of the Ivy Bridge processors was never seriously intended for playing games. It is intended to replace chipset graphics for to support office applications with large LCD monitors. And it adds transcoding capabilities.

    @Anand : If you want to do a more meaningful comparison of graphics performance for those that might be doing gaming, why not test and compare some DX9 games (still being written) of titles available 5 years ago. Real people play these games because they are cheap or free and provide as much entertainment as DX10 or DX11 games. Frame rates will be 60fps or slightly better. Or will your sponsors at nVidia, AMD or Intel not permit this sort of comparison.

    Its ridiculous to compare onboard graphics to discrete graphics performance. A dedicated GPU, optimized for graphics, will always beat a onboard graphics GPU for a given gate size.

    The Ivy Bridge graphics (performance/power consumption) , if I interpret these comparisons that have been presented correctly, is also inefficient compared to the processing capabilities of a discrete graphics card.
  • vegemeister - Wednesday, June 6, 2012 - link

    As you mentioned, I'd like to see some mention of the 2D performance. I use Awesome WM on a 3520x1200 X screen, and smooth scrolling can sometimes get choppy with my Graphics My Ass GPU.

    I'd like to upgrade my Core2 duo, but I'm not sure whether the HD2500 graphics in this chip will suffice, or if I need to be looking at higher end CPUs. I don't really care about the difference between shitty 3D and ho-hum 3D.
  • P39Airacobra - Tuesday, July 1, 2014 - link

    That's a shame that they still sale the GT 520 and GT 610 and the ATi 5450, When a integrated GPU like the HD 2500 out performs a dedicated GPU it's time to retire them from the market. I bought a 3470 and I am running a R9 270 with 8GB of 1600 Ripjaws. I tried out the HD 2500 on the chip just to see how it would do, It honestly sucked, But for videos and gaming on very low settings it works, It actually surprised me. But I don't think I could ever stand to have a intergrated GPU, What's the point in buying a i5 if you are only going to use the integrated gpu? It does not make sense, You may as well keep your old P4 if you are not going to add a real GPU to it. This is why I don't understand the point of a integrated GPU inside a high end processor.
  • Imogen596 - Saturday, September 29, 2018 - link

    Materials to guarantee lasting sturdiness. https://about.me/lenabryan It needs to likewise be adjustable for http://www.bricksite.com/dogharnesstouch convenience as well as safety.

Log in

Don't have an account? Sign up now