Intel's HD 2500 & Quick Sync Performance

What makes the 3470 particularly interesting to look at is the fact that it features Intel's HD 2500 processor graphics. The main difference between the 2500 and 4000 is the number of compute units on-die:

Intel Processor Graphics Comparison
  Intel HD 2500 Intel HD 4000
EUs 6 16
Base Clock 650MHz 650MHz
Max Turbo 1150MHz 1150MHz

At 6 EUs, Intel's HD 2500 has the same number of compute resources as the previous generation HD 2000. In fact, Intel claims that performance should be around 10 - 20% faster than HD 2000 in 3D games. Given that Intel's HD 4000 is getting close to the minimum level of 3D performance we'd like to see from Intel, chances are the 2500 will not impress. We'll get to quantifying that shortly, but the good news is Quick Sync performance is retained:

CyberLink Media Espresso 6.5 - Harry Potter 8 Transcode

The HD 2500 does a little better than our HD 4000 here, but that's just normal run to run variance. Quick Sync does rely heavily on the EU array for transcode work, but it looks like the workload itself isn't heavy enough to distinguish between the 6 EU HD 2500 and the 16 EU HD 4000. If your only need for Intel's processor graphics is for transcode work, the HD 2500 appears indistinguishable from the HD 4000.

The bad news is I can't say the same about its 3D graphics performance.

Introduction & Overclocking Intel HD 2500 Performance
POST A COMMENT

66 Comments

View All Comments

  • shin0bi272 - Friday, June 01, 2012 - link

    any gamer with a good quad core doesnt need to upgrade their cpu. Who's going to spend hundreds of dollars to upgrade from another quad core (like lets say my i7 920) to this one for a whopping 7 fps in one game and 1 fps in another? That sounds like something an apple fanboy would do... oh look the new isuch-and-such is out and its marginally better than the one I spent $x00 on last month I have to buy it now! no thanks. Reply
  • Sogekihei - Monday, June 04, 2012 - link

    This really depends a lot on what you have (or want) to do with your computer. Architectural differences are obviously a big deal or else instead of an i7-920 you'd probably be rocking a Phenom (1) x4 or Core2 Quad by your logic that having a passable quad core means you don't need to upgrade your processor until the majority of gaming technology catches up.

    Let's take the bsnes emulator as an example here, it focuses on low-level emulation of the SNES hardware to reproduce games as accurately as possible. With most new version releases, the hardware requirements gradually increase as more intricate backend code needs to execute within the same space of time to avoid dropping framerates; being that these games determined their running speed by their framerate and being sub-60 or sub-50 (region-dependent) means running at less than full speed, this could eventually be a problem for somebody wanting to use such an accurate emulator. From what I've heard, most Phenom and Phenom II systems are very bogged down and can barely get any games running at full speed on it these days and from my own experience, Nehalem-based Intel chips either require ludicrous clock speeds or simply aren't capable of running certain games at full speed (such as Super Mario RPG.) Obviously in cases such as this, the performance increases from a new architecture could benefit a user greatly.

    Another example I'll give is based on the probability through my own experiences dealing with other people that the vast majority of gamers DO use their rigs for other tasks too. Any intensive work with maths, spreadsheets, video or image editing and rendering, software development, blueprinting, or anything else you could name that people do on a computer nowadays instead of by hand in order to speed the process will see massive gains when moving to a faster processor architecture. For anybody that has such work to do, be it for a very invested-in hobby, as part of a post-secondary education course, or as part of their career, the few hundred dollars/euros/currency of choice it costs to update their system is easily worth the potentially hundreds or thousands of hours per upgrade cycle they may save through the more powerful hardware.

    I will concede that in today's market, the majority of gaming-exclusive cases don't yield much better results from increasing a processor's power (usually being GPU-limited instead) however that's a very broad statement and doesn't account for things that are heavily multithreaded (like newer Assassin's Creed games) or that are very processor-intensive (which I believe Civilization V can qualify as in mid- to late-game scenarios.)

    There will always be case-specific conditions which will make buying something make sense or not, but do try to keep in mind that a lot of people do have disposable income and will very likely end up putting it into their hobbies before anything else. If their hobbies deal with computers they're likely going to want to always have, to the best extent they can afford, the latest and greatest technology available. Does it mean your system is trash? Of course not. Does it mean they're stupid? No moreso than the man that puts $10 a week into his local lottery and never wins anything. It just comes down to you having different priorities from them.

    The only other thing I want to address is your stance on Apple products. Yes the hipsters are annoying, but you would likely lose the war if you wanted to argue on the upgrade cycle users take with Mac OSX-based computers. New product generations only come about once a year or so and most users wait 2-3 generations before upgrading and quite a few wait much longer than the average Linux/Windows PC user will before upgrading. The ones that don't wait are usually professionals in some sort of graphic arts industry (such as photography) where they need the most processing power, memory, graphics capabilities, and battery life possible and it's a justified business expense.
    Reply
  • CeriseCogburn - Monday, June 11, 2012 - link

    People usually skip a generation - so from i7 920 we can call it one gen with SB being nearly the same as IB, so you're correct.

    But anyone on core 2 or phenom 2 or athlon 2x or 4x, yeah they could do it an be happy - and don't forget the sata 6 and usb 3 they get with the new board - so it's not just the cpu with IB and SB - you get hard drive and usb speed too.

    So with those extras it could drive a few people in your position - sata 6 card and usb 3 card is half the upgrade cost anyway, so add in pci-e 3 as well. I see some people moving from where you are.
    Reply
  • ClagMaster - Saturday, June 02, 2012 - link

    The onboard graphics of the Ivy Bridge processors was never seriously intended for playing games. It is intended to replace chipset graphics for to support office applications with large LCD monitors. And it adds transcoding capabilities.

    @Anand : If you want to do a more meaningful comparison of graphics performance for those that might be doing gaming, why not test and compare some DX9 games (still being written) of titles available 5 years ago. Real people play these games because they are cheap or free and provide as much entertainment as DX10 or DX11 games. Frame rates will be 60fps or slightly better. Or will your sponsors at nVidia, AMD or Intel not permit this sort of comparison.

    Its ridiculous to compare onboard graphics to discrete graphics performance. A dedicated GPU, optimized for graphics, will always beat a onboard graphics GPU for a given gate size.

    The Ivy Bridge graphics (performance/power consumption) , if I interpret these comparisons that have been presented correctly, is also inefficient compared to the processing capabilities of a discrete graphics card.
    Reply
  • vegemeister - Wednesday, June 06, 2012 - link

    As you mentioned, I'd like to see some mention of the 2D performance. I use Awesome WM on a 3520x1200 X screen, and smooth scrolling can sometimes get choppy with my Graphics My Ass GPU.

    I'd like to upgrade my Core2 duo, but I'm not sure whether the HD2500 graphics in this chip will suffice, or if I need to be looking at higher end CPUs. I don't really care about the difference between shitty 3D and ho-hum 3D.
    Reply
  • P39Airacobra - Tuesday, July 01, 2014 - link

    That's a shame that they still sale the GT 520 and GT 610 and the ATi 5450, When a integrated GPU like the HD 2500 out performs a dedicated GPU it's time to retire them from the market. I bought a 3470 and I am running a R9 270 with 8GB of 1600 Ripjaws. I tried out the HD 2500 on the chip just to see how it would do, It honestly sucked, But for videos and gaming on very low settings it works, It actually surprised me. But I don't think I could ever stand to have a intergrated GPU, What's the point in buying a i5 if you are only going to use the integrated gpu? It does not make sense, You may as well keep your old P4 if you are not going to add a real GPU to it. This is why I don't understand the point of a integrated GPU inside a high end processor. Reply

Log in

Don't have an account? Sign up now