Final Words

Intel's Core i5-3470 is a good base for a system equipped with a discrete GPU. You don't get the heavily threaded performance of the quad-core, eight-thread Core i7 but you're also saving nearly $100. For a gaming machine or anything else that's not going to be doing a lot of thread heavy work (e.g. non-QuickSync video transcode, offline 3D rendering, etc...) the 3470 is definitely good enough. Your overclocking options are significantly limited as the 3470 is a partially unlocked CPU, but you can pretty much count on getting an extra 400MHz across the board, regardless of number of active cores.

Intel's HD 2500 however is less exciting. This is clearly the processor graphics option for users who don't care about processor graphics performance. The 2500's performance is tangibly worse than last year's HD 3000 offering (which makes sense given the 6 EU configuration) and it's not good enough to be considered playable in any of the games we tested. The good news is Quick Sync performance remains unaffected, making HD 2500 just as good as the HD 4000 for video transcoding. In short, if you're going to rely on processor graphics for gaming, you need the HD 4000 at a minimum. Otherwise, the HD 2500 is just fine.

General Performance
Comments Locked

67 Comments

View All Comments

  • JarredWalton - Thursday, May 31, 2012 - link

    Intel actually has a beta driver (tested on the Ultrabook) that improves Portal 2 performance. I expect it will make its way to the public driver release in the next month. There are definitely still driver performance issues to address, but even so I don't think HD 4000 has the raw performance potential to match Trinity unless a game happens to be CPU intensive.
  • n9ntje - Thursday, May 31, 2012 - link

    Don't forget memory bandwidth. Both the CPU and GPU use the same memory on the motherboard.
  • tacosRcool - Thursday, May 31, 2012 - link

    kinda a waste in terms of graphics
  • paraffin - Thursday, May 31, 2012 - link

    With 1920×1080 being the standard thesedays I find it annoying that all AT tests continue to ignore it. Are you trying to goad monitor makers back into 16:10 or something?
  • Sogekihei - Monday, June 4, 2012 - link

    The 1080p resolution may have become standard for televisions, but it certainly isn't so for computer monitors. These days the "standard" computer monitor (meaning, what an OEM rig will ship with in most cases whether it's a desktop or notebook) is some variant of 136#x768 resolution, so that gets tested for low-end graphics options that are likely to be seen in cheap OEM desktops and most OEM laptops (such as integrated graphics seen here.)

    The 1680x1050 resolution was the highest end-user resolution available cheaply for a while and is kind of like a standard among tech enthusiasts- sure you had other offerings available like some (expensive) 1920x1200 CRTs, but most people's budget left them with sticking to 1280x1024 CRTs or cheap LCDs or if they wanted to go with a slightly higher quality LCD practically the only available resolution at the time was 1680x1050. A lot of people don't care enough about the quality of their display to upgrade it as frequently as performance-oriented parts so many of us still have at least one 1680x1050 lying around, probably in use as a secondary or for some even a primary display despite 1080p monitors being the same cost or lower price when purchased new.
  • Beenthere - Thursday, May 31, 2012 - link

    I imagine with the heat/OC'ing issues with the trigate chips, Intel is working to resolve Fab as well as operational issues with IB and thus isn't ramping as fast as normal.
  • Fritsert - Thursday, May 31, 2012 - link

    Would the HQV score of the HD2500 be the same as the HD4000 in the Anandtech review? Basically would video playback performance be the same (HQV, 24fps image enhancement features etc.)?

    A lot of processors in the low power ivy bridge lineup have the HD2500. If playback quality is the same this would make those very good candidates for my next HTPC. The Core i5 3470T specifically.
  • cjs150 - Friday, June 8, 2012 - link

    Also does the HD2500 lock at the correct FPS rate which is not exactly 24FPS. AMD has had this for ages but Intel only caught up with the HD4000. For me it is the difference between an i7-3770T and an i5-3470T
  • Affectionate-Bed-980 - Thursday, May 31, 2012 - link

    This is a replacement of the i5-2400. Actually the 3450 was, but this is 100mhz faster. You should be comparing HD2000 vs HD2500 as well as these aren't top tier models with the HD3000/4000.
  • bkiserx7 - Thursday, May 31, 2012 - link

    In the GPU Power Consumption comparison section, did you disable HT and lock the 3770k to the same frequency as the 3470 to get a more accurate comparison between just the HD 4000 and HD 2500?

Log in

Don't have an account? Sign up now