Intel Core i5 3470 Review: HD 2500 Graphics Tested
by Anand Lal Shimpi on May 31, 2012 12:00 AM EST- Posted in
- CPUs
- Intel
- Ivy Bridge
- GPUs
General Performance
The general performance of the Core i5-3470 is nothing too unusual. We know from our original Ivy Bridge review that the advantage over Sandy Bridge is typically in the single digits. In other words, if Sandy Bridge was a good upgrade for your current system, Ivy Bridge won't change things. Idle power doesn't really improve over Sandy Bridge, but load power is a bit better.
Compared to the 3770K, you will lose out on heavily threaded performance due to the lack of Hyper Threading. But for many client workloads, including gaming, you can expect the 3470 to perform quite similarly to the 3770K.
Power Consumption Comparison | ||||
Intel DZ77GA-70K | Idle | Load (x264 2nd pass) | ||
Intel Core i7-3770K | 60.9W | 121.2W | ||
Intel Core i5-3470 | 54.4W | 96.6W | ||
Intel Core i5-3470 @ 4GHz | 54.4W | 110.1W |
67 Comments
View All Comments
JarredWalton - Thursday, May 31, 2012 - link
Intel actually has a beta driver (tested on the Ultrabook) that improves Portal 2 performance. I expect it will make its way to the public driver release in the next month. There are definitely still driver performance issues to address, but even so I don't think HD 4000 has the raw performance potential to match Trinity unless a game happens to be CPU intensive.n9ntje - Thursday, May 31, 2012 - link
Don't forget memory bandwidth. Both the CPU and GPU use the same memory on the motherboard.tacosRcool - Thursday, May 31, 2012 - link
kinda a waste in terms of graphicsparaffin - Thursday, May 31, 2012 - link
With 1920×1080 being the standard thesedays I find it annoying that all AT tests continue to ignore it. Are you trying to goad monitor makers back into 16:10 or something?Sogekihei - Monday, June 4, 2012 - link
The 1080p resolution may have become standard for televisions, but it certainly isn't so for computer monitors. These days the "standard" computer monitor (meaning, what an OEM rig will ship with in most cases whether it's a desktop or notebook) is some variant of 136#x768 resolution, so that gets tested for low-end graphics options that are likely to be seen in cheap OEM desktops and most OEM laptops (such as integrated graphics seen here.)The 1680x1050 resolution was the highest end-user resolution available cheaply for a while and is kind of like a standard among tech enthusiasts- sure you had other offerings available like some (expensive) 1920x1200 CRTs, but most people's budget left them with sticking to 1280x1024 CRTs or cheap LCDs or if they wanted to go with a slightly higher quality LCD practically the only available resolution at the time was 1680x1050. A lot of people don't care enough about the quality of their display to upgrade it as frequently as performance-oriented parts so many of us still have at least one 1680x1050 lying around, probably in use as a secondary or for some even a primary display despite 1080p monitors being the same cost or lower price when purchased new.
Beenthere - Thursday, May 31, 2012 - link
I imagine with the heat/OC'ing issues with the trigate chips, Intel is working to resolve Fab as well as operational issues with IB and thus isn't ramping as fast as normal.Fritsert - Thursday, May 31, 2012 - link
Would the HQV score of the HD2500 be the same as the HD4000 in the Anandtech review? Basically would video playback performance be the same (HQV, 24fps image enhancement features etc.)?A lot of processors in the low power ivy bridge lineup have the HD2500. If playback quality is the same this would make those very good candidates for my next HTPC. The Core i5 3470T specifically.
cjs150 - Friday, June 8, 2012 - link
Also does the HD2500 lock at the correct FPS rate which is not exactly 24FPS. AMD has had this for ages but Intel only caught up with the HD4000. For me it is the difference between an i7-3770T and an i5-3470TAffectionate-Bed-980 - Thursday, May 31, 2012 - link
This is a replacement of the i5-2400. Actually the 3450 was, but this is 100mhz faster. You should be comparing HD2000 vs HD2500 as well as these aren't top tier models with the HD3000/4000.bkiserx7 - Thursday, May 31, 2012 - link
In the GPU Power Consumption comparison section, did you disable HT and lock the 3770k to the same frequency as the 3470 to get a more accurate comparison between just the HD 4000 and HD 2500?