Intel HD Graphics 4000 Performance

With respectable but still very tick-like performance gains on the CPU, our focus now turns to Ivy Bridge's GPU. Drivers play a significant role in performance here and we're still several weeks away from launch so these numbers may improve. We used the latest available drivers as of today for all other GPUs.

 

A huge thanks goes out to EVGA for providing us with a GeForce GT 440 and GeForce GT 520 for use in this preview.

Crysis: Warhead

We'll start with Crysis, a title that no one would have considered running on integrated graphics a few years ago. Sandy Bridge brought playable performance at low quality settings (Performance defaults) last year, but how much better does Ivy do this year?

Crysis: Warhead - Frost Bench

In our highest quality benchmark (Mainstream) settings, Intel's HD Graphics 4000 is 55% faster than the 3000 series graphics in Sandy Bridge. While still tangibly slower than AMD's Llano (Radeon HD 6550D), Ivy Bridge is a significant step forward. Drop the quality down a bit and playability improves significantly:

Crysis: Warhead - Frost Bench

Crysis: Warhead - Frost Bench

Over 50 fps at 1680 x 1050 from Intel integrated graphics is pretty impressive. Here we're showing a 41% increase in performance compared to Sandy Bridge, with Llano maintaining a 33% advantage over Ivy. I would've liked to have seen an outright doubling of performance, but this is a big enough step forward to be noticeable on systems with no discrete GPU.

Power Consumption Intel HD 4000 Performance: Metro 2033
Comments Locked

195 Comments

View All Comments

  • taltamir - Monday, March 12, 2012 - link

    Rarson is correct.
    He isn't suggesting no IGP at all. He is saying put a good IGP on the lower end.

    While there ARE people who need a powerful CPU and will not get a video card because they don't play games, those people do not in any way benefit from having a higher end IGP.

    High end gamers = discreete GPU + Powerful CPU
    Budget gamers = IGP + mid-low range CPU
    Non gamers with money = High end CPU + IGP (underused)
    Non gamers on a budget = Mid-low range CPU + IGP (underused)

    The only people who need a more powerful GPU are the budget gamers and thus it makes sense on the lower end CPUs to have a more powerful IGP.
  • Urillusion17 - Monday, March 12, 2012 - link

    Great article but.... where are the temps??? The few benches I have seen don't mention overclocking, and if they do, they do not mention temps. I am hearing this chip can boil water! I would think that would be as important as anything else...
  • DrWattsOn - Tuesday, March 13, 2012 - link

    +1 (very much in agreement)
  • boogerlad - Wednesday, March 14, 2012 - link

    is it possible to fully load the igp with an opencl application, and not affect the cpu performance at all? From what I've read, it appears the igp shares the cache with the cpu, so will that affect performance?
  • rocker123 - Monday, March 19, 2012 - link

    Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective

    Should be :Generational performance improvements on the GPU side generally fall in the 20 - 40% range
  • rocker123 - Monday, March 19, 2012 - link

    Generational performance improvements on the CPU side generally fall in the 20 - 40% range. As you've just seen, Ivy Bridge offers a 7 - 15% increase in CPU performance over Sandy Bridge - making it a bonafide tick from a CPU perspective

    Should be :Generational performance improvements on the GPU side generally fall in the 20 - 40% range
  • tipoo - Monday, March 19, 2012 - link

    They give the drivers their own tweaks and bug fixes, but I doubt they could do something like add T&L without the manufacturers support. In fact, they didn't, unless they have bigger driver teams now.
  • ClagMaster - Wednesday, March 21, 2012 - link

    "Personally, I want more and I suspect that Haswell will deliver much of that. It is worth pointing out that Intel is progressing at a faster rate than the discrete GPU industry at this point. Admittedly the gap is downright huge, but from what I've heard even the significant gains we're seeing here with Ivy will pale in comparison to what Haswell provides."

    Personally, I believe on-board graphics will never be on par with a dedicated graphics part. And it is obcessive-compulsive ridiculous to compare the performance of the HD4000 with discrete graphics and complain its not as good.

    The HD4000 is meant for providing graphics for business and multi-media computers. And for that purpose it is outstanding.

    If you want gaming or engineering workstation performance, get a discrete graphics card. And stop angsting about how bad onboard graphics is to discrete graphics.
  • pottermd - Thursday, March 22, 2012 - link

    Today's desktop processors are more than fast enough to do professional level 3D rendering at home.

    The article contained this statement. It's not really true. I've had a long nap and the render I'm doing is still running. :)
  • Dracusis - Friday, April 6, 2012 - link

    "The people who need integrated graphics"

    No one *needs* integrated graphics. But not everyone needs discrete graphics. The higher performance an IGP has, the less people overall will *need* DGPs.

    Not all games need dedicated graphics cards, just the multi million dollar re-hashed COD's that choke retail stores. There are literally thousands of other games around that only require a small amount of graphics processing power. Flash now has 3D accelerated content and almost every developer using it will target IGP performance levels. Almost all casual game developers target IGPs as well, they're not selling to COD players. Sure, most of those games won't need a hight end CPU as well, but people don't buy computers to play casual games, they buy them for a massive range of tasks, the vast majority of which will be CPU bound so faster would be better.

    Also, as an indie game developer I hit performance walls with CPUs more often than I do with GPUs. You can always scale back geometry/triangle counts, trim or cut certain visual effects but cutting back on CPU related overheads generally means you're cutting out gameplay.

Log in

Don't have an account? Sign up now