Resolution Scaling with Intel HD Graphics 3000

All of our tests on the previous page were done at 1024x768, but how much of a hit do you really get when you push higher resolutions? Does the gap widen between a discrete GPU and Intel's HD Graphics as you increase resolution?

On the contrary: low-end GPUs run into memory bandwidth limitations just as quickly (if not quicker) than Intel's integrated graphics. Spend about $70 and you'll see a wider gap, but if you pit Intel's HD Graphics 3000 against a Radeon HD 5450 the two actually get closer in performance the higher the resolution is—at least in memory bandwidth bound scenarios:

 

Call of Duty: Modern Warfare 2 stresses compute a bit more at higher resolutions and thus the performance gap widens rather than closes:

For the most part, at low quality settings, Intel's HD Graphics 3000 scales with resolution similarly to a low-end discrete GPU.

Graphics Quality Scaling

The biggest issue with integrated and any sort of low-end graphics is that you have to run games at absurdly low quality settings to avoid dropping below smooth frame rates. The impact of going to higher quality settings is much greater on Intel's HD Graphics 3000 than on a discrete card as you can see by the chart below.

The performance gap between the two is actually its widest at WoW's "Good" quality settings. Moving beyond that however shrinks the gap a bit as the Radeon HD 5450 runs into memory bandwidth/compute bottlenecks of its own.

Intel HD Graphics 2000/3000 Performance Overclocking Intel's HD Graphics
Comments Locked

283 Comments

View All Comments

  • hmcindie - Monday, January 3, 2011 - link

    Why is that Quick Sync has better scaling? Very evident in the Dark Knight police car image as all the other versions have definite scaling artifacts on the car.

    Scaling is something that should be very easy. Why is there so big a difference? Are these programs just made to market new stuff and no-one really uses them because they suck? So big scaling differences between codepaths make no sense.
  • JarredWalton - Monday, January 3, 2011 - link

    It looks to me like some of the encodes have a sharpening effect applied, which is either good (makes text legible) or bad (aliasing effects) depending on your perspective. I'm quite happy overall with the slightly blurrier QS encodes, especially considering the speed.
  • xxxxxl - Monday, January 3, 2011 - link

    I've been so looking forward to SB...only to hear that H67 cant overclock CPU?!?!?!?!
    Disappointed.
  • digarda - Monday, January 3, 2011 - link

    Who needs the IGP for a tuned-up desktop PC anyway? Some for sure, but I see the main advantages of the SB GPU for business laptop users. As the charts show, for desktop PC enthusiasts, the GPU is still woefully slow, being blown away even by the (low-end) Radeon 5570. For this reason, I can't help feeling that the vast majority of overclockers will still want to have discrete graphics.

    I would have preferred to dual core (4-thread) models to have (say) 32 shaders, instead of the 6 or 12 being currently offered. At 32nm, there's probably enough silicon real estate to do it. I guess Intel simply didn't want the quad core processors to have a lower graphics performance than the dual core ones (sigh).

    Pity that the socket 2011 processors (without a GPU) are apparently not going to arrive for nearly a year (Q4 2011). I had previously thought the schedule was Q3 2011. Hopefully, AMD's Bulldozer-based CPUs will be around (or at least imminent) by then, forcing intel to lower the prices for its high-end parts. On the other hand, time to go - looks like I'm starting to dream again...
  • Exodite - Monday, January 3, 2011 - link

    Using myself as an example showing the drawback of limiting overclocking on H67 would be the lack of a good selection of overclocking-friendly micro-ATX boards due to most, if not all, of those being H67.

    Granted, that's not Intel's fault.

    It's just that I have no need for more than one PCIe x16 slot and 3 SATA (DVD, HDD, SSD). I don't need PCI, FDD, PS2, SER, PAR or floppy connectors at all.

    Which ideally means I'd prefer a rather basic P67 design in micro-ATX format but those are, currently, in short supply.

    The perfect motherboard, for me, would probably be a P67 micro-ATX design with the mandatory x8/x8 Crossfire support, one x1 and one x4 slot, front panel connector for USB 3, dual gigabit LAN and the base audio and SATA port options.

    Gigabyte?

    Anyone? :)
  • geofelt - Monday, January 3, 2011 - link

    The only P67 based micro-ATX motherboard I have found to date is the
    Asus P8P67-M pro. (or evo?)

    Any others?
  • Rick83 - Monday, January 3, 2011 - link

    There's also a non-pro P8P67-M.

    Keep in mind though, that the over-clocking issue may not be as bad as pointed out. There are H67 boards being marketed for over-clocking ability and manuals showing how to adjust the multiplier for CPUs... I'm not yet convinced over-clocking will be disabled on H67.
  • smilingcrow - Monday, January 3, 2011 - link

    Major bummer as I was going to order a Gigabyte H67 board and an i5-2500K but am put off now. They seem to over-clock so well and with low power consumption that it seemed the perfect platform for me…
    I don’t mind paying the small premium for the K editions but being forced to use a P67 and lose the graphics and have difficulty finding a mATX P67 board seems crazy!

    I wonder if this limit is set in the chipset or it can be changed with a BIOS update?
  • DanNeely - Monday, January 3, 2011 - link

    Quick Sync only works if the IGP is in use (may be fixable via drivers later); for anyone who cares about video encoding performance that makes the IGP a major feature.
  • mariush - Monday, January 3, 2011 - link

    On the Dark Knight test...

    Looking at the Intel software encoding and the AMD encoding, it looks like the AMD is more washed out overall, which makes me think there's actually something related to colorspaces or color space conversion involved....

    Are you guys sure there's no PC/TV mixup there with the luminance or ATI using the color matrix for SD content on HD content or something like that?

Log in

Don't have an account? Sign up now