Integrated Graphics - Slower than AMD, Still Perfect for an HTPC

Intel was very careful to seed reviewers with the Core i5 661, it provides integrated graphics performance equal to if not better than the best integrated graphics from AMD and NVIDIA.

The same, unfortunately, can’t be said about the Core i3 530. With 81% of the GPU clock of the 661, the i3’s graphics are obviously slower. It’s not a huge drop, but it’s enough to be noticeable and enough to be slower than AMD:

1024 x 768 Batman: AA Call of Duty Modern Warfare 2 Dawn of War II Dragon Age Origins HAWX World of Warcraft
Intel Core i5 661 (HD Graphics) 35 fps 21.6 fps 15.0 fps 41.5 fps 53 fps 14.8 fps
Intel Core i3 530 (HD Graphics) 28 fps 17.5 fps 9.5 fps 34.4 fps 45 fps 12.5 fps
AMD Phenom II X4 965 (790GX) 35 fps 29.3 fps 12.1 fps 35.6 fps 58 fps 21.1 fps
Intel Core 2 Duo E8600 (GMA X4500) 15 fps failed 1.4 fps 16.8 fps 26 fps 11.7 fps

 

The i3 does retain all of the sweet TrueHD/DTS-HD MA bitstreaming support that makes Clarkdale the perfect HTPC platform. If you don’t need the extra CPU power, the Core i3 530 could make for a great HTPC.

The Performance & Power Summary Overclocking Intel’s HD Graphics - It Works...Very Well
Comments Locked

107 Comments

View All Comments

  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    No you're pretty much right on the money. If you have a good system already, you're always better off waiting until the next big thing. In my eyes Penryn wasn't a very big deal if you already had Conroe. Clarkdale's real advantages are in power consumption and threaded performance. If you already have a quad-core CPU, chances are that you'll want to be looking at Lynnfield/Phenom II X4 or wait for Sandy Bridge if you can.

    Take care,
    Anand
  • tno - Friday, January 22, 2010 - link

    Cool! A reply from the man himself! Thanks, Anand! My leap was from a 2.4GHz Celeron to a PD805 to Penryn, so Penryn seemed like a revelation, highly efficient, easy to cool, fast and quadcore. Now, if you happen to have any loose systems that you're not using and want to send my way so I can experience the Lynnfield difference myself, I won't object.

    tno
  • kwrzesien - Friday, January 22, 2010 - link

    I had an AMD 1.2 GHz single-core with an GeForce 2MX. It was a HUGE upgrade!
  • lopri - Friday, January 22, 2010 - link

    [QUOTE]We are still running into an issue with MPC-HC and video corruption with DXVA enabled on the 790GX, but haven't been able to fix it yet. Have any of you had issues with video corruption with AMD graphics and the latest stable build of MPC-HC for 64-bit Windows? Or should we chalk it up to being just another day in the AnandTech labs.[/QUOTE]

    Instead of such fleeting one-liners, how about telling us the title, format, and codec in question so that we can verify it? This is a finest example of yellow journalism.

    I'm still waiting for an answer whether 2560x1600 and dual-displays work with these CPUs. Considering the silence, however, I think I know the answer.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    It's a Dark Knight rip we use. Take the original Blu-ray, use AnyDVD HD to strip out the DRM, re-encode to reduce file size and toss into an mkv container. The problem appears on all H.264 content though played through MPC-HC.

    As far as resolution support goes, Intel lists 2560 x 1600 as the maximum resolution available over Display Port. For DVI/HDMI you're limited to 1920 x 1200. VGA will get you up to 2048 x 1536.

    There are four independent display ports, so in theory you should be able to run one 2560 x 1600 panel and one 1920 x 1200 (or two 25x16 panels if you had a board with dual DisplayPort outputs).

    Take care,
    Anand
  • lopri - Friday, January 22, 2010 - link

    Thank you for the explanation, but unfortunately I couldn't replicate the 'problem' (what exactly?) you've experienced. I don't have The Dark Kight, so I tried Children of Men on a neighbor's 785G system I built for him. That title was chosen because its original content on the disc was encoded VC1 just like The Dark Knight. MediaInfo gave the following information:

    Video
    ID : 1
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : High@L4.1
    Format settings, CABAC : Yes
    Format settings, ReFrames : 4 frames
    Muxing mode : Container profile=Unknown@4.1
    Codec ID : V_MPEG4/ISO/AVC
    Duration : 1h 49mn
    Bit rate : 14.2 Mbps
    Nominal bit rate : 14.5 Mbps
    Width : 1 920 pixels
    Height : 1 040 pixels
    Display aspect ratio : 16:9
    Frame rate : 23.976 fps
    Resolution : 24 bits
    Colorimetry : 4:2:0
    Scan type : Progressive
    Bits/(Pixel*Frame) : 0.296
    Stream size : 10.8 GiB (88%)
    Title : Video @ 14489 kbps
    Writing library : x264 core 67 r1165M 6841c5e

    Flawless playback both in Windowed mode as well as full-screen mode, on a 30" LCD. Just to be sure, I tested The Dark Knight trailer which is a VC1 clip, and various H.264 content in .mkv, .mp4, and .m2ts. Using MPC-HC svn 1.3.3347 32-bit AND 64-bit binaries. System had an WHQL driver dated 8/17/2009, installed via Windows Updates. Only codecs installed are Matroksa Splitter and AC3filter.

    So there. Now, what exactly is the problem that I don't see but you do?

    WRT resolutions - Intel listed 2560x1600 on G45 as well. I even got an ADD2 (interesting choice of name, btw) card off eBay hoping it'd work, but that was simply waste of money. I am as skeptical as can be on GMA after my bitter experiences with G35/G45, and it is puzzling why you can't verify that in your lab instead of being a messenger. ("Intel says so")

    Would you feel bad at all if I say I purchased G35/G45 based on your reviews, only to be greatly disappointed? I couldn't even give away a G35 based system to a junior-high kid, because the kid is someone I see on and off and I feared a potential embarrassment and unexpected calls for support.

    Your reviews are full of contradictions one after another, and I am concerned whether you've lost the sense and connection to the real world.
  • Shadowmaster625 - Friday, January 22, 2010 - link

    Given the level of integration, what is making these motherboards so expensive? When are we going to see $35 motherboards? What would keep the prices from coming down that low?
  • strikeback03 - Friday, January 22, 2010 - link

    IIRC the chipset itself currently costs $40.
  • Anand Lal Shimpi - Friday, January 22, 2010 - link

    Correct. Despite moving much of the "chipset" on-package, the actual H5x chipsets are no cheaper than their predecessors. Remember that as AMD and Intel integrate more onto the CPU they still want to preserve or increase profit margins. It's just fortunate for all of us that in the process of integration we actually get a performance benefit.

    Take care,
    Anand
  • Taft12 - Friday, January 22, 2010 - link

    Sounds like we are very much in need of competition in the 3rd party chipset market like the good old days!

    Things are going in the wrong direction with NVIDIA exiting the market, Via and SiS long gone...

Log in

Don't have an account? Sign up now