Final Words

From the balanced notebook perspective, Arrandale is awesome. Battery life doesn't improve, but performance goes up tremendously. The end result is better performance for hopefully the same power consumption. If you're stuck with an aging laptop it's worth the wait. If you can wait even longer we expect to see a second rev of Arrandale silicon towards the middle of the year with better power characteristics. Let's look at some other mobile markets, though.

If what you're after is raw, unadulterated performance, there are still faster options. We compared Arrandale with a Core 2 Duo P8700, and performance went up. If you already have something with a Core i7-720QM (or other i7 part) or a Core 2 Quad, the performance figures aren't nearly so rosy. The catch is that battery life on quad-core CPUs, frankly, stinks. Most of the time, you're lucky to get over 90 minutes of battery life in light loads. For those looking at mobile performance, Clarksfield is still the winner (or grab a desktop Core i7 notebook).

We are also missing something to replace the ultra-long battery life offered by the Core 2 Ultra Low Voltage (CULV) parts. True, Intel has some low voltage 18W TDP parts running at 1.06GHz to 1.20GHz stock (Turbo up to 1.86GHz to 2.26GHz depending on the model), but the current results suggest that CULV + GS45 is still going to be far more compelling for those interested in battery life while maintaining some level of performance, or you can go with Pine Trail/Pineview (Atom N450) for extreme battery life at the cost of performance. It looks like Arrandale needs some further tweaking before we see an heir to the CULV throne.

Ultimately, we like Arrandale a lot as a balanced mobile offering. It's not going to be as fast as Clarksfield but that was never the point. Performance is 20% better in typical applications compared to mobile dual-core Penryn parts like the P8000 and P9000 series, and battery life at least didn't go down (in most cases). It's also nice to see integrated Intel graphics that don't suck… or at least, they only suck as bad as the current AMD and NVIDIA IGPs. We'll look at doing more testing with Arrandale's IGP in a future article when we have final shipping hardware, as the ability to limit the CPU performance in order to boost GPU speeds is intriguing.

If you're after a "typical" laptop, Arrandale solutions should be high on your list. We expect to see a ton of announced models at CES this week, and we'll do our best to cover them (along with Pine Trial netbooks). We still can't recommend any particular laptop as a solution for every problem, as different users have different needs, but Arrandale brings more choice to the table and choice is a good thing.

Battery Life - Technically, No Better
Comments Locked

38 Comments

View All Comments

  • bsoft16384 - Monday, January 4, 2010 - link

    The biggest problem with Intel graphics isn't performance - it's image quality. Intel's GPUs don't have AA and their AF implementation is basically useless.

    Add in the fact that the Intel recently added a 'texture blurring' feature to their drivers to improve performance (which is, I believe, on by default) and you end up with quite a different experience compared with a Radeon 4200 or GeForce 9400M based solution, even if the performance is nominally similar.

    Also, I've noticed that Intel graphics do considerably better in benchmarks than they do in the real world. The Intel GMA X4500MHD in my CULV-based Acer 1410 does around ~650 in 3DMark06, which is about 50% "faster" than my friend's 3-year-old GeForce 6150-based AMD Turion notebook. But get in-game, with some particle effects going, and the Intel pisses all over the floor (~3-4fps) while the GeForce 6150 still manages to chug along at 15fps or so.
  • bobsmith1492 - Monday, January 4, 2010 - link

    That is, Intel's integrated graphics are so slow that even if they offered AA/AF they are too slow to actually be able to use them. The same goes for low-end Nvidia integrated graphics as well.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    Not true for NV/AMD. WoW, for example, runs fine with AA/AF on GeForce 9400. It runs decent with AF on the Radeon 3200 too.

    Remember that 20fps is actually pretty playable in WoW with hardware cursor (so the cursor is always 20fps).
  • bobsmith1492 - Monday, January 4, 2010 - link

    Do you really think you can actually use AA/AF on an integrated Intel video processor? I don't believe your point is relevant.
  • MonkeyPaw - Monday, January 4, 2010 - link

    Yes, since AA and AF can really help the appearance of older titles. Some of us don't expect an IGP to run Crysis.
  • JarredWalton - Monday, January 4, 2010 - link

    The problem is that AA is really memory intensive, even on older titles. Basically, it can double the bandwidth requirements and since you're already sharing bandwidth with the CPU it's a severe bottleneck. I've never seen an IGP run 2xAA at a reasonable frame rate.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    Newer AMD/NV GPUs have a lot of bandwidth saving features, so AA is pretty reasonable in many less demanding titles (e.g. CS:S or WoW) on the HD4200 or GeForce 9400.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    And, FYI, yes, I've tried both. I had a MacBook Pro (13") briefly, and while I ultimately decided that the graphics performance wasn't quite good enough (compared with, say, my old T61 with a Quadro NVS140m), it was still night and day compared with the GMA X4500.

    The bottom line in my experience is that the GMA has worse quality output (particularly texture filtering) and that it absolutely dies with particle effects or lots of geometry.

    WoW is not at all a shader-heavy game, but it can be surprisingly geometry and texture heavy for low-end cards in dense scenes. The Radeon 4200 is "only" about 2x as fast as the GMA X4500 in most benchmarks, but if you go load up demanding environments in WoW you'll notice that the GMA is 4 or 5 times slower. Worse, the GMA X4500 doesn't really get any faster when you lower the resolution or quality settings.

    Maybe the new generation GMA solves these issues, but my general suspicion is that it's still not up-to-par with the GeForce 9400 or Radeon 4200 in worst-case performance or image quality, which is what I really care about.
  • JarredWalton - Tuesday, January 5, 2010 - link

    Well, that's the rub, isn't it: GMA 4500MHD is not the same as the X4500 in the new Arrandale CPUs. We don't know everything that has changed, but performance alone shows a huge difference. We went from 10 shader units to 12 and performance at times more than doubled. Is it driver optimizations, or is it better hardware? I'm inclined to think it's probably some of both, and when I get some Arrandale laptops to test I'll be sure to run more games on them. :-)
  • dagamer34 - Monday, January 4, 2010 - link

    Sounds like while performance increased, battery life was just "meh". However, does the increased performance factor in the Turbo Boost that Arrandale can perform or was the clock speed locked at the same rate as the Core 2 Duo?

    And what about how battery life is affected by boosting performance with Turbo Boost? I guess we'll have to wait for production models for more definitive answers (I'm basically waiting for the next-gen 13.3" MacBook Pro to replace my late-2006 MacBook Pro).

Log in

Don't have an account? Sign up now