The Platform

ASUS and Intel partnered up to send us an Arrandale system to test. It's a pre-production K42 notebook.

I won't comment on the build quality because honestly it's not very good. From what ASUS has told me it's already a lot better and we simply have very rough pre-production samples.


Shinebox.

As far as I'm concerned, it served its purpose as it gave me a great platform for measuring Arrandale performance.

I compared its performance to an HP Montevina system. Both systems used 4GB of DDR3-1066 and had CPUs running at 2.53GHz. The Core 2 Duo P8700 was our sample from the previous generation and we compared it to the Core i5-540M.

Index Performance - A Huge Improvement
Comments Locked

38 Comments

View All Comments

  • bsoft16384 - Monday, January 4, 2010 - link

    The biggest problem with Intel graphics isn't performance - it's image quality. Intel's GPUs don't have AA and their AF implementation is basically useless.

    Add in the fact that the Intel recently added a 'texture blurring' feature to their drivers to improve performance (which is, I believe, on by default) and you end up with quite a different experience compared with a Radeon 4200 or GeForce 9400M based solution, even if the performance is nominally similar.

    Also, I've noticed that Intel graphics do considerably better in benchmarks than they do in the real world. The Intel GMA X4500MHD in my CULV-based Acer 1410 does around ~650 in 3DMark06, which is about 50% "faster" than my friend's 3-year-old GeForce 6150-based AMD Turion notebook. But get in-game, with some particle effects going, and the Intel pisses all over the floor (~3-4fps) while the GeForce 6150 still manages to chug along at 15fps or so.
  • bobsmith1492 - Monday, January 4, 2010 - link

    That is, Intel's integrated graphics are so slow that even if they offered AA/AF they are too slow to actually be able to use them. The same goes for low-end Nvidia integrated graphics as well.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    Not true for NV/AMD. WoW, for example, runs fine with AA/AF on GeForce 9400. It runs decent with AF on the Radeon 3200 too.

    Remember that 20fps is actually pretty playable in WoW with hardware cursor (so the cursor is always 20fps).
  • bobsmith1492 - Monday, January 4, 2010 - link

    Do you really think you can actually use AA/AF on an integrated Intel video processor? I don't believe your point is relevant.
  • MonkeyPaw - Monday, January 4, 2010 - link

    Yes, since AA and AF can really help the appearance of older titles. Some of us don't expect an IGP to run Crysis.
  • JarredWalton - Monday, January 4, 2010 - link

    The problem is that AA is really memory intensive, even on older titles. Basically, it can double the bandwidth requirements and since you're already sharing bandwidth with the CPU it's a severe bottleneck. I've never seen an IGP run 2xAA at a reasonable frame rate.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    Newer AMD/NV GPUs have a lot of bandwidth saving features, so AA is pretty reasonable in many less demanding titles (e.g. CS:S or WoW) on the HD4200 or GeForce 9400.
  • bsoft16384 - Tuesday, January 5, 2010 - link

    And, FYI, yes, I've tried both. I had a MacBook Pro (13") briefly, and while I ultimately decided that the graphics performance wasn't quite good enough (compared with, say, my old T61 with a Quadro NVS140m), it was still night and day compared with the GMA X4500.

    The bottom line in my experience is that the GMA has worse quality output (particularly texture filtering) and that it absolutely dies with particle effects or lots of geometry.

    WoW is not at all a shader-heavy game, but it can be surprisingly geometry and texture heavy for low-end cards in dense scenes. The Radeon 4200 is "only" about 2x as fast as the GMA X4500 in most benchmarks, but if you go load up demanding environments in WoW you'll notice that the GMA is 4 or 5 times slower. Worse, the GMA X4500 doesn't really get any faster when you lower the resolution or quality settings.

    Maybe the new generation GMA solves these issues, but my general suspicion is that it's still not up-to-par with the GeForce 9400 or Radeon 4200 in worst-case performance or image quality, which is what I really care about.
  • JarredWalton - Tuesday, January 5, 2010 - link

    Well, that's the rub, isn't it: GMA 4500MHD is not the same as the X4500 in the new Arrandale CPUs. We don't know everything that has changed, but performance alone shows a huge difference. We went from 10 shader units to 12 and performance at times more than doubled. Is it driver optimizations, or is it better hardware? I'm inclined to think it's probably some of both, and when I get some Arrandale laptops to test I'll be sure to run more games on them. :-)
  • dagamer34 - Monday, January 4, 2010 - link

    Sounds like while performance increased, battery life was just "meh". However, does the increased performance factor in the Turbo Boost that Arrandale can perform or was the clock speed locked at the same rate as the Core 2 Duo?

    And what about how battery life is affected by boosting performance with Turbo Boost? I guess we'll have to wait for production models for more definitive answers (I'm basically waiting for the next-gen 13.3" MacBook Pro to replace my late-2006 MacBook Pro).

Log in

Don't have an account? Sign up now