No Surprises: Quad-Core Sandy Bridge Is Still Fast

We’ve already looked at the i7-2630QM when we reviewed the ASUS G73SW and previewed the MSI GT680R. The difference here is that we’re not looking at a 15.6” chassis that’s lighter and thinner than the GT680R, and we’re pairing the CPU with a GT 540M instead of a GTX 460M. For applications, the GPU generally won’t matter, but the presence of Optimus Technology will definitely help in several areas: better battery life, and access to Intel’s Quick Sync for video transcoding. We’ll do a quick check of performance with Quick Sync in a moment.

For the charts, we’re sticking mostly with mainstream laptops. We’ve had plenty systems come through our tests in the past year that meet that classification, and you can make your own comparisons using Mobile Bench. For our applications charts, we’ve got quite a few Arrandale systems and a couple Sandy Bright units, with a  smattering of GPUs ranging from IGP (HD 3000) up through GTX 460M. Given the pricing, the XPS 15 will mostly be playing in the $1000+ market, and our test system is very close to the price of laptops like the MSI GT680R and ASUS G73SW, so we’ll include the latter to show where the next performance tier lies. At the other end of the performance and pricing scale, we’re also including an AMD Brazos E-350 laptop, the HP dm1z. It’s not at all in competition with the other laptops—from either a price or performance standpoint—but it does offer plenty of battery life in an affordable package and we liked it enough to give it our Silver Editors’ Choice award.

The laptops we’ve chosen to highlight in this review are the Dell XPS L501x and L502x, in black and green, showing how the SNB update compares to the original. The ASUS G73SW is in red, showing where a faster GPU will get you, and the Sandy Bridge i7-2820QM with an SSD is in gold, providing the reverse picture: more CPU performance and a fast storage subsystem, but with a much slower IGP.

Futuremark PCMark Vantage

Futuremark PCMark05

3D Rendering - CINEBENCH R10

3D Rendering - CINEBENCH R10

Video Encoding - x264

Video Encoding - x264

PCMark always likes a fast storage subsystem, and it does give you some idea of how much more responsive a laptop can feel with a good SSD. The Sandy Bridge i7-2820QM ends up nearly twice as fast as the XPS 15 in PCMark Vantage, and 55% faster in the old PCMark05. PCMark also stresses the GPU a bit, which is why the ASUS G73SW ends up around 20% faster than the L502x. Overall, the i7-2820QM with SSD is 37% faster than the L502x and the ASUS G73SW is just 7% faster. The L502x ends up as the third fastest laptop in our application tests, and the only other laptop that manages to squeak by with a lead in the individual results is in the single-core Cinebench test, where the higher single-core Turbo of the K53E/i5-2520M wins out over core count.

Comparing to the original XPS 15 L501x is a bit easier, since all of the main components are at least somewhat faster on the L502x but still similar in specs. The result is performance that’s 20% to 100% faster than the L501x, with the 100% increases coming in the highly threaded Cinebench and Second Pass x264 encoding tests. On average, the L502x is 57% faster than the L501x, though for more mundane office/Internet workloads it’s probably more like 30% faster.

As noted above, we also did some tests of GPU accelerated video transcoding. We used CyberLink’s MediaEspresso and transcoded a 5323 frame 1080p24 video into 720p YouTube format using just the CPU, then with the GT 540M active, and finally using Quick Sync (i.e. HD 3000 active). With Quick Sync, MediaEspresso now has two encoding profiles available, fast and quality, so we tried both. In terms of performance, the CPU alone took 92 seconds, for a final speed of 58FPS. With the GT 540M, performance improved to 69 seconds/77FPS. Finally, Quick Sync with the “Quality” profile took 34 seconds (157FPS), while the “Fast” profile results in the quickest transcoding time, requiring just 25 seconds—or a very impressive speed of 213FPS.

As far as the transcoding quality, subjectively none of the encodes were all that great, showing a clear loss of fidelity from the original 1080p24 source—though that’s expected, considering the final file size was about 10% of the original. I also didn’t notice the issues we saw on CUDA encoding, but the home video I used may not be the best for picking out such details. (We noticed the problems originally on Arcsoft’s Total Media Converter, so it was likely just their implementation of CUDA transcoding rather than a general problem with CUDA, and the latest version might have fixed things.) A final interesting point to mention is that right now, NVIDIA’s Optimus Technology detects MediaEspresso and by default uses the discrete GPU, even though Intel’s Quick Sync is more than twice as fast. Thankfully that you can modify the profile to prefer Intel’s IGP—and on Arrandale’s IGP the dGPU would be preferable—but at present it doesn’t look like NVIDIA’s profiles are smart enough to detect your IGP and determine what path is optimal. That could be a problem down the road when Ivy Bridge and future IGPs continue to improve performance, but hopefully software updates will address the concern.

Okay, that’s enough talk about general application performance. Let’s see how the L502x fares in synthetic graphics performance before we get to the games.

Futuremark 3DMark 11

Futuremark 3DMark Vantage

Futuremark 3DMark Vantage

Futuremark 3DMark06

We’re updating our 3DMark charts to focus more on modern workloads, so we’ve added 3DMark 11’s Performance default to our benchmark list, and we’ve included results for 3DMark Vantage’s Performance setting as well—though we don’t have results for all of the other laptops on those charts. We’re also skipping the charts for 3DMark03/05, though you can still see the results in Bench.

Interesting to note is that the 3DMark 11 Performance test appears to be almost entirely GPU limited, as even with a P520 processor the HD 5650 comes out 12% ahead of the GT 425M (and the 540M in turn is 7% faster than the 5650). Depending on how well that comparison holds up, the GTX 460M looks to be around 80% faster than the GT 540M. On paper, the GTX 460M has 101% more computational power and 108% more bandwidth than the 540M, so realizing an 80% performance increase would be about right.

Elsewhere, what we see is a familiar pattern: all of the GT 400M/500M and HD 5650 parts cluster near each other, with the XPS L502x generally coming out on top. The problem is the huge gulf between the GT 540M and the GTX 460M we just mentioned, never mind the top performing mobile GPUs like the HD 6970M and GTX 485M. So once again, this is a decidedly midrange mobile GPU that will struggle with modern games at higher quality settings and higher resolutions—which is what we’ll see next. As far as upgrades from the Arrandale platform, thanks to the increased GPU bandwidth and faster core clocks, plus the quad-core SNB CPU, the new L502x is around 25-35% faster than the old L501x in the 3DMark results. Now let’s find out if that same margin of victory holds in actual games.

Design and Other Considerations Better Midrange Graphics, But Still Midrange
Comments Locked

76 Comments

View All Comments

  • JarredWalton - Wednesday, April 20, 2011 - link

    On the Dell site, it's the "XPS 15", but the L501x is the actual full model, to differentiate it from the original L501x. It's like the MacBook Pro 13/15/17 -- they don't specify which particular iteration you're discussing. I just happen to use the real model instead of the generic name so as to avoid confusion.
  • flyingpants1 - Wednesday, April 20, 2011 - link

    If I follow the link in the article and configure the $800 XPS 15 with all the options listed, the total comes out to $1505. Coupon code 932N$0ZCCHWZB9 for an additional $70 off brings it to $1435.

    However, if I follow the link on this page: http://goo.gl/jgfvV configure THAT XPS 15 with the same stuff, the total is now $1764. $425 in coupon codes brings it down to $1339. So it actually works out cheaper, and it includes a 2 year service plan instead of the 1 year.
  • Wave_Fusion - Wednesday, April 20, 2011 - link

    I'm not following which parts are plastic and which are metal.
    My old XPS M1210 was metal except in one key area: the palm rest.
    So after about a year I wore an E.T. shaped hand print in the cheap silver paint below the keyboard.

    It'd be a major disappointment to buy this one and find it still has crummy plastic where it shouldn't be.

    If I've said it once I've said it a thousand times: My DV7 is better.
    I don't get why no one seems to know about my computer, but eventually someone will review it besides myself.

    Its faster, cheaper, most stylish; and also has amazing sound, but with a 2 year warranty too. I thought you guys announced the future launch of my computer, the DV6/DV7 spring refresh, but since then its been dark.

    They destroy everything that's been reviewed since; and its sad no one seems to know about them yet.
  • will2 - Wednesday, April 20, 2011 - link

    Re. your concluding paragraph, interested in your views of screen size/resolution combinations.

    As frequently moving, I want a thin-light desktop replacement notebook for both photo-editing, multimedia playing, and general business use. I concluded as my present 14" lcd with 1440x900 gloss screen is tiring on eyes for long hours reading office documents, a 15" 1600x900 screen - maybe less tiring as text rendered slightly larger. Interested in your thoughts on that.

    With that in mind, I hope Anandtech can review the SNB Latitude E6520 with 1600x900 screen. Is that likely soon ?

    You surmise the M11x R3 worth a look. Too small screen for my needs, but I see Notebookcheck just reviewed it and they said very powerful but let down by a poor screen, meagre 150:1 contrast ratio, blacks are grey.
  • JarredWalton - Thursday, April 21, 2011 - link

    So it really depends on the individual, but I use a 30" desktop for most of my photo work, and when I have to go mobile I already feel cramped on 1080p (or even WUXGA). I've used 1440x900 and 1680x1050, and they're okay, but nothing beats resolution for photo editing in my book. The problem is, outside of photos, those fine dot pitches can really be a strain on the old eyes! Heck, my 30" is a strain these days.

    So then you use the DPI setting of Windows, but it mucks up certain programs and can be irritating, or you run at a lower resolution or use the "zoom" feature in your web browser and office applications as needed. I, incidentally, have used all of the above and continue to use them depending on my mood. My 30" LCD is set to 120dpi and it still feels a bit small on a lot of text.

    For laptops, it's a compromise either way. Personally, I've ended up with the following sizes as my preferred options. Others will obviously disagree, but as a 37-year-old I don't have the luxury of running ultra-tiny fonts anymore.
    -------------------
    <12.1" - I'm not really a fan of this size laptop. It's too small for me to type on comfortably, so I prefer 13 or 14". However, 1280x800 (or 1366x768) works okay. I've used a 10.1" laptop with a 1366x768 LCD, and it was often too small to read comfortably.

    13.3" - 1600x900 is a bit of a stretch for me at this size, but it's better than the 1366x768 alternative. 1280x800 is actually still better in my book, but too many laptops are moving to 16:9 aspect ratios.

    14" - 1440x900 or 1600x900 works best; I'm not willing to go lower res if I can avoid it, and higher res is too small for my tastes.

    15.6" - 1600x900 or 1920x1080 (or 1680x1050/1920x1200) are all fine here, though 1080p can be a bit small at times.

    >17" - must have 1080p or 1920x1200 resolution. (I haven't seen anything in recent history with a higher resolution than that.)
  • NICOXIS - Thursday, April 21, 2011 - link

    Is it possible to play games at native resolution with medium settings?

    Why should one get 540M when you can just do 525M with higher clocks? (besides additional memory)

    What is the difference between Intel Advanced-N 6230 and Intel Wireless-N 1030?

    If we replace Quad Core (Intel Core i7-2630QM) with Dual Core with higher clocks ( Intel Core i5-2520M), Should it get better performance at games and longer battery life?

    Appreciate any answers! :)
  • Wave_Fusion - Thursday, April 21, 2011 - link

    It might be possible, it might not. My HP DV7 can run most games at maximum at its native resolution of 1600 x 900, but my AMD 6770 is significantly faster than the 540M in this Dell.

    Medium settings at 1920 x 1080 could easily overwhelm a card like that. I'd recommend a unit with either Nvidia 555M, 560M; AMD Radeon HD 6770, 6870; or above.
    Or the 540M would be fine if you played at a lower resolution; but if it were me I'd choose either a smaller laptop at 1366 x 768 or a larger one at 1600 x 900.
    15 inches is to me a spot neither of Dell's offered resolutions look good at that size.

    I believe the Advanced-N 6230 supports 5GHz band while the 1030 does not.

    Finally, my DV7 has a quad core i7-2720QM and switchable graphics with the AMD 6770. The new quad cores idle pretty low, but because the dual core processors have a lower maximum thermal output, they'll draw less power under heavy load. On the other hand a quad core might handle that same load without stressing it too hard.

    Short answer: Both are built on 32nm now, so probably little significant difference unless you're gaming or something.
  • JarredWalton - Thursday, April 21, 2011 - link

    All of the gaming charts include the results at both our standard resolution as well as the native 1080p of the LCD. You can see that medium detail is >30FPS in five of the eight tested games, but of course that will vary depending on the game (and what settings you define as "medium"). For 1080p, realistically I still feel like you would want at least a GT 555M, and the GTX 460M is where medium to high 1080p is viable.

    Overclocking of mobile GPUs is possible, but in practice they bin pretty heavily and I'm uncomfortable trying to run a laptop with an overclocked GPU. I suppose it's only a 12% overclock to push the 525M to 540M levels, but YMMV. Since replacing your GPU on a laptop is difficult at best (i.e. finding a compatible part that doesn't cost an arm and a leg), I'd exercise caution. Besides, a 10% core overclock (with the same RAM speed) means the real-world performance difference is quite small.

    WiFi 6230 gives 2.4 + 5.0 GHz connections, so it's a 2x2 MiMo setup. The 1030 is 1x2 and omits 5.0GHz support. Both support up to 300Mbps connections, but in the real world you'll probably max out around 144Mbps with actual throughput of around 7-9MBps (56-72Mbps).

    As far as CPU changes, we do have performance results of the it-2520M in the charts (ASUS K53E). In games, the GPU is the major bottleneck. Remember that the 2630QM can still Turbo as high as 2.9GHz (2.8GHz dual-core, 2.7GHz tri-core, and 2.6GHz quad-core). The 2520M can go about 10% higher at 3.2GHz (3.1GHz dual-core), but unless you're playing at low resolutions and low details, you'll need more GPU than CPU.
  • NICOXIS - Thursday, April 21, 2011 - link

    thanks Wave and Jarred, appreciate you took the time to respond ;)

    Regarding GPU at medium settings, there's no laptop I've seen (with L502x footprint and weight) that offers GT 555M or GTX 460M (or similar AMD solution) at that price point and price/features ratio.

    On the overclocking side, isn't 525M the same exact chip as 540M with clocks up a 12%? or are you saying 525M are chips that didn't qualify at higher speeds?

    And if that 10% overclock difference is minimal, Why should anyone upgrade anyway?

    Reg 6230 vs 1030, what's the difference in practice?

    Sorry for the blast, but it's interesting to know :D

    Thanks
  • JarredWalton - Thursday, April 21, 2011 - link

    So all of the GT 525M, 540M, and 550M chips are the "same", but they're the same in the way that the i7-2630QM, i7-2720QM, i7-2820QM, and i7-2920XM are the "same" (barring differences in L3 cache sizes). Without actually looking at a large sample, I don't know if NVIDIA is tweaking voltage levels, but they're very likely doing some form of binning.

    Intel for example will do some tests to verify a chip can handle the desired speed at a reasonable voltage; if it can't do 2.2GHz at 1.25V (or whatever), then they'll try for a lower clock with that same voltage, or maybe raise the voltage slightly but drop the clock, etc. I don't know all of what Intel does or doesn't do with binning, but in general their higher-end (e.g. more expensive) CPUs run at lower voltages at stock and overclock better with more voltage. So, considering the number of chips NVIDIA is offering that are essentially the same, I suspect they have a way of determining which are the better chips.

    So if 10% is a minor difference in performance, why would anyone upgrade? Because they're generally uninformed. The 1GB to 2GB jump won't help at the settings where these chips run well -- which is usually lower resolutions at medium detail. 1GB to 2GB becomes useful when you're running at least 1080p with very large textures, and for that you really need at least a GTX 460M (or HD 5870M).

    This is actually one of the frustrating aspects of buying a laptop. So many companies will sell you a GT 420M/425M/435M/525M/540M/550M graphics chip with Optimus in a 14" or 15.6" laptop. (Those are all 96 CUDA cores with 128-bit DDR3 memory, so basically the only change is the clock speed of the cores, which ranges from 500MHz on the 420M up to 740MHz on the 550M.) Finding anything with the GT 445M/555M on the other hand.... Well, the Alienware M14x and Dell XPS 17 L702x are currently the only laptops with the GT 555M, and to my knowledge the old XPS 17 L701x was the only laptop with the GT 445M. I'm not a big fan of 17.3" notebooks, so that's why I think the M14x may be the best balance of price, size, and performance for a gaming laptop.

    Back to the wireless, in practice the difference is that if you have a wireless router that supports 2.4 and 5 GHz, you can get better throughput a lot of the time. Many less expensive routers don't support 5GHz, so in that case you'd see no difference at all. If you do have the appropriate router (I don't!), I understand that some people get much better performance on 5GHz because there's so much other traffic on 2.4GHz (e.g. 802.11b/g/n all use 2.4GHz, plus cordless phones and other devices).

Log in

Don't have an account? Sign up now