General Performance – Dual-Core Sandy Bridge vs. the World

So now we get to the numbers, and this is where some of the competing solutions will really take a beating. Let’s just cut straight to the chase and look at the graphs. We’ve highlighted the K53E in our standard bright green, with the ASUS U41JF in black, Apple's dual-core i5-2415M MBP13 in gold, and the quad-core i7-2820QM in yellow.

One interesting piece of information prior to the benchmark discussion is that despite having a theoretical maximum Turbo speed of 3.2GHz, we rarely see the i5-2520M hit that mark in testing. Using CPUID’s TMonitor utility, in the single-core Cinebench result we see both cores fluctuate between 800 to 3100MHz. It appears the Windows task scheduler isn’t quite sure how to best distribute the load, which is a common problem. However, in the multi-threaded Cinebench test the two CPU cores run at a constant 2.9GHz, as expected.

Futuremark PCMark Vantage

Futuremark PCMark05

3D Rendering - CINEBENCH R10

3D Rendering - CINEBENCH R10

Video Encoding - x264

Video Encoding - x264

Starting with the new MBP13 comparison, the K53E with i5-2520M comes out an average of 20% faster. Some of that can be attributed to the hard drive differences, as PCMark Vantage shows Apple’s HDD choice is particularly poor, but the CPU intensive tasks are also 15 to 25% faster. It’s interesting that ASUS’ UL41JF happens to put in an overall showing in these applications that matches the MBP13, but that’s in a large part thanks to the 15% overclock. Looking at stock Arrandale CPUs, the i5-2520M turns in slightly higher performance results than the i7-640M, the highest-clocked Arrandale CPU we’ve tested. Even the old i7-720QM in the Dell Studio 17 fails to match the performance of the i5-2520M, which leads by an average of 18% in the above benchmarks (with the only loss coming in Pass 2 of x264 encoding).

Move up to quad-core SNB and an SSD, and of course the dual-core parts look a lot weaker. The i7-2820QM average lead in the above charts is 74%, but part of that is the thanks to the 104% lead in PCMark Vantage. Remove the PCMark results, though, and the 2820QM is still 65% faster than the 2520M. On the other side of the charts—literally—is AMD’s E-350. We know it’s not meant to compete with Sandy Bridge (or even Arrandale or Core 2 Duo), but keep in mind that the cheapest price for such a laptop is going to be around $450. On average, the i5-2520M lays the smack down hard and ends up roughly four times faster than an E-350. Ah, but the E-350 has a much better IGP, right? Well, maybe it’s better, but it’s certainly not faster than Intel’s HD 3000 when it’s bottlenecked by the CPU; here are some 3DMark results before we get to the games to give you an idea of how graphics performance compares.

Futuremark 3DMark Vantage

Futuremark 3DMark06

Futuremark 3DMark05

Futuremark 3DMark03

While we would never take 3DMark as the end-all, be-all of graphics performance comparisons, it does give a general idea of what we can expect. The K53E with i5-2520M turns in performance that’s 9% faster than the MBP13 on average across the four versions of 3DMark. That’s actually pretty accurate, as we’ll see in the gaming tests. Likewise, the i7-2820QM results end up being 12% faster than the 2520M, possibly from more aggressive IGP Turbo modes. Again, that matches what we’ll see in the games. On the other hand, even a middling dGPU like the GT 420M/425M still comes out 40-50% ahead of the HD 3000, and AMD’s HD 5650 is 60% faster on average.

What about AMD’s Fusion E-350 platform? If the 3DMark results hold in our actual gaming tests, Intel’s “horrible” HD 3000 IGP offers over twice the performance of the HD 6310M. In fact, even an Arrandale IGP would come within 10% of the E-350 results in 3DMark. It’s not that we love Intel or want them to pummel AMD, and we understand that the E-350 competes in a lower price bracket. Still, many people like to get carried away in discussions of how much better AMD’s graphics are compared to Intel’s IGP. That’s certainly true when you’re looking at discrete GPUs, and compatibility is still better with AMD and NVIDIA drivers, but the latest SNB IGP changed the status quo. HD 3000 works in about 90% of games (roughly estimating), performs well enough to be playable in about 80% of titles, posts scores that are competitive with HD 5470 and GT 320M (and often twice what the current Brazos can achieve), and you get it for free with any 2nd Gen Core i-series CPU. As a friend of mine is fond of saying, it’s hard to compete with “free”.

ASUS K53E Impressions and User Experience Sandy Bridge Gaming Performance, One More Time
Comments Locked

78 Comments

View All Comments

  • JarredWalton - Friday, April 8, 2011 - link

    Our battery life testing has always targeted maximum battery life while still being able to complete all tasks. So with Core 2 Duo, Athlon II, Core i-series, etc. we've always set minimum CPU and maximum CPU to 0%, and enabled any special power saving features as applicable. (I should note that there are exceptions to the above: Atom, Brazos, and CULV/ULV have always been tested at 0/100% CPU settings, mostly because they are already slow--particularly Atom. The laptop needs to be able to play our H.264 video without stuttering or dropping frames.) On the ASUS U41JF, if you use the "Battery Saving" Power4Gear profile, it automatically underclocks and locks the CPU to run at no more than 900MHz. Running stock instead of underclocked reduces battery life by 5-10% as noted in the U41JF review. Finally, as I point out, it's interesting that for SNB, 0/0% actually reduces battery life compared to 0/100% CPU in two of the three battery tests--this is not the case with Arrandale.

    Regarding power efficiency: 10 to 15% better efficiency is "similar" in my book. The 30% difference in H.264 is a lot more pronounced, true. As for SSD vs. HDD, SSDs really don't use that much less power at idle, and often even under load. Look at the ASUS U30Jc with an SSD comparison: http://www.anandtech.com/bench/Product/266?vs=267 SSD wins by 7% at idle, HDD wins by 3% in the Internet, and the H.264 is a tie (0.3% difference). The 17.3" LCD vs. 15.6" LCD is going to be more than a 5% difference I'd bet, and the K53E actually has an LCD that appears to use very little power. The same applies to the comparison with Dell's E6410: 15-20% isn't massive in my book, but 55% certainly qualifies. It's better, yes, but not a huge change.

    Your E-350 comment is already addressed in the text if you don't take just one piece of the paragraph: "Ah, but the E-350 has a much better IGP, right? Well, maybe it’s better, but it’s certainly not faster than Intel’s HD 3000 when it’s bottlenecked by the CPU...." I suppose I can add "and memory bandwidth" for you though.

    Anyway, what individuals think of DC vs. QC Sandy Bridge is a matter of opinion. I was more impressed by QC, and if I could get QC over DC in the form factor I want that's what I'd do. Dell's XPS L502x for instance gives you both options, and with a moderately large 15.6" chassis the quad-core is an easy sell for me. Others might be more impressed with the dual-core stuff, but we've had dual-core Arrandale for a year and increasing battery life by 20% with 15-20% more performance is still "incremental" in my book.
  • Shadowmaster625 - Friday, April 8, 2011 - link

    "What about AMD’s Fusion E-350 platform? If the 3DMark results hold in our actual gaming tests, Intel’s “horrible” HD 3000 IGP offers over twice the performance of the HD 6310M. In fact, even an Arrandale IGP would come within 10% of the E-350 results in 3DMark. It’s not that we love Intel or want them to pummel AMD, and we understand that the E-350 competes in a lower price bracket. Still, many people like to get carried away in discussions of how much better AMD’s graphics are compared to Intel’s IGP. That’s certainly true when you’re looking at discrete GPUs, and compatibility is still better with AMD and NVIDIA drivers, but the latest SNB IGP changed the status quo."

    What is this nonsense? You claim to understand that the E-350 competes in a lower price bracket. But it is obvious you simply cannot comprehend that there is a difference between a $50 part and a $225 part. Sandy Bridge is too expensive to ever change the status quo. That product line is so expensive that it changes nothing. Except you are paying the price of a discrete gpu, plus a hefty markup, to have an integrated gpu. Intel will not lower those prices even when llano blows it out of the sky for half the price.
  • JarredWalton - Friday, April 8, 2011 - link

    Did you even read the whole conclusion? Where I repeatedly cede the sub-$600 territory to AMD? And I only mention Llano nine times throughout the review. Obviously nonsense.... Except, $600 SNB is now a viable alternative to what used to be $900 laptops. The U30Jc is slower in every regard than the current i5-2xxx CPUs -- the 310M can't keep up with HD 3000, and Arrandale can't keep up with SNB. So yes, that's "changing the status quo". Integrated graphics no longer suck quite as bad, to the point where HD 5470 is dead and so is G 310M.
  • JPForums - Friday, April 8, 2011 - link

    H.264 content is a place where Sandy Bridge excels, however, and with only a 10 minute difference between the 11.6”-screen HP dm1z and the 15.6”-screen ASUS K53E it’s pretty clear that’s one metric where SNB is more efficient.


    SNB is probably more efficient at H.264 decode, but one fact makes it a little less than clear. Ironically, you point it out here:

    Setting the LCD to 100% brightness (instead of 50%, which corresponds with 100nits), idle battery life drops 10%. Put another way, the LCD uses an extra 0.87W at 205nits. That’s a very low figure for a 15.6” LCD, ...


    That does seem like a rather low power draw for a 15.6" and makes me wonder how much power HP's 11.6" draws. The question is purely academic, though, as I would be willing to sacrifice some battery life for a better looking screen. Further, 13.3" is about as small as I'll go.

    That aside, this article makes me wonder how well similarly equipped notebooks with Optimus technology will do. It would be nice to see some designs that get most of the battery life under normal usage while giving you the ability to game when you want. Hopefully, nVidia realizes that the GPU is no longer needed for H.264 decode with SNB and will leave the CPU to take care of it.
  • JarredWalton - Friday, April 8, 2011 - link

    All of the Optimus Arrandale stuff I tested used the IGP, because it was sufficient for H.264 decode. However, oddly enough the Optimus laptops never quite seem to get to the same battery life levels as the IGP-only systems. Okay, I only tested one of the latter (Dell Latitude E6410), but for a 14" chassis the relative battery life was much better than any other Arrandale laptop we tested.

    Perhaps the real culprit is the batteries: they all come with Wh ratings, but I can tell you from personal experience that some 2500mAh NiMH Energizer AA rechargeables have got nothing on 2000mAh NiMH Eneloop AA rechargeables. But unfortunately, I don't know of a good way to independently rate a laptop battery to say exactly how good it is--not without some sophisticated equipment (and tearing the batteries apart, which would likely be frowned on). So I "hope" that these Lithium Ion batteries are more consistent than NiMH, even if they're not.
  • strikeback03 - Monday, April 11, 2011 - link

    I have some 2500 mAh Energizer AA batteries that I bought 4-5 years ago that are absolute trash. Even when new they self-discharged so fast that their apparent capacity was much lower than the rating. Fully charge them using an intelligent charger then leave them in a drawer for a few weeks and they would be dead. Nevermind low-discharge batteries like the Eneloop, even compared to other standard NiMH batteries they are awful.

    As far as Li-Ion batteries go it is relatively easy to test in a way that makes them look really good (third-party camera/phone batteries are infamous for this). I would hope that laptop batteries from well-known brands wouldn't follow the same pattern, but I suppose you never know.
  • JarredWalton - Monday, April 11, 2011 - link

    I've noticed over the years that some laptop batteries will self-discharge at a faster rate than others. I've never really tried to determine how fast that rate is, but it would be interesting to fully charge every laptop battery I have, wait three weeks (with them unplugged), then check how much charge is remaining. Most laptop batteries seem to lose most of their charge within about three months, so they definitely wouldn't keep up with the Eneloop stuff.
  • krumme - Friday, April 8, 2011 - link

    An e350 is 74mm2 - the same as Atom - this is exact numbers.
    The e350 is aprox 40usd (63usd list price - exact number) exactly the same as Atom d550 sans ion
    The e350 have fewer pins -> cheaper to integrate for the OEM than Atom

    How difficult is it to compare in the same segments?

    As an owner of an Intel sb e2520m machine (dell e6420) and an Atom nettop, i can safely say this Article is the most lousy and biased article for years.

    This leads to the most stupid buying decisions all over. Do you seriously think the e350 is a SB competitor?

    What a shame
  • JarredWalton - Friday, April 8, 2011 - link

    You're simply not getting it: less than twice the cost, more than double (actually, quadruple CPU and double GPU) the performance. That's my point. If you want an inexpensive netbook, Brazos wins. If you want an all-around laptop, E-350 is better than Atom, but Atom was horrible. What exactly is stupid about telling people that Brazos isn't the greatest thing since sliced bread? It fills a niche, but for me Brazos is at best a secondary laptop.

    Or to rephrase: do you seriously think everyone should buy Brazos and forget about better options that cost more? What a shame.
  • krumme - Friday, April 8, 2011 - link

    Who would have imagined the brazos came to 15.6" right off?

    Probably it will be all over even in desktops within the next year, when TSMC 40nm capacity expands big time.

    The OEM knows their market so the consumers clearly have a different view of performance than you and I do.

    Even with a ssd i cant tell the difference selecting max 50% perf., doing all the normal office stuff and hd video. The brazos is paired with normal hd, and its aparently just fast enough not to be noticed for the target group. And battery life is just okey. The target group would probably prefer more battery life if they could select themselves - if lower price was not possible.

    I think a review and the conclusions - with its interpretation of facts - should reflect the needs and the perspective of the target group.

    Sandy bridge is an excellent CPU with even better power management, but its like comparing a gtx570 to an 330m if you compare it to brazos. It doesnt make sense. Its different different segments, different purpose.

    If some less informed buyers - lets say the ones that buy cheap stuff - reads this article they might get the impression the HP brazos is very slow, and will select an Intel brand instead. And it will be Atom something not an i3/i5 or Llano - whatever - for that matter. And Brazos is the best since sliced bread compared to an Atom.

    Normal users dont use time installing programs or games, encode, whatever. And when they do they take a cup of coffee.

Log in

Don't have an account? Sign up now