Performance and Power Investigated

Given all the performance packed into the i7-2820QM, worst-case heat and noise levels should still be similar to what we encountered with Clarksfield. Idle power is good, but if you want to do some heavy processing or gaming what happens? We connected the Compal system to a Kill-A-Watt device to check power draw under various loads, as well as testing battery life while looping a graphics intensive application. We’ve seen NVIDIA and AMD GPUs really curtail performance on DC power, but has Intel done the same?

We’ve created a table of power draw at the outlet for several usage scenarios, as well as the calculated power requirements on DC based on the 71Wh battery. We’ve also included the performance figures for the tasks where applicable, to see if performance throttling is in effect when on battery power. We used the “Balanced” power profile for the AC tests, and the Power Saver profile (but still allowing the CPU to go to 100%) for battery tests. For the graphics test, we enabled the “Maximum Battery Life” setting as well as the “Balanced” setting—the graphics tests on AC were done using the “Maximum Performance” setting.

Power/Performance Under AC/DC
  Power at Outlet /
Calculated DC Power
Performance
Idle 12-13W N/A
Idle (DC) 9.04W N/A
Internet 14-31W N/A
Internet (DC) 10.24W N/A
3DMark06 48-70W 5285
3DMark06 (DC) MaxBat 23.61W 2800
3DMark06 (DC) Balanced 41.18W 5184
H.264 Playback 20-21W N/A
H.264 Playback (DC) 16.38W N/A
Cinebench 11.5 SMP 70-89W 5.72
Cinebench 11.5 SMP (DC) 59.17W 5.09

Watching power draw and CPU clocks (using CPU-Z) during the tests was rather interesting. There’s not much going on in the idle test; looking at the numbers, AC power use is about 36% higher than the DC calculated power use. Most likely extra power-saving features are in effect under DC power.

In the Internet test (under AC), while the web pages are loading the system used anywhere from 18-31W. Once all four pages have finished loading, however, power would settle down to 14W—just slightly higher than the idle power draw. That’s quite impressive, given the Flash content on the active page, and that’s reflected in the only slightly higher calculated power draw for Internet battery life vs. idle. Also of note is that the CPU clock speed never even hit 2.3GHz—let along the maximum 3.4GHz—during the Internet test, at least not that we could detect. We could see it reach 1.6GHz for a few seconds, and then it would settle back to 800MHz.

The H.264 playback test is another example of low CPU clocks and utilization through the test. The initial loading of the x264 movie would bump clock speeds up, but then the CPU would drop back to the minimum 800MHz and stay there. Power draw is definitely higher than the idle/Internet tests, but 20-21W isn’t too shabby for a 17.3” notebook. And then we get to the power hungry tests, simulating gaming and heavy CPU use.

3DMark06 power requirements are generally similar to gaming results, with the wide spread being typical. Tests 1, 3, and 4 averaged power draw closer to 53W, while test 2 (the Firefly Forest) was nearly 10W higher on average. Turbo Boost—on both the CPU and GPU—is very likely in play, but we didn’t have a good way of measuring real-time clock speeds during the tests. We tested battery graphics performance using two settings; first is the “Maximum Battery Life” setting, which results in roughly half the performance compared to running on AC. The second mode is labeled “Balanced”, which improves the score quite a bit—at the cost of power consumption.

Based on the 3DMark06 results, plugging in improves graphics performance by 2-82%, depending on what graphics power saving setting you select. You’ll definitely want to run the higher performance GPU mode if you actually want to play games, as otherwise frame rates will drop into the low 20s or upper teens on most titles. With the “Balanced” or “High Performance” GPU setting, gaming performance is reasonable even on battery power, but it puts enough of a load on the battery that you won’t be able to last more than around 90-100 minutes. If you happen to have a game where you only need the power saving performance mode, though, you should be able to get gaming battery life up to three or perhaps even four hours (depending on the game).

Finally, we’ll wrap up this discussion by looking at maximum CPU loads. In the Cinebench test, quad-core Turbo is interesting to watch; running the CB11.5 SMP benchmark, at first all of the cores start at the maximum 3.10GHz speed—blisteringly fast for a notebook! About 11 seconds in to the test, the core speed drops to 3.0GHz, where it remained until 39 seconds; then it dropped to 2.9GHz, and at around 54 seconds the speed dropped briefly (1-2 seconds) to 2.8GHz before settling in at 2.7GHz for the remainder of the test. If you happen to run heavily-threaded benchmarks continuously, the first run will usually show about 10% higher performance thanks to the initial thermal headroom, but the lowest Cinebench SMP and x264 encoding scores that we measured are still within 10% of the maximum score, which is very impressive for notebook hardware.

At the highest point in the test, power draw for the notebook peaked at 89W; once the speed settled at 2.7GHz (which it appears the notebook could sustain indefinitely in our 70F testing environment), power draw was steady at 70W. Switch to battery power and the Power Saver profile, and performance did drop slightly but not as much as you’d expect. We measured 5.09 PTS while running off the battery, so plugging in nets you up to 12% better performance. Like gaming, battery life under a heavy CPU load is going to be much lower than our other tests, and we measured just 72 minutes. Then again, compare that with some of the other high-end notebooks we’ve looked at in the past, which managed a similar 72 minutes with no load whatsoever.

One thing to keep in mind is that the effectiveness of Intel’s Turbo Boost technology does depend on the cooling equipment. While the Compal sample runs reasonably cool—we’ll check temperatures on the next page—we have definitely seen larger, more robust cooling solutions. The profile of the Compal chassis is generally flat, so that limits the size of the fan(s) and the amount of airflow. Something like the ASUS G73 chassis has proven quite effective at running high-end mobile components in the past, and we suspect that better cooling will result in the CPU running closer to the maximum Turbo limits more of the time. We’ll have to wait for sample notebooks to confirm our suspicions, but we’ve seen it in the past with Clarksfield and Arrandale, so there’s no reason Sandy Bridge would behave differently.

All the Performance, and Good Battery Life As Well! What About Heat, Noise, and the LCD?
Comments Locked

66 Comments

View All Comments

  • skywalker9952 - Monday, January 3, 2011 - link

    For your CPU specific benchmarks you annotate the CPU and GPU. I beleive the HDD or SSD plays a much larger role in those benchmarks then a GPU. Would it not be more appropriate to annotate the storage device used. Were all of the CPUs in the comparison paired with SSDs? If they weren't how much would that affect the benchmarks?
  • JarredWalton - Monday, January 3, 2011 - link

    The SSD is a huge benefit to PCMark, and since this is laptop testing I can't just use the same image on each system. Anand covers the desktop side of things, but I include PCMark mostly for the curious. I could try and put which SSD/HDD each notebook used, but then the text gets to be too long and the graph looks silly. Heh.

    For the record, the SNB notebook has a 160GB Intel G2 SSD. The desktop uses a 120GB Vertex 2 (SF-1200). W870CU is an 80GB Intel G1 SSD. The remaining laptops all use HDDs, mostly Seagate Momentus 7200.4 I think.
  • Macpod - Tuesday, January 4, 2011 - link

    the synthetics benchmarks are all run at turbo frequencies. the scores from the 2.3ghz 2820qm is almost the same as the 3.4ghz i7 2600k. this is because the 2820qm is running at 3.1ghz under cinebench.

    no one knows how long this turbo frequency lasts. maybe just enough to finish cinebench!

    this review should be re done
  • Althernai - Tuesday, January 4, 2011 - link

    It probably lasts forever given decent cooling so the review is accurate, but there is something funny going on here: the score for the 2820QM is 20393 while the score for the score in the 2600K review is 22875. This would be consistent with a difference between CPUs running at 3.4GHz and 3.1GHz, but why doesn't the 2600K Turbo up to 3.8GHz? The claim is that it can be effortlessly overclocked to 4.4GHz so we know the thermal headroom is there.
  • JarredWalton - Tuesday, January 4, 2011 - link

    If you do continual heavy-duty CPU stuff on the 2820QM, the overall score drops about 10% on later runs in Cinebench and x264 encoding. I mentioned this in the text: the CPU starts at 3.1GHz for about 10 seconds, then drops to 3.0GHz for another 20s or so, then 2.9 for a bit and eventually settles in at 2.7GHz after 55 seconds (give or take). If you're in a hotter testing environment, things would get worse; conversely, if you have a notebook with better cooling, it should run closer to the maximum Turbo speeds more often.

    Macpod, disabling Turbo is the last thing I would do for this sort of chip. What would be the point, other than to show that if you limit clock speeds, performance will go down (along with power use)? But you're right, the whole review should be redone because I didn't mention enough that heavy loads will eventually drop performance about 10%. (Or did you miss page 10: "Performance and Power Investigated"?)
  • lucinski - Tuesday, January 4, 2011 - link

    Just like any other low-end GPU (integrated or otherwise) I believe most users would rely on the HD3000 just for undemanding games in the category of which I would mention Civilization IV and V or FIFA / PES 11. This goes to say that I would very much like to see how the new Intel graphics fares in these games, should they be available in the test lab of course.

    I am not necessarily worried about the raw performance, clearly the HD3000 has the capacity to deliver. Instead, the driver maturity may come out as an obstacle. Firstly one has to consider the fact that Intel traditionally has problems with GPU driver design (relative to their competitors). Secondly, should at one point Intel be able to repair (some of) the rendering issues mentioned in this article or elsewhere, notebook producers still take their sweet time before supplying users with new driver versions.

    In this context I am genuinely concerned about the HD3000 goodness. The old GMA HD + Radeon 5470 combination still seems tempting. Strictly referring to the gaming aspect I honestly prefer reliability and a few FPS' missing rather than the aforementioned risks.
  • NestoJR - Tuesday, January 4, 2011 - link

    So, when Apple starts putting these in Macbooks, I'd assume the battery life will easily eclipse 10 hours under light usage, maybe 6 hours under medium usage ??? I'm no fanboy but I'll be in line for that ! My Dell XPS M1530's 9-cell battery just died, I can wait a few months =]
  • JarredWalton - Tuesday, January 4, 2011 - link

    I'm definitely interested in seeing what Apple can do with Sandy Bridge! Of course, they might not use the quad-core chips in anything smaller than the MBP 17, if history holds true. And maybe the MPB13 will finally make the jump to Arrandale? ;-)
  • heffeque - Wednesday, January 5, 2011 - link

    Yeah... Saying that the nVidia 320M is consistently slower than the HD3000 when comparing a CPU from 2008 and a CPU from 2011...

    Great job comparing GPUs! (sic)

    A more intelligent thing to say would have been: a 2008 CPU (P8600) with an nVidia 320M is consistently slightly slower than a 2011 CPU (i7-2820QM) with HD3000, don't you think?

    That would make more sense.
  • Wolfpup - Wednesday, January 5, 2011 - link

    That's the only thing I care about with these-and as far as I'm aware, the jump isn't anything special. It's FAR from the "tock" it supposedly is, going by earlier Anandtech data. (In fact the "tick/tock" thing seems to have broken down after just one set of products...)

    This sounds like it is a big advantage for me...but only because Intel refused to produce quad core CPUs at 32nm, so these by default run quite a bit faster than the last gen chips.

    Otherwise it sounds like they're wasting 114 million transistors that I want spent on the CPU-whether it's more cache, more, more functional units, another core (if that's possible in 114 million transistors) etc.

    I absolutely do NOT want Intel's garbage, incompatible graphics. I do NOT want the addition complexity, performance hit, and software complexity of Optimus or the like. I want a real GPU, functioning as a real GPU, with Intels' garbage completely shut off at all times.

    I hope we'll see that in mid range and high end notebooks, or I'm going to be very disappointed.

Log in

Don't have an account? Sign up now