Meet the Compal Sandy Bridge Notebook

Our review system comes from Compal via Intel, and as this is pre-release hardware there were a few minor bugs that have yet to be ironed out. For one, there was no way to disable the Bluetooth radio; perhaps a bit more alarming was that after resuming from hibernate, at least once the system fan decided to stop spinning. The latter problem made for some interesting hair-pulling, as suddenly benchmark performance started to plummet—particularly when running back-to-back CPU intensive tests! Early hardware anomalies aside, you can probably recognize the design elements from another major OEM, and it’s possible Acer/Gateway will ship something very similar to this system in the future; then again, it’s equally plausible that this was just a one-off design using existing parts so Intel could demonstrate their latest and greatest mobile platform.

Unlike the previous generation Clarksfield launch, Intel didn’t seed us with their absolute fastest mobile CPU this time around—probably because they don’t have to! We’re looking at the middle tier of quad-core performance this time, and while the i7-2920XM is technically faster, it’s hard to figure out who would be willing to part with an extra $500 just to get 100-200MHz more performance (and a 10W higher TDP). Perhaps the higher TDP will allow the Extreme version to hit maximum Turbo speeds more often, but it would likely hurt battery life in the process, so the 2820QM looks to be a good compromise. In fact, if you’re willing to give up another 100MHz and 2MB of L3 cache, the 2720QM should offer up 95% of the 2820QM performance for 2/3 the price. Here are the specs of our test system.

Compal Sandy Bridge Notebook Specifications
Processor Intel Core i7-2820QM
(4x2.30GHz, 32nm, 8MB L3, Turbo to 3.40GHz, 45W)
Chipset Intel HM65
Memory 2x2GB DDR3-1600 (Max 8GB)
Graphics Intel HD Graphics 3000
12 EUs, 650-1300MHz Core/Shader clocks
Display 17.3" LED Glossy 16:9 HD+ (1600x900)
(Seiko Epson 173KT)
Hard Drive(s) 160GB SSD (Intel X25-M G2 SA2M160G2GC)
Optical Drive BD-ROM/DVDRW Combo (HL-DT-ST CT21N)
Networking Gigabit Ethernet (Atheros AR8151 PCIe)
802.11n (Centrino Wireless-N 1030)
Bluetooth 2.1+EDR
Audio 2.0 Speakers
Microphone and two headphone jacks
Capable of 5.1 digital output (HDMI/SPDIF)
Battery 8-Cell, 14.8V, 4.8Ah, 71Wh
Front Side None
Left Side Memory Card Reader
1 x USB 2.0
Headphone Jack
Microphone Jack
1 x eSATA/USB 2.0 Combo
HDMI 1.4
VGA
Gigabit Ethernet
AC Power Connection
Kensington Lock
Right Side 2 x USB 2.0
Optical Drive
Power Switch
Back Side Exhaust vent
Operating System Windows 7 Ultimate 64-bit
Dimensions 16.3" x 10.8" x 1.1-1.35" (WxDxH)
Weight 7.3 lbs (with 8-cell battery)
Extras Webcam
99-Key Keyboard with 10-Key
Flash reader (SD, MS, MMC, xD)

The basic features are par for the course; about the only missing “modern” feature we’d like to see is USB 3.0 support, but unfortunately that’s not part of the new 6-series Intel chipsets and it’s missing from this particular test system. Many laptop manufacturers will address that shortcoming with third-party chips, so we won’t worry too much about it for now. Intel did choose to equip their sample with some nice extras, though, like a 160GB Intel G2 SSD and a Blu-ray combo drive.

As a high performance notebook, the build quality is definitely lacking, but then only the CPU and storage options are truly high-end. There’s no discrete GPU, no keyboard backlighting, a run-of-the-mill (i.e. poor) HD+ LCD, mediocre speakers, a touchpad that didn’t have functional multi-touch (or even scroll/gesture) support at this time [cue Don’t Know What You Got Till It’s Gone], and a horrible dark glossy plastic chassis. We don’t actually have a price for the system as configured, since it’s not for sale, but we can add up a few of the components and make a guess that it will come in north of $1400+ ($1000 will cover the CPU, SSD, and BRD; $400-$500 should take care of the remaining items).

Again, this seems like more of a proof of concept rather than something most users would be interested in buying. Sure, when we get to the benchmarks you’ll see that the integrated graphics are certainly sufficient for “mainstream” use, but it’s hard to call a $500+ quad-core CPU and $400 SSD anything other than enthusiast/high performance. Pair this with a decent discrete GPU (i.e. from NVIDIA with their Optimus Technology), and it would be a lot more compelling. That’s what we hope to see when we start getting retail notebooks using Sandy Bridge in for testing, so we’ll leave off critiquing the Compal design now.

Besides the complaints, let’s address the other good elements before we get to the benchmarks. First, we like the 71Wh battery; it’s not an ultra-high capacity option like some of the 95Wh models, but it’s a good step up from 48Wh batteries. HDMI 1.4 also shows up, so 3D movie viewing is possible (with the appropriate display). The other thing to point out is the memory: DDR3-1600 in a notebook. In general applications, that probably doesn’t matter much, but when you’re sharing memory bandwidth with an IGP the added bandwidth that DDR3-1600 brings will definitely prove useful. Just think: system memory bandwidth now checks in at 25.6GB/s, which is equal to what you get from midrange discrete mobile GPUs (i.e. the 420M, 425M, and 435M). More importantly, most of the Arrandale laptops we’ve tested have used DDR3-1333 memory running at DDR3-1066, so we’re talking about a healthy 50% improvement in bandwidth (at least for the faster quad-core Sandy Bridge designs).

Now, if you’re looking just at the specs, the above may not seem like it’s going to set the world on fire. The TDP on the CPU is still 45W, which means it could burn through the 71Wh battery in under two hours quite easily. However, this is where Intel’s architectural changes start to come into play. Particularly at anything less than a heavy load, battery life is substantially better than you’d expect. In fact, this is the first notebook we’ve tested where you can get close to four hours of battery life watching a Blu-ray movie—no, not watching an H.264 file off the hard drive, but actually spinning your Blu-ray drive and reading a disc! Yes, a larger 95Wh battery paired with current-generation hardware would probably break three hours, but four hours from a quad-core system is amazing.

Battery life isn’t the only thing to impress; CPU performance on laptops just took a huge leap forward. Provided your system is running at moderate temperatures, the CPU will hit very high clock speeds for single-threaded and multi-threaded tasks. Here’s another area where the sample notebook might not be the best sample of what’s to come, as sustained loads would get the CPU to the point where it would have to back down from the 3GHz range, but we still measured performance higher than desktop i7-930 in quite a few benchmarks. And as for the graphics, Arrandale finally got Intel’s IGP to the point where it was competitive with AMD’s HD 4250 IGP; Intel’s HD Graphics 3000 generally more than doubles what Arrandale could manage, which easily pushes their IGP into the entry-level gaming category—and perhaps even further.

Improved battery life, substantially higher processor performance, and integrated graphics performance that can now hang with entry-level discrete GPUs makes for a holy trinity that will be difficult to match, let alone surpass. AMD will of course have their own Fusion products launching later this year, and we expect to see at better performance compared to Intel’s IGP, but when old Core 2 processors are already matching or exceeding AMD’s mobile parts, and Clarksfield and Arrandale were significantly ahead, Sandy Bridge ups the ante yet again.

Intel has shown data for several years indicating that laptops and notebooks are easily outselling desktops globally, but never have we seen such a big jump in notebook performance between generations. An old quad-core Kentsfield desktop could still outperform the fastest Clarksfield notebooks in CPU-intensive tasks, but now you’ll need at least a decent quad-core Bloomfield/Lynnfield to keep up with the i7-2820QM. Enough talk; turn the page and see just how fast notebooks have become.

Intel’s Sandy Bridge: Upheaval in the Mobile Landscape Mobile Sandy Bridge Application Performance
Comments Locked

66 Comments

View All Comments

  • skywalker9952 - Monday, January 3, 2011 - link

    For your CPU specific benchmarks you annotate the CPU and GPU. I beleive the HDD or SSD plays a much larger role in those benchmarks then a GPU. Would it not be more appropriate to annotate the storage device used. Were all of the CPUs in the comparison paired with SSDs? If they weren't how much would that affect the benchmarks?
  • JarredWalton - Monday, January 3, 2011 - link

    The SSD is a huge benefit to PCMark, and since this is laptop testing I can't just use the same image on each system. Anand covers the desktop side of things, but I include PCMark mostly for the curious. I could try and put which SSD/HDD each notebook used, but then the text gets to be too long and the graph looks silly. Heh.

    For the record, the SNB notebook has a 160GB Intel G2 SSD. The desktop uses a 120GB Vertex 2 (SF-1200). W870CU is an 80GB Intel G1 SSD. The remaining laptops all use HDDs, mostly Seagate Momentus 7200.4 I think.
  • Macpod - Tuesday, January 4, 2011 - link

    the synthetics benchmarks are all run at turbo frequencies. the scores from the 2.3ghz 2820qm is almost the same as the 3.4ghz i7 2600k. this is because the 2820qm is running at 3.1ghz under cinebench.

    no one knows how long this turbo frequency lasts. maybe just enough to finish cinebench!

    this review should be re done
  • Althernai - Tuesday, January 4, 2011 - link

    It probably lasts forever given decent cooling so the review is accurate, but there is something funny going on here: the score for the 2820QM is 20393 while the score for the score in the 2600K review is 22875. This would be consistent with a difference between CPUs running at 3.4GHz and 3.1GHz, but why doesn't the 2600K Turbo up to 3.8GHz? The claim is that it can be effortlessly overclocked to 4.4GHz so we know the thermal headroom is there.
  • JarredWalton - Tuesday, January 4, 2011 - link

    If you do continual heavy-duty CPU stuff on the 2820QM, the overall score drops about 10% on later runs in Cinebench and x264 encoding. I mentioned this in the text: the CPU starts at 3.1GHz for about 10 seconds, then drops to 3.0GHz for another 20s or so, then 2.9 for a bit and eventually settles in at 2.7GHz after 55 seconds (give or take). If you're in a hotter testing environment, things would get worse; conversely, if you have a notebook with better cooling, it should run closer to the maximum Turbo speeds more often.

    Macpod, disabling Turbo is the last thing I would do for this sort of chip. What would be the point, other than to show that if you limit clock speeds, performance will go down (along with power use)? But you're right, the whole review should be redone because I didn't mention enough that heavy loads will eventually drop performance about 10%. (Or did you miss page 10: "Performance and Power Investigated"?)
  • lucinski - Tuesday, January 4, 2011 - link

    Just like any other low-end GPU (integrated or otherwise) I believe most users would rely on the HD3000 just for undemanding games in the category of which I would mention Civilization IV and V or FIFA / PES 11. This goes to say that I would very much like to see how the new Intel graphics fares in these games, should they be available in the test lab of course.

    I am not necessarily worried about the raw performance, clearly the HD3000 has the capacity to deliver. Instead, the driver maturity may come out as an obstacle. Firstly one has to consider the fact that Intel traditionally has problems with GPU driver design (relative to their competitors). Secondly, should at one point Intel be able to repair (some of) the rendering issues mentioned in this article or elsewhere, notebook producers still take their sweet time before supplying users with new driver versions.

    In this context I am genuinely concerned about the HD3000 goodness. The old GMA HD + Radeon 5470 combination still seems tempting. Strictly referring to the gaming aspect I honestly prefer reliability and a few FPS' missing rather than the aforementioned risks.
  • NestoJR - Tuesday, January 4, 2011 - link

    So, when Apple starts putting these in Macbooks, I'd assume the battery life will easily eclipse 10 hours under light usage, maybe 6 hours under medium usage ??? I'm no fanboy but I'll be in line for that ! My Dell XPS M1530's 9-cell battery just died, I can wait a few months =]
  • JarredWalton - Tuesday, January 4, 2011 - link

    I'm definitely interested in seeing what Apple can do with Sandy Bridge! Of course, they might not use the quad-core chips in anything smaller than the MBP 17, if history holds true. And maybe the MPB13 will finally make the jump to Arrandale? ;-)
  • heffeque - Wednesday, January 5, 2011 - link

    Yeah... Saying that the nVidia 320M is consistently slower than the HD3000 when comparing a CPU from 2008 and a CPU from 2011...

    Great job comparing GPUs! (sic)

    A more intelligent thing to say would have been: a 2008 CPU (P8600) with an nVidia 320M is consistently slightly slower than a 2011 CPU (i7-2820QM) with HD3000, don't you think?

    That would make more sense.
  • Wolfpup - Wednesday, January 5, 2011 - link

    That's the only thing I care about with these-and as far as I'm aware, the jump isn't anything special. It's FAR from the "tock" it supposedly is, going by earlier Anandtech data. (In fact the "tick/tock" thing seems to have broken down after just one set of products...)

    This sounds like it is a big advantage for me...but only because Intel refused to produce quad core CPUs at 32nm, so these by default run quite a bit faster than the last gen chips.

    Otherwise it sounds like they're wasting 114 million transistors that I want spent on the CPU-whether it's more cache, more, more functional units, another core (if that's possible in 114 million transistors) etc.

    I absolutely do NOT want Intel's garbage, incompatible graphics. I do NOT want the addition complexity, performance hit, and software complexity of Optimus or the like. I want a real GPU, functioning as a real GPU, with Intels' garbage completely shut off at all times.

    I hope we'll see that in mid range and high end notebooks, or I'm going to be very disappointed.

Log in

Don't have an account? Sign up now