Temperatures and Noise

Wrapping up our look at the Compal notebook, we measured the noise levels and temperatures at idle and at load. Since we don’t know if this particular configuration will even hit retail, we won’t dwell on it too much, but here are the results. Idle surface temperatures measured between 24C and 29C on the keyboard and palm rest, while the bottom of the notebook is slightly warmer and showed temps of 26-32C. Most of the notebook is close to room temperature, but near the CPU and exhaust at the back/middle the system is a bit warmer. Under heavy load (for over an hour), the temperatures increase, but the fan and dynamic CPU/GPU clocks keep things reasonable. The top temperatures increased to 24-33C, while the bottom measured 26-38C. Temperatures at the exhaust under load were around 44C. Here’s a shot of internal system temperatures, courtesy of HWMonitor.

Using the same idle and load tests, we also checked noise levels. The BIOS on this particular setup allows you to configure two temperatures and fan speeds, with the defaults being 75% fan speed at 55C and 100% fan speed when the CPU hits 70C. There are apparently other settings in effect, however, as we noticed four distinct fan speeds. Anyway, below about 45C, the fan shuts off and you have a silent notebook. Given the low power requirements and CPU temperatures at idle, the system fan is usually off under light loads, with the result being system noise right at the floor of our testing environment/equipment: 30dB. Occasionally the fan will spin up and create about 32.5dB of noise, but this usually only lasted a few seconds at most. Running heavy loads will usually get the fan at maximum speed after 20 seconds or so, at which point we measured 41dB; that’s still tolerable considering how infrequent such loads usually are, though if you do heavy number crunching or video editing you might end up with a moderately noisy notebook.

Average Resolution, Average Performance

What about the LCD? We’ve only looked at a few 17.3” notebooks, with their associated 900p resolution. So far we’ve had the Clevo W870CU (Chi Mei N173O6-L02), the ASUS X72D/K72DR with the same panel, and the Dell Studio 17 (with an unknown panel). The Sandy Bridge test system apparently comes with a Seiko Epson 173KT panel, but the characteristics are no better—and sometimes worse—than the other 17.3” 900p displays we’ve looked at.

Laptop LCD Quality - Contrast

Laptop LCD Quality - White

Laptop LCD Quality - Black

Laptop LCD Quality - Color Accuracy

Laptop LCD Quality - Color Gamut

Color gamut is pretty good, and accuracy is perhaps a bit better than average, but the contrast is a disappointing 217:1 and the maximum brightness is a none-too-impressive 226nits. While there are certainly worse LCDs out there, this particular panel is yet another that fails to rise above mediocrity. We have yet to test a 900p display that has impressed us, so consider this a warning.

Performance and Power Investigated Sandy Bridge: Bridging the Mobile Gap
Comments Locked

66 Comments

View All Comments

  • JarredWalton - Tuesday, January 4, 2011 - link

    Definitely a driver bug, and I've passed it along to Intel. The HD 4250 manages 7.7FPS, so SNB ought to be able to get at least 15FPS or so. The game is still a beast, though... some would say poorly written, probably, but I just call it "demanding". LOL
  • semo - Monday, January 3, 2011 - link

    Thanks for mentioning USB 3.0 Jarred. It is a much too overlooked essential feature these days. I simply will not pay money for a new laptop in 2011 without a single USB 3.0 port.
  • dmbfeg2 - Monday, January 3, 2011 - link

    Which tool do you use to check the turbo frequencies under load?
  • JarredWalton - Monday, January 3, 2011 - link

    I had both CPU-Z and the Intel Turbo Monitoring tool up, but neither one supports logging so I have to just eyeball it. The clocks in CPU-Z were generally steady, though it's possible that they would bump up for a few milliseconds and then back down and it simply didn't show up.
  • Shadowmaster625 - Monday, January 3, 2011 - link

    On the other Sandy Bridge article by Anand, right on the front page, it is mentioned that the 6EU GT1 (HD2000) die has 504M transistors, while the 12EU GT2 (HD 3000) die has 624M transistors. Yet here you are saying HD Graphics 3000 has 114M. If the 12EU version has 120M more transistors than the 6EU version, then does that not imply a total gpu transistor count well north of 200M?
  • JarredWalton - Monday, January 3, 2011 - link

    AFAIK, the 114M figure is for the 12EU core. All of the currently shipping SNB chips are quad-core with the full 12EU on the die, but on certain desktop models Intel disables half the EUs. However, if memory serves there are actually three SNB die coming out. At the top is the full quad-core chip. Whether you have 6EU or 12EU, the die is the same. For the dual-core parts, however, there are two chips. One is a dual-core with 4MB L3 cache and 12EUs, which will also ship in chips where the L3 only shows 3MB. This is the GT1 variant. The other dual-core version is for the ultra-low-cost Pentium brand, which will ship with 6EUs (there will only be 6EU on the die) and no L3 cache, as well as some other missing features (Quick Sync for sure). That's the GT2, and so the missing 120M includes a lot of items.

    Note: I might not be 100% correct on this, so I'm going to email Anand and our Intel contact for verification.
  • mino - Monday, January 3, 2011 - link

    Nice summary (why was this not in the article ?).

    Anyway those 114M do not include memory controller, encoding, display output etc. so the comparison with Redwood/Cedar is not really meaningful.

    If you actually insist on comparing transistor counts, semething like (Cedar-Redwood)/3 shall give you a reasonable value of AMD's SPU efficiency from transistors/performance POW.
  • mino - Monday, January 3, 2011 - link

    "After all, being able to run a game at all is the first consideration; making it look good is merely the icing on the cake."

    If making it look good is merely icing on the cake, why bother with GPUs ? Lets just play 2D Mines!
    (While for the poor souls stuck with Intel IGPs it certainly is just the icing, for Christ's sake, that is a major _problem_, not a feature !!!)

    After a few pages I have decided to forgo the "best-thing-since-sliced-bread" attitude, but, what is too much is too much...
  • mino - Monday, January 3, 2011 - link

    Regardless the attitude, HUGE thanks for listening to comments and including the older games roundup.

    While I'd love to see more games that actually provide playable frame-rates (read: even older ones) on SNB-class IGPs like Far Cry or HL2, even this mini-roundup is a really big plus.

    As for a suggestion on future game-playability roundup on IGP's, it is really simple:
    1) Take a look at your 2006-2007 GPU benchmarking suites
    2) Add in a few current MMORPGs
  • JarredWalton - Monday, January 3, 2011 - link

    Anand covered several other titles, and most of the pre-2007 stuff should run fine (outside of blacklisting problems or bugs). Time constraints limit how much we can test, obviously, but your "reviewer on crack" comment is appreciated. 2D and 3D are completely different, and while you might feel graphical quality is of paramount importance, the fact of the matter is that SNB graphics are basically at the same level as PS3/Xbox 360 -- something millions of users are "okay" with.

    NVIDIA and AMD like to show performance at settings where they're barely playable and SNB fails, but that's no better. If "High + 1680x1050" runs at 20FPS with Sandy Bridge vs. 40FPS on discrete mobile GPUs, wouldn't you consider turning down the detail to get performance up? I know I would, and it's the same reason I almost never enable anti-aliasing on laptops: they can't handle it. But if that's what you require, by all means go out and buy more expensive laptops; we certainly don't recommend SNB graphics as the solution for everyone.

    Honestly, until AMD gets the Radeon equivalent of Optimus for their GPUs (meaning, AMD GPU + Intel CPU with IGP and automatic switching, plus the ability to update your Radeon and Intel drivers independently), Sandy Bridge + GeForce 400M/500M Optimus is going to be the way to go.

Log in

Don't have an account? Sign up now