ASUS UX31A: Stress Testing

For everyday use, most laptops will be fine, and the UX31A is no exception. However, it’s also important to see how a laptop behaves under more strenuous loads—keeping in mind that a brand new laptop has no dust to contend with and is basically performing optimally. Over the life of a laptop, cooling performance will generally deteriorate slightly, and if a laptop already struggles with heat under load (e.g. Dell’s XPS 15), that’s only going to get worse. To see how the UX31A fares, we did some extensive testing for throttling under several sustained loads. First, we stress just the CPU cores by looping the second pass of an x264 encode, all while recording CPU clock speeds and temperatures with HWiNFO64. Then we run a gaming test—in this case Batman: Arkham City—while doing the same, and finally we combine the two and set x264 to use all but one CPU core (so in this case it’s on virtual cores 2 and 3) with Batman running on that core (virtual cores 0 and 1). Here are the clock speed and temperature results for the UX31A.

Loading up just the CPU cores, the UX31A performs admirably. CPU clocks touch 3.0GHz a few times early on, but that’s before we start the x264 encoding loop. For the most part they’re pegged at 2.8GHz and stay there for the duration of our test. There’s a cyclical nature to the temperatures, as every couple of minutes the encoding pass restarts and the brief delay between one loop and the next apparently allows the CPU to recover slightly. While it's not shown in the charts, at the end of the test run temperatures quickly drop back into the “reasonable” range—in under 10 seconds we go from CPU temperatures of over 80C to less than 60C, with another 50 seconds or so (at lower fan speeds) bringing the temperatures down to around 50C, which is where the laptop tends to idle.

Run any 3D game where the GT cores have to work and the story changes dramatically. The CPU cores at 100% load and 2.8GHz consume around 15W of the allowed 17W TDP, but the GT cores under load appear to be capable of drawing 10-11W. Try to use them both at the same time and what follows is a balancing act (i.e. throttling) in order to stay within the allowed power and thermal envelope. The CPU package does manage to exceed the 17W TDP for a time, but after seven or eight minutes it drops to 17W before eventually stabilizing around 15W (±5%). The GPU clocks are also all over the map initially, as in this case Batman is busy loading and we’re watching the intro videos and navigating the menus. After about five minutes we’re in the actual game and we can see the CPU and GPU clocks (mostly) stabilize. Even after more than an hour, however, we still see GPU clocks as low as 500 MHz and as high as 900 MHz, with CPU clocks ranging from 1.0GHz to 2.5GHz—all while we’re sitting still and watching over Arkham City from a high perch. Not surprisingly, the result in terms of actual frame rates is that they can vary upwards of 50%, which makes for a generally less than desirable experience even if average frame rates are 30+ FPS in some titles.

Our final stress test performs x264 encoding while running Batman, with each task set to use one of the two available cores along with the appropriate Hyper-Threaded core. The result isn’t actually all that different from just running Batman alone, with the GPU and CPU cores dropping to lower clocks in order to maintain a package TDP of <17W. Somewhat interesting to note however is that this time the package TDP stays much closer to 17W (instead of 15W), with CPU and GPU clocks tending to be a bit more stable and higher as well. We see the GPU dip as low as 450MHz on occasion, but we also see clocks of 1050MHz on a regular basis; likewise, the CPU drops as low as 1GHz on one core and 1.2GHz on the other core, but much of the time the cores are in the 1.5GHz to 2.1GHz range.

The net takeaway here is that the ULV Ivy Bridge processors can’t actually hit max clocks on both the GPU and CPU cores without exceeding their 17W TDP. There’s potential for configurable TDP to allow plugged-in Ultrabooks to run ULV chips at a higher power envelope to improve performance. In fact, you can set the UX31A to 25W TDP, but it appears the cooling solution isn’t actually able to deal with the higher TDP for longer periods of time and thus the CPU ends up dropping back to 17W after a few minutes of heavy lifting. That’s hardly surprising, considering how thin the UX31A is—there’s just not much space for air to flow through.

More to the point, other Ultrabooks often omit the ability to change the TDP levels, so even with better cooling it wouldn’t be possible to run the CPU and GPU at full tilt; for that, you’d need a 25W TDP in practice—around 10W for the HD 4000 and another 15W for the CPU cores. Dustin tested the HP Envy 14 Spectre, which tended to run quite a bit cooler than the UX31A (and it’s also quite a bit larger). While we didn’t perform a full throttling analysis of the Spectre, we can already see from the above results what would happen. If you’re hoping to run an Ultrabook (i.e. a ULV CPU) at max Turbo Boost speeds all the time while loading up both the CPU and GPU, that just doesn’t look possible. Unless Intel can do something unexpected, I don’t think Haswell will even fix the problem. The simple fact is that loading up all areas of an approximately 1 billion transistor processor die at high clock speeds uses too much power to fit within the ULV TDP, and clock speeds are the way to address the issue.

What about Noise?

With all the stress testing so far, we've focused on CPU/GPU clock speeds and temperatures. System noise is another important factor that we need to look at. There's not as much to discuss, as the fan speed and system noise are very nearly maxed out in all of the above tests (though running just the CPU at 100% may not get the UX31A quite as loud). At idle, the UX31A sits roughly at the limits of our equipment: 30dB. Once we start to put a load on the CPU, fan speed escalates with temperatures until it maxes out at around 80C. At that point, the system fan generates 39.5dB of noise from a distance of one foot. Given we're dealing with a single relatively small fan, it's not too surprising that the character of the noise is slightly less comfortable than other laptops, with a relatively high pitch. I actually found the fan noise to be more annoying when it was in the 35-37dB range, with the pitch seeming to decrease slightly at the maximum 39dB. That said, I doubt most people will be pegging the CPU or GPU that hard with an Ultrabook, which makes the noise less of a concern; for typical Internet and office tasks, the UX31A is usually under 34dB.

ASUS UX31A: Battery Life ASUS UX31A: A Great Ultrabook, but Still an Ultrabook
Comments Locked

106 Comments

View All Comments

  • cknobman - Tuesday, August 28, 2012 - link

    You know best buy sells this model: UX31A-R5102F

    Which has:
    128GB SSD
    core i5 3317
    1080p screen

    Best part is it only costs $999

    So you can get a full 1080p ultrabook for under a grand.
  • Connoisseur - Tuesday, August 28, 2012 - link

    Careful with the best buy versions. From what I read on the AT forums, the quality control on the store model screens can be lacking as compared to the direct purchase versions. There's a lot of anecdotal evidence that the screens in the stores come with a higher prevalence of stuck pixels, bad backlight bleeding etc. They also mention that Asus doesn't cover the store model versions in their stuck pixel guarantee.
  • Captmorgan09 - Tuesday, August 28, 2012 - link

    Yep, I purchased this model about a month ago for traveling and working on photos in Lightroom. When I first saw it on Best Buy's site at $999, I was skeptical about it having the 1080p IPS LCD but decided to take the plunge anyways. I absolutely love the monitor, I finally have a travel laptop that I can be fairly confident in post processing my photos and posting online.

    As for the 4GB of RAM being on the slim side for photo work, Lightroom is actually not too bad in terms of RAM consumption. Yes, I would like to have 6 or 8, but 4GB does work when editing Canon 7D sized RAW photos in Lightroom 4.1.

    *Lightroom 4 is a SLOW POS no matter how much RAM/CPU you throw at it. If it wasn't for a few very nice new features, I would go back to 3.x.
  • quiksilvr - Tuesday, August 28, 2012 - link

    Why did they go through the trouble of putting mini-displayport at all? I thought Ivy Bridge was Thunderbolt ready.
  • janderk - Tuesday, August 28, 2012 - link

    It's not a mini-display port (that is a small, probably soon fixed, error in the nice review). It is a mini VGA connection. My UX31A came with a VGA dongle which can be connected to this port.
  • JarredWalton - Tuesday, August 28, 2012 - link

    Oh, you're right! I assumed it was mini-DP with an adapter to VGA. What a shame!
  • Roland00Address - Tuesday, August 28, 2012 - link

    And there was no real reason to go with mini vga. Mini vga is a non standard connection that will need an adapter to go to vga.

    They could have gone with mini displayport which is about the same size and has 4 advantages.
    0) It can be adapted to vga with a cheap adapter
    1) It can be adapted to dvi with a cheap adapter
    2) It can be adapted to hdmi with a cheap adapter (and this adapter will carry sound)
    3) It can run a 2560x1600 display with a mini displayport to displayport cable
    4) Mini displayport adapters are far more common thus if you lose your adapter all you have to do is go to a apple store or a best buy to get this cable since macs have standardized on mini displayport/thunderbolt. Thus if you are a professional that needs to do a presentation and you are in a hurry and you need the adapter now you can go to a brick and mortor store and get this adapter right now.
  • peterfares - Sunday, September 9, 2012 - link

    Yeah, that mini VGA port is really stupid. Dell's ultrabook has a mini DP connector. It just makes so much sense.
  • DanNeely - Tuesday, August 28, 2012 - link

    Nope. You still need to add a TB chip for an extra $20-30 to the total cost and whatever tradeoffs the extra space needed on the PCB requires.
  • boogerlad - Tuesday, August 28, 2012 - link

    Is it possible to remove the cooler right after removing the back cover, or is the io cable going to get in the way first? I'd like to replace the thermal paste. Any warranty void stickers?

Log in

Don't have an account? Sign up now