Conclusion: An Ultraportable Demon

When Jarred reviewed the Alienware M11x R2, he was so pleased with it that he did the most sensible thing he could: compiled a wish list for the next generation model. Improved connectivity, a better screen, and DirectX 11-class graphics. There was no excuse for omission of gigabit ethernet in the R2, but there wasn't a good, power-optimized DX11 solution on hand at the time either. In upgrading the M11x, Alienware has fixed everything that matters and bolstered everything else. Gigabit ethernet, Bluetooth 3.0 as an optional upgrade, USB 3.0, and the improved NVIDIA GeForce GT 540M all work alongside the shiny new Sandy Bridge low voltage processor to provide the most gaming performance per square inch one could conceivably pack into a modern laptop.

Well, almost everything's been fixed. The screen continues to be a major sore spot for the M11x R3, and if anything, it's only gotten worse. While Alienware seems to have gunned for netbook-of-the-year with the M11x's design, the 11.6" screen seems like more of a formality than a legitimately practical decision. The bezel's huge, and could easily accommodate at least a 12.1" screen or better. The move to a 12" screen also brings IPS technology to the table; if Lenovo can pack that into their ThinkPad X201, we don't see why Dell can't source those screens for a premium piece of kit like the M11x. Of course, then it's not the M11x, it's the M12x, but we'd be willing to increment the model number by one if it means a vast improvement on the one part of the M11x that most desperately needs attention.

Since most of our requests have been addressed by the M11x R3, it seems only fitting to continue looking Alienware's gift horse in the mouth and asking for more. My wish list includes three things. The first is the obvious one: improving that screen. My second is one that I think has a better shot of happening, and that's an inclusion of an mSATA SSD as a system drive alongside the 2.5" HDD for storage. Like a lot of you I'm a big proponent of mSATA becoming fairly universal in modern notebooks: even if the notebook doesn't ship with an mSATA SSD, the option would be greatly appreciated.

My third request is going to extend to the M11x R3's big brother, the M14x. While having the intakes on the bottom of the notebook is fine for land monsters like the M17x and M18x, notebooks as small as these two should be usable on the user's lap, period. I don't like having that intake someplace where it can be easily blocked off, and the "wind tunnel" style cooling that Intel pioneered and Toshiba employs with their Tecra R840 and Portege R830 looks like the kind of redesign the M11x and M14x desperately need. Understanding the inside of the M11x is pretty cramped to begin with, finding some way to improve the cooling system to further reduce noise and allow the notebook to better be used as an actual laptop would still be appreciated.

As for the M11x R3 itself? Well, the M11x R2 was an Editor's Choice Silver winner, and certainly easy enough to recommend. Everything is up (except the pricetag for a decent configuration), and you're still not going to find a more portable gaming solution. It should be a shoo-in for Editor's Choice again, but in the process of updating everything Alienware still left one of the most grievous problems with the M11x untouched...again. In fact, it was worse than untouched, it was actually exacerbated. The panel in our review unit has defied the odds and is somehow worse than its predecessors in every metric but brightness. Jarred's gone back and forth over things like this before, and unfortunately I have to agree with him: the first time is forgivable, but we're on the R3 and the screen is still dire.

The R3 is easy to recommend over the R2. It's absolutely worth the money, definitely the best one Alienware's released thus far, and an easy sell for the portable gamer. The $999 stock configuration can easily be left unchanged; the i5-2537M isn't too much slower than the i7-2617M, 4GB of DDR3 is enough to game, the 320GB 7200-RPM hard drive is on the smallish side but still decent, and adding an additional 1GB of video memory to the GT 540M is a waste. So while the base price has gone up over time, the actual cost of getting a good configuration seems to have dropped. If you were interested in the M11x, the R3 is awesome.

But we can't reward complacency. Our biggest gripe with the previous two has only gotten worse with time. Fix the screen, Alienware, and you've probably got a Silver award in your future. Tweak the cooling and you'll go Gold.

The Screen Still Sucks, Though
Comments Locked

55 Comments

View All Comments

  • JarredWalton - Friday, July 22, 2011 - link

    There are two major problems with your A8-3530MX plan:

    1) Judging by the results of the A8-3500M, it is unlikely that the 3530MX would actually be faster than the i7-2617M in most applications. Certainly the SNB CPUs will be faster in single-threaded performance.

    2) i7-2617M (and the i5-2537M) are 17W TDP parts, which means they should use at most around 17W. The A8-3500M by contrast is a 35W TDP part, while the 3530MX is a 45W part; good luck getting that into a sub-13" chassis. I know you hate Intel, but that doesn't make AMD's parts universally better.

    Assuming Alienware did go with Llano, pricing should at least drop $50 to $100 or so, which really isn't the point with a premium brand like Alienware. Perhaps Dell or HP will make a 13.3" laptop with an A6 (A8 TDP is still too high for most companies to try fitting it into anything smaller than 14"), and sell it for $600 without a discrete GPU. Performance will be lower, making high detail gaming a non-starter, but you can get two such laptops for the cost of the M11x R3 with i7 CPU.
  • Beenthere - Friday, July 22, 2011 - link

    Seems like there's a few Intel fanbois at Anandtech. The M11x is still ugly and they won't get my money until they sell AMD powered laptops. :)
  • JarredWalton - Friday, July 22, 2011 - link

    And that, my friend, is the definition of a fanboi: "I won't buy it unless it has brand X." It's not about being better; rather, it's about using a specific brand for no reason other than the brand.

    Fact: Intel currently has a faster CPU at every power level than AMD.
    Fact: AMD has a better integrated GPU.
    Fact: With a discrete GPU, Intel will be faster (see point one above).
    Fact: AMD costs less for their APU vs. comparable Intel CPU + dGPU.

    So, if you want to take that and say that AMD is better on price/performance, that's fine. They are. But if you need a specific level of performance, the pricing difference starts to erode. Consider:

    The Fusion 6620G graphics is about as fast as a GT 525M, slower than the GT 540M, and also slower than the HD 6630M. It's good for up to ~medium settings at 1366x768, but you wouldn't want higher resolution gaming on it. Add in a faster dGPU and you've added $100 to the price, and now the Llano CPU becomes more of a bottleneck. Heck, the Llano CPU is even a bottleneck for the fGPU in the 3500M, though I suspect that will largely go away with the 3530MX.

    So if you're looking at a 45W TDP Llano and adding in a dGPU, how would that be better than just going with a faster Intel CPU with the same dGPU? If the price difference is only $50 (which precludes Alienware type of hardware, obviously), and you're already paying $800+, the 6% increase in total cost will be outweighed by a greater than 6% increase in overall performance.

    Llano A8 laptops priced under $700 should sell quite well. That's not even remotely in Alienware/Dell's plans for the M11x, which is why they're not worried about Llano. Deliver better performance, charge more, and make more money -- that's what the M11x R3 is supposed to do. If you don't want to buy it because it costs to much, that's sensible, but to refuse to buy something "because it doesn't have an AMD APU"? That's brand loyalty, which is just a less offensive way of saying fanboi.
  • SquattingDog - Saturday, July 23, 2011 - link

    Well said Jarred :)
  • redchar - Friday, July 22, 2011 - link

    The m11x and liano parts seem to have a lot in common, I was thinking. They put more focus on the GPU than on the CPU - but not in such a way to leave the cpu crippled as you would with atom or bobcat. I was thinking that I could see a liano laptop of the size of the m11x, I mean, sure liano parts are 35-45w, but the 540M GPU in the r3 is around 30-40 too, and liano is a CPU+GPU chip. So, I believe you could make a laptop similar to the m11x with liano. It would cater to the same market, more or less. The only thing is, as this review shows, it would not perform as well. Close, but not as good. It's actually impressive how well the sandy bridge + nvidia discrete work together with optimus to result in better performance and battery life than a fusion part gets, seeing as fusion's integration has its advantages. I'm not the kind of fanboy who would bother getting an inferior machine just to avoid buying a competing product, so I would stick by the m11x, but liano would be a pretty nice second choice, and I would assume at a lower price, too.

    Certainly, as he mentioned though, alienware's styling is not the best, and not everyone can put up with it. I believe in function over fashion, but that's just me.
  • Wolfpup - Friday, July 22, 2011 - link

    Disappointed it still uses Floptimus. But the GPU itself...is that really much of an upgrade? Yes it's 96 instead of 72 cores (and Direct X 11), but the new archetecture ends up performing somewhere between 2/3 to 100% the speed of the old architecture for the number of cores, right?

    Is it really that big an upgrade?

    Total side note, I've got a 96 core part just like that in a desktop that runs Folding at Home 24/7. Wish I could put something better in there, but it's got a tiny power supply, and that part works (does a heck of a lot of work Folding too!)
  • JarredWalton - Friday, July 22, 2011 - link

    Using clever terms like "Floptimus" doesn't actually make the technology bad. What's wrong with Optimus, where you would actually prefer not having it?

    Let me see... you can get Intel and NVIDIA reference drivers, switching happens very quickly, and you pay perhaps a 3-5% performance hit in some cases (due to the copying of the frame buffer). You also lose out on 3D Vision (who cares?), and for some titles you have to create a custom profile.

    The alternative is the switchable graphics used in the original M11x. Switch and reboot, or switch and watch the flickering for five seconds; neither is a great experience (though SLI notebook flicker just as bad). Now go ahead and ask how many driver updates such laptops have received since launch. I believe the correct answer is two updates since the March 2010 review, and that's two more than most other non-Optimus switchable graphics laptops have received. The current driver is actually relatively recent: 263.08 released in March of this year. Any games released since then have the potential for compatibility issues, though the GT 335M isn't likely to have issues since it's an older DX10.1 part.

    In short, Optimus isn't perfect, but it's a lot closer than the switchable graphics alternatives I've encountered.

    Concerning the GT 540M vs. GT 335M, it's really not even close as far as performance goes. Sure, each core on the 400M/500M may not always be faster than a 300M core, but the 540M comes clocked at 1344MHz on the shaders vs. 1080MHz with the 335M. We tested the ASUS N82Jv, which used a full i5-450M instead of a ULV CPU, and even with a ULV SNB part the M11x R3 is quite a bit faster in gaming (the closest the GT 335M gets is in STALKER at our Medium preset; everything else is about 20-40% faster for the 540M). Here's the comparison: http://www.anandtech.com/bench/Product/246?vs=396
  • redchar - Friday, July 22, 2011 - link

    I'm still on the r1, so with the launch of the r2, I thought I didn't care for optimus - the r2 had a loss in battery life and I was wondering if it had to do with the nvidia gpu turning on when it didn't need to - that it had imperfect sensing of what gpu is needed for what task. So, I thought I liked the switchable option best, for it puts the user in complete control so that he can decide when he wants to use the discrete gpu.
    But considering the r3 is better all around, I can't really hate optimus now. It's certainly easier, especially to the casual user who wouldn't know how to work switchable, or that it even existed. If the r3 can get the same battery life with the pain-free optimus that the r1 can get with manual switching, then it's all for the best.
  • Uritziel - Friday, July 22, 2011 - link

    I completely agree with Jarred.
  • ouchtastic - Friday, July 22, 2011 - link

    I think 1280 x 720 is the sweet spot res to game on this laptop, it's a standard HD res, and your framerates can only be higher than what's already been benched.

Log in

Don't have an account? Sign up now