Alienware's M18x, Part 1: NVIDIA's GeForce GTX 580M in SLIby Dustin Sklavos on October 3, 2011 11:50 AM EST
Conclusion: More Notebook than You Need?
For end users who want a powerful gaming notebook, or heck, a powerful notebook in general, my go-to has been the Alienware M17x R3 since I reviewed it (provided they have the budget for it). Now that I've had a chance to sit down and review the M18x, my go-to is...still the M17x R3. The M18x is faster, yes, but with the added performance come some additional compromises.
In and of itself, the M18x is another feather in Alienware's cap. The screen quality is good, the overall design feels sturdy and attractive (although I think I'd pass on the red finish and stick with the black), I still love that stupid glowing keyboard, and the performance is there (and how!). There's plenty of connectivity, upgrade options, and so on. There's nothing inherently wrong with the M18x. Except that it's freaking huge.
Where the M17x R3 feels like a fairly balanced mobile workstation and gaming system, heavy but not insanely heavy to the point where you just don't want to cart it around anywhere, the M18x is beastly. I review the lion's share of desktops here (read: desktops that don't have big glowing apples on them) and with many builds I often feel like they're excessive. They're past the point of diminishing returns, where you just don't get performance and efficiency commensurate with their size/noise/cost/power consumption. That's how I feel about the M18x. If the M17x R3 is a sound investment for someone who wants a good, stylish mobile gaming system with the performance they require, the M18x feels like an offering for the more-money-than-sense crowd.
We're at a point now where top-end mobile GPUs really are good enough, particularly when no one is doing higher than 1080p displays. NVIDIA's GeForce GTX 580M is incrementally faster than the GTX 485M, but it runs roughshod on AMD's still-capable Radeon HD 6970M, and with the M17x R3 Alienware adds Optimus support as an enticing bonus. Adding a second 580M just doesn't seem to be worth the headache often associated with multi-GPU configurations, much less the expense. If you're only going to go with one GPU, there's absolutely no reason not to just get the smaller and more affordable M17x R3. Alienware offers four GPU configurations for the M18x, three multi-GPU rigs and a single GeForce GTX 560M, which really should tell you all you need to know.
If you want as much power as you can conceivably cram into a notebook, I can certainly recommend the M18x over any competition from Clevo or really any other notebook. You want power? You got it. But if you want a more balanced design, I'd strongly encourage you to stick with the M17x R3.
Of course, we're only halfway done with the M18x. Check back soon when we'll have the second half, focusing on the AMD Radeon HD 6990M in both single and CrossFire configurations to see how it performs on its own as well as how it stacks up against the competition. The 580Ms in SLI aren't just going to have to be faster than the 6990Ms in CrossFire, they're going to have to be $700 faster. Stay tuned.
Update: Alienware's muxed graphics solution still uses the drivers from NVIDIA's Verde program, so updating drivers is a non-issue. However, the end conclusion remains the same: the M18x still feels like too much, while the M17x R3 is probably going to be the gaming notebook of choice for the overwhelming majority of users.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Meaker10 - Monday, October 3, 2011 - linkNot when you unlock the higher core voltage, bump up the core speed to GTX560Ti speeds =D
bennyg - Wednesday, October 5, 2011 - linkAt least the XM offers overclocking. The 2820/2860 compared with 2720/2760 just offers a couple hundred MHz for hundreds extra... and I challenge ANYONE to feel the difference a few % in CPU speed makes.
However very few laptops use the HM67 chipset which is required to OC with multipliers, most (like my P150HM) have HM65 which can't. So unless you're willing to fork out for a top end model it is meaningless to you, but not to those who are the target market
Amrosorma - Monday, October 3, 2011 - linkWould it be possible to get a quick and dirty look at how the notebook performs with the Battlefield 3 open beta?
I imagine that's the game everyone is wondering about with regards to performance.
AmdInside - Monday, October 3, 2011 - linkAgreed. As soon as I saw this, jumped straight to the Gaming Performance section in hopes of finding BF3 benchmarks.
JarredWalton - Monday, October 3, 2011 - linkThe BF3 beta is pretty bad in terms of working properly. When there are all sorts of bugs (falling through the world floor, for instance), I'd wager performance won't be optimal either. We're planning to add the full BF3 release to our mobile benchmarks when it becomes available, along with Rage, Skyrim, and some other updates, but that will probably be another two months before we make the switch.
Stuka87 - Monday, October 3, 2011 - linkYes the beta is buggy. But I would have loved to see an unoffcial benchmark. Doesnt have to be in the graphics. But just a line or two that says "I tried the BF3 Beta, and FRAPs reported these results with these settings". Just so we would have an idea as to if this machine would run BF3.
jrs77 - Monday, October 3, 2011 - linkC'mon, for that amount of money a matte screen should be standard. :facepalm:
BTW... why are glossy screens even manufactured anymore these days? They should get dumped alltogether.
Darkstone - Monday, October 3, 2011 - linkBecause some people, including me, like glossy screens for the deep white they provide? I don't like matte because of the noise.
JojoKracko - Wednesday, October 5, 2011 - linkDamn right. Matte should be standard on laptops. Ban the glossy crap. I would never pay this much for a laptop with a glossy screen. (sadly) I'll be sticking with single GPU MSI laptops.
yelped - Monday, October 3, 2011 - linkAlthough you mentioned that Metro 2033 is a un-optimized title, you failed to mention that Crysis 2 with DX11 is even worse. Check out this article for more details. http://techreport.com/articles.x/21404&source=...