Back to Article

  • jeremyshaw - Friday, February 17, 2012 - link

    o.0 Well... that's a new one I didn't expect to ever see, lol. Reply
  • ViperV990 - Friday, February 17, 2012 - link

    I'm a little surprised at the low power consumption at 172W under load.

    Have you tried, or will you consider trying to run a stronger card in the system? I'm very curious whether the system can handle a 6870, 6950, 560Ti, 560Ti448, or 570 with relatively little down side to the power supply's longevity.
  • KineticHummus - Friday, February 17, 2012 - link

    It is limited to a gpu pulling 150 watts or less. to my knowledge, none of those cards would work, except MAYBE the 6870. not sure tho Reply
  • JarredWalton - Friday, February 17, 2012 - link

    I'd think that unless you want to try the Molex-to-PEG adapter route, your biggest limit will be GPUs that can work off a single 6-pin PEG connection. Of the available GPUs right now, that eliminates anything above the HD 6850 (so no 6870 or 6950) on the AMD side. The HD 7770 would work, but that's actually slower than the 6850 in most games so let's not worry about that one. As for the NVIDIA side, the GTX 555 is actually pretty well specced, all told. The GTX 550 Ti is a step down from the GTX 555, while the GTX 560 uses two 6-pin PEG connectors.

    Of course, the 330W PSU is probably potent enough to actually power one of the moderate GPUs that use two 6-pin connectors. Looking at our power numbers here (which include a much beefier CPU that's running overclocked), you could "safely" go with something like the HD 6870 or the GTX 560. Those two GPUs trade blows in our gaming tests, and I'd probably give a slight edge to NVIDIA on performance, but the HD 6870 wins out with better overall power characteristics. If you're really daring, you could even try for an HD 6950 or HD 7950, but then you're really pushing the envelope (

    My best recommendation however would be to wait and see what AMD can do with their Pitcairn GPU. I'd expect power to be in the <150W range on the HD 7850 and probably not much more than that on the 7870 (yeah, I'm taking a guess at the card names). Judging by the HD 7770 and HD 7950 results, performance should also be very good, assuming it lands about midway between those two. NVIDIA might also have some reasonable options with Kepler when that comes out (and you might be able to keep Optimus support). But I'm not going to recommend buying an X51 right now just to upgrade the GPU, no.

    If you don't like the GTX 555 setup for $950, I think your best bet is to wait for the Ivy Bridge refresh in April -- probably May for Alienware/Dell to update the X51? Then you could probably get an equivalent IVB CPU that's somewhat faster than the i5-2320 and has better power characteristics, and you get HD 4000 graphics onboard as well (which will surely be better than HD 2000, even if they might not be awesome). What's more, there's a reasonable chance the next refresh will also have support for a mainstream Kepler card that should outperform the GTX 555.

    All you have to do is wait a few months and something better will come along. Except, then there's another "something better" a few months after that. :-)
  • Calin - Friday, February 17, 2012 - link

    There's always something better down the road.
    Great article, and that's a wonderful little system.
  • ViperV990 - Friday, February 17, 2012 - link

    (replying to both Jarred and Roland)

    Given the X51's video card limitations, I wonder how it'd compare to a system built on the Shuttle H67 barebone, with or without the 500W first-party P/S upgrade.

    Also, regarding desktop Optimus support, I seem to remember the Z68 supporting something similar, but with a performance hit. How exactly is desktop Optimus different?
  • extide - Friday, February 17, 2012 - link

    Several of the Z68 boards support the Lucid Logix Virtu stuff, which is similar to Optimus. Reply
  • JarredWalton - Friday, February 17, 2012 - link

    It's similar but only in the loosest sense. I don't know precisely how Lucid's tech works at a low level, but AFAIK it requires profiles for the games/apps to use the dedicated GPU, just like Optimus. Last I looked, the list of titles supported by Lucid was a lot shorter than the Optimus list. Reply
  • theclocker - Sunday, June 03, 2012 - link

    Probably the best post I have seen about what the X51 in terms of what is has and what you can do as it is configured at the moment. I would add....

    The 7850 is a PCIx3.0 card, while the spec for 3.0 is backwards compatible, the specs given for the 7850 are in PCIx3.0. I don't own one, haven't tried this set up, but unless the motherboard in the X51 supports PCIx3.0 the 7850 probably won't run at full throttle as advertised. Let's assume this isn't the case for the next few paragraphs.

    I at first wasn't sure about your advice you gave on certain cards, but rather than being a troll (hate it when people post without doing at least some research before posting) I did some looking into what you are saying. It appears with current die technology the AMD7850 is a step up and those who claim to have upgraded to the GTX560 have seen some improvements.

    According to what wiki radeon7850 shows is : a GT rate of 55 and memory at 153 GB rate. Nvidia says the GTX560 comes in at (non OC keeping power usage down) GT rate of 45 and memory in 128 GB rate. All this compared to the standard best card offered GTX555 GT rate of 37 and memory at 92.

    The 7850 comes in at 1.48x's the GT rate and 1.66x's GB rate higher and the GTX 560 gt rate1.2x's and 1.4x's GB rate, I couldn't agree more with your advise that your going to spend $1150 for X51 then shell out @150 for a GTX560 or @240 for a 7850. Although you could recoup a little back if you felt safe selling the GTX555, no thank you as you said.

    The X51 is not all bad for what it is, most everyone is incorrectly comparing it to the normal tower PC. Anyone who has built that small home theater PC and what it lacks compared to a tower PC knows there are limitations as most of those mini boxes only have a few hundred watts to play with.

    The X51will probably migrate to a better offering further down the road, as there are quite a few reviews mostly liking it, followed by a lot of haters in the comments. Never the less, you could read all day long and never hit the same website twice with all the buzz Alienware has created with the X51.

    As a profession Electronic Technician for 28yrs and dealing with the design and repair of switching power supplies, I do like the external supply idea. It removes one of the more offending heat sources, but this has been somewhat alleviated with new designs in cases separating the area where the power supply resides from the motherboard area. However I would like to see some active cooling to these bricks as they probably will go higher than 330w. A sealed brick of this type of power (and higher if this idea flies), they tend to get quite warm.
  • Sufo - Friday, February 17, 2012 - link

    Instead of upgrading, how about a hefty overclock? I know that the mobile 555 can achieve close to a 100% OC (not sure if the 2 cards have anything in common bar the name tho). Either way, there seems to be about 100W and 5-10C safe wiggle room for power and temp, you could squeeze another 40-50% out of that I bet. Reply
  • swimtech - Saturday, February 18, 2012 - link

    That sounds like a great idea for a follow up article if they can hang on to the review unit for awhile longer. Seriously doubt the 100% OC though - 20% might be enough to get there for a better Battlefield 3 experience. Reply
  • DanNeely - Friday, February 17, 2012 - link

    Alienware had to spec the PSU for a worst case load that's rather heavier than the AT load test. Probably intel burn test + furmark with all the monitor ports connected, the optical drive burning a disk, all the fans spinning at max speed, and tablets charging from every USB port. Reply
  • Roland00Address - Friday, February 17, 2012 - link

    1) Video Card Power is limited to 150w according to dell
    2) Total System Power is limited to an external power brick (same one used in m18x). Dell makes a 240w version and a 330w version. You may only get the 240w version if you skimp out on the video card and processor. You can always buy the 330w power supply separately for $145
    3) The maximum length of the video card is 9 inches. This removes a reference 6870 since that card is 9.84 inches. You may be able to find a non reference 6870 which has a shorter board.

    A 7770 will fit, but it should provide similar enough performance to the geforce 555 (oem only) to be barely worth the trouble. The 7800 series on the other hand may actually work though.
  • Roland00Address - Friday, February 17, 2012 - link

    *wishes for a preview button*, but hey at least most of the spammers have gone away. Reply
  • NicodemusMM - Friday, February 17, 2012 - link

    I've been seeing a new carriage in some of Dell's Optiplex models. It's essentially a plastic insert that fits into their 3.5" drive slots, but it holds two 2.5" drives. Such an option may be viable for this model. Simply use one SSD and one 7200 RPM 2.5" drive.

    I like the looks of this, but I hope their fan profile allows it to spin up a bit once users get it into their house and fill it with dog hair and tobacco tar. I do not look forward to calls on this model.
  • Robalov - Friday, February 17, 2012 - link

    That case is exactly what I want and have been looking for.

    Hopefully down the line, some of the cases will make it to ebay.

    The computer itself fills a hole for those wanting to game without lugging around a full tower, but it's a small market, I imagine this would get more sales as a HTPC if they marketed it as such.
  • Meaker10 - Friday, February 17, 2012 - link

    The main problem I see here is you can get a notebook with a built in 1080p screen, GTX570M and 2 hard drive slots (supporting raid0) for the same price! Reply
  • Dustin Sklavos - Friday, February 17, 2012 - link

    ...where? Reply
  • rpsgc - Friday, February 17, 2012 - link

    The only notebooks around 1000 USD I can find all have a GTX 560M at best. Reply
  • kevith - Friday, February 17, 2012 - link

    That´s the only thing I was looking for throughout reading this review: Can you find these features at the same price in a laptop, and: You can´t. Reply
  • Meaker10 - Friday, February 17, 2012 - link

    Xotic pc 16f2 barebone. Reply
  • Dustin Sklavos - Friday, February 17, 2012 - link

    Except a barebone doesn't come with a CPU, or hard drive bays, or memory. Reply
  • Dustin Sklavos - Friday, February 17, 2012 - link

    Er...or hard drives. Derp. Reply
  • Meaker10 - Friday, February 17, 2012 - link

    This is a customised barebone, so the price includes a dual core CPU, 8GB of ram, HDD, wireless and OS.

    It would also ship with brackets for a second HDD.

    Plus it supports raid.
  • kasakka - Sunday, February 19, 2012 - link

    The mobile graphics chips are nowhere close to the speed of their desktop counterparts. A GTX460M or whatever they've rebranded it as 5xx is about the same as a desktop GTX260, a several years old GPU.

    Personally I'd love to see more gaming PCs like this, but simply ditch the optical drive for more space so a bigger PSU, graphics card and cooler can fit in.
  • Death666Angel - Sunday, February 19, 2012 - link

    This thing has no PSU inside the case, so ditching the ODD would not give you more space for the PSU and without a bigger PSU you can't have a beefier GPU. Reply
  • Mark_Hughes - Friday, February 17, 2012 - link

    I like this system, If I where looking right now this would certainly be high on my list, I normally use laptops, but one of my laptops hardly ever moves from the desk, this would make a great replacement when the time comes. Reply
  • Swirlser - Friday, February 17, 2012 - link

    Congrats to Dell on once again messing up and being blinded by its continuing race to the bottom.

    As if it wasnt bad enough a few years back when they bought Alienware and began pegging their XPS against it and now have taken what little was left of the BRAND and killed it with this bargain basement offering, quintessentially the opposite of what Alienware *was* about.

    Im embarrassed that once upon a time I bought an Alienware, it cost 5,100 euro (monitor not incl). It was without a doubt the worst purchase Ive ever made, Ive had rather expensive cars experience less depreciation than that did! In the 6 weeks it took to deliver it, a dual core version of what I bought had come out (that'll give an idea how long ago it was). While money isnt particularly an issue and whether you like it or not, theres always something better around the corner, it still stung that before I even opened the box it was outdated. Fine, that was my own fault for not doing more research - in fact I did none. So my bad on that! BUT, even ignoring that, it was still a stupidly overpriced box that within a year I was itching for an upgrade.

    Thankfully, I've since seen the light. I have done the lego route on my last 4ish rigs and its been a joy. Forget the cheaper price tag, just being able to hand pick each part, incl the case, the motherboard, the ram (none of which you get control over with an Alienware). Heck last time I checked they did away with picking a colour scheme (cba to check if its back or not).

    Dell have a bargin basement brand already, its called Dell. A shrinking niche market the Alienware brand may have been, but it *was* that - a brand. Not anymore.
  • KitsuneKnight - Saturday, February 18, 2012 - link

    Are you seriously lamenting that Dell is making Alienware sell more reasonably priced machines in the same post as you tell us the wonderful story of how the "worst purchase [you've] ever made" was a 5,100 euro Alienware machine that was an overpriced piece of junk, and outdated by the time you opened the box?

    Um... what?
  • seapeople - Saturday, February 18, 2012 - link

    I was talking to a guy at work once who had just built his first new computer and I asked him why he didn't just get one from Dell, since you don't save all that much money. He used the argument about how great it is that you can pick each an every single component, whatever motherboard you want, the CPU, ram, etc...

    I then asked him "Oh, ok.. what CPU did you get?"

    His response was "Um, I don't remember. I think it was Intel something or other. It was a quad core I'm pretty sure." (He had just finished building it about a week before)

    So I said, "You mean, like a Q6600? Did you overclock it?"

    "I... I just don't remember. Overclock? What's that?"

    Then he went on to tell me how his computer kept crashing every few minutes, but he didn't know what to do so he just used it like that. It's people like you that lead poor, unknowledgeable saps like this astray.
  • Death666Angel - Saturday, February 18, 2012 - link

    Cool story, bro. Reply
  • frozentundra123456 - Friday, February 17, 2012 - link

    Kind of a nice system, but I am not sure I really see the point of cramming all this into a small formfactor on the desktop. Personally, I would prefer a larger system that could be more easily upgraded. And for HPTC use, it seems overkill.

    Still have to admire the system and the price it is selling at, but I would ether go for a larger desktop gaming PC or a gaming laptop for portability.
  • Shadowmaster625 - Friday, February 17, 2012 - link

    Woah woah woah, wait a minute. Are you saying you plug your monitor into the hdmi port on the motherboard, and it can route both gpu outputs to that hdmi port? Reply
  • Phynaz - Friday, February 17, 2012 - link

    Yes, that's how it works. Reply
  • Shadowmaster625 - Friday, February 17, 2012 - link

    How does the nvidia pci express gpu card route its video signal back to the motherboard and through the cpu and out to the hdmi port on the motherboard? Reply
  • JarredWalton - Friday, February 17, 2012 - link

    It's called Optimus Technology, and it's been around for a few years now. The NVIDIA drivers detect supported applications (you can add your own), and the dGPU does the work and then copies the resulting frame directly into the IGP's frame buffer. Reply
  • tipoo - Friday, February 17, 2012 - link

    I think what he's asking is that the reason Nvidia gave for it not working on desktops is that the card had different physical links than the onboard graphics, unlike laptops, so switching using just one port was impossible. How did they do it here then? Reply
  • JarredWalton - Saturday, February 18, 2012 - link

    Optimus has always been possible on the desktop; NVIDIA (no matter what they might suggest) just hasn't allowed for it. Alienware/Dell apparently wanted it enough that NVIDIA allowed it on the X51. I wouldn't be surprised if we see more desktops with Optimus going forward, so maybe the X51 is more of a forerunner than an exclusive thing. Since the dGPU just copies the content into the IGP frame buffer, any ports supported by the IGP are supported by the dGPU. I believe there may be some limitations on what is supported (e.g. HDMI 1.4a and 3D Vision may not be available via this sort of configuration). Reply
  • Dustin Sklavos - Saturday, February 18, 2012 - link

    3D Vision isn't available on the X51 unless you connect directly to the GeForce. Reply
  • Cali3350 - Friday, February 17, 2012 - link

    Could you tell me if the internal sound includes Dolby Digital Live! encoding? Reply
  • Anonymous Blowhard - Friday, February 17, 2012 - link

    "With such a compact design one would expect the X51 to be both loud and hot, but surprisingly this isn't the case. Quite the opposite actually; the X51 is cooler and quieter at both idle and load than the first-generation Xbox 360 was."

    I'm pretty sure I've heard quieter power tools than a first-gen 360. That's not exactly shooting for the moon there.

    How far away is that 40dB measurement being taken from? This makes the difference between "gaming capable HTPC" and "banned from the living room."
  • haukionkannel - Friday, February 17, 2012 - link

    This is something like a paragon of "the best you can get" when thinking next generation consoles.
    The consoles are most propably even more cripled by power consumption and this would be too expensive, so they would reguire allso cheaper parts...
    Nice to see when xbox 720 comes out how it would compare to this...
  • A5 - Saturday, February 18, 2012 - link

    Take this and replace the GPU with something with DX11.1 support and similar thermals (a 6850 with DX11.1 features added seems reasonable instead of a 7770), and you're probably in the ballpark.

    Good-looking console games come from the incredible amount of optimization possible due to a single hardware configuration, not from the power of the hardware.
  • A5 - Saturday, February 18, 2012 - link

    You'd also replace the CPU with some kind of PPC variant if the rumors are to be believed. Reply
  • tipoo - Saturday, February 18, 2012 - link

    The first revision 360 had a 200W maximum power draw, this has a 172W draw. I think they could do it, but I think Microsoft at least, and probably Sony too, will re-think the selling for a loss strategy this round as it took them a looong time to recoup losses. There's a rumor the Nextbox will use a 6670-like card, but I think (and hope) that is false, as the original 360 dev kits used an old x800 graphics card before they finally came with the x1900-like chip in the 360. Reply
  • Traciatim - Friday, February 17, 2012 - link

    It's really unfortunate that you couldn't have done the gaming benchmarks with the I3, i5, and i7 models to see how much of a difference each step makes in a variety of games. Reply
  • Wolfpup - Friday, February 17, 2012 - link

    The answer is power gating, not switchable graphics. Until we have that better, we need the GPU acting as a GPU.

    These articles keep acting like it's fine, and in practice, it's one person after another getting blue screens, driver weirdness, difficulty installing Nvidia or AMD's drivers, etc., that you just don't see on most systems without switchable graphics.

    Articles like this that keep promoting it have casual users trying to buy stuff confused, when you've got 10 people on a forum trying to talk them out of it.

    I'm used to Anandtech being dead on with everything, so this Optimus push of the last few years is BIZARRE.
  • TrackSmart - Friday, February 17, 2012 - link

    Switchable graphics makes a lot of sense for a mobile system, where an extra couple of watts of power draw can mean an extra hour or two of battery life. I'm already amazed at how little energy *very powerful* modern graphics cards use when idling. How much lower do you think they can realistically go? Until they can get within range of their mobile parts at idle, switchable graphics will continue to be a compelling feature for keeping laptops running longer.

    If you are talking specifically about desktop computers, then I agree that the benefits are minimal. Aside for access to Quick Sync for those few people who would use it.
  • JarredWalton - Friday, February 17, 2012 - link

    " practice, it's one person after another getting blue screens, driver weirdness, difficulty installing Nvidia or AMD's drivers, etc., that you just don't see on most systems without switchable graphics..."

    I disagree. I've had very few BSODs, taking all of the Optimus laptops I've tested/used together over the past few years. I'm sure there are probably exceptions, but certainly within the last 18 months I've had no complaints that I can think of with Optimus on my personal laptops.

    I don't think Optimus fills a major need for a desktop, but posts like yours claiming that Optimus is essentially driver hell and problems are, in my experience, the rantings of someone who either had one bad experience or simply hasn't used it.

    But let's put it another way: what specific laptops have you used/tested with Opitmus where there were clear problems with Optimus working properly, where drivers couldn't be updated, etc.?
  • TrackSmart - Friday, February 17, 2012 - link

    Gamers are the target audience, yet a marginally bigger case would have allowed for a more powerful GPU. Or a similarly powerful GPU for a lot less money. This is not a mobile system where every square cm of space counts, so why force the consumer to make such large compromises in price:performance?

    Obviously I'm not the target audience. Just like I will never own an "all in one" desktop computer that has the performance of a laptop. It just doesn't make sense unless you have absurd space limitations.
  • ranilus - Monday, February 20, 2012 - link

    The advantage is the flexibility of where you can put the system. On the desk, right beside/behind the monitor, on the floor, in an entertainment center, etc.

    I had an Aurora. I was happy with the stock CPU and GPU, didn't feel the need to upgrade or overclock, or add SLI/CF or RAID 0. But that case, on man it wasn't just HUGE. It was HEAVY. I think it was 60lbs. I couldn't put it on the desktop. I couldn't fit it in my computer desk which had space for a computer tower, I could only put it beside the desk, and it was sort of in the way the whole time.

    There's always the want for simplicity, a neat desk-area, a clutter-free Feng-shui, and/or an aesthetically pleasing gaming room. The X51 achieves that, while also being relatively powerful.

    It is indeed just as you've said, you are obviously not the target audience. Really the system, and All-In-Ones, are for those who appreciates a holistic Chi.
  • Coup27 - Friday, February 17, 2012 - link

    Nice review and a neat little system. What sets this apart is the custom design and build. On that front it is a shame there is only one photo of inside the unit itself and even that you cannot see past the side. Reply
  • Death666Angel - Friday, February 17, 2012 - link

    I like the clean built and the form factor and the relative power it packs. However, I'm a PC nerd and would never buy a complete PC unless I can save 100-200 bucks compared to the components used (which is impossible). Reply
  • Leyawiin - Saturday, February 18, 2012 - link

    I'm kind of intrigued by this tiny form factor and the relative power it has. I'm sure it would perform better than my mid-range PC (GTX 460 OC'd and X4 955 OC'd). Interesting... Reply
  • JarredWalton - Saturday, February 18, 2012 - link

    CPU would be faster in some cases, yes, but the GPU? Overclocked GTX 460 is almost certainly going to outperform the GTX 555 (OEM). 460 has 336 cores at 1350MHz (stock), which works out to 907.2 GFLOPS (theoretical), and the 256-bit GDDR5 memory interface at 3600MHz (effective) gives you 115.2 GB/s of bandwidth -- that's assuming you have the 1GB version of the GTX 460; if not, you'd be sitting down at 86.4 GB/s.

    In comparison, the GTX 555 has 288 cores at 1553 MHz, which yields a theoretical 894.5 GFLOPS. It has a 192-bit memory interface running at 3828MHz, for 91.9GB/s of bandwidth. So, at stock the GTX 460 1GB card would have 1-2% more computations power and 25% more bandwidth, but you say your card is overclocked which would mean that however far you've overclocked basically translates directly into more computational power.
  • Leyawiin - Sunday, February 19, 2012 - link

    Ah, I'm not that familiar with the GTX 555 since its OEM and there are no reviews anywhere. I am playing Skyrim almost exclusively right now and its so CPU bound I was thinking that stronger CPU would have more of an impact. Reply
  • JarredWalton - Sunday, February 19, 2012 - link

    The 1.4 update should have alleviated a lot of the CPU-bound issues. At least, it seems to have done so on my PC. Plus you can also use the high resolution texture packs -- though with a 1GB card that might be asking too much. Reply
  • TareX - Sunday, February 19, 2012 - link

    Extremely irrelevant, but I'm wondering when Anandtech will be reviewing the world's latest, fastest, most impressive handheld gaming machine coming out this week... Reply
  • AndySocial - Tuesday, February 21, 2012 - link

    I find it interesting that no reviewers ever seem to review the base model. It seems that would be enlightening. Many people are probably intrigued by the idea of a small system with ostensibly enough power to play current games on their HDTV (gotta love HDMI standardization across PC and TV usage). But, this review, like every other I've seen so far, won't tell them if the most-affordable system is worth buying. This is especially true when the X51 is using OEM-only video cards, so a typical user is not going to be able to find a lot of comparisons of other systems with the same specs. Reply

Log in

Don't have an account? Sign up now