Overview

I started building computers a decade ago to help make ends meet while I was an undergraduate. For the first time, I built more workstations than gaming boxes, HTPCs, and even budget systems in 2012. Whether this is a reflection of my social network getting older, or fleets of ageing Core 2 Duo and Core 2 Quad machines finally getting too long in the tooth to be truly serviceable, the rise of mobile gaming, or something else, I'm not entirely sure. But as readers of AnandTech know, the desktop computing segment is shrinking, and one area where the desktop is still undisputed king is in productivity.

Though there are many powerful laptops that can crunch through large datasets and have screens and keyboards that you can actually use for an entire day of work, these systems remain very expensive compared to similarly-performing desktops, and there are no truly mobile devices that can match the raw computational power, capabilities, and flexibility of a desktop. (Yes, I know: Clevo sells some notebooks with desktop CPUs, but we've looked at a couple of those over the years and always came away wanting something more.)

Fortunately for workstation users, 2012 has produced many tangible benefits. While Intel's Sandy Bridge E CPUs remain the most powerful mainstream workstation CPUs, Ivy Bridge chips brought what Intel called a "Tick+" compared to Sandy Bridge CPUs. Last year, AMD introduced its Bulldozer-based processors, which were disappointing. This year, AMD has narrowed the gap with its Piledriver CPUs. Though Piledriver chips don't match Intel's highest-end performance processors, at certain price points, Piledriver CPUs are worth consideration because they can outperform equivalently priced Intel products (with a few qualifications).

As for storage, stabilization of hard drive production in the wake of the Southeast Asia floods has brought massive 3TB and 4TB hard drive prices back down into the mainstream. Developments in the SSD market have brought reliable, high performance solid state storage down to prices at which they're pragmatic choices for uses in addition to solely operating system and application drives. DDR3 RAM prices have plummeted, to the point where you can sometimes pick up a whopping 32GB of desktop memory for about $100. Finally, competition has widened the field in terms of cases to include more than a handful of players in the premium workstation case market—including newer designs that are not only functional, but actually look nice and are quiet, too.

One important consideration in deciding whether to build a workstation is exactly that—whether to build a workstation. Arguably, you can build a workstation that is more reliable than anything you can buy from a large-scale integrator like Dell, HP, or Lenovo. Similarly, you can build a workstation better suited to your needs than a pre-built system. The question is whether you can provide the same level of support as a large company.

Many of AnandTech's readers have the DIY know-how to quickly diagnose computer-related issues (whether software or hardware induced), and many of us keep spare parts on hand, so we can fix a computer even before the next business day. However, do you have time to spend a few hours troubleshooting a broken down computer in the middle of a work day? Do you want to deal with that aggravation? Do you have spare systems already online that you can use while your primary productivity system is offline for a day or two?

These are important questions, and only you can answer them. If you'd rather not be your own technical support, it's best to stick with a pre-built that comes with support for your important computer. If your computer is more than important (i.e. mission critical), DIY is rarely a good idea. That is, the more important your productivity work is, the more likely it is you'll be better off going the pre-built route and avoiding the issues involved in providing your own part and support.

If you're sold on the idea of building your own workstation, the next general issue to consider is whether your workloads benefit from GPU computing. Succinctly, graphics cards are much more proficient than central processors at certain types of tasks; namely, those that are heavily parallelized. These tasks include scientific computing (such as Monte Carlo simulations, many bioinformatics analyses, and climate data work), audio signal processing (including speech processing), cryptography and cryptanalysis, as well as many functions used in video and image processing. One of the more popular software titles that makes extensive use of GPGPU computing is Adobe's Creative Suite 6; Adobe has an informative FAQ on GPGPU computing in CS6. Again, only you know whether powerful GPGPU capabilities make sense in your system, so for each of the builds we detail, we recommend a graphics card in line with the overall system budget (though you might want to spend more or less depending on your needs).

In this guide we outline four workstations, priced from $850 up to over $2,000. We start with the least expensive builds on the next page.

AMD and Intel Mainstream Workstations
Comments Locked

49 Comments

View All Comments

  • Next9 - Tuesday, December 11, 2012 - link

    there is another important argument - NBD on site warranty.

    If there is any problem with your real workstation, you call the vendor and next day, you have functional machine.

    If there is a problem with you do-it-ourself consumer grade so-called workstation, you are left on your own.
  • PCMerlin - Monday, December 10, 2012 - link

    I have to agree with you Next9. The stability of ECC and raw power of Xeon, along with the Quadro or Fire series video cards should be the only combination for a serious CAD or other graphics workstation.

    I would NOT want to be the tech that has to answer the call when a designer wants answers to why the drawing he/she just spent the better part of the day working on just got "zapped" when his/her system blue-screened.

    In the regular workplace, the helpdesk guy can be the "hero" by restoring a crashed system back to life. CAD designers and engineers, on the other hand, would be perfectly happy if they never saw anyone from the IT world during their day-to-day work.
  • zebrax2 - Monday, December 10, 2012 - link

    What happened to the workstation GPU review?
  • A5 - Monday, December 10, 2012 - link

    Is that what the kids are calling it these days? ;)
  • GrizzledYoungMan - Monday, December 10, 2012 - link

    While I agree with you on some points (see below), I'm still deeply skeptical of the usefulness of quick sync for professional video encoding. The image quality of those Intel commodity hardware encoders is really poor relative to any halfway decent software encoder. And pros tend to value quality over a few minutes (or even hours) of encoding time, as it so heavily affects the perceived overall quality of their product.

    But maybe that's changed? Perhaps a comparison article is in order?
  • JarredWalton - Monday, December 10, 2012 - link

    I believe the point is that if you're uploading something to YouTube (which will futz with the quality, so there's no point in ultra-high rendering quality in the first place), Quick Sync is awesome. E.g. "Here's a preview clip -- it encoded in one minute on my little Ultrabook with Quick Sync, and since it's on YouTube the loss in quality doesn't matter."
  • GrizzledYoungMan - Monday, December 10, 2012 - link

    I don't want to nitpick, but the fact that youtube generally recompresses any video delivered to the site isn't a justification for skimping on the quality of video delivered to youtube, it's a rationale for being even MORE careful about what you deliver to youtube.

    Speaking from experience, it's definitely possible to get video up on youtube that looks great, you just have to deliver at the highest quality possible. If memory serves, youtube accepts files of up to 20GB in size with no practical limit on bitrate, so I usually max out bitrate (via high quality settings, larger frame size, frame rate, etc etc) as much as possible relative to the length of the clip and the file size limit.

    In general, the rule when encoding is that good video put through bad compression gives you mediocre video. Mediocre video put through bad compression gives you bad video.

    To put it another way, the more information (by way of better quality compression) delivered to the youtube encoding pipeline, the better the overall result.
  • Next9 - Monday, December 10, 2012 - link

    What is the point of using garbage consumer grade boards like ASUS or ASrock?

    ASUS boards usually lacks proper VT-d, ECC, AMT and other professional features support. BIOS/UEFI interface is complete piece of shit with GUI targeted at 10 year old kids full of stupid tawdry "features" with no real value to usability.
  • Rick83 - Monday, December 10, 2012 - link

    I was about to say the same - This review lacks consideration of S1155 Xeons, C216 chipsets, ECC...basically all that makes the distinction between a desktop and a workstation.
    And even the C216 ASUS Board does not support AMT.

    With th current price of these components, you would only add around 200 dollars to the mid-end machine, to bring it up to workstation spec.
    ECC-UDIMMs are only mildly more expensive than non-ECC-UDIMMs, S1155 Xeons are only marginally more expensive than the i7, and come with all features unlocked, and the Supermicro X9SAE(-V) (the only boards for the S1155 workstation market, that can be found in retail) go below 200 dollars, if you shop around - twice the price of the bargain bin B75 board, but you get so much more for your money....

    There's little use in going higher end, as anything that requires more performance should probably not be at your workplace, but rather in a server room.
    The AMD route is an interesting way of getting ECC at a slightly cheaper price. But only if you can stomach losing remote management.
  • Ktracho - Monday, December 10, 2012 - link

    What motherboard(s) would you recommend for for ECC and full VT-d support? I built a system with 3 Tesla cards with the idea that one or two of them could be dedicated to a virtual machine, but I didn't realize the motherboard also needed to support VT-d. I have no idea how to find out what motherboards have this feature.

Log in

Don't have an account? Sign up now