Intel’s Prebuilt Test System: A $7000 Build

How we receive test units for review has varied greatly over the years. The company providing the review sample has a range of choices and hands-on solutions.

For a regular run of the mill launch, such as Kaby Lake/Coffee Lake/Coffee Lake gen 2, which are second generation launches on the same mature platform as the last generation, we get just the CPU and a set of ‘expected test result notes’ to help guide our testing. The reviewers are expected to know how to use everything and the vendor has confidence in the reviewers analysis. This method allows for the widest range of sampling and the least work at the vendor level, although relies on the journalist to have the relevant contacts with motherboard and memory companies as well as the ability to apply firmware updates as needed.

For important new launches, such as Ryzen and AM4, or Threadripper and TR4, or Skylake-X and X299, the vendor supplied the CPU(s), a motherboard, a memory kit, and a suitable CPU cooler. Sometimes there’s a bit of paper from the FAE tester that confirmed the set worked together over some basic stress tests, but it puts less work in the hands of the reviewer knowing that none of the kit should be dead on arrival and it should at least get to the OS without issue.

For unique launches, where only a few samples are being distributed, or there is limited mix-and-match support ready for day one, the option is the full system sample. This means case, motherboard, CPU, CPU cooler, memory, power supply, graphics card, and storage are all shipped as one, sometimes directly from a system integrator partner, but with the idea that the system has been pre-built, pre-tested, and ready to go. This should give the reviewer the least amount of work to do (in practice it’s usually it’s the opposite), but it puts a lot of emphasis on the vendor to plan ahead, and limits the scope of sampling. It also the most expensive for the vendor to implement, but usually the tradeoff is perceived as worth it.

Usually we deal with options one or two for every modern platform to date. Option three is only ever taken if the CPU vendor aims to sell the processor to OEMs and system integrators (SI) only. This is what Intel has done with the Xeon W-3175X, however they built the systems internally rather than outsourcing. After dispatch from the US to the UK, via the Netherlands, an 80 lb (36 kg) box arrived on my doorstep.

This box was huge. I mean, I know the motherboard is huge, I’ve seen it in the flesh several times, but Intel also went and super-sized the system too. This box was 33 inches tall (84 cm), and inside that was a set of polystyrene spacers for the actual box for the case, which again also had polystyrene spacers. Double spacey.

Apologies for taking these photos in my kitchen – it is literally the only room in my flat in which I had enough space to unbox this thing. Summer wanted to help, and got quite vocal.

The case being used is the Anidees AI Crystal XL AR, listed on the company’s website as ‘all the space you need for your large and heavy loaded components’, including support for HPTX, XL-ATX, E-ATX, and EEB sized motherboards, along with a 480mm radiator on top and a 360mm radiator on front, and comes with five 120mm RGB fans as standard. It’s a beast, surrounded with 5mm tempered glass on every side that needs it.

The case IO has a fan control switch (didn’t work), two audio jacks, an LED power button, a smaller LED reset button, two USB 3.0 Type-A ports, and two USB 2.0 Type-A ports. These were flush against the design making for a very straight edged design.

This picture might show you how tall it is. Someone at Intel didn’t install the rear IO plate leaving an air gap, but actually the system airflow was designed for the rear of the chassis to be the intake and the front of the chassis to be the exhaust. There are 10 PCIe slot gaps here, along with two vertical ones for users that want to mount in that way. There is sufficient ‘case bezel’ on all sides, unlike some smaller cases that minimize this.

Users may note the power supply has an odd connector. This is a C19 connecter usually used for high-wattage power supplies, and strapped to the box Intel had supplied a power cable.

This bad boy is thick. Ignoring the fact that this is a US cable and the earth pin is huge to the extent that it would only fit in one of my adaptors and even nudging the cable caused the machine to restart so I had to buy a UK cable that worked great, this unit is designed for the low voltage US market it seems. It has to be able to deliver up to 13A of current on a 120V line, or potentially more, so is built as such. With this it is obviously recommended that no socket extenders are used and this goes directly into the wall.


About to take the side panels off. This little one wants to play.

Both of the tempered glass side panels are held on by nine thumb screws each, which sit on rubber stands on the inside of the case. Unscrewing these was easy enough to do, however it’s one of the slowest ways to open a case I’ve ever come across.

Now inside the system at hand. The LGA3647 socket holds the Xeon W-3175X processor, which is capped with an Asetek 690LX-PN liquid cooler specifically designed for the workstation market. This goes to a 360mm liquid cooling radiator, paired with three high power (I’m pretty sure they’re Delta) fans that sound like a jet engine above 55ºC.

Intel half populated the memory with 8GB Samsung DDR-2666 RDIMMs, making for a total of 48 GB of memory, which is likely going to be the lowest configuration one of these CPUs will ever be paired with. The graphics card is a GIGABYTE GTX 1080, specifically the GV-N1080TTOC-8GD, which requires one 8-pin power connector.

For the motherboard, the ASUS Dominus Extreme, we’ve detailed it in previous coverage, however it’s worth to note that the big thing at the top of this motherboard is actually the heatsink for the 32-phase VRM. It’s a beast. Here is an ASUS build using this motherboard with a liquid cooler on the CPU and VRM:


The build at ASUS’ suite at CES 2019

There’s a little OLED display to the left, which as a full color display useful for showing BIOS codes and CPU temperatures when in Windows. When the system is off, it goes through a short 15 second cycle with the logo:

I’m pretty sure users can put their own gifs (perhaps within some limits) on the display during usual run time using ASUS software.

The rear of the case is quite neat, showing part of the back of the motherboard and the fan controller. At the bottom we have an EVGA 1600W T2 80PLUS Titanium power supply, which is appropriate for this build. Unfortunately Intel only supplied the cables that they actually used with the system, making it difficult to expand to multiple GPUs, which is what a system like this would ultimately end up with.

For storage, Intel provided an Optane 905P 480GB U.2 drive, which unfortunately had so many issues with the default OS installation (and then failing my own OS installation) that I had to remove it and debug it another day. Instead I put in my own Crucial MX200 1TB SATA SSD which we normally use for CPU testing and installed the OS directly on that. ASUS has a feature in the BIOS that will automatically push a software install to initiate driver updates without the need for a driver DVD – this ended up being very helpful.

Overall, the system cost is probably on the order of $7000:

Intel Reference System
  Item List Price
CPU Intel Xeon W-3175X $2999
CPU Cooler Asetek 690LX-PN $260
Motherboard ASUS Dominus Extreme $1500 ?
Memory 6 x 8GB Samsung DDR4-2666 RDIMM $420
Storage Intel Optane 905P 480 GB U.2 $552
Video Card GIGABYTE GTX 1080 OC 8GB $550
Chassis Anidees AI Crystal XL AR $300
Power Supply EVGA 1600W T2 Titanium $357
Total $6938

However, this is with a minimum amount of memory, only one GTX 1080, and a mid-sized U.2 drive. If we add in liquid cooling, a pair of RTX 2080 Ti graphics cards, 12x16GB of DDR4, and some proper storage, the price could easily creep over $10k-$12k, then add on the system builder additions. The version of this system we saw at the Digital Storm booth at CES, the Corsa, was around $20k.

Intel Xeon W-3175X Detailed W-3175X Power Consumption and Overclocking
Comments Locked

136 Comments

View All Comments

  • FMinus - Friday, February 1, 2019 - link

    Not really, 3D rednering is done on specialized render farms, the modeling work, key framing etc. can be done on any decent modern mainstream CPU, and especially well on any modern HEDT chip, for prototyping and preview, once satisfied, send it out to render properly.
  • eastcoast_pete - Wednesday, January 30, 2019 - link

    The only scenario where this or similar Xeons do outperform the AMD lineup is if (!) the key application (s) in question make good use of AVX512. In those situations, Intel is still way ahead. In all others, a similar or lower priced Threadripper will give more bang for the buck.
  • Tango - Wednesday, January 30, 2019 - link

    There are scenarios in which this is perfect, and in fact my research department is looking into acquiring two of them. Our algorithms include both highly parallelized instructions and completely non parallelizable ones where clock speed dominates. We estimate models that take a whole weekend to spot out a result, and the alternative is paying top money for supercomputer time.
    At $3000 it is a steal. The problem is half Wall Street will be sending orders to get one, since the use case is similar for high frequency trading applications.
  • MattZN - Wednesday, January 30, 2019 - link

    I expect all the review sites will redo their 2990WX benchmarks once Microsoft is able to fix the scheduler. The question is really... how long will it take Microsoft to fix their scheduler? That said, nobody should be expecting massive improvements. Some of the applications will improve a ton, but not all of them. It will be more like a right-sizing closer to expected results and less like hitting the ball out of the park.

    -Matt
  • BGADK - Wednesday, January 30, 2019 - link

    Little professional software exists for Linux, so these machines WILL run windows for most parts.
  • cmcl - Thursday, January 31, 2019 - link

    Agree that there is more professional software for Windows, but in visual effects (where I work), 90% of our workstations (and all render) runs on Linux (24-core workstations, with P6000s), running Nuke, Maya etc. Apart from the gaming benchmarks (and who would buy one of these for gaming), a lot of the tests could be done in Linux as that software runs on Linux
  • Icehawk - Thursday, January 31, 2019 - link

    Workstations aside, these mega-core beasts are run as VM hosts on bare metal. I don't have a single server here that just runs an OS & app suite, it's not 2000 anymore everything is virtualized as much as possible.
  • WasHopingForAnHonestReview - Wednesday, January 30, 2019 - link

    Holy shit. AMD absolutely bent intel over on this one. The price for performance ratio is overwhelming in AMD ls favor! Intel would have released this for 8k if the 2990wx wasnt so competitive!

    WOW!
  • GreenReaper - Thursday, January 31, 2019 - link

    They probably wouldn't have released it at all. As noted, most of these could easily be server cores on which they could make plenty more money. This appears to be largely a PR effort.
  • jcc5169 - Wednesday, January 30, 2019 - link

    Who in the world would buy this over-priced piece-of-crap?

Log in

Don't have an account? Sign up now