Board Features

The ASUS ROG Strix X299-XE Gaming, while a mouthful to actually say, has a consummate number of options for a mid-range gaming motherboard.

ASUS ROG Strix X299-XE Gaming
Warranty Period 3 Years
Product Page Link
Price $369.99 Amazon US
Size ATX
CPU Interface LGA2066
Chipset Intel X299
Memory Slots (DDR4) Eight DDR4
Supporting 128GB
Quad Channel
Up to DDR4 4133 (quad and dual channel)
Network Connectivity 1 x Intel I219V GbE
Onboard Audio Realtek ALC S1220A
PCIe Slots for Graphics (from CPU)  3 x PCIe 3.0
- 44 Lane CPU: x16/x16/x8 
- 28 Lane CPU: x16/x8/x1 
- 16 Lane CPU: x8/x8/x1 
PCIe Slots for Other (from PCH) 2 x PCIe 3.0 x4
Onboard SATA 8 x RAID 0/1/5/10
Onboard SATA Express None
Onboard M.2 1 x PCIe 3.0 x4 and SATA mode
1 x PCIe 3.0 x4 mode only
Onboard U.2 None
USB 3.1 ASMedia ASM3142 
1 x Type-A
1 x Type-C
1 x Onboard Headers
USB 3.0 Chipset
4 x Back Panel
4 x Onboard Headers
USB 2.0 Chipset
2 x Back Panel
2 x Onboard Headers
Power Connectors 1 x 24-pin ATX
1 x 8-pin CPU
1 x 4-pin CPU (optional)
Fan Headers 1 x 4-pin CPU
1 x 4-pin CPU OPT
1 x AIO PUMP
1 x W PUMP+
2 x Chassis
IO Panel 1 x LAN (RJ45) ports
2 x USB 3.1 10 Gbps, Type-A and Type-C
4 x USB 3.0
2 x USB 2.0
1 x SPDIF out
5 x Audio Jacks
1 x USB BIOS Flashback Button(s)
1 x ASUS Wi-Fi GO Module

The PCIe lane arrangement looks very odd at first glance, with ASUS seemingly emphasising single GPU bandwidth for Skylake-X CPUs, expecting users to use add-in cards for the other slots. Aside from the large heatsinks, nothing immediately stands out on this board: sure it has an Intel NIC, 802.11ac Wi-Fi, and an ASUS-specific S1220A audio codec, but unlike other boards in this price range, it doesn't have that extra 'knock-out' feature that separates it from other products. 

Test Bed

As per our testing policy, we take a high-end CPU suitable for the motherboard that was released during the socket’s initial launch and equip the system with a suitable amount of memory running at the processor maximum supported frequency. This is also typically run at JEDEC sub timings where possible. It is noted that some users are not keen on this policy, stating that sometimes the maximum supported frequency is quite low, or faster memory is available at a similar price, or that the JEDEC speeds can be prohibitive for performance. While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS, and most users will fall back on JEDEC supported speeds - this includes home users as well as industry who might want to shave off a cent or two from the cost or stay within the margins set by the manufacturer. Where possible, we will extend our testing to include faster memory modules either at the same time as the review or a later date.

Readers of our motherboard review section will have noted the trend in modern motherboards to implement a form of MultiCore Enhancement / Acceleration / Turbo (read our report here) on their motherboards. This does several things, including better benchmark results at stock settings (not entirely needed if overclocking is an end-user goal) at the expense of heat and temperature. It also gives, in essence, an automatic overclock which may be against what the user wants. Our testing methodology is ‘out-of-the-box’, with the latest public BIOS installed and XMP enabled, and thus subject to the whims of this feature. It is ultimately up to the motherboard manufacturer to take this risk – and manufacturers taking risks in the setup is something they do on every product (think C-state settings, USB priority, DPC Latency/monitoring priority, overriding memory sub-timings at JEDEC). Processor speed change is part of that risk, and ultimately if no overclocking is planned, some motherboards will affect how fast that shiny new processor goes and can be an important factor in the system build.

Test Setup
Processor Intel i9 7900X (10C/20T, 3.3G, 140W)
Motherboard ASUS ROG Strix X299-XE Gaming (BIOS version 0802)
Cooling Corsair H115i
Power Supply Corsair HX750
Memory Corsair Vengeance LPX 4x8GB DDR4 2666 CL16
Corsair Vengeance 4x4GB DDR4 3200 CL16
Memory Settings DDR4 2666 CL16-18-18-35 2T
Video Cards ASUS Strix GTX 980
Hard Drive Crucial MX300 1TB
Optical Drive TSST TS-H653G
Case Open Test Bed
Operating System Windows 10 Pro 64-bit

 

Many thanks to...

We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this testbed specifically but is used in other testing.

Thank you to ASUS for providing us with GTX 980 Strix GPUs. At the time of release, the STRIX brand from ASUS was aimed at silent running, or to use the marketing term: '0dB Silent Gaming'. This enables the card to disable the fans when the GPU is dealing with low loads well within temperature specifications. These cards equip the GTX 980 silicon with ASUS' Direct CU II cooler and 10-phase digital VRMs, aimed at high-efficiency conversion. Along with the card, ASUS bundles GPU Tweak software for overclocking and streaming assistance.

The GTX 980 uses NVIDIA's GM204 silicon die, built upon their Maxwell architecture. This die is 5.2 billion transistors for a die size of 298 mm2, built on TMSC's 28nm process. A GTX 980 uses the full GM204 core, with 2048 CUDA Cores and 64 ROPs with a 256-bit memory bus to GDDR5. The official power rating for the GTX 980 is 165W.

The ASUS GTX 980 Strix 4GB (or the full name of STRIX-GTX980-DC2OC-4GD5) runs a reasonable overclock over a reference GTX 980 card, with frequencies in the range of 1178-1279 MHz. The memory runs at stock, in this case, 7010 MHz. Video outputs include three DisplayPort connectors, one HDMI 2.0 connector, and a DVI-I.

Further Reading: AnandTech's NVIDIA GTX 980 Review

 

Thank you to Crucial for providing us with MX300 SSDs. Crucial stepped up to the plate as our benchmark list grows larger with newer benchmarks and titles, and the 1TB MX300 units are strong performers. Based on Marvell's 88SS1074 controller and using Micron's 384Gbit 32-layer 3D TLC NAND, these are 7mm high, 2.5-inch drives rated for 92K random read IOPS and 530/510 MB/s sequential read and write speeds.

The 1TB models we are using here support TCG Opal 2.0 and IEEE-1667 (eDrive) encryption and have a 360TB rated endurance with a three-year warranty.

Further Reading: AnandTech's Crucial MX300 (750 GB) Review

 

Thank you to Corsair for providing us with Vengeance LPX DDR4 Memory, HX750 Power Supply, and H115i CPU Cooler

Corsair kindly sent a 4x8GB DDR4 2666 set of their Vengeance LPX low profile, high-performance memory for our stock testing. The heatsink is made of pure aluminum to help remove heat from the sticks and has an eight-layer PCB. The heatsink is a low profile design to help fit in spaces where there may not be room for a tall heat spreader; think a SFF case or using a large heatsink. Timings on this specific set come in at 16-18-18-35. The Vengeance LPX line supports XMP 2.0 profiles for easily setting the speed and timings. It also comes with a limited lifetime warranty. 

Powering the test system is Corsair's HX750 Power Supply. This HX750 is a dual mode unit able to switch from a single 12V rail (62.5A/750W) to a five rail CPU (40A max ea.) and is also fully modular. It has a typical selection of connectors, including dual EPS 4+4 pin four PCIe connectors and a whopping 16 SATA power leads, as well as four 4-pin Molex connectors.

The 135mm fluid dynamic bearing fan remains off until it is 40% loaded offering complete silence in light workloads. The HX750 comes with a ten-year warranty. 

In order to cool these high-TDP HEDT CPUs, Corsair sent over its latest and largest AIO in the H115i. This closed-loop system uses a 280mm radiator with 2x140mm SP140L PWM controlled fans. The pump/block combination mounts to all modern CPU sockets. Users are also able to integrate this cooler into the Corsair link software via USB for more control and options. 

BIOS and Software Benchmark Overview
Comments Locked

27 Comments

View All Comments

  • MrPoletski - Monday, December 11, 2017 - link

    I can totally see ten of thousands of dollars being spent on this board and a corresponding PC of worthwhile power so the owner can play master of orion 2, nes emulators and minecraft. I know, I'm one of those nobs.
  • peevee - Monday, December 11, 2017 - link

    Somebody has to seriously grow up instead of wasting $400 for a gaming MB (or a few thou for a gaming computer).
  • ddrіver - Monday, December 11, 2017 - link

    "ten of thousands of dollars"? Sounds a bit excessive given that 1 (or 2, where possible) of the most expensive components available still doesn't really get you to $10K. Unless you're buying by sorting for the most expensive anything and taking as many as you can fit in a case.
    Next thing you're going to brag you pay a guy to comment for you.
  • DanNeely - Monday, December 11, 2017 - link

    " The smaller slots are an x1 and two x4 slots (the first runs at 1x) powered by the chipset for add-in cards. "

    This seems backwards since the first x4 is always free to put a card in while the second is blocked by the 2nd GPU.
  • Joe Shields - Monday, December 11, 2017 - link

    Hey Dan, I don't blame you for thinking this way. However, from the specifications it says this...:

    1. PCIEX4_1 max. at x1 mode

    Which is the same for all 44/28/16 lane CPUs.
  • DanNeely - Monday, December 11, 2017 - link

    ok. Just wanted to confirm it was a screwy design on Asus's part, not a transcription error.
  • SanX - Monday, December 11, 2017 - link

    Where the hell are dual CPU mobos? Intel and AMD don't like to sell more chips?
  • Dr. Swag - Monday, December 11, 2017 - link

    Intel has never sold non Xeon products that can be put in dual CPU mobos.
  • PeachNCream - Monday, December 11, 2017 - link

    Google says there were dual Pentium, Pentium Pro, Pentium II, Pentium III, and so forth motherboards around so Intel has sold non-Xeon products for dual socket/slot motherboards.
  • DanNeely - Monday, December 11, 2017 - link

    With the exception of the P3 all of those predated the Xeon branding. Dual socket P3 was presumably transitional in their rebranding.

    For modern chips, on the Intel side mainstream parts have neither the on die hardware, nor chip socket support for multi-socket setups because doing so would inflate the costs of the 99.9% of systems that are single socket.

    I'm less sure of the situation with AMD. I suspect that due to the level of die sharing they're doing between TR and Epyc that TR cpu dies themselves have the hardware needed to talk to a second CPU socket. However I'm skeptical that they've also paid extra for a larger/more complex socket on mainstream TR parts. It'd raise costs for the 99.9% of uni-socket systems and cut into sales of their more profitable Epyc line.

    More generally multi-core CPUs have been heavily eroding the market for multi-socket chips over the last 15 years. They require more complex boards, more complex CPUs, in many cases (ie any that need threads on different sockets to talk to each other) they also require additional programming work to perform at their maximum capacity (AMD has a NUMA hit for new multi die but single socket chips, however its worse for their dual socket ones). All of that means that almost any time you can get a single socket system with a suitable performance level it will be more cost effective than a similar dual (never mind quad or 8way) socket system. With dozens of cores available on Intel and AMD's current high end platforms small core count dual socket systems rarely make sense outside of cases where you need huge amounts of ram and don't really care about CPU performance.

Log in

Don't have an account? Sign up now