Given the price of the ASUS ROG Strix X299-XE Gaming, it seems to be targeted as a 'mid-range+' board for an enthusiast level gamer. The enthusiast gamer tends to take things a more serious than a casual gamer and can spend a bit more on high-end parts. These users will pass the blurred lines of HEDT platform and the mainstream, often skipping the Kaby Lake processors built for the X299 platform. Instead of paying $350 or less for a 7740K or 7640X, they go straight in for a higher core count CPU such as the six-core i7-7800X ($389) or an eight-core i7-7820X ($599). With this, there are more PCIe lanes to use and it opens up the platform for multi-GPU capabilities and more freedom for M.2 based devices. The XE version of this board over the non-XE version simply adds a larger and slightly different styled heatsink as well as including a 40mm fan and a mounting bracket to the updated heatsink. No other changes between the two were made. 

The ROG Strix X299-XE Gaming has most of the bells and whistles that we expect as default on the X299 platform. USB 3.1 (10 Gbps) duties are handled by the ASMedia 3142 controller which uses a bit less power and offers better performance than the 2142 controller seen in Z270 boards. It has a single wired LAN controller in the I219-V, a proven performer, to which ASUS also offers ESD protection and traffic management for prioritizing streams to optimize performance in online games. Control over the 8-phase VRM is handled by a well regarded IR (now Infineon) IR35201 8-channel controller ensuring accurate voltage is delivered to the processor. 

The build quality on the ROG Strix was quite good. The larger heatsink has lines cut into it allowing more heat to escape and air to get in there when the fan is on top. Both the VRM front and back was mounted well and made contact with everything it was supposed to which kept it running within specifications during our testing. The oversize chipset and M.2 heatsink also worked well. RGB LEDs were integrated tastefully on the board with only the ROG feature and the back panel IO shroud having RGB LEDs. The ROG RGB LED feature is customizable using a 3D printer with ASUS giving instructions/templates on the website. 

Performance on the ASUS out of the box is where things fell a bit short. Though it did well in our main system tests such as power and DPC Latency, it was middle of the pack on tightly coupled tests, or lower in a couple of tests due to how the board handles frequency at stock. In our testing, the i7-7900X boosted to 3.6 GHz on all cores - it appears it simply uses the Intel defaults while other boards we have looked at so far, enable some minor overclocking features by default. This results in the other boards giving an increase in scores over the default Intel boost tables. All this goes out the window when overclocking, to which the Strix X299-XE Gaming was a nice performer matching the 4.5 GHz our CPU can do. The auto-overclocking tools were a little aggressive, giving stability issues.

The ASUS ROG Strix X299-XE Gaming is going to have most, if not all the features an enthusiast level gamer needs to get up and fragging. It uses a high-end audio codec, the latest connectivity with USB 3.1 (10 Gbps) Type-A and Type-C ports, some muti-GPU support, and multiple M.2 ports. The XE name means that it adds to the beefy VRM cooler and provides a fan to support 165W CPUs and higher overclocks better.

However, the difference is in the fine details. At $50 more than the standard Strix, and $370 overall, it doesn't strike the $350-$400 price bracket with fear. In fact, for $323 dollars (around the cost of the non-XE ASUS ROG Strix), the ASRock X299 Taichi XE has the beefed-up heatsink, Wi-Fi, three M.2 slots, high-end audio codec, etc and costs almost $50 less, albeit with fewer gaming oriented software features. For the Strix XE, at $370, it is missing a key factor to make it an obvious sale. 

Other AnandTech Reviews for Intel’s Basin Falls CPUs and X299

Overclocking with the i9-7900X


View All Comments

  • MrPoletski - Monday, December 11, 2017 - link

    I can totally see ten of thousands of dollars being spent on this board and a corresponding PC of worthwhile power so the owner can play master of orion 2, nes emulators and minecraft. I know, I'm one of those nobs. Reply
  • peevee - Monday, December 11, 2017 - link

    Somebody has to seriously grow up instead of wasting $400 for a gaming MB (or a few thou for a gaming computer). Reply
  • ddrіver - Monday, December 11, 2017 - link

    "ten of thousands of dollars"? Sounds a bit excessive given that 1 (or 2, where possible) of the most expensive components available still doesn't really get you to $10K. Unless you're buying by sorting for the most expensive anything and taking as many as you can fit in a case.
    Next thing you're going to brag you pay a guy to comment for you.
  • DanNeely - Monday, December 11, 2017 - link

    " The smaller slots are an x1 and two x4 slots (the first runs at 1x) powered by the chipset for add-in cards. "

    This seems backwards since the first x4 is always free to put a card in while the second is blocked by the 2nd GPU.
  • Joe Shields - Monday, December 11, 2017 - link

    Hey Dan, I don't blame you for thinking this way. However, from the specifications it says this...:

    1. PCIEX4_1 max. at x1 mode

    Which is the same for all 44/28/16 lane CPUs.
  • DanNeely - Monday, December 11, 2017 - link

    ok. Just wanted to confirm it was a screwy design on Asus's part, not a transcription error. Reply
  • SanX - Monday, December 11, 2017 - link

    Where the hell are dual CPU mobos? Intel and AMD don't like to sell more chips? Reply
  • Dr. Swag - Monday, December 11, 2017 - link

    Intel has never sold non Xeon products that can be put in dual CPU mobos. Reply
  • PeachNCream - Monday, December 11, 2017 - link

    Google says there were dual Pentium, Pentium Pro, Pentium II, Pentium III, and so forth motherboards around so Intel has sold non-Xeon products for dual socket/slot motherboards. Reply
  • DanNeely - Monday, December 11, 2017 - link

    With the exception of the P3 all of those predated the Xeon branding. Dual socket P3 was presumably transitional in their rebranding.

    For modern chips, on the Intel side mainstream parts have neither the on die hardware, nor chip socket support for multi-socket setups because doing so would inflate the costs of the 99.9% of systems that are single socket.

    I'm less sure of the situation with AMD. I suspect that due to the level of die sharing they're doing between TR and Epyc that TR cpu dies themselves have the hardware needed to talk to a second CPU socket. However I'm skeptical that they've also paid extra for a larger/more complex socket on mainstream TR parts. It'd raise costs for the 99.9% of uni-socket systems and cut into sales of their more profitable Epyc line.

    More generally multi-core CPUs have been heavily eroding the market for multi-socket chips over the last 15 years. They require more complex boards, more complex CPUs, in many cases (ie any that need threads on different sockets to talk to each other) they also require additional programming work to perform at their maximum capacity (AMD has a NUMA hit for new multi die but single socket chips, however its worse for their dual socket ones). All of that means that almost any time you can get a single socket system with a suitable performance level it will be more cost effective than a similar dual (never mind quad or 8way) socket system. With dozens of cores available on Intel and AMD's current high end platforms small core count dual socket systems rarely make sense outside of cases where you need huge amounts of ram and don't really care about CPU performance.

Log in

Don't have an account? Sign up now