Intel Core i9-7980XE and Core i9-7960X Conclusion

In the 2000s, we had the frequency wars. Trying to pump all the MHz into a single core ended up mega-hurting one company in particular, until the push was made to multi-core and efficient systems. Now in the 2010s, we have the Core Wars. You need to have at least 16 cores to play, and be prepared to burn power like never before. With today’s launch, Intel has kicked their high-end desktop offerings up a notch, both in performance and power consumption.

Pun density aside, both Intel and AMD are pursuing a many-core strategy when it comes to the high-end desktop. Both manufacturers have the same story in their pockets: users need more multi-threaded performance, either for intense compute or for a heavy scoop of compute-based multi-tasking, to include streaming, encoding, rendering, and high-intensity minesweeper VR all at the same time. Due to having so many cores, single threaded performance while under load can be lower than usual, so both companies focus on throughput rather than responsiveness.

The Core i9-7980XE is Intel’s new top-of-the-line HEDT processor. Coming in at 18-cores and drawing 165W, this processor can reach 4.4 GHz at top turbo or 3.4 GHz on all-core turbo. At $1999, it becomes the most expensive (?) consumer focused processor on the market. It is priced at this level for two reasons: first such that it doesn’t cannibalize Intel’s enterprise sales which have a higher profit margin, but also due to Intel’s product stack it fills up several price points from $300 all the way to $2000 now, and with it being Intel’s best consumer processor, they are hoping that it will still sell like hot cakes.

Our performance numbers show that Intel’s best multi-core consumer processor is deserving of that title. In most of our multi-core tests, Intel has a clear lead over AMD: a combination of more cores and a higher single threaded performance compensates for any frequency difference. For anyone with hardcore compute, Intel gets you to the finishing line first in almost every scenario. AMD does win on a few benchmarks, which is something we saw when Johan tested the enterprise versions of Intel's and AMD's CPUs in his review, where he cited AMD’s FP unit as being the leading cause of the performance improvement.

There are going to be three cautionary flags to this tale based on our testing today, for anyone looking at Skylake-X, and they all start with the letter P: Power, Platform, and Price.

Power: In our testing, Intel’s cores can consume between 20W and 7.5W per core, which is a large range. When all cores are at load, as well as the mesh and DRAM controllers, the Core i9-7980XE draws over 190W, well above the TDP rating of 165W. This will cause concern for users that take the TDP value as rote for power consumption – and for any users thinking of overclocking it might also be worth investing in custom cooling loops. The processors from AMD consume ~177W at load, which for two cores less is about the same ballpark.

Platform: X299 motherboards are designed to handle Kaby Lake-X, Skylake-X LCC and Skylake-X HCC processors. Almost all the motherboards should be geared towards the high-end processors which makes platform compatibility a bit of a non-issue, however testing by others recommends some form of active cooling on the power delivery. When investing $1999 in a processor, especially if a user is considering overclocking, it is likely that a good motherboard is needed, and not just the bargain basement model. Some users will point to the competition, where AMD's processors offer enough lanes for three x16 GPUs and three PCIe 3.0 x4 storage devices from the processor at the same time, rather than reduced bandwidth for 3-way and requiring storage to go through the chipset.

Price: $1999 is a new record for consumer processors. Intel is charging this much because it can – this processor does take the absolute workstation performance crown. For high performance, that is usually enough – the sort of users that are interested in this level of performance are not overly interested in performance per dollar, especially if a software license is nearer $10k. However for everyone else, unless you can take advantage of TSX or AVX-512, the price is exorbitant, and all arrows point towards AMD instead. Half the price is hard to ignore.

Users looking at the new processors for workstation use should consider the three Ps. It’s not an easy task, and will highly depend on the user specific workflow. The recommendations ultimately come down to three suggestions:

  • If a user needs the top best workstation processor without ECC, then get Skylake-X.
  • If a user needs ECC or 512GB of DRAM, Xeon-W looks a better bet.
  • If a user has a strict budget or wants another GPU for compute workloads, look at Threadripper.

For those weighing up the Core i9-7960X compared to the Core i9-7980XE, part of me wants to say ‘if you’re going for cores and prepared to spend this much, then go all the way’. If the main purpose is throughput, for the benchmarks that matter, the 7980XE is likely to provide benefit. For multi-taskers, the benefits are less clear, and it would be interesting to get the Core i9-7940X and Core i9-7920X in for testing.

Performance Per Dollar Analysis
Comments Locked

152 Comments

View All Comments

  • mapesdhs - Tuesday, September 26, 2017 - link

    In that case, using Intel's MO, TR would have 68. What Intel is doing here is very misleading.
  • iwod - Monday, September 25, 2017 - link

    If we factor in the price of the whole system, rather then just CPU, ( AMD's MB tends to be cheaper ), then AMD is doing pretty well here. I am looking forward to next years 12nm Zen+.
  • peevee - Monday, September 25, 2017 - link

    From the whole line, only 7820X makes sense from price/performance standpoint.
  • boogerlad - Monday, September 25, 2017 - link

    Can an IPC comparison be done between this and Skylake-s? Skylake-x LCC lost in some cases to skylake, but is it due to lack of l3 cache or is it because the l3 cache is slower?
  • IGTrading - Monday, September 25, 2017 - link

    There will never be an IPC comparison of Intel's new processors, because all it would do is showcase how Intel's IPC actually went down from Broadwell and further down from KabyLake.

    Intel's IPC is a downtrend affair and this is not really good for click and internet traffic.

    Even worse : it would probably upset Intel's PR and that website will surely not be receiving any early review samples.
  • rocky12345 - Monday, September 25, 2017 - link

    Great review thank you. This is how a proper review is done. Those benchmarks we seen of the 18 core i9 last week were a complete joke since the guy had the chip over clocked to 4.2GHz on all core which really inflated the scores vs a stock Threadripper 16/32 CPU. Which was very unrealistic from a cooling stand point for the end users.

    This review had stock for stock and we got to see how both CPU camps performed out of the box states. I was a bit surprised the mighty 18 core CPU did not win more of the benches and when it did it was not by very much most of the time. So a 1K CPU vs a 2K CPU and the mighty 18 core did not perform like it was worth 1K more than the AMD 1950x or the 1920x for that matter. Yes the mighty i9 was a bit faster but not $1000 more faster that is for sure.
  • Notmyusualid - Thursday, September 28, 2017 - link

    I too am interested to see 'out of the box performance' also.

    But if you think ANYONE would buy this and not overclock - you'd have to be out of your mind.

    There are people out there running 4.5GHz on all cores, if you look for it.

    And what is with all this 'unrealistic cooling' I keep hearing about? You fit the cooling that fits your CPU. My 14C/28T CPU runs 162W 24/7 running BOINC, and is attached to a 480mm 4-fan all copper radiator, and hand on my heart, I don't think has ever exceeded 42C, and sits at 38C mostly.

    If I had this 7980XE, all I'd have to do is increase pump speed I expect.
  • wiyosaya - Monday, September 25, 2017 - link

    Personally, I think the comments about people that spend $10K on licenses having the money to go for the $2K part are not necessarily correct. Companies will spend that much on a license because they really do not have any other options. The high end Intel part in some benchmarks gets 30 to may be 50 percent more performance on a select few benchmarks. I am not going to debate that that kind of performance improvement is significant even though it is limited to a few benchmarks; however, to me that kind of increased performance comes at an extreme price premium, and companies that do their research on the capabilities of each platform vs price are not, IMO, likely to throw away money on a part just for bragging rights. IMO, a better place to spend that extra money would be on RAM.
  • HStewart - Monday, September 25, 2017 - link

    In my last job, they spent over $100k for software version system.

    In workstation/server world they are looking for reliability, this typically means Xeon.

    Gaming computers are different, usually kids want them and have less money, also they are always need to latest and greatest and not caring about reliability - new Graphics card comes out they replace it. AMD is focusing on that market - which includes Xbox One and PS 4

    For me I looking for something I depend on it and know it will be around for a while. Not something that slap multiple dies together to claim their bragging rights for more core.

    Competition is good, because it keeps Intel on it feat, I think if AMD did not purchase ATI they would be no competition for Intel at all in x86 market. But it not smart also - would anybody be serious about placing AMD Graphics Card on Intel CPU.
  • wolfemane - Tuesday, September 26, 2017 - link

    Hate to burst your foreign bubble but companies are cheap in terms of staying within budgets. Specially up and coming corporations. I'll use the company I work for as an example. Fairly large print shop with 5 locations along the US West coast that's been in existence since the early 70's. About 400 employees in total. Server, pcs, and general hardware only sees an upgrade cycle once every 8 years (not all at once, it's spread out). Computer hardware is a big deal in this industry, and the head of IT for my company Has done pretty well with this kind of hardware life cycle. First off, macs rule here for preprocessing, we will never see a Windows based pc for anything more than accessing the Internet . But when it comes to our servers, it's running some very old xeons.

    As soon as the new fiscal year starts, we are moving to an epyc based server farm. They've already set up and established their offsite client side servers with epyc servers and IT absolutely loves them.

    But why did I bring up macs? The company has a set budget for IT and this and the next fiscal year had budget for company wide upgrades. By saving money on the back end we were able to purchase top end graphic stations for all 5 locations (something like 30 new machines). Something they wouldn't have been able to do to get the same layout with Intel. We are very much looking forward to our new servers next year.

    I'd say AMD is doing more than keeping Intel on their feet, Intel got a swift kick in the a$$ this year and are scrambling.

Log in

Don't have an account? Sign up now