Core i9-9980XE Conclusion: A Generational Upgrade

Users (and investors) always want to see a year-on-year increase in performance in the products being offered. For those on the leading edge, where every little counts, dropping $2k on a new processor is nothing if it can be used to its fullest. For those on a longer upgrade cycle, a constant 10% annual improvement means that over 3 years they can expect a 33% increase in performance. As of late, there are several ways to increase performance: increase core count, increase frequency/power, increase efficiency, or increase IPC. That list is rated from easy to difficult: adding cores is usually trivial (until memory access becomes a bottleneck), while increasing efficiency and instructions per clock (IPC) is difficult but the best generational upgrade for everyone concerned.

For Intel’s latest Core i9-9980XE, its flagship high-end desktop processor, we have a mix of improved frequency and improved efficiency. Using an updated process has helped increase the core clocks compared to the previous generation, a 15% increase in the base clock, but we are also using around 5-6% more power at full load. In real-world terms, the Core i9-9980XE seems to give anywhere from a 0-10% performance increase in our benchmarks.

However, if we follow Intel’s previous cadence, this processor launch should have seen a substantial change in design. Normally Intel follows a new microarchitecture and socket with a process node update on the same socket with similar features but much improved efficiency. We didn’t get that. We got a refresh.

An Iteration When Intel Needs Evolution

When Intel announced that its latest set of high-end desktop processors was little more than a refresh, there was a subtle but mostly inaudible groan from the tech press at the event. We don’t particularly like generations using higher clocked refreshes with our graphics, so we certainly are not going to enjoy it with our processors. These new parts are yet another product line based on Intel’s 14nm Skylake family, and we’re wondering where Intel’s innovation has gone.

These new parts involve using larger silicon across the board, which enables more cache and PCIe lanes at the low end, and the updates to the manufacturing process afford some extra frequency. The new parts use soldered thermal interface material, which is what Intel used to use, and what enthusiasts have been continually requesting. None of this is innovation on the scale that Intel’s customer base is used to.

It all boils down to ‘more of the same, but slightly better’.

While Intel is having another crack at Skylake, its competition is trying to innovate, not only by trying new designs that may or may not work, but they are already showcasing the next generation several months in advance with both process node and microarchitectural changes. As much as Intel prides itself on its technological prowess, and has done well this decade, there’s something stuck in the pipe. At a time when Intel needs evolution, it is stuck doing refresh iterations.

Does It Matter?

The latest line out of Intel is that demand for its latest generation enterprise processors is booming. They physically cannot make enough, and other product lines (publicly, the lower power ones) are having to suffer when Intel can use those wafers to sell higher margin parts. The situation is dire enough that Intel is moving fab space to create more 14nm products in a hope to match demand should it continue. Intel has explicitly stated that while demand is high, it wants to focus on its high performance Xeon and Core product lines.

You can read our news item on Intel's investment announcement here.

While demand is high, the desire to innovate hits this odd miasma of ‘should we focus on squeezing every cent out of this high demand’ compared to ‘preparing for tomorrow’. With all the demand on the enterprise side, it means that the rapid update cycles required from the consumer side might not be to their liking – while consumers who buy one chip want 10-15% performance gains every year, the enterprise customers who need chips in high volumes are just happy to be able to purchase them. There’s no need for Intel to dip its toes into a new process node or design that offers +15% performance but reduces yield by more, and takes up the fab space.

Intel Promised Me

In one meeting with Intel’s engineers a couple of years back, just after the launch of Skylake, I was told that two years of 10% IPC growth is not an issue. These individuals know the details of Intel’s new platform designs, and I have no reason to doubt them. Back then, it was clear that Intel had the next one or two generations of Core microarchitecture updates mapped out, however the delays to 10nm seem to put a pin into those +10% IPC designs. Combine Intel’s 10nm woes with the demand on 14nm Skylake-SP parts, and it makes for one confusing mess. Intel is making plenty of money, and they seem to have designs in their back pocket ready to go, but while it is making super high margins, I worry we won’t see them. All the while, Intel’s competitors are trying to do something new to break the incumbents hold on the market.

Back to the Core i9-9980XE

Discussions on Intel’s roadmap and demand aside, our analysis of the Core i9-9980XE shows that it provides a reasonable uplift over the Core i9-7980XE for around the same price, albeit for a few more watts in power. For users looking at peak Intel multi-core performance on a consumer platform, it’s a reasonable generation-on-generation uplift, and it makes sense for those on a longer update cycle.

A side note on gaming – for users looking to push those high-frame rate monitors, the i9-9980XE gave a 2-4% uplift over our games at our 720p settings. Individual results varied from a 0-1% gain, such as in Ashes or Final Fantasy, up to a 5-7% gain in World of Tanks, Far Cry 5, and Strange Brigade. Beyond 1080p, we didn’t see much change.

When comparing against the AMD competition, it all depends on the workload. Intel has the better processor in most aspects of general workflow, such as lightly threaded workloads on the web, memory limited tests, compression, video transcoding, or AVX512 accelerated code, but AMD wins on dedicated processing, such as rendering with Blender, Corona, POV-Ray, and Cinema4D. Compiling is an interesting one, because in for both Intel and AMD, the more mid-range core count parts with higher turbo frequencies seem to do better.

Power Consumption
Comments Locked

143 Comments

View All Comments

  • imaheadcase - Tuesday, November 13, 2018 - link

    Yah because you don't do anything intensive with the jobs you have, of course you would use laptops or whatever mobile. But the reality is most people would use desktops because simply faster to get stuff done, and more powerful.

    BYOD fyi is not like that for most companies..
  • imaheadcase - Tuesday, November 13, 2018 - link

    ..and if you are doing anything intensive with laptops..that just means company you work for is behind the curve and just being cheap and not fork out money for the right hardware.
  • PeachNCream - Tuesday, November 13, 2018 - link

    There are over 250K people on the payroll. There ARE desktop PCs around, but they are few and far between. I'm not going to get into an extended debate about this because it won't change anyone's perspective, but I do believe you've got a slight misconception about the usefulness and flexibility of portable computer hardware. A simple look at the availability of desktops versus laptops should be enough to make it obvious, for most people, computer == laptop these days.
  • Spunjji - Tuesday, November 13, 2018 - link

    You're eliding the difference between "convenient and sufficient" and "as powerful as anyone needs".

    I'll absolutely grant that if you're only going to have one system for doing your work and you move around a fair bit, then it absolutely makes sense to have that system be mobile, even if you lose a bit of edge-case performance.

    For people doing /serious/ GPU grunt work something like an XPS 15 is going to provide between 1/2 and 1/3 of the power they could get with a similarly priced desktop. That compromise doesn't make any sense for someone whose job does not require mobility.

    So sure, notebooks are better than ever for a large number of people. Doesn't make desktops and HEDT chips functionally irrelevant for businesses, though. If you can really use 18 cores for the work you're doing then being provided with an XPS 15 will be, at best, a sad joke.
  • Ratman6161 - Tuesday, November 13, 2018 - link

    Any laptop is essentially on a different planet than any of the processors covered in this review (doesn't matter if we are talking Intel or AMD).
    1. If it is possible to do your work on a laptop (which I am myself at this very moment) then you (and me) are not the target audience for these CPU's. In fact, I'm not entirely sure why you even bother to read or comment on the story?
    2. If you have to ask if you need it, you don't need it.
    3. If you have to think more than about 1 second to make a decision between one of these and a laptop, then you don't need it.
    4. If you do need one, then you already know that.

    Most people don't need one, including me. I read these things because the technology is interesting and because I find it interesting what others might be doing. I don't really feel any need to insist that others need what I need and could not possibly need anything else.
  • PeachNCream - Wednesday, November 14, 2018 - link

    So a differing opinion than yours should mean that someone not read an article or comment on it. That appears to be nothing more than a self-protective mechanism intended to erect a bubble in which exists nothing more than an echo chamber filled with your own beliefs. That's hardly a way to integrate new thinking, but I do understand that a lot of people fear change in the same way you do.
  • Kilnk - Tuesday, November 13, 2018 - link

    "But the reality is most people would use desktops because simply faster to get stuff done, and more powerful."

    See, that's the problem with your reasoning. You assume that most people need power when they do not. The reality is that the majority of people who need to use computers for work do not need to do rendering or any kind of intensive task. So no, most people don't use desktops nor would they want to use desktops given the opportunity. They use laptops.
  • FunBunny2 - Tuesday, November 13, 2018 - link

    "Now we live in a BYOD (bring your own device) world where the company will pay up to a certain amount (varies between $1,100 and $1,400 depending on funding from upper echelons of the corporation) and employees are free to purchase the computer hardware they want for their work. There are no desktop PCs currently and in the past four years, only one person purchased a desktop in the form of a NUC. "

    The Man's advantage to the Worker Bees using laptops: their always 'on the job'. no time off. as close to slavery as it's legal to be. some smart folks are truly stupid.
  • PeachNCream - Tuesday, November 13, 2018 - link

    "The Man's advantage to the Worker Bees.." (just quoting because of the lack of continuing indents in Anandtech's 1990's-era comment system)

    I think that's a bit of a stretch in our case. My division doesn't do on-call and we strictly prohibit our lower tier managers from tapping employees outside of their normal work hours. Even checking company e-mail outside of work hours is against posted (and enforced) policy. If we must, due to emergencies, they absolutely have to be compensated for the time regardless of whether or not they are hourly or salaried workers. I haven't seen an "emergency" that couldn't wait until the next day so that policy has not been put into use in at least the last five years. Computational mobility is no excuse to allow invasions into off-the-clock time and I for one won't allow it.
  • jjjag - Tuesday, November 13, 2018 - link

    I hate to admit it but PNC is right. Super-high-powered desktops are an anachronism. If you need REAL horsepower, you build a server/compute farm and connect to it with thin-client laptops. If you are just doing software development, the laptop cpu is usually good enough.

    This is especially true of single socket monsters like these HEDT chips. The only reason they exist is because gamers will pay too much for everything. It's nothing more than an expensive hobby, and like all hobbies at the top end is all "want" and very little "need". The "need" stops somewhere around 6 or 8 cores.

    It's exactly the same as owning a Ferrari and never taking it to the track. You will never use more than 20% of the full capabilities of it. All you really need is a Vette.

Log in

Don't have an account? Sign up now