Paul Otellini, chief executive of Intel from 2005 to 2013, passed away in his sleep on Monday, October 2, 2017, Intel has announced. Mr. Otellini was Intel’s first CEO who did not have a technology educational background, and yet spent his entire career at Intel. During his tenure at one of the world’s largest chipmakers, he had a significant impact on the company’s business and technology development.

“We are deeply saddened by Paul’s passing,” Intel CEO Brian Krzanich said. “He was the relentless voice of the customer in a sea of engineers, and he taught us that we only win when we put the customer first.”

Paul Otellini was born in San Francisco on October 12, 1950. He received a bachelor’s degree in economics from the University of San Francisco in 1972, then an MBA from the University of California, Berkeley in 1974. Mr. Otellini joined Intel in 1974 and then held various positions at the company for 39 years before he retired in 2013.

Despite the fact that he did not have any educational background in technology, his impact on Intel (and therefore on the whole IT industry) is hard to overstate. For example, he used to run Intel Architecture Group, responsible for developing CPUs and chipsets as well as the company’s strategies for desktop, mobile and enterprise computing. Besides, he used to be the general manager of Intel’s Sales and Marketing Group, before becoming COO in 2002 and CEO in 2005.

One of his main accomplishments in the early 2000s was Intel’s shift from CPU development to platform development. Until Intel launched its Centrino platform (CPU, chipset, Wi-Fi controller) in 2003, the company focused primarily on microprocessor design, leaving parts of the chipset business and other chips to third parties, such as VIA Technologies or SiS. Centrino demonstrated that a platform approach yields better financial results, as the company could sell more chips per PC than before.

Later on, Mr. Otellini re-organized Intel around platforms for client and server machines, which streamlined development of the company’s compute solutions in general. The company’s financial results were strong even during the global economic recession from 2008 to 2010. In fact, he managed to increase Intel’s revenues from $34 billion in 2004 to $53 billion in 2012.

Among the big victories of Mr. Otellini that Intel names his win of Apple’s PC business: until 2006, Apple used PowerPC processors, but switched to Intel’s x86 architectures completely in the period from early 2006 to early 2008. It took Mr. Otellini nearly a decade to put Intel’s chips inside Macintosh computer. He first met Steve Jobs when he was still at NeXT in the late 1990s and kept meeting with him regularly until the deal was signed in 2005.

“Intel had a reputation for being a tough partner, coming out of the days when it was run by Andy Grove and Craig Barrett,” said Mr. Otellini in Walter Isaacson’s Steve Jobs book. “I wanted to show that Intel was a company you could work with. So a crack team from Intel worked with Apple, and they were able to beat the conversion deadline by six months.”

Eventually, Intel did not win Apple’s mobile contract due to various factors, but Mr. Otellini was quick to address one of Intel platforms’ main drawback: slow integrated GPUs. During his famous presentation at IDF in 2007, Mr. Otellini announced plans to increase performance of the company’s iGPUs by 10 times by 2010, but in reality the company did better than that. In fact, Intel has increased its iGPU performance quite dramatically in the last 10 years, a clear indicator that Paul Otellini and his team considered Intel’s weaknesses seriously and set up a foundation to overcome them.

Paul Otellini is survived by his wife Sandy (with whom they were married for 30 years), his son Patrick and his daughter Alexis.

Related Reading:

Source: Intel

Comments Locked

41 Comments

View All Comments

  • CaedenV - Tuesday, October 3, 2017 - link

    Not to be blindly in the Intel fanboy camp...
    But the GPU criticism is a bit unfair in this case. Yes, Intel's on-board graphics stank for a very long time. But lets also keep in mind that they went from pretty much nothing to making some of the best (and most efficient) on-board graphics in a 8-10 year period. Most of this struggle was the IP base that they started with, and trying to dramatically improve performance without stepping on the toes of AMD/ATI and nVidea who were viewed at the time as close partners rather than direct rivles (at least in the GPU department). I was just as frustrated as anyone at how bad onboard GPUs were when they started improving, but at the same time I am glad they made their move, and did it in a way that did well for the company without making too many waves with their partners.
  • Zingam - Wednesday, October 4, 2017 - link

    All criticism is fair. I had never so many driver issues with any other vendor. Even normal everyday desktop applications have rendering issues on Intel's latest and greatest but expensive!
  • FunBunny2 - Tuesday, October 3, 2017 - link

    -- more than the processor cores themselves on the 2C dies.

    for some years, if you look at the dieshots, cores are down to may be 10% of area. too bad there aren't many embarrassingly parallel user space problems; if there were, and cores were reduced to just ALU, much more of that budget would go to actual computing. wait.... isn't that what GPU programming really is??
  • vailr - Tuesday, October 3, 2017 - link

    Intel's integrated GPU should be an optional feature, instead of what's now standard: being forced onto all retail desktop machines. Unless you want to go to the expense of buying a workstation or else swap out the CPU that came with the desktop machine.
  • Zingam - Wednesday, October 4, 2017 - link

    iGPUs are a terrible idea in the first place. Well maybe not on smartphone SOCs.
  • BrokenCrayons - Thursday, October 5, 2017 - link

    iGPUs are a good idea for a wide variety of computing scenarios where the performance of a dGPU is unnecessary. iGPUs have brought the cost and hardware complexity of PCs down which has helped them spread far and wide. The majority of computers sold contain no dedicated graphics. That was the case when the awful Extreme 3D was being foisted on us and is still the case today.

    There's certainly a place for dedicated graphics processors (a lot of places really), but offering "just enough" graphics power to toss a Windows desktop on a low resolution screen was sufficient for a lot of people. Since iGPUs released after the GMA950 were progressively given more priority and offered more performance, it's possible to pick up any relatively modern Intel processor and skip the GPU, but still play a few older games or watch a video in HD and not worry about buying a graphics card.
  • Jon Tseng - Wednesday, October 4, 2017 - link

    Yeah one thing I never understand is why high end enthusiast CPUs (6700Ks 7700Ks etc) have vast amounts of die areas wasted on an iGPU when any user is blatantly going to pair it with a discrete GPU anyway.

    AMD actually cottoned onto this with Ryzen - create dedicated enthusiast parts which use this transistor budget for CPU which allows them to offer more cores for less.

    Has always seemed nuts to me...
  • DanNeely - Wednesday, October 4, 2017 - link

    Because they don't sell enough of the K variants to justify a separate die without massively inflating the unit cost due to a much smaller customer base to spread the fixed costs of creating it over. LGA 115x K series are the top of the line mainstream dies. If you want something explicitly designed for enthusiasts and without the IGP baggage, step up to LGA 20xx.
  • Jon Tseng - Wednesday, October 4, 2017 - link

    Yeah that kinda makes sense given tape-out costs.
  • cwolf78 - Tuesday, October 3, 2017 - link

    Aha!! So HERE's the schmuck who was responsible for (or at least turned a blind eye to) Intel's anti-competitive practices throughout the years.

Log in

Don't have an account? Sign up now