Paul Otellini, chief executive of Intel from 2005 to 2013, passed away in his sleep on Monday, October 2, 2017, Intel has announced. Mr. Otellini was Intel’s first CEO who did not have a technology educational background, and yet spent his entire career at Intel. During his tenure at one of the world’s largest chipmakers, he had a significant impact on the company’s business and technology development.

“We are deeply saddened by Paul’s passing,” Intel CEO Brian Krzanich said. “He was the relentless voice of the customer in a sea of engineers, and he taught us that we only win when we put the customer first.”

Paul Otellini was born in San Francisco on October 12, 1950. He received a bachelor’s degree in economics from the University of San Francisco in 1972, then an MBA from the University of California, Berkeley in 1974. Mr. Otellini joined Intel in 1974 and then held various positions at the company for 39 years before he retired in 2013.

Despite the fact that he did not have any educational background in technology, his impact on Intel (and therefore on the whole IT industry) is hard to overstate. For example, he used to run Intel Architecture Group, responsible for developing CPUs and chipsets as well as the company’s strategies for desktop, mobile and enterprise computing. Besides, he used to be the general manager of Intel’s Sales and Marketing Group, before becoming COO in 2002 and CEO in 2005.

One of his main accomplishments in the early 2000s was Intel’s shift from CPU development to platform development. Until Intel launched its Centrino platform (CPU, chipset, Wi-Fi controller) in 2003, the company focused primarily on microprocessor design, leaving parts of the chipset business and other chips to third parties, such as VIA Technologies or SiS. Centrino demonstrated that a platform approach yields better financial results, as the company could sell more chips per PC than before.

Later on, Mr. Otellini re-organized Intel around platforms for client and server machines, which streamlined development of the company’s compute solutions in general. The company’s financial results were strong even during the global economic recession from 2008 to 2010. In fact, he managed to increase Intel’s revenues from $34 billion in 2004 to $53 billion in 2012.

Among the big victories of Mr. Otellini that Intel names his win of Apple’s PC business: until 2006, Apple used PowerPC processors, but switched to Intel’s x86 architectures completely in the period from early 2006 to early 2008. It took Mr. Otellini nearly a decade to put Intel’s chips inside Macintosh computer. He first met Steve Jobs when he was still at NeXT in the late 1990s and kept meeting with him regularly until the deal was signed in 2005.

“Intel had a reputation for being a tough partner, coming out of the days when it was run by Andy Grove and Craig Barrett,” said Mr. Otellini in Walter Isaacson’s Steve Jobs book. “I wanted to show that Intel was a company you could work with. So a crack team from Intel worked with Apple, and they were able to beat the conversion deadline by six months.”

Eventually, Intel did not win Apple’s mobile contract due to various factors, but Mr. Otellini was quick to address one of Intel platforms’ main drawback: slow integrated GPUs. During his famous presentation at IDF in 2007, Mr. Otellini announced plans to increase performance of the company’s iGPUs by 10 times by 2010, but in reality the company did better than that. In fact, Intel has increased its iGPU performance quite dramatically in the last 10 years, a clear indicator that Paul Otellini and his team considered Intel’s weaknesses seriously and set up a foundation to overcome them.

Paul Otellini is survived by his wife Sandy (with whom they were married for 30 years), his son Patrick and his daughter Alexis.

Related Reading:

Source: Intel

Comments Locked


View All Comments

  • silverblue - Tuesday, October 3, 2017 - link

    Very sad to hear.
  • HStewart - Tuesday, October 3, 2017 - link

    Sorry to here about this - but it sounds like from his time in Intel - that he made huge difference - including transformation to i3/i5/i7 series - up to probably the Haswell series.

    One thing I am curious I wonder if Sandy Bridge was name because of his wife Sandy.
  • nathanddrews - Tuesday, October 3, 2017 - link

    Paul's wife + translated Hebrew codename = game changer!
  • IGTrading - Wednesday, October 4, 2017 - link

    Personal opinion :

    Lets also remember that during his tenure Intel paid 6 billion USD in bribes just to DELL and more than 10 times more to companies all over the world.

    Lets not forget how Intel put the corporate boot on the throat of the healthy market competition and made its whole client base pay multiple times more billions for its technology (because those bribes came from us, the clients) .

    Lets not forget that this company had to be raided in multiple countries on multiple continents to get to the proof of its behavior, like an organized crime syndicate.

    And yes, lets acknowledge its technological achievements as well.

    So may Paul rest in peace, but we have to accept his memory with the good and the bad as well.

    If there are some who lack the documentation or were just too young when all of this was happening, this guy here made a very well documented video documentary on everything :
  • ddriver - Wednesday, October 4, 2017 - link

    Yeah, his death is a huge loss to the efforts of unfair and illegal business practices, impeding of progress, abusing monopolies and milking the consumer.
  • nathanddrews - Wednesday, October 4, 2017 - link

    Be adults.
  • ddriver - Wednesday, October 4, 2017 - link

    You mean be cattle? By which I mean "do what you are expected to do and don't put any thought into what is going on".

    My previous post was 100% true, and if you felt like it put things in a negative perspective, that's not on me but on him.

    Not undermining the personal aspect of his tragedy, it looks that gets plenty of attention, I just don't feel like it offsets the amount of evil he played a central role in committing. And since AT is too biassed to tell the truth, that's up to conscientious citizens.
  • Topweasel - Wednesday, October 4, 2017 - link

    I am going to second the calls to look back at his body of work. His passing is notable because of what he did in his life in the Industry he worked for (and pretty much founded). With that comes all the damage he has done that he along with his subordinates are directly responsible for. For all the trouble IBM/Bell/MS have gotten into for Monopolistic practices, Intel's actions while he was at the helm were some of the most dirtiest seen in the corporate world and that is a high bar to pass.

    He might have built up the microprocessor industry but he also is responsible for decades of damage to it as well just so his company never had to compete on a technical level.
  • bigboxes - Wednesday, October 4, 2017 - link

    Yeah, I went Intel in 2006, but maybe I wouldn't if not for this man's actions.
  • extide - Tuesday, October 3, 2017 - link

    Funny they mention increasing the speed of their iGPU's by many times -- as when you start from such a low bar that is not that hard. There is a bit of an interesting story behind the origin of the Intel integrated graphics, though. Intel's integrated graphics started out in the chipset, in the northbridge back on the platforms that still had an FSB, external memory controller, etc. Since the northbridge was mostly used for I/O, it's size was commonly pad-limited, meaning it required so many pins that the die size actually had to be bigger than it otherwise needed to be in order to fit the required logic. This means they essentially had some free transistor budget, and so they decided to toss in a basic GPU. Since the GPU was literally an afterthought and thrown in using the spare space, it was never prioritized, and that showed. It wasn't until the Arrandale and Sandy Bridge era that Intel actually got serious about making their iGPU decent, and today the iGPU takes up a significant amount of die space, more than the processor cores themselves on the 2C dies.

Log in

Don't have an account? Sign up now