Paul Otellini, chief executive of Intel from 2005 to 2013, passed away in his sleep on Monday, October 2, 2017, Intel has announced. Mr. Otellini was Intel’s first CEO who did not have a technology educational background, and yet spent his entire career at Intel. During his tenure at one of the world’s largest chipmakers, he had a significant impact on the company’s business and technology development.

“We are deeply saddened by Paul’s passing,” Intel CEO Brian Krzanich said. “He was the relentless voice of the customer in a sea of engineers, and he taught us that we only win when we put the customer first.”

Paul Otellini was born in San Francisco on October 12, 1950. He received a bachelor’s degree in economics from the University of San Francisco in 1972, then an MBA from the University of California, Berkeley in 1974. Mr. Otellini joined Intel in 1974 and then held various positions at the company for 39 years before he retired in 2013.

Despite the fact that he did not have any educational background in technology, his impact on Intel (and therefore on the whole IT industry) is hard to overstate. For example, he used to run Intel Architecture Group, responsible for developing CPUs and chipsets as well as the company’s strategies for desktop, mobile and enterprise computing. Besides, he used to be the general manager of Intel’s Sales and Marketing Group, before becoming COO in 2002 and CEO in 2005.

One of his main accomplishments in the early 2000s was Intel’s shift from CPU development to platform development. Until Intel launched its Centrino platform (CPU, chipset, Wi-Fi controller) in 2003, the company focused primarily on microprocessor design, leaving parts of the chipset business and other chips to third parties, such as VIA Technologies or SiS. Centrino demonstrated that a platform approach yields better financial results, as the company could sell more chips per PC than before.

Later on, Mr. Otellini re-organized Intel around platforms for client and server machines, which streamlined development of the company’s compute solutions in general. The company’s financial results were strong even during the global economic recession from 2008 to 2010. In fact, he managed to increase Intel’s revenues from $34 billion in 2004 to $53 billion in 2012.

Among the big victories of Mr. Otellini that Intel names his win of Apple’s PC business: until 2006, Apple used PowerPC processors, but switched to Intel’s x86 architectures completely in the period from early 2006 to early 2008. It took Mr. Otellini nearly a decade to put Intel’s chips inside Macintosh computer. He first met Steve Jobs when he was still at NeXT in the late 1990s and kept meeting with him regularly until the deal was signed in 2005.

“Intel had a reputation for being a tough partner, coming out of the days when it was run by Andy Grove and Craig Barrett,” said Mr. Otellini in Walter Isaacson’s Steve Jobs book. “I wanted to show that Intel was a company you could work with. So a crack team from Intel worked with Apple, and they were able to beat the conversion deadline by six months.”

Eventually, Intel did not win Apple’s mobile contract due to various factors, but Mr. Otellini was quick to address one of Intel platforms’ main drawback: slow integrated GPUs. During his famous presentation at IDF in 2007, Mr. Otellini announced plans to increase performance of the company’s iGPUs by 10 times by 2010, but in reality the company did better than that. In fact, Intel has increased its iGPU performance quite dramatically in the last 10 years, a clear indicator that Paul Otellini and his team considered Intel’s weaknesses seriously and set up a foundation to overcome them.

Paul Otellini is survived by his wife Sandy (with whom they were married for 30 years), his son Patrick and his daughter Alexis.

Related Reading:

Source: Intel

Comments Locked

41 Comments

View All Comments

  • RedGreenBlue - Tuesday, October 3, 2017 - link

    Well that's a sad day. I know he had a huge impact on the markets Anand reviewed at the time. Up until the Pentium 4, that's great, but the fact that 10 years later you can expect to see someone comment on his tech obituary about threatening OEMs to only buy Intel says a lot about the goodwill he helped Intel destroy in the enthusiast market. I hope current Intel executives want to be remembered better than a nice story with an asterisk in people's minds like, "* also held back the advancement of computers".
  • CaedenV - Tuesday, October 3, 2017 - link

    Brought Intel out of the Pentium 4 swamp and lead the team through the Sandy/Ivy bridge era... things really have not improved after he left. Some may complain about some of Intel's business practices at the time, but he also knew how to keep the machine moving forward for a very long time without slowing down. Now they are floundering with no clear direction or purpose. Paul is surely missed.
  • Zingam - Wednesday, October 4, 2017 - link

    You, Sir, have no clue what you are talking about. P4 was fine. I had one. It did a decent job and wormed up my hand during cold winters!!!
    Mr. Otellini's decisions had direct influence on what Intel conceived up until at least Kaby Lake.
    The only good thing about Intel is their best-in-the-business manufacturing process. Architecturewise they are still churning over and over the same old PC architecture that IBM made up in the 80's. Zero invention since that time!

    I just watched a presentation by an NVIDIA architect who very well explained the bottlenecks on the PC and how they overcame them together with IBM on the server BY inventing NEW technology! IBM has that built into their Power CPUs.

    IBM created monsters - Intel and Microsoft.
  • damonlynch - Tuesday, October 3, 2017 - link

    Condolences to Mr. Otellini's family and friends. I learned only now that Mr. Otellini graduated from Cal with an MBA in 1974. Perhaps he took a class or two with C. West Churchman, who was a truly great thinker.
  • atirado - Wednesday, October 4, 2017 - link

    Wow, essentially, my first 10 years of professional career was spent using CPUs spearheaded by his customer first view...
  • wolfemane - Wednesday, October 4, 2017 - link

    Without fame, he who spends his time on earth leaves only such a mark upon the world as smoke does on air or foam on water. -
    Durante degli Alighieri
  • entity279 - Wednesday, October 4, 2017 - link

    Surely the article should also mention the anti-competitive practices Intel was accused of (and was charged for , in some cases) during his "mandate". I'm not accusing AT of any bias or anything, but us readers deserve to get a more balanced view.
  • ddriver - Wednesday, October 4, 2017 - link

    "and was charged for , in some cases"

    Intel was found guilty on every continent they sell their products.

    If anyone expects objectivity from AT... talk about unrealistic expectations.
  • entity279 - Wednesday, October 4, 2017 - link

    Technicaly, they were not found guilty on all of the charges, since Intel & AMD agreed on a settlement and some were droped as a result.

    I think the lack of objectivity is not the issue here. Who reads truly objective journalism in this industry floded by money, exclusives and free review samples? I'd be fine even if the article would have mentioned something in the lines of "some of the business practices in his time were controversials and they were targeted by lawsuits". But we didn't even get a single word.
    I repeat, I just would have expected factual information, no dramas. The time for that kind of passion has passed as Paul's death so eloquently shows it

    Only mentioning the positive achievements of a person, while omiting other very relevant information (they did touch on Intel's deals, so why be silent about Dell, one of their key partners?) is much, much worse. It insults the reader and it shows a disconnection between the journalist and the reality they were supposed to be bound by
  • abufrejoval - Wednesday, October 4, 2017 - link

    Privately: I feel sorry for his family, especially his wife, who probably saw far too little of him, during his tenure at Intel. At 66 I feel he didn’t get the deal he deserved from his maker.

    Publicly: What he did for Intel was great for Intel. Not so sure it was that great for us, the professional and private consumers. Intel has developed a lot of great technology as well as a lot of admirable failures (i432 wasn’t the first nor the last). But it has also severely abused its power (most notoriously against AMD) and continues to do so. It is very hard to admire the CEO of a company that fights so dirty, even if many of its products are good or even great. I have always felt that they wouldn’t be very far from where they are, even if they hadn’t gone dirty.

    But that’s what I used to think about VW, too.

    Chipset graphics: I’ve always tried to keep things fair around my home-lab and had AMD/ATI and Intel/Nvidia run side by side. And I was more focused around the new features and capabilities each generation brought than on one being “better” than the other.

    But just this week I took an old Core2 low-power Mini-ITX motherboard out of the closet, because an Atom J1900 turned out too slow handling pfSense on a 300Mbit broadband connection. It features a QX9100 Quad-Core at 2.3GHz and a GM45 chipset and I installed Ubuntu 16.04 with a Cinnamon desktop on it, mostly to run it through Geekbench 4, and to gage its relative performance to a potential Goldmont upgrade (it’s 35W TDP after all, vs. 6W on 14nm Atoms, which could amount to a noticeable difference on a 24x7 appliance).

    That’s true Northbridge graphics, definitely nothing to game on, but in terms of desktop experience it was astonishingly good. With Cinnamon you can quickly switch between accelerated and pure software rendering mode without changing the visual style much and the difference was between “I’d rather prefer something better” and “not bad at all” at 1920x1200 resolution.

    (BTW: Snort and ClamAV still managed to throttle 200 out of 300Mbit/s in pfSense, twice better than the J1900, so it will go back to the closet, soon)

    It might have sent several iGPU driver developers to insanity with the bugs it had, but they had evidently covered nicely for all of them, because the end-user experience was flawless and quite acceptable for OpenOffice and Firefox or “desktop work”. Xonotic (Quake engine based ego shooter) was good enough to explore the campgrounds (“Google Earth” like, so to say), but not fit for game play.

    At the other end I have intensively compared an Kaveri A10-7850K with pretty optimal DDR3-2400 against a mobile Skylake i5-6267U (Iris 550 with 64MB EDRAM) running DDR3-1866 modules at -1333 for lack of BIOS support and found that on pretty much all benchmarks I could throw at both of them, graphics, compute or both they performed identically; except at the power socket where the notebook took 1/3 of the power.

    For all the bad press Intel graphics have received over the years, in terms of Watts per visual satisfaction I think they have done rather well: You simply can’t squeeze GDDR5X or HBM2 performance out of DDR3 at 10Watts of power when the opposition is allowed to spend 30x as much.

    But I’d also be the first to prefer that Intel had produced desktop SoCs where the majority of the silicon real-estate had gone into CPUs: There is nothing wrong with having a bit of GPU on a SoC, but I would have gladly traded most of the real-estate currently wasted on the iGPU for additional cores on my E3 Xeons desktops or even on my GTX 1080m notebook: With twice the cores there would have been enough space left for the GM45 equivalent desktop graphics where now at least 60% of the CPU die real-estate lies dormant because I use dGPUs.

    I’m now thinking about converting a cheap Lenovo Skylake i5 notebook into the pfSense appliance, because it’s easy on power and has a UPS already built-in: A similar mini-server as Xeon-D or Atom C3000 costs much more, but I would have preferred 2 extra cores or 4 extra threads instead of the iGPU power such an appliance won’t ever need.

    And I’ll probably have to run pfSense as a VM using KVM (via VirtualBox as “GUI”), because nobody sells you dual NIC laptops and pfSense doesn’t know enough USB 3 to get better than 100Mbit out of a USB3 Gigabit NIC, due to its FreeBSD 10 base: No problem reaching 950Mbit of iperf3 throughput with a venet adapter from that “stick NIC” under KVM on Linux, through…

    Perhaps there is another “fanboy” lesson here: It’s very difficult to find a better free home firewall appliance than pfSense, which unfortunately runs on a rather outdated OS base. And it’s very difficult to find a better OS than Linux, which unfortunately doesn’t always have the best applications running natively on it. But that’s no motive to start a flame war, because you can still combine them into something that just works well.

    Intel’s technical savvy expresses itself best in the fact that whatever your use case, you’ll find one of their products is at 90% of what you’d want. But that they’ll wave another product at you for 3x the price to get the last 10%. And it from what I now hear and see, that’s Otellini’s main legacy to the vast majority of us.

Log in

Don't have an account? Sign up now