Hitting switches

The transistor, invented in 1948 by Bell Laboratories is what brought about the ability for us to have these extremely complicated computers in very small packages.  The transistor is nothing more than a switch whose status (or “position” in compliance with the switch analogy) is determined by the electricity that is fed to it.  The beauty of the transistor however is that it features no moving parts, allowing it to be very small while still performing its duty.  Combining a large number of these transistors using a material that only conducts electricity under certain conditions (determined by temperature) you have what is called an integrated circuit which was first introduced just over 10 years after the invention of the transistor. 

Millions upon millions of these transistors came together in a special form of an integrated circuit, one that is not only very tiny but programmable as well.  This was the birth of the first microprocessor, the 4004 from Intel. 

The Pentium 4 and Athlon processors of today follow this same basic foundation; they are nothing more than extremely complicated descendants of the 4004.  Part of the evolutionary process has been the constant decrease of the size of these transistors.  This size is what we refer to when we say that the Pentium 4 is built on a 0.18-micron process, the 0.18-micron measurement is in reference to the size of the circuit. 

As CPUs get more and more complex their designs require more and more transistors.  However, if all other variables remain constant, we’d have processors with dies that can be measured in feet, not millimeters.  In order to keep heat production low and yields high, these transistors must get smaller as processors get faster and more complex. 


To the right you can see the actual Pentium 4 core. Kind of big isn't it?

For example, the first x86 processor, the 8088 featured 29,000 transistors with a 3-micron circuit size on a 33mm^2 die.  In comparison, the Pentium 4 features 42 million transistors with a 0.18-micron circuit size on a 217mm^2 die.  The Pentium 4’s die (surface area of the core itself) is only about 6 times as large as that of the 8080 while it has over 1400 times the number of transistors.  Can you imagine how big of a die the Pentium 4 would have if it were made on the same 3-micron process as the 8088? 

The answer lies in history Tying it all together
Comments Locked

14 Comments

View All Comments

  • SlyNine - Wednesday, August 18, 2010 - link

    It'll be much much longer then we all thought. :P
  • cdurkinz - Monday, June 29, 2020 - link

    You had no idea.... Just checking in, another decade on from when this article released! ;)
  • karasaj - Wednesday, June 20, 2012 - link

    They're 7 years overdue! :)

    History really is interesting.
  • Shahnewaz - Sunday, April 12, 2015 - link

    It has been 10 years and the only processor even remotely close enough to 10GHz is an AMD FX-9590@5GHz.
    No, you're not realistically speaking. At least not Intel.
  • name99 - Monday, February 29, 2016 - link

    IBM z12 clocked at 5.5GHz, and IBM has claimed POWER8 runs at 5GHz (though I don't know if they've ever sold those on the open market).

    Back in the day (2007) POWER6 WAS sold at 5GHz, and IBM claimed they had versions running at 6GHz (which they may well have sold not on the open market).
  • NJCompguy - Monday, February 29, 2016 - link

    15 years later, we can now have facial recognition on a Surface Pro 4 to log in! Yay for the fast pace!! lol
  • name99 - Monday, February 29, 2016 - link

    "These are things that Intel is claiming will be possible by 2005 with the type of processors that will be available in desktop systems.... Intel is working very hard in developing the software that will help make these visions a reality. "

    Let's all remember this next time Intel predicts something, anything. Intel has three skills
    - process/manufacturing
    - circuit design
    - micro-architecture design.
    Unfortunately NOT on that list are things like
    - software design
    - ISA design
    - vision for the future, and prediction

    Which means you're going to be in a bubble if you live in the Intel world. That was obvious here with the absolute lack of mention of any other manufacturer (TSMC was 13 yrs old in 2000), and the lack of mention of other uses of CPUs (Apple Newton was 7 yrs old in 2000). Instead of asking what better processes might enable in less powerful machines, all we get is the question "how do we do more of the same?" The question to ask, usually, should NOT be "what do I do with a 10x faster processor" but "what do I do with a 100x CHEAPER" processor" or what do I do with a "100x lower power processor?"
    The post-iPhone revolution has broken through this bubble in some respects, but not all. Almost everyone is willing to concede that CPUs in cell-phone are important, interesting, and worth following. But we get the same blindness when it comes to the next shrink in size, whether its smartwatches or IoT. And we get an absolute blindness when it comes to the idea of substantially restructured OSs, substantially restructured languages (and development paradigms) --- apparently we're going to be using UNIX-like OS's and C/C++ for the next hundred years...
  • Dr AB - Saturday, May 9, 2020 - link

    Yes I agree .. it seems like they were totally limited in thinking because of living entirely in "intel world".
    For future yes thats what happening everyone is just following the "trends", too scared to do something of the box or taking an entirely different approach. In smartphones yes every year or so there are only performance/efficiency improvements that look so negligible in "real world" scenarios. More like 50 years from now nd looking back at the current era, feeling would be the same : "How the heck we are still stuck in the same ancient technology introduced years ago nd only recently has been implemented in a productive way." Thats what happen when I read some article from 20 years back.
  • zerghumper - Wednesday, February 8, 2017 - link

    SlyNine,

    Much. Much longer. :(
  • PanZhang - Thursday, February 27, 2020 - link

    It takes me 15 years to realize that a dream may never come ture.

Log in

Don't have an account? Sign up now