CHAPTER 2: Why single core CPUs are no longer "cool"

The end of the single core CPU?

Right now, the leakage problem is by far the most urgent problem. As profit margins are low and cost is, in most cases, the decisive factor for consumers, expensive cooling systems are not practical.

Past experience has shown that complex superscalar CPUs need about twice as many transistors to achieve +/- 40% better performance. The conclusion of many industry analysts and researchers is that the single-core CPU has no future. I quote Shekhar Borkar, Intel Fellow, Director:

"Multiprocessing, on the other hand, has potential to provide near linear performance improvement. Two smaller processors, instead of a large monolithic processor, can potentially provide 70-80% more performance, compare this to only 40% from the large monolithic processor."

Note the word "monolithic", a word with a rather pejorative meaning, which insinuates that the current single core CPUs are based on old technology. So, basically the single core CPU has no future as it improves performance only by 40%, while doubling complexity and thus leakage. This reasoning explains why all of sudden Intel marketing does not talk anymore about 10 GHz CPUs, but about the "era of thread parallelism".

It should be noticed though that the 40% better performance of the "monolithic CPU" is achieved across a wide variety of applications, without the need of time-consuming software optimizations. The promised 70% to 80% of the multithreaded CPU can only be easily achieved in a small range of applications, while the other applications will see exponential investments in development time to achieve the same performance increase.

Of course, we agree that multiprocessors have benefits. It is easier to turn off a complete CPU than to manage the energy consumption of the different parts of one big CPU.

And you can run a single-threaded application on CPU1, and turn the CPU2 off. When CPU1 gets almost hot, you let CPU2 continue to do the work. As a result, you reduce the average temperature of one CPU core. As leakage decreases with lower die temperatures, this technique can reduce overall leakage power. The objective of using a dual core CPU is then primarily to reduce power consumption in situations where there is only one CPU intensive application. This is probably the reason why Intel sees a great future for dual core CPUs in the mobile market, although the mobile market is probably the last market where we will be able to benefit from dual core power. The last thing that you want is twice as much power dissipation because the two cores get active. In our humble opinion, dual core will be only dual when it is not working on battery power.

Trendy

The second argument used by the people who are hyping the multithreaded CPU is "the whole industry is moving towards multi-core CPUs". Considering that the server is the only market where non x86 CPUs play an important role, it is not very surprising. For companies such as SUN and IBM, it is only natural to ignore single-threaded performance somewhat and to invest as much time as they can in designs that can work with as many threads as possible. The software that runs on these SUN and IBM machines, Massive OLTP databases and HPC applications, are multi-threaded by nature.

SUN's Niagra CPU can run 32 threads at once, but it will not be the kind of CPU that you would like in your desktop. Single threaded performance is most likely at the level of one of the early PIIIs. Sun's own demo [6] shows a Niagra to be more than 4 times slower in a single-threaded application than an unknown single-threaded CPU, which is, hopefully for SUN, one of the current top CPUs.

Delving deeper

So, while there are definite advantages to CPUs that exploit Thread Level Parallelism, if we want to understand what is really going on, we need to delve a little deeper. First, we look if leakage can really kill all progress of "monolithic" single core CPUs; secondly, we will study the prime example of a "classic" single core CPU that had crushed into a wall of leakage: the Intel Prescott.


CHAPTER 1 (con't) CHAPTER 3: Containing the epidemic problems
Comments Locked

65 Comments

View All Comments

  • stephenbrooks - Wednesday, February 9, 2005 - link

    #28 - that's interesting. I was thinking myself just a few days ago "I wonder if those wires go the long way on a rectangular grid or do they go diagonally?" Looks like there's still room for improvement.
  • Chuckles - Wednesday, February 9, 2005 - link

    The word comes from Latin. "mono" meaning one, "lithic" meaning stone. So monolithic refers to the fact that it is a single cohesive unit.
    The reason you associate "lithic" with old is only due to the fact that anthropologists use Paleolithic and Neolithic to describe time periods in human history in the Stone Age. The words translate as "old stone" and "new stone" respectively.
    I have seen plenty of monolithic benches around here. Heck, a slab granite countertop qualifies as a monolith.
  • theOracle - Wednesday, February 9, 2005 - link

    Very good article - looks like a university paper with all the references etc! Looking forward to part two.

    Re "monolithic", granted the word doesn't mean old but anything '-lithic' instantly makes me think ancient (think neolithic etc). -lithic means a period in stone use by humans, and a monolith is a (usually ancient) stone monument; I think its fair to say Intel were trying to make the audience think 'old technology'.
  • DavidMcCraw - Wednesday, February 9, 2005 - link

    Great article, but this isn't accurate:

    "Note the word "monolithic", a word with a rather pejorative meaning, which insinuates that the current single core CPUs are based on old technology."

    Neither the dictionary nor technical meanings of monolithic imply 'old technology'. Rather, it simply refers to the fact that the single-core CPU being referred to is as large as the two smaller chips, but is in one part.

    In the context of OS kernel architectures, the Linux kernel is a good example of monolithic technology... but I doubt many people consider it old tech!
  • IceWindius - Wednesday, February 9, 2005 - link

    Even this articles makes my head hurt, so much about CPU's is hard to understand and grasp. I wish I kneow how those CPU engineers do this for a living.

    I wish someone like Arstechinca would make something really built ground up like CPU's for morons so I could start understanding this stuff better.
  • JohanAnandtech - Wednesday, February 9, 2005 - link

    Jason and Anand have promised me (building some pressure ;-) a threaded comment system so I can answer more personally. Until then:

    1. Thanks for all the encouraging comments. It really gives a warm feeling to read them, and it is basically the most important motivation for writing more

    2. Slashbin (27): Typo. just typed with a small period of insanity. Voltage of course, fixed

    3. CSMR: the SPEC numbers of intel are artificially high, as they have been spending more and more time on aggressive compiler optimisations. All other benchmarks clearly show the slowdown.
  • CSMR - Tuesday, February 8, 2005 - link

    Excellent article. Couple of odd things you might want to amend in chapter one: "CPUs run 40 to 60% faster each year" contradicts the previous discussion about slowed CPU speed increases. Also power formula explanation on the same page doesn't really make sense as pointed out by #27.
  • Doormat - Tuesday, February 8, 2005 - link

    Good article. The only real thing I wanted to bring up was something called the "X Consortium". I wrote a paper in my solid state circuit design class a few years ago. Basically instead of having all the interconnects within a chip laid out in a grid-like fashion, it allows them to be diagonal (and thus, a savings of, at most, 29% - for the math impaired it could be at most 1/sqrt(2)). Perhaps the tools arent there or its too patent encumbered. If interconnects are really an issue then they should move to this diagonal interconnect technology. I actually dont think they are a very pressing need right now - leakage current is the most pressing issue. The move to copper interconnects a while ago helped (increased conductivity over aluminum, smaller die sizes mean shorter distances to traverse, typically).

    It will be very interesting to see what IBM does with their Cell chips and SOI (and what clock speed AMD releases their next A64/Opteron chips at since they've teamed with IBM). If indeed these cell chips run at 4GHz and dont have leakage current issues then there is a good chance that issue is mostly remedied (for now at least).
  • slashbinslashbash - Tuesday, February 8, 2005 - link

    " In other words, dissipated power is linear with the e ffective capacitance, activity and frequency. Power increases quadratically with frequency or clock speed." (Page 2)

    Typo there? Frequency can't be both linear and quadratic..... from the equation itself, it looks like voltage is quadratic. (assuming the V is voltage)
  • AnnoyedGrunt - Tuesday, February 8, 2005 - link

    And of course I meant to refer to post 23 above.
    -D!

Log in

Don't have an account? Sign up now