This is “The Death of Moore’s Law?”, section 4.2 from the book Getting the Most Out of Information Systems: A Manager's Guide (v. 1.0). For details on it (including licensing), click here.

For more information on the source of this book, or why it is available for free, please see the project's home page. You can browse or download additional books there. To download a .zip file containing this book to use offline, simply click here.

Has this book helped you? Consider passing it on:
Creative Commons supports free culture from music to education. Their licenses helped make this book available to you.
DonorsChoose.org helps people like you help teachers fund their classroom projects, from art supplies to books to calculators.

4.2 The Death of Moore’s Law?

Learning Objectives

After studying this section you should be able to do the following:

  1. Describe why Moore’s Law continues to advance, and discuss the physical limitations of this advancement.
  2. Name and describe various technologies that may extend the life of Moore’s Law.
  3. Discuss the limitations of each of these approaches.

Moore simply observed that we’re getting better over time at squeezing more stuff into tinier spaces. Moore’s Law is possible because the distance between the pathways inside silicon chips gets smaller with each successive generation. While chip plants (semiconductor fabrication facilities, or fabsSemiconductor fabrication facilities; the multibillion dollar plants used to manufacture semiconductors.) are incredibly expensive to build, each new generation of fabs can crank out more chips per silicon wafer. And since the pathways are closer together, electrons travel shorter distances. If electronics now travel half the distance to make a calculation, that means the chip is twice as fast.

But the shrinking can’t go on forever, and we’re already starting to see three interrelated forces—size, heat, and power—threatening to slow down the Moore’s Law gravy train. When you make processors smaller, the more tightly packed electrons will heat up a chip—so much so that unless today’s most powerful chips are cooled down, they will melt inside their packaging. To keep the fastest computers cool, most PCs, laptops, and video game consoles need fans, and most corporate data centers have elaborate and expensive air conditioning and venting systems to prevent a meltdown. A trip through the Facebook data center during its recent rise would show that the firm was a “hot” startup in more ways than one. The firm’s servers ran so hot that the Plexiglass sides of the firm’s server racks were warped and melting!E. McGirt, “Hacker, Dropout, C.E.O.,” Fast Company, May 2007. The need to cool modern data centers draws a lot of power and that costs a lot of money.

The chief eco officer at Sun Microsystems has claimed that computers draw four to five percent of the world’s power. Google’s chief technology officer has said that the firm spends more to power its servers than the cost of the servers themselves.D. Kirkpatrick, “The Greenest Computer Company under the Sun,” April 13, 2007. Microsoft, Yahoo! and Google have all built massive data centers in the Pacific Northwest, away from their corporate headquarters, specifically choosing these locations for access to cheap hydroelectric power. Google’s location in The Dalles, Oregon, is charged a cost per kilowatt hour of two cents by the local power provider, less than one-fifth of the eleven-cent rate the firm pays in Silicon Valley.S. Mehta, “Behold the Server Farm,” Fortune, August 1, 2006. Also see Chapter 10 "Software in Flux: Partly Cloudy and Sometimes Free" in this book. This difference means big savings for a firm that runs more than a million servers.

And while these powerful shrinking chips are getting hotter and more costly to cool, it’s also important to realize that chips can’t get smaller forever. At some point Moore’s Law will run into the unyielding laws of nature. While we’re not certain where these limits are, chip pathways certainly can’t be shorter than a single molecule, and the actual physical limit is likely larger than that. Get too small and a phenomenon known as quantum tunneling kicks in, and electrons start to slide off their paths. Yikes!

Buying Time

One way to overcome this problem is with multicore microprocessorsMicroprocessors with two or more (typically lower power) calculating processor cores on the same piece of silicon, made by putting two or more lower power processor cores (think of a core as the calculating part of a microprocessor) on a single chip. Philip Emma, IBM’s Manager of Systems Technology and Microarchitecture, offers an analogy. Think of the traditional fast, hot, single-core processors as a three hundred-pound lineman, and a dual-core processor as two 160-pound guys. Says Emma, “A 300-pound lineman can generate a lot of power, but two 160-pound guys can do the same work with less overall effort.”Adam Ashton, “More Life for Moore’s Law,” BusinessWeek, June 20, 2005. For many applications, the multicore chips will outperform a single speedy chip, while running cooler and drawing less power. Multicore processors are now mainstream.

By 2007, most PCs and laptops sold had at least a two-core (dual-core) processor. The Microsoft XBox 360 has three cores. The PlayStation 3 includes the so-called cell processor developed by Sony, IBM, and Toshiba that runs nine cores. By 2008, quad-core processors were common in high-end desktops and low-end servers. By 2010, AMD plans a 12-core PC chip. Intel has even demonstrated chips with upward of 48 cores.

Multicore processors can run older software written for single-brain chips. But they usually do this by using only one core at a time. To reuse the metaphor above, this is like having one of our 160-pound workers lift away, while the other one stands around watching. Multicore operating systems can help achieve some performance gains. Versions of Windows or the Mac OS that are aware of multicore processors can assign one program to run on one core, while a second application is assigned to the next core. But, in order to take full advantage of multicore chips, applications need to be rewritten to split up tasks so that smaller portions of a problem are executed simultaneously inside each core.

Writing code for this “divide and conquer” approach is not trivial. In fact, developing software for multicore systems is described by Shahrokh Daijavad, software lead for next-generation computing systems at IBM, as “one of the hardest things you learn in computer science.”Adam Ashton, “More Life for Moore’s Law,” BusinessWeek, June 20, 2005. Microsoft’s chief research and strategy officer has called coding for these chips “the most conceptually different [change] in the history of modern computing.”M. Copeland, “A Chip Too Far?” Fortune, September 1, 2008. Despite this challenge, some of the most aggressive adaptors of multicore chips have been video game console manufacturers. Video game applications are particularly well-suited for multiple cores since, for example, one core might be used to render the background, another to draw objects, another for the “physics engine” that moves the objects around, and yet another to handle Internet communications for multiplayer games.

Another approach to breathing life into Moore’s Law is referred to as stacked or three-dimensional semiconductorsSemiconductors that are manufactured as a stack of multiple, interconnected layers instead of in one flat plane.. In this approach, engineers slice a flat chip into pieces, then reconnect the pieces vertically, making a sort of “silicon sandwich.” The chips are both faster and cooler since electrons travel shorter distances. What was once an end-to-end trip on a conventional chip might just be a tiny movement up or down on a stacked chip. But stacked chips present their own challenges. In the same way that a skyscraper is more difficult and costly to design and build than a ranch house, 3D semiconductors are tougher to design and manufacture. IBM has developed stacked chips for mobile phones, claiming the technique improves power efficiency by up to 40 percent.

Quantum Leaps, Chicken Feathers, and the Indium Gallium Arsenide Valley?

Think about it—the triple threat of size, heat, and power means that Moore’s Law, perhaps the greatest economic gravy train in history, will likely come to a grinding halt in your lifetime. Mutlicore and 3D semiconductors are here today, but what else is happening to help stave off the death of Moore’s Law?

Every once in a while a material breakthrough comes along that improves chip performance. A few years back researchers discovered that replacing a chip’s aluminum components with copper could increase speeds up to 30 percent. Now scientists are concentrating on improving the very semiconductor material that chips are made of. While the silicon used in chips is wonderfully abundant (it has pretty much the same chemistry found in sand), researchers are investigating other materials that might allow for chips with even tighter component densities. Researchers have demonstrated that chips made with supergeeky-sounding semiconductor materials such as indium gallium arsenide, indium aluminum arsenide, germanium, and bismuth telluride can run faster and require less wattage than their silicon counterparts.Y. L. Chen, J. G. Analytis, J.-H. Chu, Z. K. Liu, S.-K. Mo, X. L. Qi, H. J. Zhang, et al., “Experimental Realization of a Three-Dimensional Topological Insulator, Bi2Te3,” Science 325, no. 5937 (July 10, 2009): 178–81; Kate Greene, “Intel Looks Beyond Silicon,” Technology Review, December 11, 2007; and A. Cane, “A Valley By Any Other Name…” Financial Times, December 11, 2006. Perhaps even more exotic (and downright bizarre), researchers at the University of Delaware have experimented with a faster-than-silicon material derived from chicken feathers! Hyperefficient chips of the future may also be made out of carbon nanotubes, once the technology to assemble the tiny structures becomes commercially viable.

Other designs move away from electricity over silicon. Optical computing, where signals are sent via light rather than electricity, promises to be faster than conventional chips, if lasers can be mass produced in miniature (silicon laser experiments show promise). Others are experimenting by crafting computing components using biological material (think a DNA-based storage device).

One yet-to-be-proven technology that could blow the lid off what’s possible today is quantum computing. Conventional computing stores data as a combination of bits, where a bit is either a one or a zero. Quantum computers, leveraging principles of quantum physics, employ qubits that can be both one and zero at the same time. Add a bit to a conventional computer’s memory and you double its capacity. Add a bit to a quantum computer and its capacity increases exponentially. For comparison, consider that a computer model of serotonin, a molecule vital to regulating the human central nervous system, would require 1094 bytes of information. Unfortunately there’s not enough matter in the universe to build a computer that big. But modeling a serotonin molecule using quantum computing would take just 424 qubits.Paul Kaihla, “Quantum Leap,” Business 2.0, August 1, 2004.

Some speculate that quantum computers could one day allow pharmaceutical companies to create hyperdetailed representations of the human body that reveal drug side effects before they’re even tested on humans. Quantum computing might also accurately predict the weather months in advance or offer unbreakable computer security. Ever have trouble placing a name with a face? A quantum computer linked to a camera (in your sunglasses, for example) could recognize the faces of anyone you’ve met and give you a heads up to their name and background.Peter Schwartz, Chris Taylor, and Rita Koselka, “The Future of Computing: Quantum Leap,” Fortune, August 2, 2006. Opportunities abound. Of course, before quantum computing can be commercialized, researchers need to harness the freaky properties of quantum physics wherein your answer may reside in another universe, or could disappear if observed (Einstein himself referred to certain behaviors in quantum physics as “spooky action at a distance”).

Pioneers in quantum computing include IBM, HP, NEC, and a Canadian startup named D-Wave. At a 2007 at the Computer History Museum in Mountain View, California, D-Wave demonstrated Orion, a sixteen-qubit computer that could find a protein in a database, figure out the optimal wedding guest seating arrangements, and solve a Sudoku puzzle. Scientific opinion varied widely as to the significance of the D-Wave advance. The Orion was built using a chip cooled to minus 273 degrees Celsius in a bath of liquid helium and tasks were performed at speeds about 100 times slower than conventional PCs. Not exactly commercial stuff. But it was the most advanced quantum computing demonstration to date. If or when quantum computing becomes a reality is still unknown, but the promise exists that while Moore’s Law may run into limits imposed by Mother Nature, a new way of computing may blow past anything we can do with silicon, continuing to make possible the once impossible.

Key Takeaways

  • As chips get smaller and more powerful, they get hotter and present power-management challenges. And at some point Moore’s Law will stop because we will not longer be able to shrink the spaces between components on a chip.
  • Multicore chips use two or more low-power calculating “cores” to work together in unison, but to take optimal advantage of multicore chips, software must be rewritten to “divide” a task among multiple cores.
  • 3D or Stackable semiconductors can make chips faster and run cooler by shortening distances between components, but these chips are harder to design and manufacture.
  • New materials may extend the life of Moore’s Law, allowing chips to get smaller, still. Entirely new methods for calculating, such as quantum computing, may also dramatically increase computing capabilities far beyond what is available today.

Questions and Exercises

  1. What three interrelated forces threaten to slow the advancement of Moore’s Law?
  2. Which commercial solutions, described in the section above, are currently being used to counteract the forces mentioned above? How do these solutions work? What are the limitations of each?
  3. Will multicore chips run software designed for single-core processors?
  4. As chips grow smaller they generate increasing amounts of heat that needs to be dissipated. Why is keeping systems cool such a challenge? What are the implications for a firm like Google or Yahoo!? For a firm like Apple or Dell?
  5. What are some of the materials that may replace the silicon that current chips are made of?
  6. What kinds of problems might be solved if the promise of quantum computing is achieved? How might individuals and organizations leverage quantum computing? What sorts of challenges could arise from the widespread availability of such powerful computing technology?