Much has been written about the impending demise of Moore’s Law, the exponential scaling of transistor density on integrated circuit (IC) chips. When pondering the consequences of this upcoming shift, it is appropriate to take a moment to note the passing of Andrew Grove on March 21, 2016 at the age of 79. Grove was the CEO of Intel Corporation from 1987 to 1998, following Gordon Moore, and was then its Chair until 2005. As much as anyone, Grove was responsible for transforming Moore’s Law and the information revolution into realities.

Andy Grove had a remarkable—many would say astonishing—personal story. Escaping the World War II Holocaust and the political upheaval in Hungary 11 years later, he fled to the United States as a penniless immigrant, unable to speak English and suffering from disastrous hearing loss due to a childhood illness. He managed to graduate from the City College of New York and move on to the University of California at Berkeley, where he earned a PhD in chemical engineering in 1963. He joined Fairchild Semiconductor shortly thereafter and helped to found Intel in 1968.

Grove was clearly a formidable technical talent, but he was better known for his approach to management, which he described in a 1996 book titled Only the Paranoid Survive [1]. His approach was intense personal confrontation, which, on the receiving end, was like being hit on the head with a board, according to his successor as CEO, Craig Barrett. Any number of management consultants/psychologists would probably say that this approach was excessive, but it helped to bring Intel back from the brink of more than one disaster and make it one of the most successful companies in history, as the dominant supplier of microprocessor chips to the computer industry.

Being successful did not always equate with being right. When a new Pentium microprocessor was introduced in 1994, some users noticed that it had a processing flaw, which Grove dismissed as unimportant to all but the most sophisticated users, thinking that low-end users would not be concerned. Instead, there was a firestorm of user complaints and he and Intel had to backtrack and spend hundreds of millions of dollars to correct the flaw. As suggested in a New York Times story on his passing by Jonathan Kandell [2], however, this situation amounted to snatching victory from the jaws of defeat. The Pentium computing engines, which had previously been buried in brand-name computers, were such a continuously prominent item in the press during the long repair saga that they became as well-known a brand as the computers that contained them.

By any measure, fantastically dense and capable ICs have had a dominant role in creating the technology landscape we sometimes take for granted today. From the early days when computer companies (largely unsuccessfully) tried to convince housewives that they should buy home computers to store their food recipes, to the first irresistible “App” of email, and on to the components of the World Wide Web, smartphones, and embedded processors, amazing advances occurred as Moore’s Law was doggedly pursued and ICs progressed. For these advances, we can thank people such as Andy Grove.

Grove’s approach to business, typified by his “Only the Paranoid Survive” mantra, was to be constantly worried that a new technology and/or new business approach would suddenly arrive to destroy what had seemed an invincible business. Trying to anticipate and get ahead of such events was his daily challenge. For today’s semiconductor companies, this challenge is intense, to say the least. The “business as usual” model of following Moore’s Law to its end is about to end. Moore’s Law is likely to reach its physical and/or practical manufacturing limit within the next decade or so, and semiconductor companies must consider how to reinvent themselves.

Moore’s Law was articulated by Gordon Moore in 1965 and modified slightly in 1970 to postulate that the power of IC chips would double and their price drop by half about every two years, obsoleting earlier generations of chips. This created a virtuous cycle, as every time switch size scaled down, chip performance would automatically improve. However, when 90 nm chips were first achieved in the early 2000s, removing the heat created during operation became a major problem. Clock speed, the rate at which computations are executed, was capped and microprocessors were constructed with multiple cores (e.g., 2, 4, 8…), with the idea being that four cores operating at 250 MHz were as fast as one core operating at 1 GHz, provided that a problem could be executed in parallel parts. Although these measures allowed chip dimensions to continue to scale, heat remains a limiting problem. Research plans for scaling to 5 nm chips, which are projected for 2020-2021, have already been postulated. However, further reduction to the probable physical limit of 2-3 nm will require heroic effort, if it is achieved at all. Key components of IC technology and the challenges of advancing Moore’s Law much further were well described recently in Nature by M. Mitchell Waldrop [3].

As line dimensions shrank, the precision required to make the chips became daunting and the array of instruments to make the chips and the costs of these instruments became staggering. A modern fab line costs several billion dollars to achieve the scale necessary for the economical manufacturing of large numbers of chips. In addition, the many necessary materials and instruments, which come from multiple suppliers, must be compatible with one another. The evolution of these conditions led to the creation of the first technology roadmap for semiconductors in 1993 by members of the US Semiconductors Industry Association (SIA). This roadmap allowed participants in the industry to establish research plans to evolve their particular technology so that it would remain commercially viable by being compatible with industry needs. In 1998, the SIA included European, Japanese, Korean, and other counterparts and created the first global roadmap, the International Technology Roadmap for Semiconductors (ITRS)[4].

It is fair to say that the semiconductor revolution would never have occurred in such a relentless and efficient fashion if not for the coordination of experts’ projections that was provided by the roadmaps.

Although the technology roadmaps are daunting for anyone but specialists to understand, they involve the evolution of a formidable array of interdependent parts. These range from crystal growth furnaces to giant silicon single-crystal boules; from the equipment to cut and polish wafers to photolithography chemicals and steppers to pattern the chips; from etchers, to robotic equipment to handle wafers and chips, to test equipment to verify chip performance; and so on . In an earlier issue of this journal [5], for example, Hailing Tu notes that the next step in Si single-crystal technology necessary to move to 10 nm line widths is to move from 300 mm to 450 mm diameter single crystals, that is, from boules like sturdy tree trunks to boules like much bigger tree trunks. Handling such large boules, while insuring low defect density and the flatness of cut wafers, is a challenging task.

The technology roadmaps were typically updated in even years and revised in odd years. The last ITRS was issued in 2013 and the next one is due soon; however, this one represents the end of a line, as it will be the first not to focus on Moore’s Law. In addition, rather than contributing to yet another ITRS, the SIA will generate their own research plan that will emphasize special chips for special applications, not denser chips.

A wide array of research approaches is being investigated in order to try to move beyond Moore’s Law, including efforts by the dominant semiconductor manufacturers. However, the business landscape is littered with the remains of lifeless technology companies that did not make the necessary transitions to survive. As Andy Grove himself noted, it was difficult to shake an intense focus on memory chips in order to transition Intel into becoming the dominant player in microprocessor chips—a shift that made the company fabulously successful. The transition to “beyond Moore” may be even more difficult for today’s semiconductor manufacturers. The grave danger for them is that they will try to stay in their comfort zones for too long, wringing the last dollar of profit from their multi-billion dollar investments in IC fab lines, while someone else successfully embarks on a radical new computing technology or business model that makes such fab lines obsolete, just as transistors and integrated circuits made vacuum tubes obsolete.