J. David Hunger
In 1968, Robert N. Noyce, the co-inventor of the integrated circuit, and Gordon E. Moore left Fairchild Semiconductor International to form a new company. They took with them a young chemical engineer, Andrew Grove, and called the new firm Intel, short for integrated electronics. The company successfully made money by manufacturing computer memory modules. The company produced the first microprocessor (also called a “chip”) in 1971. A key turning point for the new company was IBM’s decision in the early 1980s to select Intel’s processors to run IBM’s new line of personal computers. Today, more than 80% of the world’s PCs run on Intel microprocessors.
One of the company’s early innovations was centralizing its manufacturing in giant chip fabrication plants. This allowed Intel to make chips at a lower cost than its competitors who made custom chips in small factories. The founders encouraged a corporate culture of “disagree and commit” in which engineers were encouraged to constantly think of new ways of doing things faster, cheaper, and more reliably.
Massive investment by Japanese competitors in the late 1970s led to falling prices in computer memory modules. Faced with possible bankruptcy, CEO Moore, with Grove as his second in command (Noyce had retired from active management), made the strategic decision in 1985 to abandon the computer memory business to focus on microprocessors. Projected growth in microprocessors was based on Moore’s prediction that the number of transistors on a chip would double every 24 months. In what was soon called “Moore’s Law,” Gordon Moore argued that microprocessor technology would improve exponentially, regardless of the state of the economy, the industry, or any one company. Thus, a company had to be at the cusp of innovation or risk falling behind. According to Moore, “If you lag behind your competition by a generation, you don’t just fall behind in chip performance, you get undercut in cost.”
______________________________________________________________________________ This case was prepared by Professor J. David Hunger, Iowa State University and St. John’s University. Copyright © 2006 by J. David Hunger. The copyright holder is solely responsible for case content. Reprint permission is solely granted to the publisher, Prentice-Hall, for the books Strategic Management and Business Policy–11th Edition (and the International version of this book) and Cases in Strategic Management and Business Policy–11th Edition, by the copyright holder, J. David Hunger. Any other publication of the case (translation, any form of electronics or other media) or sale (any form of partnership) to another publisher will be in violation of copyright law, unless J. David Hunger has granted an additional written permission. Sources available upon request. Reprinted by permission.
To raise money, Intel’s management agreed to sell 12% of the company’s stock to IBM for $250 million, a stake it later repurchased. Moore’s Law soon became part of the corporate culture as a fundamental expectation of all employees. Andy Grove replaced Gordon Moore as Intel’s CEO in 1987. Moore continued to serve on Intel’s board of directors until 2001. During Grove’s tenure as CEO from 1987 to 1998, Intel’s stock price rose 31.6% annually and revenues grew from $1.9 billion to $25.1 billion. With 55% of its sales coming from outside the United States, Intel was transformed into a global corporation. The company became central to the growth of personal computers, cell phones, genomic research, and computer-aided design.
Strategic Decisions Lead to Market Dominance
IN ORDER TO SUCCEED IN THIS HIGH-TECH BUSINESS, MANAGEMENT WAS FORCED TO MAKE A NUMBER OF RISKY STRATEGIC DECISIONS. FOR EXAMPLE, INTEL’S BOARD OF DIRECTORS FOUND IT DIFFICULT TO VOTE FOR A PROPOSAL IN THE EARLY 1990S TO COMMIT $5 BILLION TO MAKING THE PENTIUM MICROPROCESSOR...