History of Internet

Only available on StudyMode
  • Download(s) : 18
  • Published : March 4, 2013
Open Document
Text Preview



Steps Toward Modern Computing 31 First Steps: Calculators 31 The Technological Edge: Electronics 31 Putting It All Together: The ENIAC 36 The Stored-Program Concept 36 The Computer’s Family Tree 37 The First Generation (1950s) 37 The Second Generation (Early 1960s) 38 The Third Generation (Mid-1960s to Mid-1970s) 39 The Fourth Generation (1975 to the Present) 41 A Fifth Generation? 44 The Internet Revolution 45 Lessons Learned 48

After reading this module, you will be able to: 1. Define the term “electronics” and describe some early electronic devices that helped launch the computer industry. 2. Discuss the role that the stored-program concept played in launching the commercial computer industry. 3. List the four generations of computer technology. 4. Identify the key innovations that characterize each generation. 5. Explain how networking technology and the Internet has changed our world. 6. Discuss the lessons that can be learned from studying the computer’s history.

Module 1B

History of Computers and the Internet


What would the world be like if the British had lost to Napoleon in the battle of Waterloo, or if the Japanese had won World War II? In The Difference Engine, authors William Gibson and Bruce Sterling ask a similar question: What would have happened if nineteenth-century inventor Charles Babbage had succeeded in creating the world’s first automatic computer? (Babbage had the right idea, but the technology of his time wasn’t up to the task.) Here is Gibson and Sterling’s answer: with the aid of powerful computers, Britain becomes the world’s first technological superpower. Its first foreign adventure is to intervene in the American Civil War on the side of the U.S. South, which splits the United States into four feuding republics. By the mid-1800s, the world is trying to cope with the multiple afflictions of the twentieth century: credit cards, armored tanks, and fast-food restaurants. Alternative histories are fun, but history is serious business. Ideally, we would like to learn from the past. Not only do historians urge us to study history, but computer industry executives also say that knowledge of the computer’s history gives them an enormous advantage. In its successes and failures, the computer industry has learned many important lessons, and industry executives take these to heart. Although the history of analog computers is interesting in its own right, this module examines the chain of events that led to today’s digital computers. You’ll begin by looking at the computing equivalent of ancient history, including the first mechanical calculators and their huge, electromechanical offshoots that were created at the beginning of World War II. Next, you’ll examine the technology—electronics—that made today’s computers possible, beginning with what is generally regarded to be the first successful electronic computer, the ENIAC of the late 1940s. You’ll then examine the subsequent history of electronic digital computers, divided into four “generations” of distinctive—and improving—technology. The module concludes by examining the history of the Internet and the rise of electronic commerce.

Today’s electronic computers are recent inventions, stemming from work that began during World War II. Yet the most basic idea of computing—the notion of representing data in a physical object of some kind, and getting a result by manipulating the object in some way—is very old. In fact, it may be as old as humanity itself. Throughout the ancient world, people used devices such as notched bones, knotted twine, and the abacus to represent data and perform various sorts of calculations (see Figure 1B.1).

First Steps: Calculators
During the sixteenth and seventeenth centuries, European mathematicians developed a series of calculators that used clockwork mechanisms and cranks (see...
tracking img