The History of the Computer
Long before computers were invented, humans realized the need for them. The history of the Computer started about 2000 years ago with the abacus. It is a wooden rack holding two horizontal wires with beads strung on them and was one of the best calculating machines until the seventeenth century (PBS, 1). In 1835, English inventor, Charles Babbage came up with the idea of the Analytical Machine, a general purpose, fully programmed-controlled, automatic mechanical digital computer, which consisted of two parts, a calculating section and a storage section. His machine was capable of reading the punched holes in cards, just as the loom did (Campbell-Kelly, 15-17).
In 1890 Herman Hollerith and James Powers, who worked for the U.S. Census Bureau, developed devices that could read the information that had been punched into cards. The information was recorded by a machine equipped with many metal pins that poked through the punched holes but stopped where no holes existed. The 1890 census was completed in one-third the time taken in 1880, reading errors were reduced and work flow increased (Campbell-Kelly, 20-21). These advantages were seen by commercial companies and soon led to the development of improved punch-card using computers created by IBM and Remington. These computers used electromechanical devices in which electric power provided mechanical motion. They could be fed a specified number of cards automatically, add, multiply, and divide, and feed out cards with punched results. For more than 50 years after their first use, punched cards machines did most of the world's first business computing, and a considerable amount of the computing work in science (Ceruzzi, 16-17).
The start of World War II produced a large need for computer capacity, especially for the military. New weapons were made for which trajectory tables and other essential data were needed. Two men, John W. Mauchly and J. Presper Eckert, Jr., built a...
Please join StudyMode to read the full document