The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880 the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms. Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers. 1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, The world’s first computer was actually built. 1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM (IBM was founded in 1911). 1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts. 1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory. 1943-1944: Two University of Pennsylvania professors—John Mauchly and J. Presper Eckert—build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20 foot by 40 foot room and has 18,000 vacuum tubes. 1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications. 1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Inventor Thomas Johnson Watson, Jr., son of IBM CEO Thomas Johnson Watson, Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war. 1954: The FORTRAN programming language is born.
1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. 1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public. 1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip. 1971: Alan Shugart leads a team of IBM engineers who invent the “floppy disk,” allowing data to be shared among computers. 1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware. 1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, RadioShack’s TRS-80—affectionately known as the “Trash 80,” and the Commodore PET. 1975: The IBM 5100 becomes the first commercially available portable computer. 1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool’s Day and roll out the Apple I, the first computer with a single-circuit board. 1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished. 1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage. 1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program. 1979: Word...
Please join StudyMode to read the full document