Brief History of Computers
(A Look Back at Computing)
Computers have become one of the most important parts of modern society. Nearly everything that is modern required or uses computer related technology in some way. But how did computers as we know them come to exist? Did someone sitting in his lab just one day say, "Aha! I've got it! The computer?! Well, no, that is not how this happened. Rather, many years of brilliant ideas and research from many different individuals contributed to modern computing. The field is constantly evolving at a pace unlike anything before it as techniques are polished and new breakthroughs are made.
The Early days (1,000 B.C. to 1940)
Computers are named so because they make mathematical computations at fast speeds. As a result, the history of computing goes back at least 3,000 years ago, when ancient civilizations were making great strides in arithmetic and mathematics. The Greeks, Egyptians, Babylonians, Indians, Chinese, and Persians were all interested in logic and numerica l computation. The Greeks focused on geometry and rationality, the Egyptians on simple addition and subtraction, the Babylonians on multiplication and division, Indians on the base-10 decimal numbering system and concept of zero , the Chinese on trigonometry, and the Persians on algorithmic problem solving. These developments carried over into the more modern centuries, fueling advancements in areas like astronomy, chemistry, and medicine.
Pascal, Leibnitz, and Jacquard
During the first half of the 17th century there were very important advancements in the automation and simplification of arithmetic computation. John Napier invented logarithms to simplify difficult mathematical computations. The slide rule was introduced in the year 1622, and Blaise Pascal spent most of his life in the 1600's working on a calculator called the Pascaline. The Pascaline was mostly finished by 1672 and was able to do addition and subtraction by way of mechanical cogs and gears. In 1674 the German mathematician Gottfried Leibnitz created a mechanical calculator called the Leibnitz Wheel. This 'wheel' could perform addition, subtraction, multiplication, and division, albeit not very well in all instances. Neither the Pascaline nor Leibnitz wheel can be categorized as computers because they did not have memory where information could be stored and because they were not programmable. The first device that did satisfy these requirements was a loom developed in 1801 by Joseph Jacquard. Jacquard built his loom to automate the process of weaving rugs and clothing. It did this, using punched cards that told the machine what pattern to weave. Where there was a hole in the card the machine would weave and where there was no hole the machine would not weave. Jacquard's idea of punched cards was later used by computer companies like IBM to program software.
Charles Babbage was a mathematics professor at Cambridge University who was interested in automated computation. In 1823 he introduced the Difference Engine, the largest and most sophisticated mechanical calculator of his time. Along with addition, subtraction, multiplication, and division to 6 digits-- the Difference Engine could also solve polynomial equations. It was never actually completed because the British Government cut off funding for the project in 1842. After this Babbage began to draw up plans for an Analytical Machine, a general-purpose programmable computing machine. Many people consider this to be the first true computer system even though it only ever existed on paper. The Analytical Machine had all the same basic parts that modern computer systems have. While designing the Analytical, Babbage noticed that he could perfect his Difference Engine by using 8,000 parts rather than 25,000 and could solve up to 20 digits...
Please join StudyMode to read the full document