The computers that you see and use today hasn't come off by any inventor at one go. Rather it took centuries of rigorous research work to reach the present stage. And scientists are still working hard to make it better and better. But that is a different story.

First, let us see when the very idea of computing with a machine or device, as against the conventional manual calculation, was given a shape.

Though experiments were going on even earlier, it dates back to the 17th century when the first such successful device came into being. Edmund Gunter, an English mathematician, is credited with its development in 1620. Yet it was too primitive to be recognized even as the forefather of computers. The first mechanical digital calculating machine was built in 1642 by the French scientist-philosopher Blaise Pascal. And since then the ideas and inventions of many mathematicians, scientists, and engineers paved the way for the development of the modern computer in following years.

But the world has had to wait for yet another couple of centuries to reach the next milestone in developing a computer. Then it was the English mathematician and inventor Charles Babbage who did the wonder with his works during 1830s. In fact, he was the first to work on a machine that can use and store values of large mathematical tables. The most important thing of this machine is its use in recording electric impulses, coded in the very simple binary system, with the help of only two kinds of symbols. This is quite a big leap closer to the basics on which computers today work. However, there was yet a long way to go. And, compared to present day computers, Babbage's machine could be regarded as more of high-speed counting devices. For, they could only work on numbers alone!

The Boolean algebra developed in the 19th century removed the numbers-alone limitation for these counting devices. This technique of mathematics, invented by Boole, helped correlate the binary digits with our language. For instance, the values of 0s are related with false statements and 1s with the true ones. British mathematician Alan Turing made further progress with the help of his theory of a computing model. Meanwhile the technological advancements of the 1930s helped much in furthering the advancement of computing devices.

But the direct forefathers of present-day computer systems evolved in about 1940s. The Harvard Mark 1 Computer designed by Howard Aiken is the world's first digital computer which made use of electro-mechanical devices. It was developed jointly by the International Business Machines (IBM) and the Harvard University in 1944.

But the real breakthrough was the concept of the stored-program computer. This was when the Hungarian-American mathematician John von Neumann introduced the Electronic Discrete Variable Automatic Computer (EDVAC). The idea--that instructions as well as data should be stored in the computer's memory for better results--made this device totally different from its counting device type of forerunners. And since then computers have increasingly become faster and more powerful.

Still, as against the present day's personal computers, they had the simplest form of designs. It was based on a single CPU performing various operations, like, addition, multiplication and so on. And these operations would be performed following an order of instructions, called program, to produce the desired result.

This form of design, was followed, with a little change even in the advanced versions of computers developed later. This changed version saw a division of the CPU into memory and arithmetic logical unit (ALU) parts and a separate input and output sections.

In fact, the first four generations of computers followed this as their basic form of design. It was basically the type of hardware used that caused the difference over the generation. For instance, the first generation variety...

## Share this Document

Let your classmates know about this document and more at StudyMode.com