Professor Ganesh Bhatt
February 19, 2015
Timeline of Computers
The History of Computer Hardware, Computer Software, and Computer Networks originated in the 20th century. It has progressed from mechanical inventions and mathematical theories towards the modern computer concepts and machines formed a major academic field and the beginning of a substantial worldwide industry.
Computer Hardware originated from the 1960’s is marked by the conversion from vacuum tube to solid state devices such as the transistor and later the integrated circuit. By 1959 discrete transistors were considered sufficiently reliable and economical that they made further vacuum tube computers uncompetitive. Computer main memory slowly moved away from magnetic core memory devices to solid-state static and dynamic semiconductor memory, which greatly reduced the cost, size and power consumption of computer devices. Eventually the cost of integrated circuit devices became low enough that home computers and personal computers became widespread. In the third generation the mass increase in the use of computers accelerated with computers. These generally relied on Jack Kilby’s invention of the microchip, starting around 1965. However, the IBM Systems used hybrid circuits, which were solid state devices interconnected on a substrate with separate wires. The first integrated circuit was produced in September 1958 but computers using them didn’t begin to appear until 1963. Some of the early uses were in embedded systems, notably used by NASA for the Apollo Guidance Computer and others. While large mainframe computers such as the System increased storage and processing abilities, the integrated circuit also allowed development of much smaller computers. The minicomputer was a significant innovation in the 1960s and 1970s. It brought computing power to more people, not only through more convenient physical size but also