From the introduction of the counting frame, or more popularly known as the abacus1, it was realized that the creation of a tool that is able assist in mathematical calculations will greatly increase productivity and efficiency needs of man2. The use of abaci continued for numerous centuries up to the years when early calculators made use of hole-placements in a dial to signify a count—similar to that of a rotary dial telephone3. As the years progressed people needed more. It was seen that the simple addition, subtraction, and multiplication functions were not enough. The need for memory storage features arose. People at that time perceived the abacus, comptometers, Napier's bones, books of mathematical tables, slide rules, and other manual tool used for computing as tedious and error-prone2. From this need came the development of the early non-electronic computers.
From the making of the first digital electronic computer by John V. Atanasoff, to the use of vacuum tubes in 1950, the early designs of computers were used in a variety of processes. They varied from decoding German messages during World War II, up to calculating the presidential election returns in the 1940’s. Although the latter function was not trusted by TV networks, people were amazed at the capabilities of these machines4.
In 1971 the first microprocessor was patented by Gilbert Hyatt at Micro Computer4. Making use of small number-holding areas known as registers, microprocessors were designed to perform not just arithmetic but also logic operations5. Regular microprocessor functions include adding, subtracting, comparing two numbers, and fetching numbers from one area to another5. It was during this time that two of the largest microprocessor companies today, namely Intel and Advanced Micro Devices developed its first microprocessor5.
Founded in 1968, Intel had one main objective: To make semiconductor memory more practical. In 1971 Intel released its first microprocessor – the 4004 microcomputer. Smaller than a thumbnail, it contained 2,300 transistors and was able to complete sixty thousand operations in a second. After the release of the 4004 microcomputer, Intel then released the 8008 version. This newer version is now capable of accomplishing twice the capabilities of its predecessor. A company by the name of International Business Machines (IBM) took notice of Intel’s dedication to microprocessor development and chose Intel’s 8088 chip for the CPU of their first ever personal computer. In 1982 the acclaimed 286 chip finished development. This time, Intel’s newest product contained 134,000 transistors and executed three times the performance, not of its 4004 chip ancestor, but its current competitors. Intel continued its commitment to microprocessor development when it introduced the 386 microprocessor in 1986, and following with the 486 chip in 1989. The latter Intel product was fifty times faster than the 4004 chip and is capable of matching the performance of a powerful mainframe computer. The famous Pentium name was introduced in 1993. Rated to be five times faster than the 486 processor, it contained 3.3 million transistors and is able to process ninety million instructions per second. In the mid-1990’s Intel strayed from its tradition and did not introduce another chip model, this time an enhancement to multimedia performance was added to its line of products – the MMX. This new Intel breakthrough was bundled with the Pentium line of microprocessors making it run faster when managing multimedia applications. According to Intel, a computer with an MMX microprocessor runs a multimedia application up to sixty percent faster than a computer with a microprocessor having the same clock speed but without the Pentium MMX microchip22.
Throughout these years Intel’s dedication has been praised by many. The microprocessor company followed with the...