From Wikipedia, the free encyclopedia
For other uses, see Computer (disambiguation).
"Computer system" redirects here. For other uses, see Computer system (disambiguation). "Computer technology" redirects here. For the company, see Computer Technology Limited.
A computer is a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem. Conventionally, a computer consists of at least one processing element, typically a central processing unit (CPU) and some form of memory. The processing element carries out arithmetic and logic operations, and a sequencing and control unit that can change the order of operations based on stored information. Peripheral devices allow information to be retrieved from an external source, and the result of operations saved and retrieved. The first electronic digital computers were developed between 1940 and 1945 in the United Kingdom and United States. Originally they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). In this era mechanical analog computers were used for military applications. Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and mobile computers can be powered by smallbatteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from mp3 players to fighter aircraft and from toys to industrial robots are the most numerous. Contents [hide] * 1 History of computing * 1.1 Limited-function early computers * 1.2 First general-purpose computers * 1.3 Stored-program architecture * 1.4 Semiconductors and microprocessors * 2 Programs * 2.1 Stored program architecture * 2.2 Bugs * 2.3 Machine code * 2.4 Programming language * 2.4.1 Low-level languages * 2.4.2 Higher-level languages * 2.5 Program design * 3 Components * 3.1 Control unit * 3.2 Arithmetic logic unit (ALU) * 3.3 Memory * 3.4 Input/output (I/O) * 3.5 Multitasking * 3.6 Multiprocessing * 3.7 Networking and the Internet * 3.8 Computer architecture paradigms * 4 Misconceptions * 4.1 Required technology * 5 Further topics * 5.1 Artificial intelligence * 5.2 Hardware * 5.2.1 History of computing hardware * 5.2.2 Other hardware topics * 5.3 Software * 5.4 Languages * 5.5 Professions and organizations * 6 See also * 7 Notes * 8 References * 9 External links
History of computing
The Jacquard loom, on display at the Museum of Science and Industry in Manchester, England, was one of the first programmable devices. Main article: History of computing hardware
The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations. Limited-function early computers
The history of the modern computer begins with two separate technologies, automated calculation and programmability, but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term. A few devices are worth mentioning though, like some mechanical aids to computing, which were very successful and survived for centuries until the advent of...
Please join StudyMode to read the full document