The concept of mind is so difficult to define that over the centuries scholars have sought its representation using an array of objects, usually the latest technological tools. This mode of scientific discovery is known as the tools-to-theories heuristic; when the current tools used by science are incorporated into a theory and accepted due to widespread use of said tool. Currently, the most universally applicable tool in all the sciences is undoubtedly the computer, arguably the single most complex device ever created by humanity. The computer is the ultimate tool, replacing inanimate objects like typewriters, notepads, calculators, photo-albums, televisions, books; running complex models of attractors, road networks, the weather, the stock market and the universe; and replicating human bank tellers, porters, telephone operators, pilots, teachers, even doctors  and artists. Analogies have been drawn between mind and computer, but the computer is so much more than a metaphor. My contention is that the usage, by philosophers, psychologists and cognitive scientists, of a computer as a metaphor for the mind is misuse. The mind, nature's most complex survival mechanism, is nothing more or less than a highly sophisticated, finely tuned computer.  Emergence, layers of abstraction, and Turing machines
One of the first computers was born out of the need to calculate large arithmetic operations by Babbage and was based on the fact that "an assemblage of unskilled workers, each knowing very little about the large computation" :133 could be replaced by machines. Thus Babbage noticed that simple rules produce emergent phenomena. Emergence is produced by complex biological systems , in which the whole is more than the sum of its parts. Each worker can be thought of as a neuron or an ant, on their own practically useless, but placed in an network, or a colony, highly complex behaviour is produced by following simple rules. The simple rules percolate upwards creating a global effect: answers to complicated arithmetic operations in the case of Babbage, gliders in the case of Conway's Game of Life, eusociality in the case of ants, DNA in the case of organisms and minds in the case of neurons. Babbage's realisation was the first step in the creation of the computer. Building on this was von Neumann who used computers as a representation of the human brain, closer to a tools-to-theories fashion. He promoted the "binary digit" nature of the neuron and "consider[ed] living organisms as if they were purely digital automata":297, although he did question "how legitimate it is to transfer our experience with computing machines to natural systems".:68 In agreement with Von Neumann's comments, it is not a necessity that components of computers have a direct correspondence to those in neurological and cognitive theories in order to claim that a mind is a computer. The stored-instruction computer, the architecture bearing Von Neumann's name, clearly demonstrates that hardware, the physical realisation, can be abstracted away from software. Abstraction layers[notes 1] are levels of analysis of any complex system. Starting from a sufficiently simple layer each new layer adds complexity, organisation and an emergent phenomenon. Abstraction layers are deployed by Marr, Chomsky, Pylyshyn, Rumelhart and McClelland, Newell and Anderson in their cognitive theories, in computer science and in the natural sciences. For example a desktop computer can be subdivided into the following simplified layers of abstraction: the electronics, logical gates, adders and (de)multiplexers; the hardware, HDDs, the CPU and RAM; and the software, word-processors, web-browsers and music-players. When discussing web-browsers there is no need to care about logical gates or what part of RAM is in use, because it is impossible to reduce higher layers to lower ones although anything that occurs at one...
Please join StudyMode to read the full document