John Searle

Only available on StudyMode
  • Download(s) : 362
  • Published : May 25, 2008
Open Document
Text Preview
In “Can Computers Think?” John Searle argues against the prevailing view in philosophy, psychology, and artificial intelligence, which emphasizes the analogies between the functioning of the human brain and the functioning of digital computers. (Searle, 372) He asks whether a digital computer, as defined, can think. Specifically, he asks whether instantiating or implementing the right computer program with the right inputs and outputs is sufficient to, or constitutive of, thinking, to which he answers no, since “computer programs are defined purely syntactically.” (Searle, 376) In this essay, I will argue that, according to Searle’s own definition of semantic understanding, computers do have at least a minimal amount of semantics. I will argue that Margret Boden’s objections to Searle’s argument in “Escaping from the Chinese Room” are strong and that the internal symbols and procedures of computer program “do embody minimal understanding.” (Boden, 387)

I will begin this essay by investigating Searle’s Chinese room thought—experiement. This thought—experiement is meant to simulate the processes of a digital computer. I will detail how, according to Searle’s own multiple definitions of thinking, the person inside the Chinese language room is in fact thinking, citing arguments from Boden. I will conclude the essay by arguing that syntactical processes involve a certain amount of prior semantic understanding, and that instantiating or implementing the right computer program with the right inputs and outputs is sufficient to, or constitutive of, Searle’s definition of thinking.

To differentiate syntactical processes from semantical understanding, Searle uses a thought—experiement. He asks the reader to imagine computer programers have developed a program that can simulate the understanding of Chinese. He further asks the reader to imagine him or herself as the processor, imagining that we are locked in a room, with several baskets full of Chinese symbols. With no understanding of Chinese, we are given an English rule book to formally order the Chinese symbols. Chinese symbols, called questions, are passed into the room, with further instructions to process these symbols, and send them out, as answers. According to Searle, this manipulation of formal symbols creates the appearance of the understanding of Chinese, yet does not involve any understanding. Like the manipulator of the Chinese symbols, Serles argues that a computer program fails to interpret the meaning of the symbols and thus has no semantic content. (Searle, 374)

Searle gives multiple definitions of what thinking is. Thinking is defined by semantical contents that are references to the external, interpretation of meaning, and intentionality. Thinking is semantic and “minds are semantical, in the sense that they have more than a formal structure, they have content.” (Searle, 374) Searle defines semantic contents as one’s thoughts, and beliefs, and desires. These are “about something, or they refer to something, or they concern states of affairs in the world; and they do that because their content directs them at these state of affairs in the world.” (Searle, 377) Semantic contents thus refer to external causal factors. Further, Searle argues that mental states “involve having an interpretation, or a meaning attached” to the formal symbols involved. (Searle, 374) Lastly, Searle defines thinking by intentionality, stating that “no purely formal model will ever be sufficient by itself for intentionality because the formal properties are not by themselves constitutive of intentionality.” (Text, 516) I argue that according to these given by Searle definitions of thinking, computer programs, at the very least, do a minimal amount of thinking.

The first definition Searle assigns to thinking is contents that refer to, and direct themselves to the states of the world. Searle states that “even if my thoughts occurs to me in strings...
tracking img