Cognitive development is the development of thought processes, including remembering, problem solving, and decision-making, from childhood through adolescence to adulthood. Historically, the cognitive development of children has been studied in a variety of ways. The oldest is through intelligence tests. An example of this is the Stanford Binet Intelligence Quotient test. IQ scoring is based on the concept of "mental age," according to which the scores of a child of average intelligence match his or her age. IQ tests are widely used in the United States, but they have been criticized for defining intelligence too narrowly. In contrast to the emphasis placed on a child's native abilities by intelligence testing, learning theory grew out of work by behaviorist researchers such as John Broadus Watson and B.F. Skinner, who argued that children are completely malleable. Learning theory focuses on the role of environmental factors in shaping the intelligence of children, especially on a child's ability to learn by having certain behaviors rewarded and others discouraged.
The most well-known and influential theory of cognitive development is that of French psychologist Jean Piaget. He originally trained in areas of biology and philosophy and considered himself a "genetic epistimologist." He was mainly interested in the biological influences on how we come to know. He believed that what distinguishes human beings from other animals is our ability to do "abstract symbolic reasoning." Piaget's theory, first published in 1952, grew out of decades of extensive observation of children, including his own, in their natural environments as opposed to the laboratory experiments of the behaviorists. Although Piaget was interested in how children reacted to their environment, he proposed knowledge as composed of schemas, basic units of knowledge used to organize past experiences and serve a s a basis for understanding new ones. Schemas are continually being...
Please join StudyMode to read the full document