It was once believed that infants lacked the ability to think or form complex ideas and remained without cognition until they learned language. It is now known that babies are aware of their surroundings and interested in exploration from the time they are born. From birth, babies begin to actively learn. They gather, sort, and process information from around them, using the data to develop perception and thinking skills.
Cognitive development refers to how a person perceives, thinks, and gains understanding of his or her world through the interaction of genetic and learned factors. Among the areas of cognitive development are information processing, intelligence , reasoning, language development , and memory.
Historically, the cognitive development of children has been studied in a variety of ways. The oldest is through intelligence tests, such as the widely used Stanford Binet Intelligence Quotient (IQ) test first adopted for use in the United States by psychologist Lewis Terman (1877–1956) in 1916 from a French model pioneered in 1905. IQ scoring is based on the concept of "mental age," according to which the scores of a child of average intelligence match his or her age, while a gifted child's performance is comparable to that of an older child, and a slow learner's scores are similar to those of a younger child. IQ tests are widely used in the United States, but they have come under increasing criticism for defining intelligence too narrowly and for being biased with regard to race and gender.
In contrast to the emphasis placed on a child's native abilities by intelligence testing, learning theory grew out of work by behaviorist researchers such as John Watson (1878–1958) and B. F. Skinner (1904–1990), who argued that children are completely malleable. Learning theory focuses on the role of environmental factors in shaping the intelligence of children, especially on a child's ability to learn by having certain behaviors rewarded and others discouraged.
Piaget's theory of cognitive development
The most well-known and influential theory of cognitive development is that of French psychologist Jean Piaget (1896–1980). Piaget's theory, first published in 1952, grew out of decades of extensive observation of children, including his own, in their natural environments as opposed to the laboratory experiments of the behaviorists. Although Piaget was interested in how children reacted to their environment, he proposed a more active role for them than that suggested by learning theory. He envisioned a child's knowledge as composed of schemas, basic units of knowledge used to organize past experiences and serve as a basis for understanding new ones.
Schemas are continually being modified by two complementary processes that Piaget termed assimilation and accommodation. Assimilation refers to the process of taking in new information by incorporating it into an existing schema. In other words, people assimilate new experiences by relating them to things they already know. On the other hand, accommodation is what happens when the schema itself changes to accommodate new knowledge. According to Piaget, cognitive development involves an ongoing attempt to achieve a balance between assimilation and accommodation that he termed equilibration.
At the center of Piaget's theory is the principle that cognitive development occurs in a series of four distinct, universal stages, each characterized by increasingly sophisticated and abstract levels of thought. These stages always occur in the same order, and each builds on what was learned in the previous stage. They are as follows:
Sensorimotor stage (infancy): In this period, which has six sub-stages, intelligence is demonstrated through motor activity without the use of symbols. Knowledge of the world is limited, but developing, because it is based on physical interactions and experiences. Children acquire object permanence at about seven months of age (memory). Physical...
Please join StudyMode to read the full document