Top-Rated Free Essay
Preview

The Lucifer Effect

Powerful Essays
10294 Words
Grammar
Grammar
Plagiarism
Plagiarism
Writing
Writing
Score
Score
The Lucifer Effect
CHAPTER TWELVE

Investigating Social Dynamics: Power, Conformity, and Obedience
I believe that in all men's lives at certain periods, and in many men's lives at all periods between infancy and extreme old age, one of the most dominant elements is the desire to be inside the local Ring and the terror of being left outside.... Of all the passions the passion for the Inner Ring is most skilful in making a man who is not yet a very bad man do very bad things. —C. S. Lewis, "The Inner Ring" ( 1 9 4 4 )
1

Motives and needs that ordinarily serve us well can lead us astray when they are aroused, amplified, or manipulated by situational forces that we fail to recognize as potent. This is why evil is so pervasive. Its temptation is just a small turn away, a slight detour on the path of life, a blur in our sideview mirror, leading to disaster. In trying to understand the character transformations of the good young men in the Stanford Prison Experiment, I previously outlined a number of psychological processes that were pivotal in perverting their thoughts, feelings, perceptions, and actions. We saw how the basic need to belong, to associate with and be accepted by others, so central to community building and family bonding, was diverted in the SPE into conformity with newly emergent norms that enabled the guards to abuse the prisoners. We saw further that the basic motive for consistency between our private attitudes and public behavior allowed for dissonant commitments to be resolved and rationalized in violence against one's fellows.
3 2

I will argue that the most dramatic instances of directed behavior change and "mind control" are not the consequence of exotic forms of influence, such as hypnosis, psychotropic drugs, or "brainwashing," but rather the systematic manipulation of the most mundane aspects of human nature over time in confining settings,
4

It is in this sense, I believe what the English scholar C. S. Lewis proposed— that a powerful force in transforming human behavior, pushing people across the boundary between good and evil, comes from the basic desire to be "in" and not "out." If we think of social power as arrayed in a set of concentric circles from the most powerful central or inner ring moving outward to the least socially significant outer ring, we can appreciate his focus on the centripetal pull of that central

Investigating Social Dynamics

259

circle. Lewis's "Inner Ring" is the elusive Camelot of acceptance into some special group, some privileged association, that confers instant status and enhanced identity. Its lure for most of us is obvious—who does not want to be a member of the "in-group"? Who does not want to know that she or he has been tried and found worthy of inclusion in, of ascendance into, a new, rarifled realm of social acceptability? Peer pressure has been identified as one social force that makes people, especially adolescents, do strange things—anything—to be accepted. However, the quest for the Inner Ring is nurtured from within. There is no peer-pressure power without that push from self-pressure for Them to want You. It makes people willing to suffer through painful, humiliating initiation rites in fraternities, cults, social clubs, or the military. It justifies for many suffering a lifelong existence climbing the corporate ladder. This motivational force is doubly energized by what Lewis called the "terror of being left outside." This fear of rejection when one wants acceptance can cripple initiative and negate personal autonomy. It can turn social animals into shy introverts. The imagined threat of being cast into the out-group can lead some people to do virtually anything to avoid their terrifying rejection. Authorities can command total obedience not through punishments or rewards but by means of the double-edged weapon: the lure of acceptance coupled with the threat of rejection. So strong is this human motive that even strangers are empowered when they promise us a special place at their table of shared secrets—"just between you and me."
5

A sordid example of these social dynamics came to light recently when a forty-year-old woman pleaded guilty to having sex with five high school boys and providing them and others with drugs and alcohol at weekly sex parties in her home for a full year. She told police that she had done it because she wanted to be a "cool mom." In her affidavit, this newly cool mom told investigators that she had never been popular with her classmates in high school, but orchestrating these parties enabled her to begin "feeling like one of the group." Sadly, she caught the wrong Inner Ring. Lewis goes on to describe the subtle process of initiation, the indoctrination of good people into a private Inner Ring that can have malevolent consequences, turning them into "scoundrels." I cite this passage at length because it is such an eloquent expression of how this basic human motive can be imperceptibly perverted by those with the power to admit or deny access to their Inner Ring. It will set the stage for our excursion into the experimental laboratories and field settings of social scientists who have investigated such phenomena in considerable depth. To nine out of ten of you the choice which could lead to scoundrelism will come, when it does come, in no very dramatic colors. Obviously bad men, obviously threatening or bribing, will almost certainly not appear. Over a drink or a cup of coffee, disguised as a triviality and sandwiched between
6

260

The Lucifer Effect two jokes, from the lips of a man, or woman, whom you have recently been getting to know rather better and whom you hope to know better still—just at the moment when you are most anxious not to appear crude, or naive or a prig—the hint will come. It will be the hint of something, which is not quite in accordance with the technical rules of fair play, something that the public, the ignorant, romantic public, would never understand. Something which even the outsiders in your own profession are apt to make a fuss about, but something, says your new friend, which "we"—and at the word "we" you try not to blush for mere pleasure— something "we always do." And you will be drawn in, if you are drawn in, not by desire for gain or ease, but simply because at that moment, when the cup was so near your lips, you cannot bear to be thrust back again into the cold outer world. It would be so terrible to see the other man's face— that genial, confidential, delightfully sophisticated face—turn suddenly cold and contemptuous, to know that you had been tried for the Inner Ring and rejected. And then, if you are drawn in, next week it will be something a little further from the rules, and next year something further still, but all in the jolliest, friendliest spirit. It may end in a crash, a scandal, and penal servitude; it may end in millions, a peerage and giving the prizes at your old school. But you will be a scoundrel.

RESEARCH REVELATIONS OF SITUATIONAL P O W E R The Stanford Prison Experiment is a facet of the broad mosaic of research that reveals the power of social situations and the social construction of reality. We have seen how it focused on power relationships among individuals within an institutional setting. A variety of studies that preceded and followed it have illuminated many other aspects of human behavior that are shaped in unexpected ways by situational forces. Groups can get us to do things we ordinarily might not do on our own, but their influence is often indirect, simply modeling the normative behavior that the group wants us to imitate and practice. In contrast, authority influence is more often direct and without subtlety: "You do what I tell you to do." But because the demand is so open and bold-faced, one can decide to disobey and not follow the leader. To see what I mean, consider this question: To what extent would a good, ordinary person resist against or comply with the demand of an authority figure that he harm, or even kill, an innocent stranger? This provocative question was put to experimental test in a controversial study on blind obedience to authority. It is a classic experiment about which you have probably heard because of its "shocking" effects, but there is much more of value embedded in its procedures that we will extract to aid in our quest to understand why good people can be induced to behave badly. We will review replications and extensions of this clas-

Investigating Social Dynamics

261

sic study and again ask the question posed of all such research: What is its external validity, what are real-world parallels to the laboratory demonstration of authority power? Beware: Self-Serving Biases May Be at W o r k Before we get into the details of this research, I must warn you of a bias you likely possess that might shield you from drawing the right conclusions from all you are about to read. Most of us construct self-enhancing, self-serving, egocentric biases that make us feel special—never ordinary, and certainly "above average." Such cognitive biases serve a valuable function in boosting our self-esteem and protecting against life's hard knocks. They enable us to explain away failures, take credit for our successes, and disown responsibility for bad decisions, perceiving our subjective world through rainbow prisms. For example, research shows that 86 percent of Australians rate their job performance as "above average," and 90 percent of American business managers rate their performance as superior to that of their average peer. (Pity that poor average dude.) Yet these biases can be maladaptive as well by blinding us to our similarity to others and distancing us from the reality that people just like us behave badly in certain toxic situations. Such biases also mean that we don't take basic precautions to avoid the undesired consequences of our behavior, assuming it won't happen to us. So we take sexual risks, driving risks, gambling risks, health risks, and more. In the extreme version of these biases, most people believe that they are less vulnerable to these self-serving biases than other people, even after being taught about them.
8 7

That means when you read about the SPE or the many studies in this next section, you might well conclude that you would not do what the majority has done, that you would, of course, be the exception to the rule. That statistically unreasonable belief (since most of us share it) makes you even more vulnerable to situational forces precisely because you underestimate their power as you overestimate yours. You are convinced that you would be the good guard, the defiant prisoner, the resistor, the dissident, the nonconformist, and, most of all, the Hero. Would that it were so, but heroes are a rare breed—some of whom we will meet in our final chapter. So I invite you to suspend that bias for now and imagine that what the majority has done in these experiments is a fair base rate for you as well. At the very least, please consider that you can't be certain of whether or not you could be as readily seduced into doing what the average research participant has done in these studies—if you were in their shoes, under the same circumstances. I ask you to recall what Prisoner Clay-416, the sausage resister, said in his postexperimental interview with his tormenter, the "John Wayne" guard. When taunted with "What kind of guard would you have been if you were in my place?" he replied modestly, "I really don't know."

262

The Lucifer Effect It is only through recognizing that we are all subject to the same dynamic

forces in the human condition, that humility takes precedence over unfounded pride, that we can begin to acknowledge our vulnerability to situational forces. In this vein, recall John Donne's eloquent framing of our common interrelatedness and interdependence: All mankind is of one author, and is one volume; when one man dies, one chapter is not torn out of the book, but translated into a better language; and every chapter must be so translated. . . . As therefore the bell that rings to a sermon, calls not upon the preacher only, but upon the congregation to come: so this bell calls us all. . . . No man is an island, entire of i t s e l f . . . any man's death diminishes me, because I am involved in mankind; and therefore never send to know for whom the bell tolls; it tolls for thee. (Meditations 27)

Classic Research on Conforming to Group Norms One of the earliest studies on conformity, in 1 9 3 5 , was designed by a social psychologist from Turkey, Muzafer Sherif. Sherif, a recent immigrant to the United States, believed that Americans in general tended to conform because their democracy emphasized mutually shared agreements. He devised an unusual means of demonstrating conformity of individuals to group standards in a novel setting. Male college students were individually ushered into a totally dark room in which there was a stationary spot of light. Sherif knew that without any frame of reference, such a light appears to move about erratically, an illusion called the "autokinetic effect." At first, each of these subjects was asked individually to judge the movement of the light. Their judgments varied widely; some saw movement of a few inches, while others reported that the spot moved many feet. Each person soon established a range within which most of his reports would fall. Next, he was put into a group with several others. They gave estimates that varied widely, but in each group a norm "crystallized" wherein a range of judgments and an average-norm judgment emerged. After many trials, the other participants left, and the individual, now alone, was asked again to make estimates of the movement of the light—the test of his conformity to the new norm established in that group. His judgments now fell in this new group-sanctioned range, "departing significantly from his earlier personal range." Sherif also used a confederate who was trained to give estimates that varied in their latitude from a small to a very large range. Sure enough, the naive subject's autokinetic experience mirrored that of the judgments of this devious confederate rather than sticking to his previously established personal perceptual standard.
9

Investigating Social Dynamics

263

Asch's Conformity Research: Getting into Line Sherif's conformity effect was challenged in 1 9 5 5 by another social psychologist, Solomon A s c h ,
10

who believed that Americans were actually more independent

than Sherif's work had suggested. Asch believed that Americans could act autonomously, even when faced with a majority who saw the world differently from them. The problem with Sherif's test situation, he argued, was that it was so ambiguous, without any meaningful frame of reference or personal standard. When challenged by the alternative perception of the group, the individual had no real commitment to his original estimates so just went along. Real conformity required the group to challenge the basic perception and beliefs of the individual— to say that X was Y, when clearly that was not true. Under those circumstances, Asch predicted, relatively few would conform: most would be staunchly resistant to this extreme group pressure that was so transparently wrong. What actually happened to people confronted with a social reality that conflicted with their basic perceptions of the world? To find out, let me put you into the seat of a typical research participant. You are recruited for a study of visual perception that begins with judging the relative size of lines. You are shown cards with three lines of differing lengths and asked to state out loud which of the three is the same length as a comparison line on another card. One is shorter, one is longer, and one is exactly the same length as the comparison line. The task is a piece of cake for you. You make few mistakes, just like most others (less than 1 percent of the time). But you are not alone in this study; you are flanked by a bunch of peers, seven of them, and you are number eight. At first, your answers are like theirs—all right on. But then unusual things start to happen. On some trials, each of them in turn reports seeing the long line as the same length as the medium line or the short line the same as the medium one. (Unknown to you, the other seven are members of Asch's research team who have been instructed to give incorrect answers unanimously on specific "critical" trials.) When it is your turn, they all look at you as you look at the card with the three lines. You are clearly seeing something different than they are, but do you say so? Do you stick to your guns and say what you know is right, or do you go along with what everyone else says is right? You face that same group pressure on twelve of the total eighteen trials where the group gives answers that are wrong, but they are accurate on the other six trials interspersed into the mix. If you are like most of the 1 2 3 actual research participants in Asch's study, you would yield to the group about 70 percent of the time on some of those critical, wrong-judgment trials. Thirty percent of the original subjects conformed on the majority of trials, and only a quarter of them were able to maintain their independence throughout the testing. Some reported being aware of the differences between what they saw and the group consensus, but they felt it was easier to go along with the others. For others the discrepancy created a conflict that was re-

264

The Lucifer Effect

solved by coming to believe that the group was right and their perception was wrong! All those who yielded underestimated how much they had conformed, recalling yielding much less to the group pressure than had actually been the case. They remained independent—in their minds but not in their actions. Follow-up studies showed that, when pitted against just one person giving an incorrect judgment, a participant exhibits some uneasiness but maintains independence. However, with a majority of three people opposed to him, errors rose to 32 percent. On a more optimistic note, however, Asch found one powerful way to promote independence. By giving the subject a partner whose views were in line with his, the power of the majority was greatly diminished. Peer support decreased errors to one fourth of what they had been when there was no partner— and this resistance effect endured even after the partner left. One of the valuable additions to our understanding of why people conform comes from research that highlights two of the basic mechanisms that contribute to group conformity. We conform first out of informational needs: other people often have ideas, views, perspectives, and knowledge that helps us to better navigate our world, especially through foreign shores and new ports. The second mechanism involves normative needs: other people are more likely to accept us when we agree with them than when we disagree, so we yield to their view of the world, driven by a powerful need to belong, to replace differences with similarities. Conformity and Independence Light Up the Brain Differently New technology, not available in Asch's day, offers intriguing insights into the role of the brain in social conformity. When people conform, are they rationally deciding to go along with the group out of normative needs, or are they actually changing their perceptions and accepting the validity of the new though erroneous information provided by the group? A recent study utilized advanced brain-scanning technology to answer this question. Researchers can now peer into the active brain as a person engages in various tasks by using a scanning device that detects which specific brain regions are energized as they carry out various mental tasks. The process is known as functional magnetic resonance imaging (FMRI). Understanding what mental functions various brain regions control tells us what it means when they are activated by any given experimental task. Here's how the study worked. Imagine that you are one of thirty-two volunteers recruited for a study of perception. You have to mentally rotate images of three-dimensional objects to determine if the objects are the same as or different from a standard object. In the waiting room, you meet four other volunteers, with whom you begin to bond by practicing games on laptop computers, taking photos of one another, and chatting. (They are really actors—"confederates," as they are called in psychology—who will soon be faking their answers on the test trials so that they are in agreement with one another but not with the correct responses
12 11

Investigating Social Dynamics

265

that you generate.) You are selected as the one to go into the scanner while the others outside look at the objects first as a group and then decide if they are the same or different. As in Asch's original experiment, the actors unanimously give wrong answers on some trials, correct answers on others, with occasional mixed group answers thrown in to make the test more believable. On each round, when it is your turn at bat, you are shown the answers given by the others. You have to decide if the objects are the same or different—as the group assessed them or as you saw them? As in Asch's experiments, you (as the typical subject) would cave in to group pressure, on average giving the group's wrong answers 41 percent of the time. When you yield to the group's erroneous judgment, your conformity would be seen in the brain scan as changes in selected regions of the brain's cortex dedicated to vision and spatial awareness (specifically, activity increases in the right intraparietal sulcus). Surprisingly, there would be no changes in areas of the forebrain that deal with monitoring conflicts, planning, and other higher-order mental activities. On the other hand, if you make independent judgments that go against the group, your brain would light up in the areas that are associated with emotional salience (the right amygdala and right caudate nucleus regions). This means that resistance creates an emotional burden for those who maintain their independence—autonomy comes at a psychic cost. The lead author of this research, the neuroscientist Gregory Berns, concluded that "We like to think that seeing is believing, but the study's findings show that seeing is believing what the group tells you to believe." This means that other people's views, when crystallized into a group consensus, can actually affect how we perceive important aspects of the external world, thus calling into question the nature of truth itself. It is only by becoming aware of our vulnerability to social pressure that we can begin to build resistance to conformity when it is not in our best interest to yield to the mentality of the herd. Minority Power to Impact the Majority Juries can become "hung" when a dissenter gets support from at least one other person and together they challenge the dominant majority view. But can a small minority turn the majority around to create new norms using the same basic psychological principles that usually help to establish the majority view? A research team of French psychologists put that question to an experimental test. In a color-naming task, if two confederates among groups of six female students consistently called a blue light "green," almost a third of the naive majority subjects eventually followed their lead. However, the members of the majority did not give in to the consistent minority when they were gathered together. It was only later, when they were tested individually, that they responded as the minority had done, shifting their judgments by moving the boundary between blue and green toward the green of the color spectrum.
13

266

The Lucifer Effect Researchers have also studied minority influence in the context of simulated

jury deliberations, where a disagreeing minority prevents unanimous acceptance of the majority point of view. The minority group was never well liked, and its persuasiveness, when it occurred, worked only gradually, over time. The vocal minority was most influential when it had four qualities: it persisted in affirming a consistent position, appeared confident, avoided seeming rigid and dogmatic, and was skilled in social influence. Eventually, the power of the many may be undercut by the persuasion of the dedicated few. How do these qualities of a dissident minority—especially its persistence— help to sway the majority? Majority decisions tend to be made without engaging the systematic thought and critical thinking skills of the individuals in the group. Given the force of the group's normative power to shape the opinions of the followers who conform without thinking things through, they are often taken at face value. The persistent minority forces the others to process the relevant information more mindfully. Research shows that the decisions of a group as a whole are more thoughtful and creative when there is minority dissent than when it is absent.
15 14

If a minority can win adherents to their side even when they are wrong, there is hope for a minority with a valid cause. In society, the majority tends to be the defender of the status quo, while the force for innovation and change comes from the minority members or individuals either dissatisfied with the current system or able to visualize new and creative alternative ways of dealing with current problems. According to the French social theorist Serge Moscovici, the conflict between the entrenched majority view and the dissident minority perspective is an essential precondition of innovation and revolution that can lead to positive social change. An individual is constantly engaged in a two-way exchange with society—adapting to its norms, roles, and status prescriptions but also acting upon society to reshape those norms.
16

BLIND O B E D I E N C E TO AUTHORITY: MILGRAM'S SHOCKING RESEARCH "I was trying to think of a way to make Asch's conformity experiment more humanly significant. I was dissatisfied that the test of conformity was judgments about lines. I wondered whether groups could pressure a person into performing an act whose human import was more readily apparent; perhaps behaving aggressively toward another person, say by administering increasingly severe shocks to him. But to study the group e f f e c t . . . you'd have to know how the subject performed without any group pressure. At that instant, my thought shifted, zeroing in on this experimental control. Just how far would a person go under the experimenter's orders?" These musings, from a former teaching and research assistant of Solomon Asch, started a remarkable series of studies by a social psychologist, Stanley Mil-

Investigating Social Dynamics

267

gram, that have come to be known as investigations of "blind obedience to authority." His interest in the problem of obedience to authority came from deep personal concerns about how readily the Nazis had obediently killed Jews during the Holocaust. "[My] laboratory paradigm . . . gave scientific expression to a more general concern about authority, a concern forced upon members of my generation, in particular upon Jews such as myself, by the atrocities of World War II. . . . The impact of the Holocaust on my own psyche energized my interest in obedience and shaped the particular form in which it was examined."
17

I would like to re-create for you the situation faced by a typical volunteer in this research project, then go on to summarize the results, outline ten important lessons to be drawn from this research that can be generalized to other situations of behavioral transformations in everyday life, and then review extensions of this paradigm by providing a number of real-world parallels. (See the Notes for a description of my personal relationship with Stanley Milgram. ) Milgram's Obedience Paradigm Imagine that you see the following advertisement in the Sunday newspaper and decide to apply. The original study involved only men, but women were used in a later study, so I invite all readers to participate in this imagined scenario.
18

Public

Announcement

WE WILL PAY YOU S 4 . 0 0 F O R ONE HOUR OF Y O U R TIME Persons Needed for a Study of Memory
•We will pay five hundred New Haven men to help US complete a scientific study of memory and learning. The study is being done at Yale University. •Each person who participates will be paid $4.00 (plus 5 0 c carfare) for approximately 1 hour's time. We need you for only one hour: there are no further obligations. You may choose the time you would lite to come (evenings, weekdays, or weekends). •No special training, education, or experience is needed. We want: Factory workers City employees Laborers Barbers Businessmen Clerks Professional people Telephone workers Construction workers Salespeople White-collar workers Others

All persons must be between the ages of 20 and 50. High school and college students cannot be used. • I f you meet these qualifications, fill out the coupon below and mail it now to Professor Stanley Milgram, Department of Psychology, Yale University, New Haven. You will be notified later of the specific time and place of the study. We reserve the right to decline any application. •You will be paid $ 4 . 0 0 (plus 50c carfare) as soon as you arrive at the laboratory.

TO: PROF. STANLEY MILGRAM. DEPARTMENT OF PSYCHOLOGY, Y A L E U N I V E R S I T Y , N E W H A V E N , C O N N . I want t o t a k e part i n this study o f m e m o r y and learning. I a m b e t w e e n the ages o f 2 0 and 5 0 . I will b e paid S 4 . 0 0 (plus 5 0 c c a r f a r e ) i f I p a r t i c i p a t e .

A researcher whose serious demeanor and gray laboratory coat convey scientific importance greets you and another applicant at your arrival at a Yale University laboratory in Linsly-Chittenden Hall. You are here to help scientific psychology find ways to improve people's learning and memory through the use of punishment. He tells you why this new research may have important practical consequences. The task is straightforward: one of you will be the "teacher" who gives the "learner" a set of word pairings to memorize. During the test, the teacher gives each key word, and the learner must respond with the correct association. When right, the teacher gives a verbal reward, such as

"Good" or "That's right." When wrong, the teacher is to press a lever on an impressive-looking shock apparatus that delivers an immediate shock to punish the error. The shock generator has thirty switches, starting from a low level of 15 volts and increasing by 15 volts at each higher level. The experimenter tells you that every time the learner makes a mistake, you have to press the next higher voltage switch. The control panel indicates both the voltage level of each of the switches and a corresponding description of the level. The tenth level ( 1 5 0 volts) is "Strong Shock"; the 1 3 t h level ( 1 9 5 volts) is "Very Strong Shock"; the 1 7 t h level ( 2 5 5 volts) is "Intense Shock"; the 2 1 s t level ( 3 1 5 volts) is "Extremely Intense Shock"; the 2 5 t h level ( 3 7 5 volts) is "Danger, Severe Shock"; and at the 2 9 t h and 3 0 t h levels ( 4 3 5 and 4 5 0 volts) the control panel is simply marked with an ominous X X X (the pornography of ultimate pain and power). You and another volunteer draw straws to see who will play each role; you are to be the teacher, and the other volunteer will be the learner. (The drawing is rigged, and the other volunteer is a confederate of the experimenter who always plays the learner.) He is a mild-mannered, middle-aged man whom you help escort to the next chamber. "Okay, now we are going to set up the learner so he can get some punishment," the researcher tells you both. The learner's arms are

270

The Lucifer Effect

strapped down and an electrode is attached to his right wrist. The shock generator in the next room will deliver the shocks to the learner—if and when he makes any errors. The two of you communicate over the intercom, with the experimenter standing next to you. You get a sample shock of 45 volts, the third level, a slight tingly pain, so you now have a sense of what the shock levels mean. The experimenter then signals the start of your trial of the "memory improvement" study. Initially, your pupil does well, but soon he begins making errors, and you start pressing the shock switches. He complains that the shocks are starting to hurt. You look at the experimenter, who nods to continue. As the shock levels increase in intensity, so do the learner's screams, saying he does not think he wants to continue. You hesitate and question whether you should go on, but the experimenter insists that you have no choice but to do so. Now the learner begins complaining about his heart condition and you dissent, but the experimenter still insists that you continue. Errors galore; you plead with your pupil to concentrate to get the right associations, you don't want to hurt him with these very-high-level, intense shocks. But your concerns and motivational messages are to no avail. He gets the answers wrong again and again. As the shocks intensify, he shouts out, "I can't stand the pain, let me out of here!" Then he says to the experimenter, "You have no right to keep me here! Let me out!" Another level up, he screams, "I absolutely refuse to answer any more! Get me out of here! You can't hold me here! My heart's bothering me!" Obviously you want nothing more to do with this experiment. You tell the experimenter that you refuse to continue. You are not the kind of person who harms other people in this way. You want out. But the experimenter continues to insist that you go on. He reminds you of the contract, of your agreement to participate fully. Moreover, he claims responsibility for the consequences of your shocking actions. After you press the 300-volt switch, you read the next keyword, but the learner doesn't answer. "He's not responding," you tell the experimenter. You want him to go into the other room and check on the learner to see if he is all right. The experimenter is impassive; he is not going to check on the learner. Instead he tells you, "If the learner doesn't answer in a reasonable time, about five seconds, consider it wrong," since errors of omission must be punished in the same way as errors of commission—that is a rule. As you continue up to even more dangerous shock levels, there is no sound coming from your pupil's shock chamber. He may be unconscious or worse! You are really distressed and want to quit, but nothing you say works to get your exit from this unexpectedly distressing situation. You are told to follow the rules and keep posing the test items and shocking the errors. Now try to imagine fully what your participation as the teacher would be. I am sure you are saying, "No way would I ever go all the way!" Obviously, you

Investigating Social Dynamics

271

would have dissented, then disobeyed and just walked out. You would never sell out your morality for four bucks! But had you actually gone all the way to the last of the thirtieth shock levels, the experimenter would have insisted that you repeat that XXX switch two more times, for good measure! Now, that is really rubbing it in your face. Forget it, no sir, no way; you are out of there, right? So how far up the scale do you predict that you would you go before exiting? How far would the average person from this small city go in this situation? The Outcome Predicted by Expert Judges Milgram described his experiment to a group of forty psychiatrists and then asked them to estimate the percentage of American citizens who would go to each of the thirty levels in the experiment. On average, they predicted that less than 1 percent would go all the way to the end, that only sadists would engage in such sadistic behavior, and that most people would drop out at the tenth level of 1 5 0 volts. They could not have been more wrong! These experts on human behavior were totally wrong because, first, they ignored the situational determinants of behavior in the procedural description of the experiment. Second, their training in traditional psychiatry led them to rely too heavily on the dispositional perspective to understand unusual behavior and to disregard situational factors. They were guilty of making the fundamental attribution error (FAE)! The Shocking Truth In fact, in Milgram's experiment, two of every three ( 6 5 percent) of the volunteers went all the way up the maximum shock level of 4 5 0 volts. The vast majority of people, the "teachers," shocked their "learner-victim" over and over again despite his increasingly desperate pleas to stop. And now I invite you to venture another guess: What was the dropout rate after the shock level reached 3 3 0 volts—with only silence coming from the shock chamber, where the learner could reasonably be presumed to be unconscious? Who would go on at that point? Wouldn't every sensible person quit, drop out, refuse the experimenter's demands to go on shocking him? Here is what one "teacher" reported about his reaction: "I didn't know what the hell was going on. I think, you know, maybe I'm killing this guy. I told the experimenter that I was not taking responsibility for going further. That's it." But when the experimenter reassured him that he would take the responsibility, the worried teacher obeyed and continued to the very e n d .
19

And almost everyone who got that far did the same as this man. How is that possible? If they got that far, why did they continue on to the bitter end? One reason for this startling level of obedience may be related to the teacher's not knowing how to exit from the situation, rather than just blind obedience. Most participants dissented from time to time, saying they did not want to go on, but the experimenter did not let them out, continually coming up with reasons why

272

The Lucifer Effect

they had to stay and prodding them to continue testing their suffering learner. Usually protests work and you can get out of unpleasant situations, but nothing you say affects this impervious experimenter, who insists that you must stay and continue to shock errors. You look at the shock panel and realize that the easiest exit lies at the end of the last shock lever. A few more lever presses is the fast way out, with no hassles from the experimenter and no further moans from the nowsilent learner. Voilà! 4 5 0 volts is the easy way out—achieving your freedom without directly confronting the authority figure or having to reconcile the suffering you have already caused with this additional pain to the victim. It is a simple matter of up and then out. Variations on an Obedience Theme Over the course of a year, Milgram carried out nineteen different experiments, each one a different variation of the basic paradigm of: experimenter/teacher/ learner/memory testing/errors shocked. In each of these studies he varied one social psychological variable and observed its impact on the extent of obedience to the unjust authority's pressure to continue to shock the "learner-victim." In one study, he added women: in others he varied the physical proximity or remoteness of either the experimenter-teacher link or the teacher-learner link; had peers rebel or obey before the teacher had the chance to begin; and more. In one set of experiments, Milgram wanted to show that his results were not due to the authority power of Yale University—which is what New Haven is all about. So he transplanted his laboratory to a run-down office building in downtown Bridgeport, Connecticut, and repeated the experiment as a project, ostensibly of a private research firm with no apparent connection to Yale. It made no difference; the participants fell under the same spell of this situational power. The data clearly revealed the extreme pliability of human nature: almost everyone could be totally obedient or almost everyone could resist authority pressures. It all depended on the situational variables they experienced. Milgram was able to demonstrate that compliance rates could soar to over 90 percent of people continuing the 450-volt maximum or be reduced to less than 10 percent—by introducing just one crucial variable into the compliance recipe. Want maximum obedience? Make the subject a member of a "teaching team," in which the job of pulling the shock lever to punish the victim is given to another person (a confederate), while the subject assists with other parts of the procedure. Want people to resist authority pressures? Provide social models of peers who rebelled. Participants also refused to deliver the shocks if the learner said he wanted to be shocked; that's masochistic, and they are not sadists. They were also reluctant to give high levels of shock when the experimenter filled in as the learner. They were more likely to shock when the learner was remote than in proximity. In each of the other variations on this diverse range of ordinary Ameri-

Investigating Social Dynamics

273

can citizens, of widely varying ages and occupations and of both genders, it was possible to elicit low, medium, or high levels of compliant obedience with a flick of the situational switch—as if one were simply turning a "human nature dial" within their psyches. This large sample of a thousand ordinary citizens from such varied backgrounds makes the results of the Milgram obedience studies among the most generalizable in all the social sciences. When you think of the long and gloomy history of man, you will find far more hideous crimes have been committed in the name of obedience than have been committed in the name of rebellion. —C. P. Snow, "Either-Or" (1961) Ten Lessons from the Milgram Studies: Creating Evil Traps for Good People Let's outline some of the procedures in this research paradigm that seduced many ordinary citizens to engage in this apparently harmful behavior. In doing so, I want to draw parallels to compliance strategies used by "influence professionals" in real-world settings, such as salespeople, cult and military recruiters, media advertisers, and o t h e r s .
20

There are ten methods we can extract from Milgram's

paradigm for this purpose: 1. Prearranging some form of contractual obligation, verbal or written, to control the individual's behavior in pseudolegal fashion. (In Milgram's experiment, this was done by publicly agreeing to accept the tasks and the procedures.) 2. Giving participants meaningful roles to play ("teacher," "learner") that carry with them previously learned positive values and automatically activate response scripts. 3. Presenting basic rules to be followed that seem to make sense before their actual use but can then be used arbitrarily and impersonally to justify mindless compliance. Also, systems control people by making their rules vague and changing them as necessary but insisting that "rules are rules" and thus must be followed (as the researcher in the lab coat did in Milgram's experiment or the SPE guards did to force prisoner Clay-416 to eat the sausages). 4. Altering the semantics of the act, the actor, and the action (from "hurting victims" to "helping the experimenter," punishing the former for the lofty goal of scientific discovery)—replacing unpleasant reality with desirable rhetoric, gilding the frame so that the real picture is disguised. (We can see the same semantic framing at work in advertising, where, for example, bad-tasting mouthwash is framed as good for you because it kills germs and tastes like medicine is expected to taste.)

274 5.

The Lucifer Effect Creating opportunities for the diffusion of responsibility or abdication of responsibility for negative outcomes; others will be responsible, or the actor won't be held liable. (In Milgram's experiment, the authority figure said, when questioned by any "teacher," that he would take responsibility for anything that happened to the "learner.") 6. Starting the path toward the ultimate evil act with a small, seemingly insignificant first step, the easy "foot in the door" that swings open subsequent greater compliance pressures, and leads down a slippery slope.
21

(In the obedience study, the initial shock was only a mild 15 volts.) This is also the operative principle in turning good kids into drug addicts, with that first little hit or sniff. 7. Having successively increasing steps on the pathway that are gradual, so that they are hardly noticeably different from one's most recent prior action. "Just a little bit more." (By increasing each level of aggression in gradual steps of only 15-volt increments, over the thirty switches, no new level of harm seemed like a noticeable difference from the prior level to Milgram's participants.) 8. Gradually changing the nature of the authority figure (the researcher, in Milgram's study) from initially "just" and reasonable to "unjust" and demanding, even irrational. This tactic elicits initial compliance and later confusion, since we expect consistency from authorities and friends. Not acknowledging that this transformation has occurred leads to mindless obedience (and it is part of many "date rape" scenarios and a reason why abused women stay with their abusing spouses). 9. Making the "exit costs" high and making the process of exiting difficult by allowing verbal dissent (which makes people feel better about themselves) while insisting on behavioral compliance. 10. Offering an ideology, or a big lie, to justify the use of any means to achieve the seemingly desirable, essential goal. (In Milgram's research this came in the form of providing an acceptable justification, or rationale, for engaging in the undesirable action, such as that science wants to help people improve their memory by judicious use of reward and punishment.) In social psychology experiments, this tactic is known as the "cover story" because it is a cover-up for the procedures that follow, which might be challenged because they do not make sense on their own. The real-world equivalent is known as an "ideology." Most nations rely on an ideology, typically, "threats to national security," before going to war or to suppress dissident political opposition. When citizens fear that their national security is being threatened, they become willing to surrender their basic freedoms to a government that offers them that exchange. Erich Fromm's classic analysis in Escape from Freedom made us aware of this trade-off, which Hitler and other dictators have long used to gain and maintain power: namely, the claim that they will be able to provide security in

Investigating Social Dynamics

275

exchange for citizens giving up their freedoms, which will give them the ability to control things better.
22

Such procedures are utilized in varied influence situations where those in authority want others to do their bidding but know that few would engage in the "end game" without first being properly prepared psychologically to do the "unthinkable." In the future, when you are in a compromising position where your compliance is at stake, thinking back to these stepping-stones to mindless obedience may enable you to step back and not go all the way down the path—their path. A good way to avoid crimes of obedience is to assert one's personal authority and always take full responsibility for one's a c t i o n s .
23

Replications and Extensions of the Milgram Obedience Model Because of its structural design and its detailed protocol, the basic Milgram obedience experiment encouraged replication by independent investigators in many countries. A recent comparative analysis was made of the rates of obedience in eight studies conducted in the United States and nine replications in European, African, and Asian countries. There were comparably high levels of compliance by research volunteers in these different studies and nations. The majority obedience effect of a mean 61 percent found in the U.S. replications was matched by the 66 percent obedience rate found across all the other national samples. The range of obedience went from a low of 31 percent to a high of 91 percent in the U.S. studies, and from a low of 28 percent (Australia) to a high of 88 percent (South Africa) in the cross-national replications. There was also stability of obedience over decades of time as well as over place. There was no association between when a study was done (between 1 9 6 3 and 1 9 8 5 ) and degree of obedience. Obedience to a Powerful Legitimate Authority In the original obedience studies, the subjects conferred authority status on the person conducting the experiment because he was in an institutional setting and was dressed and acted like a serious scientist, even though he was only a high school biology teacher paid to play that role. His power came from being perceived as a representative of an authority system. (In Milgram's Bridgeport replication described earlier, the absence of the prestigious institutional setting of Yale reduced the obedience rate to 4 7 . 5 percent compared to 65 percent at Yale, although this drop was not a statistically significant one.) Several later studies showed how powerful the obedience effect can be when legitimate authorities exercise their power within their power domains. When a college professor was the authority figure telling college student volunteers that their task was to train a puppy by conditioning its behavior using electric shocks, he elicited 75 percent obedience from them. In this experiment, both the "experimenter-teacher" and the "learner" were "authentic." That is, college students acted as the teacher, attempting to condition a cuddly little puppy,
24

276

The Lucifer Effect

the learner, in an electrified apparatus. The puppy was supposed to learn a task, and shocks were given when it failed to respond correctly in a given time interval. As in Milgram's experiments, they had to deliver a series of thirty graded shocks, up to 4 5 0 volts in the training process. Each of the thirteen male and thirteen female subjects individually saw and heard the puppy squealing and jumping around the electrified grid as they pressed lever after lever. There was no doubt that they were hurting the puppy with each shock they administered. (Although the shock intensities were much lower than indicated by the voltage labels appearing on the shock box, they were still powerful enough to evoke clearly distressed reactions from the puppy with each successive press of the shock switches.) As you might imagine, the students were clearly upset during the experiment. Some of the females cried, and the male students also expressed a lot of distress. Did they refuse to continue once they could see the suffering they were causing right before their eyes? For all too many, their personal distress did not lead to behavioral disobedience. About half of the males ( 5 4 percent) went all the way to 4 5 0 volts. The big surprise came from the women's high level of obedience. Despite their dissent and weeping, 1 0 0 percent of the female college students obeyed to the full extent possible in shocking the puppy as it tried to solve an insoluble task! A similar result was found in an unpublished study with adolescent high school girls. (The typical finding with human "victims," including Milgram's own findings, is that there are no male-female gender differences in obedience. ) Some critics of the obedience experiments tried to invalidate Milgram's findings by arguing that subjects quickly discover that the shocks are fake, and that is why they continue to give them to the very e n d .
26 25

This study, conducted back

in 1 9 7 2 (by psychologists Charles Sheridan and Richard King), removes any doubt that Milgram's high obedience rates could have resulted from subjects' disbelief that they were actually hurting the learner-victim. Sheridan and King showed that there was an obvious visual connection between a subject's obedience reactions and a puppy's pain. Of further interest is the finding that half of the males who disobeyed lied to their teacher in reporting that the puppy had learned the insoluble task, a deceptive form of disobedience. When students in a comparable college class were asked to predict how far an average woman would go on this task, they estimated 0 percent—a far cry from 1 0 0 percent. (However, this faulty low estimate is reminiscent of the 1 percent figure given by the psychiatrists who assessed the Milgram paradigm.) Again this underscores one of my central arguments, that it is difficult for people to appreciate fully the power of situational forces acting on individual behavior when they are viewed outside the behavioral context.

Investigating Social Dynamics

277

Physicians' Power over Nurses to Mistreat Patients If the relationship between teachers and students is one of power-based authority, how much more so is that between physicians and nurses? How difficult is it, then, for a nurse to disobey an order from the powerful authority of the doctor— when she knows it is wrong? To find out, a team of doctors and nurses tested obedience in their authority system by determining whether nurses would follow or disobey an illegitimate request by an unknown physician in a real hospital setting.
27

Each of twenty-two nurses individually received a call from a staff doctor whom she had never met. He told her to administer a medication to a patient immediately, so that it would take effect by the time he arrived at the hospital. He would sign the drug order then. He ordered her to give his patient 20 milligrams of the drug "Astrogen." The label on the container of Astrogen indicated that 5 milliliters was usual and warned that 10 milliliters was the maximum dose. His order doubled that high dose. The conflict created in the minds of each of these caregivers was whether to follow this order from an unfamiliar phone caller to administer an excessive dose of medicine or follow standard medical practice, which rejects such unauthorized orders. When this dilemma was presented as a hypothetical scenario to a dozen nurses in that hospital, ten said they would refuse to obey. However, when other nurses were put on the hot seat where they were faced with the physician's imminent arrival (and possible anger at being disobeyed), the nurses almost unanimously caved in and complied. All but one of twenty-two nurses put to the real test started to pour the medication (actually a placebo) to administer to the patient—before the researcher stopped them from doing so. That solitary disobedient nurse should have been given a raise and a hero's medal. This dramatic effect is far from isolated. Equally high levels of blind obedience to doctors' almighty authority showed up in a recent survey of a large sample of registered nurses. Nearly half ( 4 6 percent) of the nurses reported that they could recall a time when they had in fact "carried out a physician's order that you felt could have had harmful consequences to the patient." These compliant nurses attributed less responsibility to themselves than they did to the physician when they followed an inappropriate command. In addition, they indicated that the primary basis of social power of physicians is their "legitimate power," the right to provide overall care to the patient. They were just following what they construed as legitimate orders—but then the patient died. Thousands of hospitalized patients die needlessly each year due to a variety of staff mistakes, some of which, I assume, include such unquestioning obedience of nurses and tech aides to physicians' wrong orders.
28

278

The Lucifer Effect

Deadly Obedience to Authority This potential for authority figures to exercise power over subordinates can have disastrous consequences in many domains of life. One such example is found in the dynamics of obedience in commercial airline cockpits, which have been shown to lead to many airline accidents. In a typical commercial airline cockpit, the captain is the central authority over a first officer and sometimes a flight engineer, and the might of that authority is enforced by organizational norms, the military background of most pilots, and flight rules that make the pilot directly responsible for operating the aircraft. Such authority can lead to flight errors when the crew feels forced to accept the "authority's definition of the situation," even when the authority is wrong. An investigation of thirty-seven serious plane accidents where there were sufficient data from voice recorders revealed that in 81 percent of these cases, the first officer did not properly monitor or challenge the captain when he had made errors. Using a larger sample of seventy-five plane accidents as the context for evaluating destructive obedience, the author of this study concludes, "If we assume that both monitoring and challenging errors are due to excessive obedience, we may conclude that excessive obedience may cause as many as 2 5% of all airplane accidents."
29

Administrative Obedience to Authority In modern society people in positions of authority rarely punish others with physical violence as in the Milgram paradigm. What is more typical is, mediated violence, where authorities pass along orders to underlings who carry them out or the violence involves verbal abuse that undercuts the self-esteem and dignity of the powerless. Authorities often take actions that are punitive and whose consequences are not directly observable. For example, giving hostile feedback to someone that knowingly will disrupt their performance and adversely affect their chances of getting a job qualifies as a form of such socially mediated violence. A team of Dutch researchers assessed the extension of authority-based obedience to such a situation in a series of ingenious experiments involving twentyfive separate studies of nearly 5 0 0 participants from 1 9 8 2 to 1 9 8 5 at Utrecht University in the Netherlands.
30

In their "administrative obedience paradigm"

the experimenter told the research participant, acting as administrator, to make a series of fifteen "stress remarks" to a job applicant (a trained accomplice) in the next room. Specifically, the subjects were instructed to administer a job selection test to the applicant—if he passed the test, he would get the job; if he failed, he would remain unemployed. They were also instructed to disturb and stress the applicant while giving him the test. These fifteen graded remarks were critical of his test performance and also denigrated his personality, such as "That was really stupid of you." As the participant-administrators delivered these ever-more-hostile remarks, they

Investigating Social Dynamics

279

"placed the applicant under such intense psychological strain that he did not perform satisfactorily and consequently failed to get the job." In addition, they were told by the researchers to continue despite any protests from the applicant. Any dissent by the participant-administrators was countered with up to four prods by the experimenter to continue the hostile remarks before they were finally permitted to stop if they were adamant. Finally, and most significantly, the subjects were informed that the ability to work under stress was not an essential job requirement, but the procedure had to be followed because it assisted the experimenter's research project, which was studying how stress affects test performance. Causing distress and hurting another person's job chances had no further use than the researcher's collection of some data. In the control condition, subjects could stop making the stress remarks at any point they chose. When asked to predict whether they would make all the stress remarks under these circumstances, more than 90 percent of a separate set of comparable Dutch respondents said they would not comply. Again, the "outsider's view" was way off base: fully 91 percent of the subjects obeyed the authoritative experiment to the very end of the line. This same degree of extreme obedience held up even when personnel officers were used as the subjects despite their professional code of ethics for dealing with clients. Similarly high obedience was found when subjects were sent advance information several weeks before their appearance at the laboratory so that they had time to reflect on the nature of their potentially hostile role. How might we generate disobedience in this setting? You can choose among several options: Have several peers rebel before the subject's turn, as in Milgram's study. Or notify the subject of his or her legal liability if the applicant-victim were harmed and sued the university. Or eliminate the authority pressure to go all the way, as in the control condition of this research—where no one fully obeyed.

You May Also Find These Documents Helpful

  • Powerful Essays

    When people are given little to no direction or training, and are faced with dealing with people they may perceive as a threat to their own safety and the well-being of others, they have a propensity to overstep what most would consider reasonable behavior. The “guards” in the experiment were put into a position of authority and took the steps they deemed necessary to maintain order. In spite of the fact that they knew it was an experiment, they were immersed in the situation and played the role given them. The “prisoners” played their part and were so wholly immersed in the role and the environment that their entire perspective of reality was altered. They began to believe they were helpless and unable to help themselves out of the situation they found themselves. They had become powerless to change the situation, in spite of the fact that it was just an…

    • 1846 Words
    • 8 Pages
    Powerful Essays
  • Good Essays

    Zimbardo’s mock prison experiment yielded the conclusion that individual behavior is largely under the control of social forces and environmental contingencies rather than personality traits, character, and will power. His findings were shown through the change in the pretend prison guards’ behavior over a matter of days. Their total demeanor was transformed and they became the role they were playing, with tyrannical and abusive actions towards the prisoners. The prison guards’ power went to their heads and corrupted them, much like what happened in the case of ordinary soldiers torturing prisoners. Like the prison guards, the soldiers were ordinary until they were put into a role of power. The environment of the prison with no structure or set rules changed the soldiers’ demeanors and caused them to throw their morals aside for limitless power over other human…

    • 549 Words
    • 3 Pages
    Good Essays
  • Good Essays

    Have you ever noticed that certain people act and behave differently when they are with crowds versus when they are alone? Being in a large crowd can really impact individual to act in a certain way that they seem to fit in with the group and sometimes do things more anonymous as it is in a large crowd. Both Zimbardo and Le Bon believe that bystanders are less responsible and more likely to commit violence than when people are alone. Philip Zimbardo is a psychologist and a professor at Stanford University; he researches the cause of evil in people by doing a Stanford prison experiment. Zimbardo states about how evil can cause good people easily by the peers that they are surrounded by and the culture and traditional way changes can affect people…

    • 1535 Words
    • 7 Pages
    Good Essays
  • Good Essays

    One of the infamous experiment in the history of psychology was the Stanford Prison Experiment. Its creator, Dr. Zimbardo, main objective was to see what effects would occur when a psychological experiment into human nature was performed. As I began to perform some research of my own, I noticed that my thoughts on the matter were similar to many; that as a scientific research project, Mr. Zimbardo’s experiment it was a complete failure. However, his findings did provide us with something that was much more important that is still being talked about today; insight into human psychology and social behavior.…

    • 176 Words
    • 1 Page
    Good Essays
  • Powerful Essays

    A team at Stanford University, led by Phillip Zimbardo, conducted the Stanford Prison Experiment to investigate causes of conflict between military guards and prisoners. Zimbardo and his team were seeking to observe the inherent personality traits of prisoners and guards and see if this was the chief cause of abusive behavior in these settings (Haney, Banks, and Zimbardo, 1973). This study is one that is well know and well-recognized. Zimbardo and his study are often discussed in many psychology courses today, and have even caused reform in prison systems as well as IRB/APA ethical committees.…

    • 1783 Words
    • 8 Pages
    Powerful Essays
  • Good Essays

    This study was conducted by Professor Phillip Zimbardo at Stanford University in 1971. Zimbardo wanted to find out if a situation can control the person or can an individual’s beliefs, attitude and values would allow one to rise above their current situation. He wanted to look more in-depth in the behavioral and sociological consequences in the roles of the guard and prisoner. Also, he wanted to find out why and how social situations can overwhelm people. In order to find study subjects, Zimbardo advertised in the paper for healthy, male students and offered $15 a day for up to two weeks. His subjects were predominately white, middle class students with no history of drug use or a criminal record. The basement of Stanford’s psychology department…

    • 1588 Words
    • 7 Pages
    Good Essays
  • Good Essays

    Experiments have been done for many more years than humans can count on the two hands in which they possess. Two experiments, in particular, were written, “The Stanford Prison Experiment” by Philip G. Zimbardo and “The Perils of Obedience” by Stanley Milgram. These experiments can be controversial for many different reasons, but neither of these experiments were completed under conditions of normality. The information collected in these experiments isn’t exactly based off of real life situations, it becomes difficult not to question the relevance of these experiments.…

    • 743 Words
    • 3 Pages
    Good Essays
  • Better Essays

    Zimbardo Research Paper

    • 1014 Words
    • 4 Pages

    The value of the study in relation to social psychology was the demonstration on how social roles can have a negative effect on a person’s behavior. It showed that putting an individual in a bad situation can potentially change their behavior or basic personal characteristic. I opened the path to further examining how situations individuals are placed in can drastically their behavior. The experiment showed that with random assignment, the social class of individuals before the experiment did not matter. It also showed that when told to do what is necessary to maintain order in the makeshift prison, the guards took actions that they probably would never have done in their life. It showed how easily people can change in order to maintain control and authority.…

    • 1014 Words
    • 4 Pages
    Better Essays
  • Good Essays

    “The line between good and evil is permeable and almost anyone can be induced to cross it when pressured by situational forces.” Said Philip Zimbardo. The Stanford Prison Experiment helped solve many mysteries about forensic psychology and how good, normal people, can turn evil. The Stanford Prison Experiment was a psychologically intense experiment that affected the lives of normal, mentally healthy, students who were brought into interference with situational forces.…

    • 607 Words
    • 3 Pages
    Good Essays
  • Powerful Essays

    It wasn’t only the superior’s fault, but also the weak’s fault. Convicts should not have rebelled, but should have obeyed the authoritative guards. The SPE demonstrates the way humans abuse the power they are given, whether that power is real or perceived, and social profiling of an individual affects their…

    • 1435 Words
    • 6 Pages
    Powerful Essays
  • Powerful Essays

    Humans are by nature social beings, this means we have to communicate and interact with other people. Attempting to establish a polite relationship between people, control mechanisms were created to correct people from taking bad decisions and doing unaccepted actions such as stealing, murder, fighting, and rape. Margin someone from society and imprison him could be for the police and society the best solution, but the circumstances lived in jail are for sure the worst experiences for the internees. Based on the movie The Shawshank Redemption, this paper analyzes three of the amount of issues presented on the film and ties them to problems occurring now days. The issues described on this paper are infidelity,…

    • 1008 Words
    • 5 Pages
    Powerful Essays
  • Good Essays

    In this experiment a simulated prison was created where college students were recruited for a two week study and paid $15 a day to either be a prison guard or an inmate. “After a day or two in which the volunteers self-consciously “played” their roles, the simulation became real-too real.” (Social Psychology) The guards took their roles too seriously and “devised cruel and degrading routines.” (Social Psychology) After only six days, the experiment got out of hand and was forced to be shut down. The experiment showed that situational factors powerfully affected human behaviors. This was shown by the many inmates that broke down and had emotional breakdowns and left the experiment because the prison guards took it too far when given a position of authority. The individuals in the experiment were deindividualized and no longer had any self-awareness of what they were doing within the group. Zimbardo’s Experiment clearly showed that “people will readily conform to the social roles they are expected to play, especially if the roles are so strongly stereotyped as those of the prison guards.” (McLeod, Zimabardo - Stanford Prison Experiment,…

    • 743 Words
    • 3 Pages
    Good Essays
  • Good Essays

    When obeying authority one can often loose thought of morals and beliefs. In the experiments the men obey the authority figure by doing cruel things they would not usually do. These experiments turn mentally stable men into a person willing to inflict harsh punishments on innocent people while following orders. Night by Elie Wiesel, The Milgram Shock Experiment, and the stanford prison experiment shows how obedience to an authority can cause people to stray from their conscience.…

    • 424 Words
    • 2 Pages
    Good Essays
  • Good Essays

    criminal

    • 355 Words
    • 1 Page

    As we have seen throughout this chapter, most criminological theories posit a cause of crime.90 Some theories, however, focus less on causes than on constraints—those forces that keep people from committing a crime. These theories are called restraint theories. However, since they focus primarily on why people do not break the law, restraint theories provide only half of the causal picture. They are especially weak in identifying the social-structural sources of motivations to commit crimes.91 Also, the ways in which bonds with different in- stitutions interact with one another and with personal attributes, as well as the variety of bonds that operate throughout the life cycle, have yet to be clarified.However, differential association theory fails to explain why people have the associations they do and why some associations affect certain individuals more than others. Why, for ex- ample, are most prison guards unaffected by their constant association with offenders, while a few take advantage of their position to smuggle contraband? The theory has also been crit- icized for being so general and imprecise as to allow for little testing.88 Complete testing of the theory would require that all of the associations a person has ever had be recorded and analyzed from the standpoint of the individual—a clearly impossible task. Other theorists continue to build on Sutherland’s early work. Robert Burgess and Ronald Akers, for example, have constructed a differential association–reinforcement theory that seeks to integrate Sutherland’s original propositions with the work of American psycholo- gist B. F. Skinner’s work on conditioning.89 Burgess and Akers suggest that although values and behavior patterns are learned in association with others, the primary mechanism through which such learning occurs is operant conditioning. Reinforcement is the key, they say, to understanding any social learning as it takes place. The name social learning theory has been widely applied to…

    • 355 Words
    • 1 Page
    Good Essays

Related Topics