Tuesday, April 3, 2007

McChurch - Sin in the Sanctuary

Bad Apples or Bad Barrels? Zimbardo on ‘The Lucifer Effect’

In the world of McChurch, Republicans are “good;” Democrats are “evil.” Americans are “good;” many foreigners are “evil.” Christians are “good;” Muslims are “evil.”

There is, in fact, enough sin to go around for everyone…The tendency to discover the evil in others stems from the desire to justify ourselves by comparison…We cannot seem to remember that it was only a short time ago when Arians were “good,” and Jews were “evil.”

Stan Moody, author of Crisis in Evangelical Scholarshipand McChurched: 300 Million Served and Still Hungry.”)


It is rare when a social scientist actually embraces theologically loaded words like “good” or “evil.” Most prefer to speak in more muted terms of violence and aggression, or use the sanitized, judgment-free language of psychopathology — the language of disorders.

Not so, Philip Zimbardo.

“Psychologists rarely ask the big questions,” the eminent Stanford psychologist said, addressing a standing-room-only crowd gathered to hear his talk, “The Lucifer Effect: Understanding How Good People Turn Evil,” at the APS 18th Annual Convention. “We have all kinds of great techniques for answering small questions. We’ve never bothered to ask the big questions. It’s time we asked the big questions like the nature of evil.” In a young century already dominated by iconic images of evil, the photographs with which he opened his presentation were both familiar and hard to watch. “This,” he said, “is the ultimate evil of our time: The little shop of horrors, the dungeon, Tier 1A, the night shift, at Abu Ghraib.” The pictures, a few of which had become well known from media reports of the prison, showed Army reservist guards torturing and humiliating Iraqi prisoners — naked prisoners stacked in pyramids or crawling on the floor with leashes; a prisoner standing in a black hood with electrodes on his fingertips; naked terrified prisoners being threatened with attack dogs or having guns pointed at their genitals by hideously masked guards; and worse.

“Pretty horrible,” Zimbardo said, breaking the stunned silence in the room.

All of the photographs, he explained, were what he called “trophy photos” that had come from the guards’ digital cameras. Zimbardo had access to them because he had served as an expert witness for the defense of one of the guards who had been tried for the atrocities. Despite the natural repulsion it was easy to feel toward those guards, Zimbardo’s aim was to show how readily, given the right circumstances, almost any normal person can become an agent of evil.

Their accusers called them “bad apples” — a dispositional account that simply blames the individual for wrongdoing. But as psychologists, Zimbardo said, it is necessary to assume that the perpetrators of the abuses at Abu Ghraib and other prisons in Iraq “didn’t go in there with sadistic tendencies, this is not part of their whole lifestyle, they are not serial murderers and torturers.” Rather, they were transformed into perpetrators of evil by their situation, the “bad barrel” of war.

Known to everyone in the audience as the researcher who conducted the famous 1971 Stanford Prison Experiment, Zimbardo is probably the best-positioned psychologist in the world to deliver such a situational analysis of atrocity. “We imagine a line between good and evil,” he said, “and we like to believe that it’s impermeable. We are good on this side. The bad guys, the bad women, they are on that side, and the bad people never will become good, and the good never will become bad. I’ll say today that’s nonsense. Because that line is … permeable. Because sometimes, just like human cells, material flows in and out. And if it does, then it could allow some ordinary people like you to become perpetrators of evil.”

From Jekyll to Hyde

Beginning with the classic studies of diffusion of responsibility, Zimbardo walked his engrossed audience through a great tradition of 20th-century social-psychological research seemingly tailor-made to understanding the situation at Abu Ghraib.

Ask a classroom of students who would be willing to pull the trigger to execute a condemned traitor; no one will raise their hands. Alter the conditions such that one would be part of a large firing squad in which there is only one real bullet, no one knowing who had fired the fatal shot, resistance to committing the deed lessens. “If you can diffuse responsibility, so people don’t feel individually accountable, now they will do things that they ordinarily say ‘I would never do that.’” This basic psychological principle is just one ingredient in the potion that can turn good Dr. Jekylls into sadistic Mr. Hydes.

Many of the other ingredients were revealed in Stanley Milgram’s classic 1961 study of obedience to authority — in which over two thirds of subjects in a study ostensibly about memory went all the way in delivering what they thought was a lethal shock to another person (actually an actor feigning agony) when ordered to do so by an authority figure in a lab coat. Subsequent studies by other researchers in which male and female psychology students delivered actual nonfatal but painful shocks to a puppy (causing it to yelp and cry) — they were led to believe they would get a failing grade if they failed to condition the puppy — further revealed how easily people’s scruples can evaporate when something even as minor as a grade is at stake.

From such studies, Zimbardo said, we can learn important principles about how to create obedience. He listed several, including the importance of a legitimate-sounding cover story (e.g., a memory study, or “national security”), a legitimate-seeming authority figure, and rules that are vague enough that they are hard to understand or remember. You also, he said, need a model of compliance that, ironically enough, allows room for dissent (“‘Yes, I can understand. Yeah, cry, go ahead and cry. Just keep pressing the button’”). Showing slides of the mass suicide/execution of 912 People’s Temple cult members in Guyana in 1978, Zimbardo added that it is also important to “make exiting difficult. This is one of the big things all cults do: They literally create a barrier to leaving [by saying] ‘If you exit, you’re going to end up mentally impaired.’ Literally a lot of people in practicing cults are there because they don’t know how to exit.”

Situations in which people are depersonalized are good breeding grounds for evil, Zimbardo said. Among the most disturbing of the Abu Ghraib photographs showed an Army reservist guard with his face painted like a hideous skeleton (modeled after the violent rock group “Insane Clown Posse”). “Can you imagine what it must have been like to be a prisoner, watching one of your guards look like this?” Zimbardo discussed a number of deindividution studies in which disguises such as hoods facilitated overcoming moral barriers to hurting another person. He also cited anthropological research showing that warriors in cultures that donned masks or costumes before engaging in battle were significantly more likely to torture, mutilate, and kill their enemies than warriors in cultures that didn’t engage in self-disguise. “Masks have terrible power, they’re a medium of terror. And of course the first terrorists in the United States were the Ku Klux Klan.” Military uniforms, like disguises, are tools of deindividuating a person. And depersonalizing the enemy — if only through linguistic labels — is the flip side of the coin: He cited a study by Albert Bandura in which students delivered much higher electric shocks to another group of participants merely if they had overheard that those students from the other college seemed like “animals.”

Zimbardo’s famous Prison Study exemplifies all of the above principles and how they can be used to create evil. After randomly assigning 24 normal, psychologically healthy college students to roles of prisoner and guard, giving each group suitably depersonalizing attire (the guards wore reflective sunglasses, for example — an idea Zimbardo said he got from the movie Cool Hand Luke) — the students began very quickly to lose their everyday personalities and fulfill their assigned roles. Guards quickly began giving prisoners humiliating menial tasks, then forced them to strip naked and subjected them to sexual degradation. “Within 36 hours, the first normal, healthy student prisoner had a breakdown. … We released a prisoner each day for the next five days, until we ended the experiment at six days, because it was out of control. There was no way to control the guards.”

Ordinary People, Extraordinary Conditions

The parallels between the Stanford Prison Experiment and Abu Ghraib are striking. “What we’ve done is substituted social psychology for Dr. Jekyll’s chemical — transforming good ordinary people into perpetrators of evil. So essentially we took the chemical out of Dr. Jekyll’s hand. It’s not necessary. You can do it using social psychology.”

Thus Zimbardo’s role as a witness for the defense of the reservist sergeant in charge of Tier 1A (where the infamous abuses were committed) made perfect sense. “This is Chip Frederick,” he said.“He is the one who got the idea for the iconic image of torture, the hooded man” — referring to the now-famous photograph of a hooded Iraqi prisoner — “He put electrodes on his fingers, he put him on a box, and said, ‘You get tired, you fall off, you get electrocuted.’ Imagine the terror. But what was Frederick like before going out into the desert to do that terrible stuff?”

Like any of the students in Zimbardo’s study, or any of the hundreds of participants in Milgram’s or other similar studies, Frederick is — and was, before his fateful tour of duty — a normal healthy person. He had a distinguished work and military record and a healthy family life. “I had the army’s permission to have a whole battery of psychological tests conducted by an assessment expert,” Zimbardo said. “I interviewed him at my home. Normal. No evidence of any psychopathology. No sadistic tendencies. His only negatives were obsessive about orderliness, neatness, discipline, personal appearance — all of which was absent at Abu Ghraib.” A series of slides of Frederick and his family taken both before and after his hellish experience in Iraq — including a charming photo of Zimbardo and Frederick embracing like old friends — drove his normality home.

According to Zimbardo, the inhuman conditions at the prison — which had been Saddam Hussein’s torture chamber before the war — created the situation necessary to effect a Jekyll/Hyde transformation in Frederick’s (and his fellow guards’) character. Forced to work 12-hour shifts, seven nights a week, for 40 days without a break, in hellishly filthy conditions (without toilets or running water) and under constant enemy bombardment, it sounds like a picture of hell. “Frederick was in charge of 1,000 prisoner, 12 reservist guards, 60 Iraqi police guards who were smuggling weapons and drugs to the Iraqi prisoners. They had no training. There was never supervision on anyone’s part. [Frederick] rarely left the prison. … Tier 1A became his total reference group, and we know what that means.”

The situationist explanation does not absolve Frederick and the other guards from responsibility, Zimbardo emphasized. “What happened at Abu Ghraib was terrible. By understanding the processes we don’t excuse it. These people are guilty.” But for Zimbardo, the military leadership that had implicitly condoned torture and averted its eyes from what was happening at Abu Ghraib prison deserved the bulk of the blame. He made evident that in his analysis it was the military command and the Bush Administration that created the evil system that created the bad barrel that corrupted once-good American soldiers.

The more than a thousand attendees gladly stayed well over the allotted hour to hear Zimbardo finish his presentation. “I want to end on a positive note,” he said, to relieved laughter. “Not all good people turn bad. People resist.” Turning around Hannah Arendt’s famous phrase “the banality of evil,” Zimbardo coined the term “the banality of heroism” to describe the soldier — another otherwise undistinguished, normal guy — who blew the whistle on what was happening at Abu Ghraib by turning over a CD of the soldiers’ trophy photographs to the authorities. “Heroic action by ordinary people … is more common than the few lifelong heroes,” he said.

Zimbardo’s address exemplified how social psychology — even the most depressing studies of human weakness — can actually be inspiring. “There will come a time in your life,” he said, “when … you have the power within you, as an ordinary person, as a person who is willing to take a decision, to blow the whistle, to take action, to go the other direction and do the heroic thing.” That decision is set against the decisions to perpetrate evil or to do nothing, which is the evil of inaction. Zimbardo concluded with a thought from Alexandr Solzhenitsyn, the Russian poet imprisoned under Stalin: “The line between good and evil lies at the center of every human heart.” He added, “it is not an abstraction out there. It’s a decision you have to make every day in here.” With the last of Zimbardo’s 150 slides and three video clips, came an extended standing ovation — rare among psychology audiences.

No comments: