Skip to main content

Verified by Psychology Today

Ethics and Morality

Aftershock and the Banality of Evil

'I was only obeying orders."

ShortCutstv, used with permission
Source: ShortCutstv, used with permission

When Adolf Eichmann, one of the architects of the Holocaust, was brought to trial at Jerusalem’s District Court in April 1961, those in the courtroom and the millions watching on TV were in for a shock. They’d expected to see someone horrifying; a typical sadist or psychopath. What they saw instead was a small, hunched figure, rather fastidious, carefully taking notes. To philosopher Hannah Arendt, who was there for The New Yorker, he seemed "neither perverted nor sadistic," but "terribly and terrifyingly normal."

Arendt later developed her impressions of Eichmann into a concept she called the Banality of Evil. Eichmann had performed evil deeds but was not inherently evil. He was simply a bureaucrat of limited intelligence blindly following the orders he had been given to the best of his ability who "never realised what he was doing."

The outrage that followed this portrait of Eichmann shattered Arendt’s reputation. Banality of evil was an absurd idea. Of course, Eichmann knew what he was he was doing. He was evil because no normal person could ever do what he had done.

But in the same year as the Eichmann trial, a young Yale psychologist was preparing a series of experiments that would shock the world and seem to provide scientific evidence for the banality of evil.

Milgram at Yale

Almost everyone who’s taken a psychology course, and many who haven’t, will know something about Milgram’s frightening "obedience" experiments. The grainy black and white film of people apparently giving increasingly powerful electric shocks to a stranger because an "experimenter" told them to do it, was seen around the world and is still shown to psychology students today.

Participants were seated in front of an impressive-looking "shock machine" which had 30 switches going up in 15-volt intervals to a lethal 450 volts. When Milgram asked 40 psychiatrists how far they thought his participants would go, they told him not very far and that only a psychopath would go all the way. But, as we know, in the original baseline condition, an incredible 65% of male participants continued shocking, past the Danger Severe Shock warning, right up to the maximum 450 volts. When Milgram replicated the condition with female participants (in Experiment 8) he got the same result. And in more recent partial replications of Milgram’s methods, Jerry Burger (2009) and Dariusz Dolinski (2017), found broadly similar results.

Therefore, if ordinary people, some of them professionals, were prepared to administer potentially lethal electric shocks in a cognitive science experiment, doesn’t this confirm the banality of evil?

Milgram thought so. He later explained his findings with what he called the agentic state. Confronted by a powerful authority, a person may move from an autonomous state of mind to an agentic state, where they come to see themselves as merely an "instrument" for carrying out another's wishes. They no longer see themselves as responsible for their actions and become blind to the consequences.

So there’s a powerful union between Milgram's agentic state and Arendt's notion of the banality of evil, which Milgram himself said "comes closer to the truth than one might dare imagine."

However, this "truth" has recently been questioned by some psychologists. One of them is Steve Reicher, Professor of Psychology at St. Andrews University, who visited Yale to look again at Milgram’s documents and the many different versions of his experiment, some of them unpublished.

 Obedience and Identity (2015) / Shortcutstv
Professor Stephen Reicher
Source: Beyond Milgram: Obedience and Identity (2015) / Shortcutstv

Reicher at Yale

“The problem is not with Milgram’s findings,” says Steve Reicher, “but with how he explained them; the agentic state explanation doesn't stand up to scrutiny.”

What makes Milgram’s experiments so dramatic is that participants were not blind to the consequences of their actions. As you can see in the clip below, they were clearly very concerned and were torn between two different voices — the learner pleading with them to stop and the experimenter telling them to continue. Watch: Two voices in Milgram’s experiments.

In the original baseline condition, shown to students for decades, the voice of experimenter won out most of the time with two-thirds of participants continuing to give shocks all the way to 450 volts.

However, Milgram also conducted many less well-known variants of his experiment and got some very different results. For example, when the experimenter simply phoned in commands, or when another experimenter took over halfway through, obedience fell to 20%. And when there were two casually dressed experimenters who argued with each other and issued contradictory orders, obedience fell to zero. Contrary to the common interpretation, Milgram’s findings tell us as much about disobedience as obedience.

In the original baseline condition, everything had been scripted to encourage participants’ identification with the scientific status of the project. They were in the world-famous Yale University and the white-coated experimenter explained that this was an important project that would improve learning. The idea that this experiment was for science and learning was important in motivating obedience, and this was reinforced by the experimenter’s carefully scripted prompts, such as “the experiment requires you to continue.”

However, when that scientific credibility was undermined in different ways, participants were more comfortable with refusing to continue and compliance fell dramatically.

“What determines obedience," Reicher argues, “is the extent to which people identify with the authority giving the order. We don’t obey others blindly. We obey when we think they represent something we believe in.”

The social identity approach gives us another way of understanding Milgram’s incredible findings and, by extension, people's obedience to tyranny in the real world.

It seems more than likely that many of those who, like Eichmann, obeyed orders to transport Jewish people to death camps, or put bombs in busy city centres, or tortured prisoners held without trial for information, or flew airliners into a tower block on a September morning, or believed they were giving someone powerful electric shocks in a Yale laboratory more half a century ago, were neither evil psychopaths nor banally blind to the consequences of their actions. Rather, they did what they did because they believed it was right.

And that’s the really shocking thing about Milgram’s experiments.

More from Steven Taylor Ph.D.
More from Psychology Today