More than 50 years after a controversial psychologist shocked the world with studies that revealed people’s willingness to harm others on order, a team of cognitive scientists has carried out an updated version of the iconic ‘Milgram experiments’.
Their findings may offer some explanation for Stanley Milgram's uncomfortable revelations: when following commands, they say, people genuinely feel less responsibility for their actions — whether they are told to do something evil or benign.
“If others can replicate this, then it is giving us a big message,” says neuroethicist Walter Sinnot-Armstrong of Duke University in Durham, North Carolina, who was not involved in the work. “It may be the beginning of an insight into why people can harm others if coerced: they don’t see it as their own action.”
The study may feed into a long-running legal debate about the balance of personal responsibility between someone acting under instruction and their instructor, says Patrick Haggard, a cognitive neuroscientist at University College London, who led the work, published on 18 February in Current Biology1.
Milgram’s original experiments were motivated by the trial of Nazi Adolf Eichmann, who famously argued that he was ‘just following orders’ when he sent Jews to their deaths. The new findings don’t legitimize harmful actions, Haggard emphasizes, but they do suggest that the ‘only obeying orders’ excuse betrays a deeper truth about how a person feels when acting under command.
Ordered to shock
In a series of experiments at Yale University in New Haven, Connecticut, in the 1960s, Milgram told his participants that a man was being trained to learn word pairs in a neighbouring room. The participants had to press a button to deliver an electric shock of escalating strength to the learner when he made an error; when they did so, they heard his cries of pain. In reality, the learner was an actor, and no shock was ever delivered. Milgram’s aim was to see how far people would go when they were ordered to step up the voltage.
Routinely, an alarming two-thirds of participants continued to step up shocks, even after the learner was apparently rendered unconscious. But Milgram did not assess his participants’ subjective feelings as they were coerced into doing something unpleasant. And his experiments have been criticized for the deception that they involved — not just because participants may have been traumatized, but also because some may have guessed that the pain wasn’t real.
Modern teams have conducted partial and less ethically complicated replications of Milgram’s work. But Haggard and his colleagues wanted to find out what participants were feeling. They designed a study in which volunteers knowingly inflicted real pain on each other, and were completely aware of the experiment’s aims.
Because Milgram’s experiments were so controversial, Haggard says that he took “quite a deep breath before deciding to do the study”. But he says that the question of who bears personal responsibility is so important to the rule of law that he thought it was “worth trying to do some good experiments to get to the heart of the matter.”
Sense of agency
In his experiments, the volunteers (all were female, as were the experimenters, to avoid gender effects) were given £20 (US$29). In pairs, they sat facing each other across a table, with a keyboard between them. A participant designated the ‘agent’ could press one of two keys; one did nothing. But for some pairs, the other key would transfer 5p to the agent from the other participant, designated the ‘victim’; for others, the key would also deliver a painful but bearable electric shock to the victim’s arm. (Because people have different tolerances to pain, the level of the electric shock was determined for each individual before the experiment began.) In one experiment, an experimenter stood next to the agent and told her which key to press. In another, the experimenter looked away and gave the agent a free choice about which key to press.
To examine the participants’ ‘sense of agency’ — the unconscious feeling that they were in control of their own actions — Haggard and his colleagues designed the experiment so that pressing either key caused a tone to sound after a few hundred milliseconds, and both volunteers were asked to judge the length of this interval. Psychologists have established that people perceive the interval between an action and its outcome as shorter when they carry out an intentional action of their own free will, such as moving their arm, than when the action is passive, such as having their arm moved by someone else.
When they were ordered to press a key, the participants seemed to judge their action as more passive than when they had free choice — they perceived the time to the tone as longer.
In a separate experiment, volunteers followed similar protocols while electrodes on their heads recorded their neural activity through EEG (electroencephalography). When ordered to press a key, their EEG recordings were quieter — suggesting, says Haggard, that their brains were not processing the outcome of their action. Some participants later reported feeling reduced responsibility for their action.
Unexpectedly, giving the order to press the key was enough to cause the effects, even when the keystroke led to no physical or financial harm. “It seems like your sense of responsibility is reduced whenever someone orders you to do something — whatever it is they are telling you to do,” says Haggard.
The study might inform legal debate, but it also has wider relevance to other domains of society, says Sinnot-Armstrong. For example, companies that want to create — or avoid — a feeling of personal responsibility among their employees could take its lessons on board.
Actor Peter Sarsgaard portrays psychologist Stanley Milgram in the 2015 film Experimenter.
Chris Ryan/adapted from Emilie Caspar, ref 1
- Journal name:
- Date published:
Behind the Shock Machine
The Untold Story of the Notorious Milgram Psychology Experiments
by Gina Perry
In the early 1960s, Stanley Milgram, a social psychologist at Yale, conducted a series of experiments that became famous. Unsuspecting Americans were recruited for what purportedly was an experiment in learning. A man who pretended to be a recruit himself was wired up to a phony machine that supposedly administered shocks. He was the "learner." In some versions of the experiment he was in an adjoining room.
The unsuspecting subject of the experiment, the "teacher," read lists of words that tested the learner's memory. Each time the learner got one wrong, which he intentionally did, the teacher was instructed by a man in a white lab coat to deliver a shock. With each wrong answer the voltage went up. From the other room came recorded and convincing protests from the learner — even though no shock was actually being administered.
The results of Milgram's experiment made news and contributed a dismaying piece of wisdom to the public at large: It was reported that almost two-thirds of the subjects were capable of delivering painful, possibly lethal shocks, if told to do so. We are as obedient as Nazi functionaries.
Or are we? Gina Perry, a psychologist from Australia, has written Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments. She has been retracing Milgram's steps, interviewing his subjects decades later.
"The thought of quitting never ... occurred to me," study participant Bill Menold told Perry in an Australian radio documentary. "Just to say: 'You know what? I'm walking out of here' — which I could have done. It was like being in a situation that you never thought you would be in, not really being able to think clearly."
In his experiments, Milgram was "looking to investigate what it was that had contributed to the brainwashing of American prisoners of war by the Chinese [in the Korean war]," Perry tells NPR's Robert Siegel.
On turning from an admirer of Milgram to a critic
"That was an unexpected outcome for me, really. I regarded Stanley Milgram as a misunderstood genius who'd been penalized in some ways for revealing something troubling and profound about human nature. By the end of my research I actually had quite a very different view of the man and the research."
On the many variations of the experiment
"Over 700 people took part in the experiments. When the news of the experiment was first reported, and the shocking statistic that 65 percent of people went to maximum voltage on the shock machine was reported, very few people, I think, realized then and even realize today that that statistic applied to 26 of 40 people. Of those other 700-odd people, obedience rates varied enormously. In fact, there were variations of the experiment where no one obeyed."
On how Milgram's study coincided with the trial of Nazi officer Adolf Eichmann — and how the experiment reinforced what Hannah Arendt described as "the banality of evil"
"The Eichmann trial was a televised trial and it did reintroduce the whole idea of the Holocaust to a new American public. And Milgram very much, I think, believed that Hannah Arendt's view of Eichmann as a cog in a bureaucratic machine was something that was just as applicable to Americans in New Haven as it was to people in Germany."
On the ethics of working with human subjects
"Certainly for people in academia and scholars the ethical issues involved in Milgram's experiment have always been a hot issue. They were from the very beginning. And Milgram's experiment really ignited a debate particularly in social sciences about what was acceptable to put human subjects through."
Gina Perry is an Australian psychologist. She has previously written for The Age and The Australian. Chris Beck/Courtesy of The New Press hide caption
Gina Perry is an Australian psychologist. She has previously written for The Age and The Australian.Chris Beck/Courtesy of The New Press
On conversations with the subjects, decades after the experiment
"[Bill Menold] doesn't sound resentful. I'd say he sounds thoughtful and he has reflected a lot on the experiment and the impact that it's had on him and what it meant at the time. I did interview someone else who had been disobedient in the experiment but still very much resented 50 years later that he'd never been de-hoaxed at the time and he found that really unacceptable."
On the problem that one of social psychology's most famous findings cannot be replicated
"I think it leaves social psychology in a difficult situation. ... it is such an iconic experiment. And I think it really leads to the question of why it is that we continue to refer to and believe in Milgram's results. I think the reason that Milgram's experiment is still so famous today is because in a way it's like a powerful parable. It's so widely known and so often quoted that it's taken on a life of its own. ... This experiment and this story about ourselves plays some role for us 50 years later."