Milgram experiment
Based on Wikipedia: Milgram experiment
The Day Ordinary People Became Torturers
Imagine walking into a laboratory at Yale University, one of the most prestigious institutions in the world. A scientist in a lab coat greets you warmly. You're told you'll be helping with a study about memory and learning. You'll be paid four dollars for your time—decent money in 1961. Everything feels legitimate, professional, safe.
Within an hour, you'll be administering what you believe are potentially lethal electric shocks to a screaming stranger.
And you probably won't stop.
This was the Milgram experiment, and its results shattered everything psychologists thought they knew about human nature. Before the study began, experts predicted that perhaps one in a thousand participants might go all the way to the maximum voltage. The actual number? Sixty-five percent. Nearly two-thirds of ordinary Americans were willing to deliver what they believed were 450-volt shocks to a person begging them to stop.
The Shadow of Adolf Eichmann
Stanley Milgram didn't design his experiment in a vacuum. He launched it in August 1961, just three months after one of the most watched trials in history began in Jerusalem.
Adolf Eichmann sat in a bulletproof glass booth, a balding, bespectacled bureaucrat who looked more like an accountant than a monster. He had been one of the chief architects of the Holocaust, responsible for the logistics of transporting millions of Jews to death camps. His defense was chillingly simple: he was just following orders.
The world struggled to understand. How could ordinary people participate in genocide? Were the Germans somehow uniquely evil? Was there something in the German character that made them more susceptible to authoritarian commands?
Milgram, whose own family had roots in Eastern European Jewish communities devastated by the Holocaust, wanted answers. He originally planned to test American subjects as a control group before studying Germans, whom he expected would prove far more obedient. He never needed to run the German version. The Americans were obedient enough to make his point.
The Elegant Cruelty of the Design
The genius of Milgram's experiment lay in its theatrical precision. Every detail was choreographed to create a specific psychological trap.
Two people arrived for each session—the actual subject and a man named "Mr. Wallace," a mild-mannered, likeable accountant in his late forties. They drew slips of paper to determine who would be the "teacher" and who would be the "learner." The draw was rigged. Both slips said "teacher." Mr. Wallace, who was actually an actor working with Milgram, always claimed to have drawn "learner."
The subject watched as Mr. Wallace was strapped into a chair and electrodes were attached to his wrists. The experimenter explained, with clinical detachment, that the straps would prevent "excessive movement" during the shocks. Mr. Wallace mentioned, almost casually, that he had a heart condition. The experimenter acknowledged this but showed no concern.
Then came the hook that made everything feel real: the teacher received an actual 45-volt shock. It stung. It was unpleasant. It was enough to make the fantasy of those electrode wires painfully concrete.
The Machinery of Compliance
The shock generator was a masterpiece of intimidation. It featured thirty switches arranged in a horizontal line, ranging from 15 volts to 450 volts. Each switch was labeled not just with a number but with increasingly alarming descriptions: "Slight Shock," then "Moderate Shock," then "Strong Shock," climbing through "Intense Shock" and "Extreme Intensity Shock" until reaching "Danger: Severe Shock." The final two switches bore only an ominous "XXX."
The task seemed simple enough. The teacher would read pairs of words—"blue/box," "nice/day," "wild/duck." The learner had to remember which words went together. Wrong answers earned shocks, starting at 15 volts and increasing by 15 volts each time.
Of course, Mr. Wallace made plenty of mistakes. That was the script.
At first, there was silence from the other room. Then grunts. At 75 volts, audible discomfort. At 120 volts, loud complaints that the shocks were becoming painful. At 150 volts, Mr. Wallace demanded to be released from the experiment.
At 300 volts, he pounded on the wall and screamed.
After 330 volts, silence. Complete, utter silence. The learner stopped responding entirely. Was he unconscious? Dead?
The experimenter, calm as ever, instructed the teacher to treat no response as a wrong answer. The shocks should continue.
The Four Prods
When subjects hesitated—and they all hesitated at some point—the experimenter had a carefully scripted series of verbal prods, deployed in sequence:
First: "Please continue," or "Please go on."
If that failed: "The experiment requires that you continue."
Then: "It is absolutely essential that you continue."
And finally, the words that trapped so many: "You have no other choice. You must go on."
That last prod is fascinating because it was objectively false. Every subject had a choice. They could stand up and walk out. No one was physically restrained. No one faced any real consequences for quitting. Yet those words—"you have no other choice"—proved remarkably effective at overriding people's own moral judgment.
If subjects worried about permanent harm to the learner, the experimenter had a ready response: "Although the shocks may be painful, there is no permanent tissue damage, so please go on." If they pointed out that the learner wanted to stop, the experimenter replied: "Whether the learner likes it or not, you must go on until he has learned all the word pairs correctly."
The Agony of Obedience
It would be comforting to imagine that those who obeyed did so calmly, perhaps even sadistically. The opposite was true.
Subjects sweated profusely. Their hands trembled. They bit their lips until they bled. They dug fingernails into their palms. Some developed nervous tics or uncontrollable laughing fits—not from amusement, but from psychological distress so acute their bodies didn't know how else to respond. One subject had a full seizure.
They begged the experimenter to check on the learner. They questioned whether the experiment should continue. They expressed genuine distress at what they were being asked to do.
And then, most of them, continued doing it.
Every single participant in Milgram's original study went to at least 300 volts—the point where the learner was pounding on the wall in agony and demanding release. Every single one. The 35% who eventually defied the experimenter did so only after already administering what they believed were extremely dangerous shocks.
What the Experts Got Wrong
Before running his experiment, Milgram surveyed people to predict the results. He asked senior psychology students at Yale to estimate how far 100 hypothetical subjects would go. Their predictions clustered around the idea that almost nobody—perhaps one or two people out of a hundred—would go all the way to 450 volts. The average prediction was that only 1.2% would deliver the maximum shock.
He polled forty psychiatrists. They were even more confident in human virtue. By the 300-volt shock, they predicted, only 3.73% of subjects would still be pressing the button. As for 450 volts? They estimated that roughly one-tenth of one percent—one person in a thousand—might go that far.
The actual figure was 650 times higher than the psychiatrists predicted.
This gap between expectation and reality reveals something crucial about why Milgram's findings disturb us so deeply. We don't just misunderstand other people; we profoundly misunderstand ourselves. We imagine we would be the ones to resist. We believe moral clarity would guide us. We're almost certainly wrong.
The Geography of Conscience
Milgram didn't stop with his initial findings. He ran variation after variation, probing the boundaries of obedience like a scientist mapping an unknown continent.
Location mattered. When he moved the experiment from Yale's ivy-covered legitimacy to a shabby office in a nondescript building, obedience dropped from 65% to 47%. The authority of the institution conferred authority on the instructions.
Proximity mattered enormously. In the original setup, teacher and learner were in separate rooms. When Milgram put them in the same room, obedience fell to 40%. When subjects had to physically hold the learner's hand on a shock plate to deliver the punishment, only 30% went all the way. We find it easier to harm people we cannot see.
This particular finding echoes through modern warfare. It's easier to drop a bomb from 30,000 feet than to strangle someone with your hands. It's easier to approve a drone strike from a control room in Nevada than to pull a trigger while looking into someone's eyes. Distance is moral anesthesia.
The Question of the Holocaust
Milgram himself drew explicit connections between his laboratory and the death camps. In his 1974 book, Obedience to Authority, he argued that "a common psychological process is centrally involved in both events."
Not everyone agrees.
James Waller, a scholar of genocide studies, has pointed out significant differences. Milgram's subjects were explicitly told no permanent damage would occur—Holocaust perpetrators knew exactly what they were doing. The laboratory subjects had no personal animosity toward their victims—Nazi killers were often motivated by years of antisemitic indoctrination. The experiment lasted an hour—the Holocaust unfolded over years, providing ample time for reflection and moral reckoning.
Perhaps most importantly, many Holocaust perpetrators were not reluctant participants following orders against their will. They were enthusiastic volunteers, creative problem-solvers who developed new methods of killing, people who took photographs of their atrocities as souvenirs.
The Milgram experiment may explain how ordinary bureaucrats processed paperwork that sent people to their deaths. It may explain how train conductors kept the schedules running on time. But it probably doesn't fully explain how ordinary men became willing executioners.
The Ethics of Studying Evil
Almost immediately after Milgram published his findings, the criticism began. Psychologist Diana Baumrind published a scathing response in the American Psychologist in 1964, arguing that Milgram had traumatized his subjects. Even though they'd signed consent forms, they hadn't truly understood what they were agreeing to. When participants showed obvious signs of severe distress—trembling, sweating, having seizures—the experimenter should have stopped.
Milgram defended himself vigorously. He surveyed his former subjects and found that 84% said they were "glad" or "very glad" to have participated. Only 1.3% regretted the experience. Many wrote letters expressing gratitude for the insight they'd gained into their own psychology.
One letter stands out. Written six years after the experiment, at the height of the Vietnam War, a former subject explained how the experience had shaped his moral development:
While I was a subject in 1964, though I believed that I was hurting someone, I was totally unaware of why I was doing so. Few people ever realize when they are acting according to their own beliefs and when they are meekly submitting to authority... I am fully prepared to go to jail if I am not granted Conscientious Objector status. Indeed, it is the only course I could take to be faithful to what I believe.
For this man, at least, the experiment was a vaccination against blind obedience. It showed him his own vulnerability so that he could guard against it.
Others weren't so fortunate. Critic Gina Perry, who tracked down many original participants decades later, found that some had never fully recovered from the knowledge of what they'd been willing to do. The debriefing process, she argued, was inadequate. Many left the laboratory without fully understanding that the shocks had been fake and the learner unharmed.
The Replication Question
Scientists have repeated versions of Milgram's experiment around the world for more than sixty years. Thomas Blass of the University of Maryland compiled a meta-analysis of these studies and found something remarkable: the results have held up with striking consistency.
The percentage of subjects willing to go to the maximum voltage ranged from 28% to 91% depending on the specific experimental conditions. But the average was essentially unchanged from Milgram's original findings—about 61% in American studies, 66% in studies conducted elsewhere. Time hasn't made us less obedient. Neither has knowing about Milgram's findings.
There's no evidence that people today would behave differently than people did in 1961. The potential for obedient cruelty appears to be a stable feature of human psychology, as consistent as our visual perception or our capacity for language.
What We Refuse to See
Perhaps the most troubling detail in Milgram's notes is what subjects did not do.
Even those who defied the experimenter and refused to continue—the heroic 35%—behaved in ways that should give us pause. Not one of them insisted that the experiment be terminated entirely. Not one of them went to check on the learner after they stopped. They simply declined to participate further and sat quietly waiting for further instructions.
They had just witnessed what they believed was a medical emergency. A man with a heart condition had stopped responding after receiving what seemed like severe electric shocks. And they sat there.
This suggests that even defiance has limits. Even those who found the courage to say "no" still deferred to the authority of the situation. They had freed themselves from the obligation to hurt someone, but they hadn't freed themselves from the laboratory's frame of reference. They were still subjects, waiting to be dismissed.
The Agentic State
Milgram developed a concept to explain what he observed. He called it the "agentic state"—a mental shift that occurs when a person stops seeing themselves as an autonomous moral actor and starts seeing themselves as an agent carrying out another's will.
In the agentic state, responsibility floats upward. The person pulling the lever isn't responsible; they're just following orders. The person giving the orders isn't directly responsible; they're not the one pulling the lever. Everyone can tell themselves that the moral weight belongs to someone else.
You see this everywhere once you start looking. "I'm just doing my job." "I don't make the rules." "That's above my pay grade." These phrases are the verbal signatures of the agentic state. They're how we absolve ourselves while participating in systems we might otherwise question.
The Uncomfortable Conclusion
Milgram's experiment forces us to confront a possibility that most of us would rather not consider: that the capacity for cruelty isn't something that belongs to other people—to monsters, to psychopaths, to members of strange foreign cultures. It belongs to us.
As Milgram wrote in his summary of the findings:
Ordinary people, simply doing their jobs, and without any particular hostility on their part, can become agents in a terrible destructive process. Moreover, even when the destructive effects of their work become patently clear, and they are asked to carry out actions incompatible with fundamental standards of morality, relatively few people have the resources needed to resist authority.
The phrase "relatively few people have the resources needed to resist authority" deserves careful attention. Milgram isn't saying that resistance is impossible. Thirty-five percent of his subjects did eventually refuse to continue. But he's suggesting that such resistance requires something—resources, preparation, perhaps a kind of moral muscle that must be developed in advance.
You probably won't develop the ability to resist unjust authority at the moment you need it most. That moment is too charged, too confusing, too laden with social pressure. The time to think about these questions is now, in the calm before any storm—to examine the conditions under which you might find yourself complicit in something terrible, and to decide in advance what you will and won't do.
Looking in the Mirror
The original shock generator from Milgram's experiments still exists. It sits in the Archives of the History of American Psychology, a relic of one of the most disturbing investigations into human nature ever conducted.
It's a physical object, with switches and dials and labels. It looks almost quaint now—a piece of mid-century laboratory equipment, the kind of thing you might see in a museum of science history.
But the psychological machinery it revealed isn't in any museum. It's operating right now, in every institution where people follow orders without questioning their purpose. In every bureaucracy where responsibility is diffused until no one feels accountable. In every situation where the authority of expertise or position overrides individual moral judgment.
The shock generator was always a prop. The real machinery was inside the subjects all along.
It's inside you, too.