Atul Gawande posted a series of tweets, based on findings in the Senate CIA Torture Report, about the significant role physicians and psychologists played in torture. He comments, “But the worst for me is to see the details of how doctors, psychologists, and others sworn to aid human beings made the torture possible.”
Agreed. Upon reading how these professionals used their knowledge to torture their fellow human beings I felt disappointed, sad, and sick.
“How could those people sleep at night?” I exclaimed.
But I know how “those people” were able to sleep at night, perhaps even with pride that they were doing good work.
Three studies—all controversial, though still illustrative—provide hints as to how people who engage in “bad behavior” believe that their actions are noble.
One is the Stanford prison experiment.
Briefly, college students were divided into two groups: One assumed the role of “guards” and the other became “prisoners”. Though everyone knew that this was an experiment, the behavior of the “guards” became more cruel and sadistic over time. For example, they forced the “prisoners” to be naked; they also refused to empty the buckets that “prisoners” used as toilets. As Wikipedia comments, this experiment demonstrated the “impressionability and obedience of people when provided with a legitimizing ideology and social and institutional support”.
Another is the Milgram experiment.
In this experiment, an “experimenter” instructed subjects (called “teachers”) to push a button that reportedly delivered electric shocks to a “learner” if the “learner” did not answer a question correctly. The “teachers” could not see the “learner”, but could hear the “learner”. The intensity of the electric shocks increased each time the “learner” answered the questions incorrectly. In fact, no one was receiving any electric shocks, but the “learner” would scream and start hitting a wall as the intensity of the electric shocks increased.
Most of the “teachers” continued to deliver shocks even though they heard the distress of the “learner”. Wikipedia quotes Milgram’s conclusion: “The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.”
The third is the Rosenhan experiment.
Here, volunteers were instructed to try to get admitted into psychiatric hospitals by reporting that they were hearing vague auditory hallucinations. Once the volunteers were actually admitted to the hospitals, they were supposed to “act normally” and to state that they were now feeling fine and were not hearing voices.
All of the volunteers got admitted and all received psychiatric diagnoses (often schizophrenia). None of the patients were released until they agreed with the psychiatrists’ assessments and plans: that they had a mental illness and should take antipsychotic medication.
All three experiments suggest the power of context in influencing human behavior. Most of the “guards”, “learners”, and staff at the psychiatric hospitals did what they thought they were “supposed” to do. From an outsider’s perspective their behaviors were “wrong”. To the subjects, though, they were doing the “right” thing because that is what they were “supposed” to do.
I don’t know any of the physicians or psychologists who participated in the government-sanctioned (!) torture, though I suspect that most, if not all, of them believed that they were doing “the right thing”. That they were using their knowledge and power for good, and not for evil.
Many of us—myself included—would like to believe that we would never do something like help torture people, that we would never be one of “those people”. We want to believe that we would have the mental fortitude to exercise independent thought, stick to our values and morals, and speak up against injustice.
But with experiments and events like these, how can any of us be so sure that we wouldn’t bend to authority and get sucked into groupthink?