Читайте только на Литрес

Kitap dosya olarak indirilemez ancak uygulamamız üzerinden veya online olarak web sitemizden okunabilir.

Kitabı oku: «The Bystander Effect», sayfa 2

Yazı tipi:

Just Following Orders

One of the earliest—and most famous—research studies demonstrating that otherwise good people can engage in harmful actions was conducted by Stanley Milgram at Yale University. Milgram was interested in whether people would inflict pain on others if ordered to do so by an authority figure. He designed the study specifically to understand the psychological processes that had buttressed the Nazi Holocaust, when millions of innocent victims were murdered by people who claimed that they were simply obeying orders. “Obedience, as a determinant of behavior, is of particular relevance to our time,” wrote Milgram. “Gas chambers were built, death camps were guarded; daily quotas of corpses were produced … These inhumane policies may have originated in the mind of a single person, but they could only be carried out on a massive scale if a very large number of persons obeyed orders.”[12]

In a series of experiments, Milgram brought men into his lab at Yale to participate in what was supposedly a study of memory and learning. (His original study was done with forty men; later variations included women.) On arrival, each participant was greeted by a person identified as the experimenter and introduced to another participant, who was really an accomplice planted by the researchers. The experimenter explained that the study was designed to test an important scientific question about the impact of punishment on the speed of learning.

Participants were told that one person would serve as the “teacher” and the other as the “learner,” but Milgram rigged it so that the study subject was always the teacher, while the accomplice was the learner. The learner would first be given a series of word pairs and later shown one of the words and tasked with picking out its pair from a list of four options. The teacher, who could communicate with the learner but not see him, was told to administer a shock if the learner gave a wrong answer. The experimenter was supposedly trying to gauge whether the shock helped or hindered the learner. (In reality, no shocks were given.)

The teacher was told to start by giving the learner the lowest level of shock (15 volts) and to increase the shock level each time the learner made a mistake.

At each shock level, the learner responded in a standard way. At the 75-volt level, he began to cry out in pain, and by 150 volts he asked to be let out of the experiment. He also began to claim that his heart was bothering him. If the teacher hesitated or turned to the experimenter in bewilderment asking if he could stop, he received one of four prompts that prodded him to continue: “Please continue,” “The experiment requires that you continue,” “It is absolutely essential that you continue,” or “You have no other choice but to continue.” The experimenter kept providing these prompts until the teacher refused to continue or reached the highest level (450 volts, which was marked “XXX dangerous”).

Much to Milgram’s surprise, the majority of the study participants—65 percent—were willing to give a person whom they believed to be an innocent participant the maximum level of electric shocks. Many people were dismayed by this extremely high rate of obedience, including the psychiatrists Milgram had consulted before the experiment, who had predicted that approximately 1 percent of the participants would follow through to the very end. The Milgram study was conducted more than fifty years ago, but similar experiments recently conducted in both Poland and the United States have found similarly high rates of compliance.[13]

Our willingness to harm others when we are following the instructions of an authority figure has also been demonstrated by studies that more clearly mimic real-world situations. Researchers in one study asked participants to read various test questions to a supposed job applicant, who was actually an accomplice.[14] The applicant was always played by the same person—a well-dressed man about thirty years old. The researchers told the participants that they were interested in examining how job applicants would react under pressure, so they wanted them to harass the applicant by making statements that progressed in offensiveness, including, “If you continue like this, you will fail,” and “This job is much too difficult for you.” As the “interview” continued, the applicant pleaded with them to stop, then refused to tolerate the abuse and showed signs of tension, and eventually stopped answering the questions in despair. In the control condition, in which there was no authority figure urging them to continue, none of the participants got through all fifteen of the statements. But when the experimenter prodded them along, 92 percent went all the way through the list.

What explains this tendency to obey an authority figure’s orders even if it means harming an innocent person? One central factor is the authority figure’s willingness to assume responsibility for any negative outcomes. This allows the person who is engaging in the bad behavior to feel absolved of wrong-doing.[15] The tendency to seek absolution on that basis can be found repeatedly in real-world situations, from the American soldiers who abused prisoners at Abu Ghraib in Iraq to business executives engaging in corporate fraud.[16]

Experimental research demonstrates that people who feel less responsible for committing harmful acts are more willing to do so. Participants in a replication of the Milgram study who were made to feel more responsible for inflicting harm—by being told explicitly that they were responsible for the well-being of the learner—stopped the procedure significantly earlier.[17] People who feel more responsible for hurting someone have also been found to be better able to resist explicit instructions to do so. A detailed analysis of the utterances of participants in one of the recent replications of the Milgram study revealed that those who expressed a sense that they were responsible for their actions were more likely to resist the orders and stop delivering shocks.[18]

These findings tell us that feeling less responsible increases the tendency to engage in compliant harmful behavior, but they don’t tell us why. Do people blame actions on the instructions of authority figures to avoid facing the consequences of their actions, as in the Nuremberg trials, when Nazi defendants blamed their actions on “just following orders”? Or does the act of following orders actually change the way we process our behavior at a neurological level?

Patrick Haggard, a cognitive neuroscientist at University College London, designed a study with his colleagues to test precisely this question.[19] Students were recruited to participate in what they were told was a study to examine how people interact with each other when they are told what to do, and how they process this experience. The participants were placed into pairs and asked to deliver a “painful but tolerable” electric shock to their partner. In one condition, the participants were told that they had the option to deliver shocks to their partner or not, and if they did, they would receive some additional money. In the other condition, participants were ordered by the experimenter to deliver shocks.

The researchers monitored the participants’ brain activity using electroencephalography (EEG). This allowed them to detect what neuroscientists call “event-related potentials,” or ERPs—very small voltages that are generated in the brain in response to different sensory, motor, or cognitive events, such as seeing a picture of a face or experiencing a surprise. People who freely choose to engage in an action generally show larger ERP amplitudes—larger brain waves, meaning greater activity and a more intense experience—than those who are instructed to engage in an action.[20] The researchers were interested in finding out whether people who delivered the shocks without being ordered to do so would show larger ERP amplitudes than those who were following orders.

First they confirmed that those who gave the shocks of their own free will felt more responsible (87 percent) than those who were ordered to do so (35 percent). When they looked at the EEG data, they found that the people who had given shocks voluntarily did indeed have larger ERP amplitudes than those who had been ordered to do so. What does this tell us? It seems that people who have been told or coerced into doing something that may be harmful to another person—those who are “just following orders”—experience their action less intensely than those who choose to engage in the same behavior voluntarily.

The lower level of brain response reveals that if you do something that you are ordered to do, at a neurological level it seems to be less meaningful than if you do the same thing of your own volition. This makes it easier for people to feel less responsible for their actions, and thus more likely to engage in bad behavior. It also suggests that the defense of “just following orders” may not be merely a strategy people use to retrospectively excuse their behavior. When a person harms someone at the explicit instructions of an authority figure, their behavior is processed differently in the brain.

A Question of Identity

It is a natural human tendency to look for someone to blame when you are confronted with evidence of having done something wrong. After all, if it wasn’t your fault you can convince yourself—and possibly others—that you are really a good person at heart. We’ve just seen that some data from neuroscience suggest that people who are following orders don’t experience their actions as intensely as those who are acting of their own volition. But psychologists have also found that people sometimes come to identify with those who are giving the orders, at which point they may be choosing willingly to engage in bad behavior. We see this especially in the case of charismatic religious or political leaders.

Researchers at the University of St. Andrews and the University of Exeter conducted a study to assess how identifying with a person giving orders would affect people’s actions.[21] They recruited people to read about the Milgram study and its variants and to evaluate how much they thought the participants in those studies would have identified with the “experimenter” (who was giving the orders), or with the “learner” (who was receiving the shocks). They chose one group of experts (academic psychologists who were already familiar with the Milgram study) and one of nonexperts (students taking an introductory psychology class who had not yet learned about the study), in case they differed in their assessments (in the end, the results were the same for the two groups). They asked people in both groups to read about the original study and then about fifteen different variations that Milgram had run over the years. These variations tweaked the procedure in small but important ways. In one, the experimenter gave the orders to deliver shocks by phone instead of in person. In another, the experiment was run not at prestigious Yale University but at an office building in Bridgeport, Connecticut.

The psychologists and students were asked to assess how they believed each variant would influence the participants’ identification with the experimenter as a scientist and the scientific community he represented, or with the learner and his broader community. The researchers then examined whether these assessments of the nature of the participants’ identification in the different variants correlated with the participants’ willingness to obey or resist the orders.

Did identification influence obedience? In a word, yes. The variations that pushed participants to identify with the experimenter—and to see their actions as making a valuable contribution to the pursuit of scientific knowledge—led them to follow the orders to deliver shocks far longer. In one of these variations, the learner never gave a verbal complaint: he just pounded the wall in protest. In another, a second experimenter stepped in to give the orders in an attempt to speed up the process.

Variations that invited people to identify with the learner led participants to resist much earlier and more emphatically. In one such variation, two other supposed participants (actually accomplices) refused to continue delivering shocks. In another, two experimenters argued about whether the participant should continue to deliver the shocks.

These findings suggest that people may engage in harmful behavior when they are following orders not simply because they feel absolved of responsibility but because they come to believe that their actions are serving a worthy purpose.

This alternative explanation provides insight into some of the factors that led to the devastating effectiveness of the policies of the Nazis. People were not simply begrudgingly or numbly following orders; in many instances they embraced the broader social vision and mission of fascism. They identified with the dangers that Hitler was articulating, shared his muscular patriotism and nostalgia for a simpler past, embraced his hatred of outsiders, and bought into his vision of a racially pure society.

The question of why some people act badly and others don’t is not really about good and bad people. Situational factors and questions of self-identification are far more important than we might imagine.[22]

The Agony of Indecision

As we have seen, most of the participants in the original Milgram study went along and delivered what they believed to be increasingly painful shocks to an innocent person. But what’s often overlooked about this study is that the choice to continue obeying the authority was not an easy one for the participants. Videotapes reveal that many participants agonized over what they were doing, even as they continued to deliver shocks. Milgram described one of the troubled participants: “I observed a mature and initially poised businessman enter the laboratory smiling and confident. Within 20 minutes he was reduced to a twitching, stuttering wreck, who was rapidly approaching a point of nervous collapse. He constantly pulled on his earlobe, and twisted his hands. At one point he pushed his fist into his forehead and muttered, ‘Oh God, let’s stop it.’”[23] This man, like most others, continued all the way to the 450-volt level. But he was hardly a monster who was blissfully and blindly obeying the authority.

The participants in the Milgram study were faced with a difficult—and unusual—dilemma. They had agreed to participate in a study that was supposed to further the goals of science, and they trusted the experimenter who was giving the orders. Then, when the shock levels escalated and it became clear they were no longer administering a “mild punishment,” they found it very difficult to extricate themselves.

Most of the participants tried at some point to resist. They turned to the experimenter and asked what they should do. They pushed the experimenter to check on the participant, and many at some point said, “I quit.” But they didn’t quit. What most participants had trouble doing was actually sticking with their gut feeling and walking out. In other words, they wanted to do the right thing, and they tried, often repeatedly, to do so. But they weren’t able to follow through on their decision.

So, who were the people who successfully stood up to the authority figure? Milgram simply divided people into “obedient” and “disobedient,” but a recent analysis of audio recordings of the study reveals considerably more nuance.[24] Many people in both groups resisted the orders in some form. Some hesitated to continue delivering shocks, others voiced their concerns about harming the recipient, and still others tried to stop the experiment. Of the “disobedient” participants, who refused to administer the most painful shocks, 98 percent tried to stop participating in the study at an early stage, saying things like, “I can’t do this anymore,” or “I won’t do this anymore.” Of the “obedient” participants, those who continued to deliver shocks through to the end, 19 percent did voice some form of direct refusal.

Those who ultimately disobeyed the experimenter did so in varied ways. Participants who called on multiple strategies to try to push back—and challenged the authority earlier—were more likely to quit. This tells us that people who want to do the right thing often don’t do so because they lack the right skills and strategies.

Throughout the book I will be providing you with tools and strategies so that when the moment comes and you find yourself thinking, “I can’t do this anymore,” or “I won’t do this anymore,” you will follow through.

Gradual Escalation

Another reason we often go along when we are being urged to do something that we know to be wrong is because the situation gets more extreme little by little. Sometimes each small step will feel wrong, but relatively minor. This makes it difficult psychologically to decide not to do it. And then, when the harm escalates, it’s hard to change course without explaining one’s lack of prior action. This phenomenon, known as “gradual escalation,” makes it hard to recognize the problem and extricate oneself early in the process. A good example is Bernie Madoff, a financier who defrauded people of millions of dollars through a massive Ponzi scheme. In explaining how he got started, he said, “Well, you know what happens is, it starts out with you taking a little bit, maybe a few hundred, a few thousand. You get comfortable with that, and before you know it, it snowballs into something big.”[25] Other types of bad behavior—from academic cheating to fraternity hazing to sexual harassment—often play out in precisely the same way.

Empirical research demonstrates that small transgressions can put people on a slippery slope; getting away with minor acts makes them more likely to embark on bigger, more serious transgressions. Once you’ve engaged in a small—but wrong—act, you need to justify having done so while still maintaining a positive view of yourself (as we all like to do). You may explain this small act away by seeing it as not such a big deal, but that shift makes it easier to condone more serious transgressions later on.

To test whether engaging in small acts of dishonesty makes people more likely to engage in larger ones later, researchers conducted a study in which they asked college students to complete a series of math problems, in three separate trials.[26] They randomly assigned the students to one of three payment groups:

Group 1: Students received $2.50 for each correct answer in each of three trials.

Group 2: Students didn’t get any money for the first two trials but received $2.50 for each correct answer on the third trial.

Group 3: Students were told they would earn 25 cents for each correct answer on the first trial, $1.00 for each correct answer on the second trial, and $2.50 for each correct answer on the third trial.

Participants were given answer sheets after each trial and were told to check their own work and then take from an envelope the amount of money they were owed. Unbeknownst to the participants, researchers were able to check later to see whether they had calculated the correct amount.

Can you predict what happened? People in the third group, with the gradual increase in reward, cheated the most—at double the rate of those in groups one and two. For people in the third group, the initial lie was very minor—they only received a quarter for lying, so it didn’t seem like a big deal. And once they had lied on the first trial, it was easier to continue doing so on subsequent trials in which the rewards for doing so were greater.

Cases of corporate fraud often begin similarly, with small acts of unethical behavior leading to more substantial—and criminal—ones. Executives who have been found guilty of accounting fraud often describe a series of steps that led to the fraud, and they often can’t recall exactly when their bad actions began.[27] Fraternity initiation procedures also often follow this pattern of gradually escalating demands: small orders, such as running errands or cleaning someone’s car, are followed by more severe ones, such as forced drinking or even physical beatings.

So we have seen that engaging in small transgressions can make it easier for people to engage in larger ones because they are trying to justify their behavior. But another explanation is that people initially experience unpleasant physiological arousal when they engage in bad behavior—because they do recognize that it’s wrong—but over time, they adapt and no longer experience such a reaction. In support of this theory, it has been demonstrated that people show lower levels of activation in the amygdala—a part of the brain that processes emotion—after repeatedly seeing negative images (of violence, death, anger, and so on).[28]

Researchers at University College London and Duke University wanted to test whether engaging in small acts of dishonest behavior would lead to reduced brain activation.[29] The researchers used fMRI scanners to monitor people’s brains while they completed a series of estimation tasks with a partner (actually an accomplice) that involved guessing how many pennies were in a jar. In one case, they were told that they and their partner would get the greatest reward if they guessed the most accurate number. In another case, they were told that they would get the greatest reward if they lied by deliberately over- or underestimating the number, but that their partner would get less money. The procedure allowed researchers to measure how the brain would respond when people provided intentionally inaccurate estimates.

In the initial trials in which people provided deliberately dishonest guesses, the amygdala showed a strong response, indicating that the person was aware they were telling a lie and felt bad about it. But over time, with repeated trials, the amygdala’s activity levels dropped substantially, meaning that the neural response had weakened. Telling small lies, then, appears to desensitize our brains to the negative emotions that typically occur when we do something we know is wrong—which, in turn, makes it easier to engage in bad behavior in the future. These researchers also found that the larger the drop in amygdala activity in one trial, the more likely the person was to lie—and the bigger the lie—on subsequent trials.

Although this study only examined the brain’s response to repeated lies, the discovery that neural reactions decrease in response to repeated dishonesty suggests that the amygdala initially reacts strongly to acts that we know to be wrong, but that this emotional response is weakened following repeated bad behavior. “When we lie for personal gain,” explained one of the study’s authors, “our amygdala produces a negative feeling that limits the extent to which we are prepared to lie. However, this response fades as we continue to lie, and the more it falls, the bigger our lies become. This may lead to a ‘slippery slope’ where small acts of dishonesty escalate into more significant lies.”[30]

We already know that good people typically don’t set out to engage in bad behavior. But this research shows that if—for whatever reason—they take a small step in the wrong direction, it can lead to bigger and bigger steps in the same direction.

This finding helps to explain the very high rates of obedience in Milgram’s study, which started with the delivery of only very small shocks. Most people felt fine about obeying the experimenter’s request at first, and they continued to do so many times before the growing demands of the procedure became clear.[31] They started by giving 15 volts, and then 30, and then 45, all of which seemed like no big deal. They thought they were doing this in the interest of science and that they were helping respected professors determine the relationship between punishment and learning. But this gradual escalation of intensity meant that they had no easy way to justify a decision to stop giving shocks later on. And as they continued to give shocks, their physiological and neurological responses would have weakened. Most people would be unwilling to give a 450-volt shock—marked XXX dangerous—right off the bat, even if ordered to do so by a respected authority. But if it was OK to give a 100 volt shock, what makes it not OK to give a shock of 115 volts? How do you decide when to stop?

But here’s the good news: some people did decide to stop. And understanding what enabled them to resist gives us insight we can use to help people stand up to social pressure of all kinds.

Examination of the audio recordings has revealed some of the factors that allowed some participants to disobey. It turns out that the sooner a person started to question the orders out loud, the more likely he was to ultimately disobey.[32] Those who questioned the orders explicitly found it harder to rationalize what they were doing.

In all variations of Milgram’s studies, participants who stopped obeying orders did so when they reached 150 volts.[33] What is unique about this voltage level? This was the first time the victim himself asked to be released. That request changed the dynamic of the interaction. Those who disobeyed the experimenter apparently prioritized the victim’s desire not to continue over the experimenter’s instructions to do so.

Participants who defied the authority in the Milgram studies were ordinary people who chose to deliberate about what they were being asked to do—and that deliberation allowed them to defy the situational pressures and disobey. So what exactly were they doing differently from the other participants? And is there something we can learn from them?

Ücretsiz ön izlemeyi tamamladınız.

₺808,05
Yaş sınırı:
0+
Hacim:
321 s. 3 illüstrasyon
ISBN:
9780008361648
Telif hakkı:
HarperCollins
Metin
Ortalama puan 4, 4 oylamaya göre
Metin
Ortalama puan 0, 0 oylamaya göre