Kitabı oku: «The Bystander Effect»
THE BYSTANDER EFFECT
The Psychology of Courage and Inaction
Catherine A. Sanderson
Copyright
William Collins
An imprint of HarperCollinsPublishers
1 London Bridge Street
London SE1 9GF
HarperCollinsPublishers
1st Floor, Watermarque Building, Ringsend Road
Dublin 4, Ireland
This eBook first published in Great Britain by William Collins in 2020
Copyright © Catherine A. Sanderson 2020
Cover design by Steve Leard
Catherine A. Sanderson asserts the moral right to be identified as the author of this work
A catalogue record for this book is available from the British Library.
All rights reserved under International and Pan-American Copyright Conventions. By payment of the required fees, you have been granted the non-exclusive, non-transferable right to access and read the text of this e-book on-screen. No part of this text may be reproduced, transmitted, down-loaded, decompiled, reverse engineered, or stored in or introduced into any information storage and retrieval system, in any form or by any means, whether electronic or mechanical, now known or hereinafter invented, without the express written permission of HarperCollins
Source ISBN: 9780008361662
Ebook Edition © April 2020 ISBN: 9780008361648
Version: 2021-01-13
Dedication
TO ANDREW, ROBERT, AND CAROLINE,
with hope that you will never stay silent about things that matter
Contents
Cover
Title Page
Copyright
Dedication
Preface
I. The Silence of the Good People
1. The Myth of Monsters
2. Who Is Responsible?
3. The Perils of Ambiguity
4. The Considerable Costs of Helping
5. The Power of Social Groups
II. Bullies and Bystanders
6. At School: Standing Up to Bullies
7. In College: Reducing Sexual Misconduct
8. At Work: Fostering Ethical Behavior
III. Learning to Act
9. Understanding Moral Rebels
10. Becoming a Moral Rebel
Notes
Index
Acknowledgements
About the Author
About the Publisher
Preface
On August 25, 2017, my husband and I spent the day settling in our oldest child, Andrew, for the start of his first year at college. We went to Walmart to buy a minifridge and rug. We hung posters above his bed. We attended the obligatory goodbye family lunch before returning to our car to head home to a slightly quieter house.
Two weeks later Andrew called, which was unusual since, like most teenagers, he vastly prefers texting. His voice breaking, he told me that a student in his dorm had just died.
As he described it on the phone, the two of them seemed to have so much in common. They were both freshmen. They were both from Massachusetts and had attended rival prep schools. They both had younger brothers.
“What happened?” I asked.
He told me the student had been drinking alcohol with friends. He got drunk, and around 9 p.m. on Saturday, he fell and hit his head. His friends, roommate, and lacrosse teammates watched over him for many hours. They strapped a backpack around his shoulders to keep him from rolling onto his back, vomiting, and then choking to death. They periodically checked to make sure he was still breathing.
But what they didn’t do—for nearly twenty hours after the fall—was call 911.
By the time they finally did seek help, at around 4 p.m. on Sunday, it was too late. The student was taken to a hospital and put on life support so that his family could fly in to say goodbye.
Now, it’s impossible to know whether prompt medical attention could have saved his life. Perhaps it wouldn’t have. But what is clear is that he didn’t get that opportunity. And this story—of college students failing to do anything in the face of a serious emergency—is hardly unusual.
It’s not just college students who choose not to act, even when the stakes are high. Why did most passengers sit silently when a man was forcibly dragged off a United Airlines flight, recorded on a video that then went viral? What leads people to stay silent when a colleague uses derogatory language or engages in harassing behavior? Why did so many church leaders fail to report sexual abuse by Catholic priests for so many years?
Throughout my career—as a graduate student at Princeton University in the 1990s and as a professor at Amherst College over the last twenty years—my research has focused on the influence of social norms, the unwritten rules that shape our behavior. Although people follow these norms to fit in with their social group, they can also make crucial errors in their perception of these norms. The more I thought about these seemingly disparate examples of people failing to act, the more I began to see the root causes as driven by the same factors: confusion about what was happening, a lack of a sense of personal responsibility, misperception of social norms, and fear of consequences.
I have discovered through my own work that educating people about the power of social norms, pointing out the errors we so often make in perceiving these norms and the consequences of our misperceptions, helps them engage in better behavior. I’ve done studies that show that freshman women who learn how campus social norms contribute to unhealthy body image ideals show lower rates of disordered eating later on, and that college students who learn that many of their peers struggle with mental health challenges have a more positive view of mental health services. Helping people understand the psychological processes that lead them to misperceive what those around them are actually thinking—to believe that all women want to be thin, that other college students never feel sad or lonely—reduces the mistakes and misunderstandings we make about other people and can improve our psychological and physical well-being. It can also push us to act.
In my very first introduction to psychology as an undergraduate at Stanford in 1987, I remember being fascinated when I learned how much being in a group influenced our own behavior. I was fortunate enough to have Phil Zimbardo—whose Stanford Prison Experiment remains one of the most famous and controversial studies in psychology—as my professor. It was quite an introduction to the field of social psychology!
Back then, researchers could design experiments and measure people’s behavior, but we couldn’t penetrate the mechanisms that explained them. We couldn’t see what was happening in the brain. Recent breakthroughs in neuroscience have completely changed that. It is now possible to see in real time how certain scenarios, pressures, and experiences play out in the brain. As I’ll describe throughout this book, these results have revealed that many of the processes that drive inaction occur not through a careful deliberative process, but at an automatic level in the brain.
My goal in writing this book is to help people understand the psychological factors that underlie the very natural human tendency to stay silent in the face of bad behavior, and to show how significant a role that silence plays in allowing the bad behavior to continue. In the first half of the book, I describe how situational and psychological factors can lead good people to engage in bad behavior (Chapter 1), or, more commonly, to stay silent in the face of bad behavior by others (Chapters 2 to 5). Next, I show how these factors play out to inhibit action in distinct real-world situations, including bullying in school (Chapter 6), sexual misconduct in college (Chapter 7), and unethical behavior in the workplace (Chapter 8). I end by examining how some people are more able to stand up to others and what we can learn from these moral rebels (Chapter 9). In the closing chapter I look at strategies we all can use—regardless of our personality—to increase the likelihood that we will speak up and take action when we are most needed.
My hope is that providing insight into the forces that keep us from acting—and offering practical strategies for resisting such pressure in our own lives—will allow readers of this book to step up and do the right thing, even when it feels really hard. Ultimately, that’s the secret to breaking the silence of the bystander—and making sure no one has to wait twenty hours after a serious injury before someone picks up the phone.
Part I
1
The Myth of Monsters
On August 11, 2012, a sixteen-year-old girl attended a party in Steubenville, Ohio, with some students from the local high school, including members of the school’s football team. She drank a lot, became severely intoxicated, and vomited. Students at the party that night described her as appearing “out of it.” The next morning, she woke up naked in a basement living room with three boys around her but virtually no memory of the prior night.
Over the next few days, several students who had been at the party posted to social media photographs and videos that vividly illustrated what had happened to the girl: her clothes had been removed, and she had been sexually assaulted. In March 2013 two Steubenville High football players, Trent Mays and Ma’lik Richmond, were found guilty of rape.
When we hear stories like this, most of us assume that these bad acts were committed by bad people. Surely only a bad person would sexually assault an unconscious teenage girl. This belief that bad behavior is caused by bad people is reassuring and comforting. Unfortunately, it’s also wrong. As Nasra Hassan, who spent years studying Palestinian terrorists, said, “What is frightening is not the abnormality of those who carry out the suicide attacks, but their sheer normality.”[1] Or take it from Sue Klebold. In 1999 her son Dylan, along with his classmate Eric Harris, killed more than a dozen people at Columbine High School in Colorado. “This belief that Dylan was a monster,” she said, “served a deeper purpose: people needed to believe they would recognize evil in their midst.”[2]
Why do we assume that bad behavior is carried out by bad people? Because that belief reassures us that the good people we know—our friends, our family, even ourselves—couldn’t possibly do such things.
But “good people” can and do engage in bad behavior, from bullying in the schoolyard to hazing in college fraternities to sexual harassment in the workplace. So curbing bad behavior is not simply a matter of identifying and stopping the monsters. It is essential to identify the factors that lead otherwise good people to make bad choices so that we can prevent such behavior from occurring—or at least decrease its likelihood. This chapter examines the settings and situations that lead many of us to do things that we know at some level to be wrong. You may not be surprised to learn that we have more of tendency to do harmful things when we are in a group, when a trusted authority figure instructs us to do so, or when we start by taking small steps in the wrong direction. But the reasons underlying these tendencies may not be what you think.
The Hazards of the Herd
As a graduate student at Princeton University, I had a great part-time job living in a residence hall and providing support to junior and senior resident assistants. The job involved eating some meals with students in the dining hall, facilitating dorm-wide social events, and helping students deal with academic and personal concerns. There was one serious downside, however: one evening each year, I was required to serve as a “support person” at the Nude Olympics.
The Nude Olympics started in the early 1970s and was a well-established unofficial tradition until 1999, when it was banned by the board of trustees. Sophomores, both men and women, would run around campus at midnight on the occasion of the first snowfall each year, which typically occurred in January, wearing only running shoes, hats, and gloves. As you might imagine, the participants typically drank large amounts of alcohol in the hours leading up to the run, to help them withstand both the freezing temperatures and the considerable awkwardness inherent in running around naked in front of their classmates.
My role was to stand in the courtyard of one of the colleges wearing a reflective vest and holding a first aid kit, so that any student who experienced trouble—say, falling on ice—would be able to find me. As I stood there each year, fervently hoping I would be able to finish my dissertation and leave Princeton before the next Nude Olympics, I kept thinking to myself, “These students are some of the best and brightest in America. Why are they doing this?” Running drunk and naked at midnight in the snow just doesn’t seem like a great idea.
But this story illustrates a fundamental finding in psychology: people will do things in a group setting that they would never do on their own. Although the Nude Olympics was mostly harmless, the same principle holds in cases where people behave really badly. Examples of bad behavior in group settings are abundant:
In February 2010, Dylan Gifford Yount stood on the fourth-floor ledge of a commercial building in San Francisco as a large crowd gathered below. Many people taunted him, yelling “Jump!” and “Just do it already.” After forty-five minutes, he jumped to his death.
During the 2015–2016 New Year’s Eve celebrations in Cologne, Germany, large crowds of men sexually assaulted an estimated twelve hundred women.
In February 2018, fans celebrating the Philadelphia Eagles’ Super Bowl win flipped over cars, removed street poles from the ground, set fires, and broke store windows, causing $273,000 in damages.
What is it about being in a group that leads people to do things they would never do on their own? One explanation is that people in a group believe they won’t be held responsible for their actions because they are anonymous. The frequency and severity of aggressive and offensive behavior is greater if people are wearing a mask or hood or operating in the dark, even if they aren’t in a group. As the psychologist Philip Zimbardo found, college students who were asked to deliver electric shocks to another student (thinking that they were participating in a study of creativity) delivered significantly longer—and thus more painful—shocks when they were wearing hoods to hide their identity than when they were not.[3]
The same phenomenon has been observed outside the lab. An analysis of violence in Northern Ireland by Andrew Silke at the University of Leicester found that people wearing disguises—masks, hoods, or other clothing to obscure their faces—engaged in more acts of vandalism, harmed more people, and inflicted more serious physical injuries.[4] This helps explain why cyberbullying and other aggressive behavior is so common online, where people can post anonymously.
Groups may also facilitate bad behavior because they create what is called “deindividuation”—the loss of sense of oneself as an individual.[5] When people lose touch with their own moral standards and forget who they really are, which often happens in a pack, the normal constraints against deviant behavior are removed.
The larger the crowd, the worse the behavior. Andrew Ritchey and Barry Ruback at Pennsylvania State University documented this effect by analyzing the behavior of lynch mobs.[6] Examining articles from the Atlanta Constitution about lynch mobs in Georgia from 1882 to 1926, they identified 515 victims in 411 separate events. They recorded the size of the mob, the race and sex of the victim, and the amount of violence that had occurred for each case. Although all of the lynchings resulted in death, they defined those in which the victim was also burned, hanged, and/or beaten as having a higher level of violence. Their results indicated that the size of the crowd at a lynching consistently predicted the level of violence.
Although group settings seem to contribute to bad behavior, understanding exactly how they do so is difficult. People may not be conscious of why they chose to do something, so they can’t accurately tell researchers what drove their actions. They may also make excuses for their behavior, to make themselves look or feel better.
Recent breakthroughs in neuroscience, however, have provided important tools for helping us explore this behavior. Using neuroimaging techniques, researchers can examine the activity in different parts of the brain while people are in the act of doing certain things. This means we no longer have to rely only on what people say about their motivation. Instead, we can now investigate how being in a group changes patterns of brain activity.[7]
The first study to examine whether neural responses are lower in a group setting was conducted by researchers at MIT. It was prompted by an experience that one of the researchers, Mina Cikara, had when she was in graduate school. One afternoon, Cikara and her husband decided to go to a baseball game between two long-standing rivals, the Red Sox and the Yankees, at Yankee Stadium. Her husband, who wore a Red Sox cap, was relentlessly taunted by Yankees fans. In an attempt to defuse the situation, Cikara put her husband’s cap on her own head, assuming that Yankees fans wouldn’t target a woman for such verbal abuse.
It turned out she was wrong. “I have never been called names like that in my entire life,” she said.[8] She returned from the game determined to find out why being in a group setting leads otherwise normal people (though, in fairness, they were Yankees fans) to act so poorly.
Cikara and her colleagues designed a study to test two questions: Do people think about themselves less when they are participating in a competitive task as part of a team than when they are acting alone? And do people who think about themselves less when acting as part of a team behave more aggressively toward members of the other team?[9] They hypothesized that competing in a group might cause people to become less aware of themselves and to lose their ability to evaluate their own behavior.
In the first part of the study, researchers used an fMRI (functional magnetic resonance imaging) machine to measure participants’ patterns of brain activation while they played a game on their own and then as part of a team. During the game, participants were shown statements that described positive or negative moral behaviors about either themselves or other people, such as, “I have stolen food from shared refrigerators,” or “He always apologizes after bumping into someone.”
The researchers focused on a particular part of the brain called the medial prefrontal cortex (mPFC), which has been shown to be more engaged (in colloquial terms, it “lights up”) when people think about themselves—when they consider their own personality traits, physical characteristics, or mental states—than when they think about others.[10]
Cikara and her colleagues found that when people played the game on their own, their mPFC was far more active when they read statements about themselves than when they read statements about other people. But when they played as part of a team, about half of the participants showed a much smaller difference in activation in this part of the brain when they read statements about themselves than when they read statements about other people. These findings tell us that some people do in fact think less about themselves when they are in a group than when they are alone.
But the crucial question for these researchers was not just whether some people have a tendency to think less about themselves when competing as part of a team, but what the consequences are of this reduced self-reflection. They designed another study in which participants were shown six photos of each member of their own team and of the opposing team and were asked to choose one photo of each member, which was supposedly going to be printed in a published report. These photos had been independently rated according to attractiveness, from very unflattering to very flattering. The participants who showed reduced self-referential thinking—as measured by lower levels of mPFC activity—when playing as part of a team tended to choose less flattering photos of members of the opposing team than of their own team. Participants who didn’t show lower self-referential thinking chose equally flattering photos of both teams.
The researchers concluded that people who think less about themselves when in a group setting are more likely to act in ways that hurt other people. This behavior may be especially pronounced when people are in groups that are directly competing with one another, as Cikara experienced when she put on her husband’s Red Sox cap at Yankee Stadium.
“Although humans exhibit strong preferences for equity and moral prohibitions against harm in many contexts,” said Rebecca Saxe, one of the researchers involved in the study, “people’s priorities change when there is an ‘us’ and a ‘them.’”[11]