Genocide Witnesses And The Bystander Effect

'Today, people don't talk anymore about the mass murder of six million human beings' (Wiesenthal 156). Even though this statement refers to the Holocaust, it applies to various genocides that have occurred throughout history. No matter the place or time, the same reaction happens as after the Holocaust, 'The world seems to have agreed to 'let the matter drop' and nobody has more sedulously promoted this 'forget it' campaign than the Germans themselves' (Wiesenthal 156). The German witnesses play the “forget it” game just as much as the perpetrators of the horrific acts, yet their denial of the genocide often starts even earlier when they look away from the marching lines of famished Jewish workers and keep their mouths shut. The witnesses tend to speak up after-the-fact, but what makes them stay silent during the discrimination of their fellow humans? One belief is that bystanders try to conform to others, whether it is to other bystanders, society in general, or authority, not take personal responsibility of the events, and blame others for their mistakes. There is a simple answer to the aforementioned question: the bystander effect. This phenomenon became a focus of many researchers after the murder of Kitty Genovese. 

In the spring of 1964, Kitty Genovese was murdered outside her apartment in Queens, New York. Obviously, she screamed, trying to reach someone for help, but nobody came. The New York Times then released a front-page story titled “37 Who Saw Murder Didn’t Call the Police”. The number was later raised to 38. Investigations got many to admit that they heard the screams or even saw part of it take place but instead stood idly by. How was this possible? Soon after the event, psychologists John Darley and Bibb Latane conducted the famous bystander apathy experiment. They recruited college students who would participate in a discussion about various personal problems. These students would be placed in a group of two to six but would not see the other members; they had to talk through microphones in different rooms. In reality, there was only one test subject, and the others were pre-recorded. This is when the actual experiment began: one recording would imitate a seizure.

 Darley and Latane were measuring if and how long it would take for the student, the bystander, to the seizure that they could hear but not see, or to leave the room to check on the student or ask the researchers for help. The results were shocking, “Only 31% of the subjects tried to seek for help”. Even though the experience made many of the test subjects nervous, most did not seek help. However, the different conditions tested -- having only the seizure recording played or the seizure played along with up to four other recordings -- got different results. “Eighty-five percent of the participants who were in the two-person situation, and hence believed they were the only witness to the victim’s seizure, left their cubicles to help. In contrast, only 62 percent of the participants who were in the three-person situation and 31 percent of the participants in the six-person situation tried to help” (Marsh). The more people were present, the less likely someone was to respond. Darley and Latane coined this as a diffusion of responsibility, “When the study participants thought there were other witnesses to the emergency, they felt less personal responsibility to intervene”. Simple, right? If there are more people, it is easier to blame someone else for not taking action, someone with, by chance, more experience with handling the situation or superiority. However, the bystander effect is not as simple as that. 

Darley and Latane conducted a second experiment to test if people are misled by the reactions of others. The participants were either alone in a room answering a questionnaire or with two confederates whom they could clearly see. After some time passed, smoke, a clear sign of danger, was filtered into the room. Darley and Latane measured if the participants would respond. Their hypothesis was right, “When participants were alone, 75 percent of them left the room and reported the smoke to the experimenter. With three participants in the room, only 38 percent left to report the smoke. And quite remarkably, when a participant was joined by two confederates instructed not to show any concern, only 10 percent of the participants reported the smoke to the experimenter” (Marsh). Thus another term was coined: pluralistic ignorance. When a bystander is with other people who stay calm, they mistake it as a sign that no emergency is actually happening. They conform to the reactions of others, to society. After all, most would consider freaking out about what could be nothing “not cool”, and nobody wants to be labeled that. Going off of the idea that humans try to conform to each other, how does that conformity affect judgment? 

Solomon Asch, another researcher, tested how peer pressure to conform did just that. Eight subjects, one participant and seven confederates, were seated around a table. Each was asked a series of questions. The answers of the confederates were correct at the beginning to prevent the participant from catching on, but the responses became increasingly incorrect. The results showed that peer pressure does have a clear effect on judgment, “The control group, those not exposed to peer pressure where everybody gave correct answers, threw up only one incorrect response out of 35; this could probably be explained by experimental error. The results for the other groups were interesting; when surrounded by people giving an incorrect answer, over one third of the subjects also voiced an incorrect opinion. At least 75% of the subjects gave the wrong answer to at least one question, although experimental error may have had some influence on this figure” (Shuttleworth). If pressured to do something, even if it is clearly incorrect to them, people will usually fall into peer pressure. Plus, if coupled with the results of the previous experiment, this means that people will do anything to conform to the peer pressures of society even if it is immoral to them. 

Stanley Milgram, in a closely related study to the Asch experiment, also measured how likely people are to conform. However, this time it was not about peer pressure; there was an authority figure giving directions. In the experiment, a subject was acting as a teacher and was told to ask questions to a test taker over an intercom. If the test taker answered correctly, the test would continue; if it was wrong, the test taker would get a shock. The shocks started off fairly small but ended at a value that could be fatal. Throughout the study, the experimenter remained in the room and encouraged the subject to continue shocking, acting as an authority figure. It is expected that the subjects would stop once they started hearing the screams of the test taker, but, surprisingly, most kept going after verbal nudges from authority, in this case the experimenter, “65% (two-thirds) of participants (i.e., teachers) continued to the highest level of 450 volts. All the participants continued to 300 volts” (McLeod). People tend to be obedient to authority and go into what was coined as the agentic state, “People allow others to direct their actions and then pass off the responsibility for the consequences to the person giving the orders” (McLeod). They give up their sense of personal responsibility, putting the full blame or responsibility on the authority figure, even if they knew it was wrong. Milgram did not stop there though. With eighteen different conditions, he figured out how certain factors affect when subjects stop. One set tested how proximity to the victim affected obedience, “The teacher had to force the learner's hand down onto a shock plate when they refuse to participate after 150 volts. Obedience fell to 30%” (McLeod). The subject felt more personal responsibility and was more than twice likely to stop. A second set switched it to the proximity to the authority figure, “When the experimenter instructed and prompted the teacher by telephone from another room, obedience fell to 20.5%. Many participants cheated and missed out shocks or gave less voltage than ordered to by the experimenter” (McLeod). In contrast to the proximity to victims case, the proximity to authority figures had the opposite effect. In another condition, Milgram tested the effect of social support, which had the most drastic results of all of the conditions, “Two other participants (confederates) were also teachers but refused to obey. Confederate 1 stopped at 150 volts, and confederate 2 stopped at 210 volts. The presence of others who are seen to disobey the authority figure reduces the level of obedience to 10%” (McLeod). Just as Asch hypothesized, the peer pressure from others causes the witness to give in to the popular opinion, whether it was bad (as was the case in the Asch study) or good (as is the case in the Milgram study condition). However, there is one problem with the Milgram experiment that can invalidate his conclusion: it is hard to generalize it to the real world. 

In real life, there are many more factors that influence us at the same time, from morals to social stigmas. This is where Zimbardo comes in; he set up an experiment that simulated the prison environment, a more conventional and realistic situation than Milgram’s study. One of the graduate students working under Zimbardo at Stanford analyzed his experience afterward, “Milgram, after all, had focused on obedience, measuring the effects of an ever-present authority. In fact, when the authority figure was not present, Milgram had showed that the force of his instructions dissipated rather rapidly (Milgram, 1974). Our study was designed to see whether placing participants in a more conventionally designed, and in some ways familiar, role would give the situation a lastingness that it did not appear to have in Milgram's research”. To their surprise, the volunteers quickly settled into the roles of guard or prisoner and started to act the opposite way of what many, probably including the participants outside of the situation, would perceive as right. Some guards abused their power to make the prisoners feel more powerless. Some guards stood by as the others did horrible things and regarded it is normal because that was the social situation, “I mean, in our study we had good guards who didn't get involved, but in our study they never challenged the bad guards. So what you have is powerful situational forces that get the majority to do things they say they would never do” (Zimbardo, Researcher: It’s bad apples, it’s the barrel). The social stigma formed in the prison caused even the more moral guards to not speak up just like many of the others involved in the study – to stimulate a more realistic environment, family, lawyers, priests, and others were brought in – stayed silent. Because something is normal in a particular scene, circumstance, or environment, it causes the majority, usually the bystanders or witnesses to a situation, to be passive and allow the minority to take control. How does any of this tie in with what happens in genocides such as the Holocaust? The simplest observation of the bystander effect – the more witnesses there are, the less likely someone is to help – explains why most of the Germans did not halt any of the cruelty that they saw. Because of the diffusion of responsibility, they did not feel any personal responsibility to stop their government.

 Similarly, conclusions from the smoke experiment say that the more people are present that act normal, the less likely someone is to speak up about the danger or emergency. Since witnesses see others looking down and pretending like there are not famished Jews being forced to march down the street, they mimic them instead of following their morals. Asch proves the same idea with his experiment, saying that people will fall into peer pressure even if the answer to a question is very clearly wrong. Furthermore, authority and obedience to authority have a great impact on bystanders as seen in Milgram’s various conditions. Since Hitler, the leader of the Nazi government, told the Germans that the genocide, even if he did not call it that, was for the good of the country, people accepted that the responsibility for the atrocities happening right before their very own eyes was not actually theirs. The conditions in Milgram’s experiment explain why a person would help the victim; did the witness personally know a Jew (proximity to the victim), an SS man (proximity to authority), or another disobeyer (peer pressure)? 

Knowledge of affected humans, social norms, and particular settings all affect if bystanders decide to be passive or active, as also seen with the guards in the Zimbardo Prison Experiment at Stanford. Many genocides set up these conditions through some of the stages of genocide, such as classification and dehumanization, and allow the witnesses to not take responsibility for the events they could have prevented. 

The bystander effect coupled with the human tendency to conform to society and authority causes witnesses of horrific events such as the Holocaust and other genocides to lose their sense of responsibility and to not speak up against the immoral actions of their governments. Even though many people would hope or even say that they would never react the same way, decades of psychological experiments prove them wrong. Now that it is known that this psychological effect is happening, how will the world stop it from making matters worse in the present and the future? 

Works Cited 

  • “Bystander Apathy Experiment.” Explorable, 15 July 2009, explorable.com/bystander-apathy-experiment. Accessed 7 March 2019. 
  • Haberman, Clyde. “Remembering Kitty Genovese.” The New York Times, 10 April 2016, www.nytimes.com/2016/04/11/us/remembering-kitty-genovese.html. Accessed 7 March 2019. 
  • Marsh, Jason and Dacher Keltner. “We Are All Bystanders.” Greater Good Magazine, 1 Sept. 2006, greatergood.berkeley.edu/article/item/we_are_all_bystanders. Accessed 7 March 2019. 
  • McLeod, Saul. “The Milgram Experiment.” Simple Psychology, www.simplypsychology.org/milgram.html. Accessed 7 March 2019. 
  • Shuttleworth, Martyn. “Asch Experiment.” Explorable, 23 Feb. 2008, explorable.com/asch-experiment. Accessed 7 March 2019. 
  • Wiesenthal, Simon. The Sunflower: On the Possibilities and Limits of Forgiveness. Schocken, 1998. Zimbardo, Philip, et al. Reflections on the Stanford Prison Experiment: Genesis, Transformations, Consequences. 2000, http://pdf.prisonexp.org/blass.pdf.
  •  Zimbardo, Philip. Interview by Soledad O'Brien. Researcher: It’s bad apples, it’s the barrel, 21 May 2004, web.archive.org/web/20160330115959/http://www.cnn.com/2004/US/05/21/zimbarbo.access/. Accessed 7 March 2019. 
16 August 2021
close
Your Email

By clicking “Send”, you agree to our Terms of service and  Privacy statement. We will occasionally send you account related emails.

close thanks-icon
Thanks!

Your essay sample has been sent.

Order now
exit-popup-close
exit-popup-image
Still can’t find what you need?

Order custom paper and save your time
for priority classes!

Order paper now