• Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

The Asch Conformity Experiments

What These Experiments Say About Group Behavior

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

does this case study support the findings of milgram and asch

Emily is a board-certified science editor who has worked with top digital publishing brands like Voices for Biodiversity, Study.com, GoodTherapy, Vox, and Verywell.

does this case study support the findings of milgram and asch

What Is Conformity?

Factors that influence conformity.

The Asch conformity experiments were a series of psychological experiments conducted by Solomon Asch in the 1950s. The experiments revealed the degree to which a person's own opinions are influenced by those of a group . Asch found that people were willing to ignore reality and give an incorrect answer in order to conform to the rest of the group.

At a Glance

The Asch conformity experiments are among the most famous in psychology's history and have inspired a wealth of additional research on conformity and group behavior. This research has provided important insight into how, why, and when people conform and the effects of social pressure on behavior.

Do you think of yourself as a conformist or a non-conformist? Most people believe that they are non-conformist enough to stand up to a group when they know they are right, but conformist enough to blend in with the rest of their peers.

Research suggests that people are often much more prone to conform than they believe they might be.

Imagine yourself in this situation: You've signed up to participate in a psychology experiment in which you are asked to complete a vision test.

Seated in a room with the other participants, you are shown a line segment and then asked to choose the matching line from a group of three segments of different lengths.

The experimenter asks each participant individually to select the matching line segment. On some occasions, everyone in the group chooses the correct line, but occasionally, the other participants unanimously declare that a different line is actually the correct match.

So what do you do when the experimenter asks you which line is the right match? Do you go with your initial response, or do you choose to conform to the rest of the group?

Conformity in Psychology

In psychological terms, conformity refers to an individual's tendency to follow the unspoken rules or behaviors of the social group to which they belong. Researchers have long been been curious about the degree to which people follow or rebel against social norms.

Asch was interested in looking at how pressure from a group could lead people to conform, even when they knew that the rest of the group was wrong. The purpose of the Asch conformity experiment was to demonstrate the power of conformity in groups.

Methodology of Asch's Experiments

Asch's experiments involved having people who were in on the experiment pretend to be regular participants alongside those who were actual, unaware subjects of the study. Those that were in on the experiment would behave in certain ways to see if their actions had an influence on the actual experimental participants.

In each experiment, a naive student participant was placed in a room with several other confederates who were in on the experiment. The subjects were told that they were taking part in a "vision test." All told, a total of 50 students were part of Asch’s experimental condition.

The confederates were all told what their responses would be when the line task was presented. The naive participant, however, had no inkling that the other students were not real participants. After the line task was presented, each student verbally announced which line (either 1, 2, or 3) matched the target line.

Critical Trials

There were 18 different trials in the experimental condition , and the confederates gave incorrect responses in 12 of them, which Asch referred to as the "critical trials." The purpose of these critical trials was to see if the participants would change their answer in order to conform to how the others in the group responded.

During the first part of the procedure, the confederates answered the questions correctly. However, they eventually began providing incorrect answers based on how they had been instructed by the experimenters.

Control Condition

The study also included 37 participants in a control condition . In order to ensure that the average person could accurately gauge the length of the lines, the control group was asked to individually write down the correct match. According to these results, participants were very accurate in their line judgments, choosing the correct answer 99% of the time.

Results of the Asch Conformity Experiments

Nearly 75% of the participants in the conformity experiments went along with the rest of the group at least one time.

After combining the trials, the results indicated that participants conformed to the incorrect group answer approximately one-third of the time.

The experiments also looked at the effect that the number of people present in the group had on conformity. When just one confederate was present, there was virtually no impact on participants' answers. The presence of two confederates had only a tiny effect. The level of conformity seen with three or more confederates was far more significant.

Asch also found that having one of the confederates give the correct answer while the rest of the confederates gave the incorrect answer dramatically lowered conformity. In this situation, just 5% to 10% of the participants conformed to the rest of the group (depending on how often the ally answered correctly). Later studies have also supported this finding, suggesting that having social support is an important tool in combating conformity.

At the conclusion of the Asch experiments, participants were asked why they had gone along with the rest of the group. In most cases, the students stated that while they knew the rest of the group was wrong, they did not want to risk facing ridicule. A few of the participants suggested that they actually believed the other members of the group were correct in their answers.

These results suggest that conformity can be influenced both by a need to fit in and a belief that other people are smarter or better informed.

Given the level of conformity seen in Asch's experiments, conformity can be even stronger in real-life situations where stimuli are more ambiguous or more difficult to judge.

Asch went on to conduct further experiments in order to determine which factors influenced how and when people conform. He found that:

  • Conformity tends to increase when more people are present . However, there is little change once the group size goes beyond four or five people.
  • Conformity also increases when the task becomes more difficult . In the face of uncertainty, people turn to others for information about how to respond.
  • Conformity increases when other members of the group are of a higher social status . When people view the others in the group as more powerful, influential, or knowledgeable than themselves, they are more likely to go along with the group.
  • Conformity tends to decrease, however, when people are able to respond privately . Research has also shown that conformity decreases if people have support from at least one other individual in a group.

Criticisms of the Asch Conformity Experiments

One of the major criticisms of Asch's conformity experiments centers on the reasons why participants choose to conform. According to some critics, individuals may have actually been motivated to avoid conflict, rather than an actual desire to conform to the rest of the group.

Another criticism is that the results of the experiment in the lab may not generalize to real-world situations.

Many social psychology experts believe that while real-world situations may not be as clear-cut as they are in the lab, the actual social pressure to conform is probably much greater, which can dramatically increase conformist behaviors.

Asch SE. Studies of independence and conformity: I. A minority of one against a unanimous majority . Psychological Monographs: General and Applied . 1956;70(9):1-70. doi:10.1037/h0093718

Morgan TJH, Laland KN, Harris PL. The development of adaptive conformity in young children: effects of uncertainty and consensus . Dev Sci. 2015;18(4):511-524. doi:10.1111/desc.12231

Asch SE. Effects of group pressure upon the modification and distortion of judgments . In: Guetzkow H, ed.  Groups, Leadership and Men; Research in Human Relations. Carnegie Press. 1951:177–190.

Britt MA. Psych Experiments: From Pavlov's Dogs to Rorschach's Inkblots . Adams Media. 

Myers DG. Exploring Psychology (9th ed.). Worth Publishers.

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

6.6C: The Asch Experiment- The Power of Peer Pressure

  • Last updated
  • Save as PDF
  • Page ID 130621

The Asch conformity experiments were a series of studies conducted in the 1950s that demonstrated the power of conformity in groups.

Learning Objectives

  • Explain how the Asch experiment sought to measure conformity in groups
  • The Asch conformity experiments consisted of a group “vision test”, where study participants were found to be more likely to conform to obviously wrong answers if first given by other “participants”, who were actually working for the experimenter.
  • The experiment found that over a third of subjects conformed to giving a wrong answer.

In terms of gender, males show around half the effect of females (tested in same-sex groups). Conformity is also higher among members of an in-group.

  • conformity : the ideology of adhering to one standard or social uniformity

Conducted by social psychologist Solomon Asch of Swarthmore College, the Asch conformity experiments were a series of studies published in the 1950s that demonstrated the power of conformity in groups. They are also known as the Asch paradigm. In the experiment, students were asked to participate in a group “vision test. ” In reality, all but one of the participants were working for Asch (i.e. confederates), and the study was really about how the remaining student would react to their behavior.

The original experiment was conducted with 123 male participants. Each participant was put into a group with five to seven confederates. The participants were shown a card with a line on it (the reference line), followed by another card with three lines on it labeled a, b, and c. The participants were then asked to say out loud which of the three lines matched in length the reference line, as well as other responses such as the length of the reference line to an everyday object, which lines were the same length, and so on.

Each line question was called a “trial. ” The “real” participant answered last or next to last. For the first two trials, the subject would feel at ease in the experiment, as he and the other “participants” gave the obvious, correct answer. On the third trial, all the confederates would start giving the same wrong answer. There were 18 trials in total and the confederates answered incorrectly for 12 of them. These 12 were known as the “critical trials. ”

The aim was to see whether the real participants would conform to the wrong answers of the confederates and change their answer to respond in the same way, despite it being the wrong answer.

Dr. Asch thought that the majority of people would not conform to something obviously wrong, but the results showed that only 24% of the participants did not conform on any trial. Seventy five percent conformed at least once, 5% conformed every time, and when surrounded by individuals all voicing an incorrect answer, participants provided incorrect responses on a high proportion of the questions (32%). Overall, there was a 37% conformity rate by subjects averaged across all critical trials. In a control group, with no pressure to conform to an erroneous answer, only one subject out of 35 ever gave an incorrect answer.

Study Variations

Variations of the basic paradigm tested how many cohorts were necessary to induce conformity, examining the influence of just one cohort and as many as fifteen. Results indicated that one cohort has virtually no influence and two cohorts have only a small influence. When three or more cohorts are present, the tendency to conform increases only modestly. The maximum effect occurs with four cohorts. Adding additional cohorts does not produce a stronger effect.

The unanimity of the confederates has also been varied. When the confederates are not unanimous in their judgment, even if only one confederate voices a different opinion, participants are much more likely to resist the urge to conform (only 5% to 10% conform) than when the confederates all agree. This result holds whether or not the dissenting confederate gives the correct answer. As long as the dissenting confederate gives an answer that is different from the majority, participants are more likely to give the correct answer.

This finding illuminates the power that even a small dissenting minority can have upon a larger group. This demonstrates the importance of privacy in answering important and life-changing questions, so that people do not feel pressured to conform. For example, anonymous surveys can allow people to fully express how they feel about a particular subject without fear of retribution or retaliation from others in the group or the larger society. Having a witness or ally (someone who agrees with the point of view) also makes it less likely that conformity will occur.

Interpretations

Asch suggested that this reflected poorly on factors such as education, which he thought must over-train conformity. Other researchers have argued that it is rational to use other people’s judgments as evidence. Others have suggested that the high conformity rate was due to social norms regarding politeness, which is consistent with subjects’ own claims that they did not actually believe the others’ judgments and were indeed merely conforming.

image

12.4 Conformity, Compliance, and Obedience

Learning objectives.

By the end of this section, you will be able to:

  • Explain the Asch effect
  • Define conformity and types of social influence
  • Describe Stanley Milgram’s experiment and its implications
  • Define groupthink, social facilitation, and social loafing

In this section, we discuss additional ways in which people influence others. The topics of conformity, social influence, obedience, and group processes demonstrate the power of the social situation to change our thoughts, feelings, and behaviors. We begin this section with a discussion of a famous social psychology experiment that demonstrated how susceptible humans are to outside social pressures.

Solomon Asch conducted several experiments in the 1950s to determine how people are affected by the thoughts and behaviors of other people. In one study, a group of participants was shown a series of printed line segments of different lengths: a, b, and c ( Figure 12.17 ). Participants were then shown a fourth line segment: x. They were asked to identify which line segment from the first group (a, b, or c) most closely resembled the fourth line segment in length.

Each group of participants had only one true, naïve subject. The remaining members of the group were confederates of the researcher. A confederate is a person who is aware of the experiment and works for the researcher. Confederates are used to manipulate social situations as part of the research design, and the true, naïve participants believe that confederates are, like them, uninformed participants in the experiment. In Asch’s study, the confederates identified a line segment that was obviously shorter than the target line—a wrong answer. The naïve participant then had to identify aloud the line segment that best matched the target line segment.

How often do you think the true participant aligned with the confederates’ response? That is, how often do you think the group influenced the participant, and the participant gave the wrong answer? Asch (1955) found that 76% of participants conformed to group pressure at least once by indicating the incorrect line. Conformity is the change in a person’s behavior to go along with the group, even if he does not agree with the group. Why would people give the wrong answer? What factors would increase or decrease someone giving in or conforming to group pressure?

The Asch effect is the influence of the group majority on an individual’s judgment.

What factors make a person more likely to yield to group pressure? Research shows that the size of the majority, the presence of another dissenter, and the public or relatively private nature of responses are key influences on conformity.

  • The size of the majority: The greater the number of people in the majority, the more likely an individual will conform. There is, however, an upper limit: a point where adding more members does not increase conformity. In Asch’s study, conformity increased with the number of people in the majority—up to seven individuals. At numbers beyond seven, conformity leveled off and decreased slightly (Asch, 1955).
  • The presence of another dissenter: If there is at least one dissenter, conformity rates drop to near zero (Asch, 1955).
  • The public or private nature of the responses: When responses are made publicly (in front of others), conformity is more likely; however, when responses are made privately (e.g., writing down the response), conformity is less likely (Deutsch & Gerard, 1955).

The finding that conformity is more likely to occur when responses are public than when they are private is the reason government elections require voting in secret, so we are not coerced by others ( Figure 12.18 ). The Asch effect can be easily seen in children when they have to publicly vote for something. For example, if the teacher asks whether the children would rather have extra recess, no homework, or candy, once a few children vote, the rest will comply and go with the majority. In a different classroom, the majority might vote differently, and most of the children would comply with that majority. When someone’s vote changes if it is made in public versus private, this is known as compliance. Compliance can be a form of conformity. Compliance is going along with a request or demand, even if you do not agree with the request. In Asch’s studies, the participants complied by giving the wrong answers, but privately did not accept that the obvious wrong answers were correct.

Now that you have learned about the Asch line experiments, why do you think the participants conformed? The correct answer to the line segment question was obvious, and it was an easy task. Researchers have categorized the motivation to conform into two types: normative social influence and informational social influence (Deutsch & Gerard, 1955).

In normative social influence , people conform to the group norm to fit in, to feel good, and to be accepted by the group. However, with informational social influence , people conform because they believe the group is competent and has the correct information, particularly when the task or situation is ambiguous. What type of social influence was operating in the Asch conformity studies? Since the line judgment task was unambiguous, participants did not need to rely on the group for information. Instead, participants complied to fit in and avoid ridicule, an instance of normative social influence.

An example of informational social influence may be what to do in an emergency situation. Imagine that you are in a movie theater watching a film and what seems to be smoke comes in the theater from under the emergency exit door. You are not certain that it is smoke—it might be a special effect for the movie, such as a fog machine. When you are uncertain you will tend to look at the behavior of others in the theater. If other people show concern and get up to leave, you are likely to do the same. However, if others seem unconcerned, you are likely to stay put and continue watching the movie ( Figure 12.19 ).

How would you have behaved if you were a participant in Asch’s study? Many students say they would not conform, that the study is outdated, and that people nowadays are more independent. To some extent this may be true. Research suggests that overall rates of conformity may have reduced since the time of Asch’s research. Furthermore, efforts to replicate Asch’s study have made it clear that many factors determine how likely it is that someone will demonstrate conformity to the group. These factors include the participant’s age, gender, and socio-cultural background (Bond & Smith, 1996; Larsen, 1990; Walker & Andrade, 1996).

Link to Learning

Watch this video of a replication of the Asch experiment to learn more.

Stanley Milgram’s Experiment

Conformity is one effect of the influence of others on our thoughts, feelings, and behaviors. Another form of social influence is obedience to authority. Obedience is the change of an individual’s behavior to comply with a demand by an authority figure. People often comply with the request because they are concerned about a consequence if they do not comply. To demonstrate this phenomenon, we review another classic social psychology experiment.

Stanley Milgram was a social psychology professor at Yale who was influenced by the trial of Adolf Eichmann, a Nazi war criminal. Eichmann’s defense for the atrocities he committed was that he was “just following orders.” Milgram (1963) wanted to test the validity of this defense, so he designed an experiment and initially recruited 40 men for his experiment. The volunteer participants were led to believe that they were participating in a study to improve learning and memory. The participants were told that they were to teach other students (learners) correct answers to a series of test items. The participants were shown how to use a device that they were told delivered electric shocks of different intensities to the learners. The participants were told to shock the learners if they gave a wrong answer to a test item—that the shock would help them to learn. The participants believed they gave the learners shocks, which increased in 15-volt increments, all the way up to 450 volts. The participants did not know that the learners were confederates and that the confederates did not actually receive shocks.

In response to a string of incorrect answers from the learners, the participants obediently and repeatedly shocked them. The confederate learners cried out for help, begged the participant teachers to stop, and even complained of heart trouble. Yet, when the researcher told the participant-teachers to continue the shock, 65% of the participants continued the shock to the maximum voltage and to the point that the learner became unresponsive ( Figure 12.20 ). What makes someone obey authority to the point of potentially causing serious harm to another person?

Several variations of the original Milgram experiment were conducted to test the boundaries of obedience. When certain features of the situation were changed, participants were less likely to continue to deliver shocks (Milgram, 1965). For example, when the setting of the experiment was moved to an off-campus office building, the percentage of participants who delivered the highest shock dropped to 48%. When the learner was in the same room as the teacher, the highest shock rate dropped to 40%. When the teachers’ and learners’ hands were touching, the highest shock rate dropped to 30%. When the researcher gave the orders by phone, the rate dropped to 23%. These variations show that when the humanity of the person being shocked was increased, obedience decreased. Similarly, when the authority of the experimenter decreased, so did obedience.

This case is still very applicable today. What does a person do if an authority figure orders something done? What if the person believes it is incorrect, or worse, unethical? In a study by Martin and Bull (2008), midwives privately filled out a questionnaire regarding best practices and expectations in delivering a baby. Then, a more senior midwife and supervisor asked the junior midwives to do something they had previously stated they were opposed to. Most of the junior midwives were obedient to authority, going against their own beliefs. Burger (2009) partially replicated this study. He found among a multicultural sample of women and men that their levels of obedience matched Milgram's research. Doliński et al. (2017) performed a replication of Burger's work in Poland and controlled for the gender of both participants and learners, and once again, results that were consistent with Milgram's original work were observed.

When in group settings, we are often influenced by the thoughts, feelings, and behaviors of people around us. Whether it is due to normative or informational social influence, groups have power to influence individuals. Another phenomenon of group conformity is groupthink. Groupthink is the modification of the opinions of members of a group to align with what they believe is the group consensus (Janis, 1972). In group situations, the group often takes action that individuals would not perform outside the group setting because groups make more extreme decisions than individuals do. Moreover, groupthink can hinder opposing trains of thought. This elimination of diverse opinions contributes to faulty decision by the group.

Groupthink in the U.S. Government

There have been several instances of groupthink in the U.S. government. One example occurred when the United States led a small coalition of nations to invade Iraq in March 2003. This invasion occurred because a small group of advisors and former President George W. Bush were convinced that Iraq represented a significant terrorism threat with a large stockpile of weapons of mass destruction at its disposal. Although some of these individuals may have had some doubts about the credibility of the information available to them at the time, in the end, the group arrived at a consensus that Iraq had weapons of mass destruction and represented a significant threat to national security. It later came to light that Iraq did not have weapons of mass destruction, but not until the invasion was well underway. As a result, 6000 American soldiers were killed and many more civilians died. How did the Bush administration arrive at their conclusions? View this video of Colin Powell, 10 years after his famous United Nations speech, discussing the information he had at the time that his decisions were based on. ("CNN Official Interview: Colin Powell now regrets UN speech about WMDs," 2010).

Do you see evidence of groupthink?

Why does groupthink occur? There are several causes of groupthink, which makes it preventable. When the group is highly cohesive, or has a strong sense of connection, maintaining group harmony may become more important to the group than making sound decisions. If the group leader is directive and makes his opinions known, this may discourage group members from disagreeing with the leader. If the group is isolated from hearing alternative or new viewpoints, groupthink may be more likely. How do you know when groupthink is occurring?

There are several symptoms of groupthink including the following:

  • perceiving the group as invulnerable or invincible—believing it can do no wrong
  • believing the group is morally correct
  • self-censorship by group members, such as withholding information to avoid disrupting the group consensus
  • the quashing of dissenting group members’ opinions
  • the shielding of the group leader from dissenting views
  • perceiving an illusion of unanimity among group members
  • holding stereotypes or negative attitudes toward the out-group or others’ with differing viewpoints (Janis, 1972)

Given the causes and symptoms of groupthink, how can it be avoided? There are several strategies that can improve group decision making including seeking outside opinions, voting in private, having the leader withhold position statements until all group members have voiced their views, conducting research on all viewpoints, weighing the costs and benefits of all options, and developing a contingency plan (Janis, 1972; Mitchell & Eckstein, 2009).

Group Polarization

Another phenomenon that occurs within group settings is group polarization. Group polarization (Teger & Pruitt, 1967) is the strengthening of an original group attitude after the discussion of views within a group. That is, if a group initially favors a viewpoint, after discussion the group consensus is likely a stronger endorsement of the viewpoint. Conversely, if the group was initially opposed to a viewpoint, group discussion would likely lead to stronger opposition. Group polarization explains many actions taken by groups that would not be undertaken by individuals. Group polarization can be observed at political conventions, when platforms of the party are supported by individuals who, when not in a group, would decline to support them. Recently, some theorists have argued that group polarization may be partly responsible for the extreme political partisanship that seems ubiquitous in modern society. Given that people can self-select media outlets that are most consistent with their own political views, they are less likely to encounter opposing viewpoints. Over time, this leads to a strengthening of their own perspective and of hostile attitudes and behaviors towards those with different political ideals. Remarkably, political polarization leads to open levels of discrimination that are on par with, or perhaps exceed, racial discrimination (Iyengar & Westwood, 2015). A more everyday example is a group’s discussion of how attractive someone is. Does your opinion change if you find someone attractive, but your friends do not agree? If your friends vociferously agree, might you then find this person even more attractive?

Social traps refer to situations that arise when individuals or groups of individuals behave in ways that are not in their best interest and that may have negative, long-term consequences. However, once established, a social trap is very difficult to escape. For example, following World War II, the United States and the former Soviet Union engaged in a nuclear arms race. While the presence of nuclear weapons is not in either party's best interest, once the arms race began, each country felt the need to continue producing nuclear weapons to protect itself from the other.

Social Loafing

Imagine you were just assigned a group project with other students whom you barely know. Everyone in your group will get the same grade. Are you the type who will do most of the work, even though the final grade will be shared? Or are you more likely to do less work because you know others will pick up the slack? Social loafing involves a reduction in individual output on tasks where contributions are pooled. Because each individual's efforts are not evaluated, individuals can become less motivated to perform well. Karau and Williams (1993) and Simms and Nichols (2014) reviewed the research on social loafing and discerned when it was least likely to happen. The researchers noted that social loafing could be alleviated if, among other situations, individuals knew their work would be assessed by a manager (in a workplace setting) or instructor (in a classroom setting), or if a manager or instructor required group members to complete self-evaluations.

The likelihood of social loafing in student work groups increases as the size of the group increases (Shepperd & Taylor, 1999). According to Karau and Williams (1993), college students were the population most likely to engage in social loafing. Their study also found that women and participants from collectivistic cultures were less likely to engage in social loafing, explaining that their group orientation may account for this.

College students could work around social loafing or “free-riding” by suggesting to their professors use of a flocking method to form groups. Harding (2018) compared groups of students who had self-selected into groups for class to those who had been formed by flocking, which involves assigning students to groups who have similar schedules and motivations. Not only did she find that students reported less “free riding,” but that they also did better in the group assignments compared to those whose groups were self-selected.

Interestingly, the opposite of social loafing occurs when the task is complex and difficult (Bond & Titus, 1983; Geen, 1989). In a group setting, such as the student work group, if your individual performance cannot be evaluated, there is less pressure for you to do well, and thus less anxiety or physiological arousal (Latané, Williams, & Harkens, 1979). This puts you in a relaxed state in which you can perform your best, if you choose (Zajonc, 1965). If the task is a difficult one, many people feel motivated and believe that their group needs their input to do well on a challenging project (Jackson & Williams, 1985).

Deindividuation

Another way that being part of a group can affect behavior is exhibited in instances in which deindividuation occurs. Deindividuation refers to situations in which a person may feel a sense of anonymity and therefore a reduction in accountability and sense of self when among others. Deindividuation is often pointed to in cases in which mob or riot-like behaviors occur (Zimbardo, 1969), but research on the subject and the role that deindividuation plays in such behaviors has resulted in inconsistent results (as discussed in Granström, Guvå, Hylander, & Rosander, 2009).

Table 12.2 summarizes the types of social influence you have learned about in this chapter.

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/psychology-2e/pages/1-introduction
  • Authors: Rose M. Spielman, William J. Jenkins, Marilyn D. Lovett
  • Publisher/website: OpenStax
  • Book title: Psychology 2e
  • Publication date: Apr 22, 2020
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/psychology-2e/pages/1-introduction
  • Section URL: https://openstax.org/books/psychology-2e/pages/12-4-conformity-compliance-and-obedience

© Jan 6, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

The Asch Conformity Experiments

What Solomon Asch Demonstrated About Social Pressure

  • Recommended Reading
  • Key Concepts
  • Major Sociologists
  • News & Issues
  • Research, Samples, and Statistics
  • Archaeology

The Asch Conformity Experiments, conducted by psychologist Solomon Asch in the 1950s, demonstrated the power of conformity in groups and showed that even simple objective facts cannot withstand the distorting pressure of group influence.

The Experiment

In the experiments, groups of male university students were asked to participate in a perception test. In reality, all but one of the participants were "confederates" (collaborators with the experimenter who only pretended to be participants). The study was about how the remaining student would react to the behavior of the other "participants."

The participants of the experiment (the subject as well as the confederates) were seated in a classroom and were presented with a card with a simple vertical black line drawn on it. Then, they were given a second card with three lines of varying length labeled "A," "B," and "C." One line on the second card was the same length as that on the first, and the other two lines were obviously longer and shorter.

Participants were asked to state out loud in front of each other which line, A, B, or C, matched the length of the line on the first card. In each experimental case, the confederates answered first, and the real participant was seated so that he would answer last. In some cases, the confederates answered correctly, while in others, the answered incorrectly.

Asch's goal was to see if the real participant would be pressured to answer incorrectly in the instances when the Confederates did so, or whether their belief in their own perception and correctness would outweigh the social pressure provided by the responses of the other group members.

Asch found that one-third of real participants gave the same wrong answers as the Confederates at least half the time. Forty percent gave some wrong answers, and only one-fourth gave correct answers in defiance of the pressure to conform to the wrong answers provided by the group.

In interviews he conducted following the trials, Asch found that those that answered incorrectly, in conformance with the group, believed that the answers given by the Confederates were correct, some thought that they were suffering a lapse in perception for originally thinking an answer that differed from the group, while others admitted that they knew that they had the correct answer, but conformed to the incorrect answer because they didn't want to break from the majority.

The Asch experiments have been repeated many times over the years with students and non-students, old and young, and in groups of different sizes and different settings. The results are consistently the same with one-third to one-half of the participants making a judgment contrary to fact, yet in conformity with the group, demonstrating the strong power of social influences.

Connection to Sociology

The results of Asch's experiment resonate with what we know to be true about the nature of social forces and norms in our lives. The behavior and expectations of others shape how we think and act on a daily basis because what we observe among others teaches us what is normal , and expected of us. The results of the study also raise interesting questions and concerns about how knowledge is constructed and disseminated, and how we can address social problems that stem from conformity, among others.

Updated  by Nicki Lisa Cole, Ph.D.

  • 15 Major Sociological Studies and Publications
  • High School Science Experiment Ideas
  • Adult Ice Breaker Games for Classrooms, Meetings, and Conferences
  • Understanding Socialization in Sociology
  • 10 Common Test Mistakes
  • Homeschool Myths
  • What to Expect From MBA Classes
  • The Milgram Experiment: How Far Will You Go to Obey an Order?
  • Teaching Strategies to Promote Student Equity and Engagement
  • The Slave Boy Experiment in Plato's 'Meno'
  • The 49 Techniques from Teach Like a Champion
  • Introduction to Sociology
  • Notes on 'Ain't'
  • Social Surveys: Questionnaires, Interviews, and Telephone Polls
  • 5 Bubble Sheet Tips for Test Answers
  • The Original 13 U.S. States

Psychologist World

Learn More Psychology

Asch: social influence, conforming in groups.

The Asch conformity experiments were a series of studies that starkly demonstrated the power of conformity in groups.

Permalink Print   |  

Asch: Social Influence, Conforming in Groups

Experimenters led by Solomon Asch asked students to participate in a "vision test." In reality, all but one of the partipants were shills of the experimenter, and the study was really about how the remaining student would react to the confederates' behavior.

Zimbardo's Stanford Prison Experiment

  • Social Influence
  • Minority Influence

Asch's Experiments

The participants -- the real subjects and the confederates -- were all seated in a classroom where they were told to announce their judgment of the length of several lines drawn on a series of displays. They were asked which line was longer than the other, which were the same length, etc. The confederates had been prearranged to all give an incorrect answer to the tests.

Many subjects showed extreme discomfort, but most conformed to the majority view of the others in the room, even when the majority said that two lines different in length by several inches were the same length. Control subjects with no exposure to a majority view had no trouble giving the correct answer.

One difference between the Asch conformity experiments and the (also famous in social psychology) Milgram experiment noted by Milgram is that subjects in these studies attributed themselves and their own poor eyesight and judgment while those in the Milgram experiment blamed the experimenter in explaining their behavior. Conformity may be much less salient than authority pressure.

The Asch experiments may provide some vivid empirical evidence relevant to some of the ideas raised in George Orwell's Nineteen Eighty-Four.

Learn More:

Discover the Psychology of Influence

  • https://en.wikipedia.org/wiki/Asch_conformity_experiments

Which Archetype Are You?

Which Archetype Are You?

Are You Angry?

Are You Angry?

Windows to the Soul

Windows to the Soul

Are You Stressed?

Are You Stressed?

Attachment & Relationships

Attachment & Relationships

Memory Like A Goldfish?

Memory Like A Goldfish?

31 Defense Mechanisms

31 Defense Mechanisms

Slave To Your Role?

Slave To Your Role?

Which Archetype Are You?

Are You Fixated?

Are You Fixated?

Interpret Your Dreams

Interpret Your Dreams

How to Read Body Language

How to Read Body Language

How to Beat Stress and Succeed in Exams

does this case study support the findings of milgram and asch

More on Influence

Zimbardo's Stanford prison experiment revealed how social roles can influence...

Are You Authoritarian?

How Theodor Adorno's F-scale aimed to identify fascism and authoritarian...

False Memories

How false memories are created and can affect our ability to recall events.

Brainwashed

Brainwashing, its origins and its use in cults and media.

Psychology Of Influence

What causes us to obey to authority figures such as police, teachers and...

Sign Up for  Unlimited Access

Psychologist World

  • Psychology approaches, theories and studies explained
  • Body Language Reading Guide
  • How to Interpret Your Dreams Guide
  • Self Hypnosis Downloads
  • Plus More Member Benefits

You May Also Like...

Psychology of color, nap for performance, dark sense of humor linked to intelligence, why do we dream, master body language, making conversation, persuasion with ingratiation, psychology  guides.

Learn Body Language Reading

Learn Body Language Reading

How To Interpret Your Dreams

How To Interpret Your Dreams

Overcome Your Fears and Phobias

Overcome Your Fears and Phobias

Psychology topics, learn psychology.

Sign Up

  • Access 2,200+ insightful pages of psychology explanations & theories
  • Insights into the way we think and behave
  • Body Language & Dream Interpretation guides
  • Self hypnosis MP3 downloads and more
  • Behavioral Approach
  • Eye Reading
  • Stress Test
  • Cognitive Approach
  • Fight-or-Flight Response
  • Neuroticism Test

© 2024 Psychologist World.   Parts licensed under   GNU FDL . Home About Contact Us Terms of Use Privacy & Cookies Hypnosis Scripts Sign Up

Loading metrics

Open Access

Essays articulate a specific perspective on a topic of broad interest to scientists.

See all article types »

Contesting the “Nature” Of Conformity: What Milgram and Zimbardo's Studies Really Show

* E-mail: [email protected]

Affiliation School of Psychology, University of Queensland, St. Lucia, Australia

Affiliation School of Psychology, University of St. Andrews, St Andrews, Scotland

  • S. Alexander Haslam, 
  • Stephen. D. Reicher

PLOS

Published: November 20, 2012

  • https://doi.org/10.1371/journal.pbio.1001426
  • Reader Comments

Understanding of the psychology of tyranny is dominated by classic studies from the 1960s and 1970s: Milgram's research on obedience to authority and Zimbardo's Stanford Prison Experiment. Supporting popular notions of the banality of evil, this research has been taken to show that people conform passively and unthinkingly to both the instructions and the roles that authorities provide, however malevolent these may be. Recently, though, this consensus has been challenged by empirical work informed by social identity theorizing. This suggests that individuals' willingness to follow authorities is conditional on identification with the authority in question and an associated belief that the authority is right.

Citation: Haslam SA, Reicher SD (2012) Contesting the “Nature” Of Conformity: What Milgram and Zimbardo's Studies Really Show. PLoS Biol 10(11): e1001426. https://doi.org/10.1371/journal.pbio.1001426

Copyright: © 2012 Haslam, Reicher. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The authors received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

Introduction

If men make war in slavish obedience to rules, they will fail. Ulysses S. Grant [1]

Conformity is often criticized on grounds of morality. Many, if not all, of the greatest human atrocities have been described as “crimes of obedience” [2] . However, as the victorious American Civil War General and later President Grant makes clear, conformity is equally problematic on grounds of efficacy. Success requires leaders and followers who do not adhere rigidly to a pre-determined script. Rigidity cannot steel them for the challenges of their task or for the creativity of their opponents.

Given these problems, it would seem even more unfortunate if human beings were somehow programmed for conformity. Yet this is a view that has become dominant over the last half-century. Its influence can be traced to two landmark empirical programs led by social psychologists in the 1960s and early 1970s: Milgram's Obedience to Authority research and Zimbardo's Stanford Prison Experiment. These studies have not only had influence in academic spheres. They have spilled over into our general culture and shaped popular understanding, such that “everyone knows” that people inevitably succumb to the demands of authority, however immoral the consequences [3] , [4] . As Parker puts it, “the hopeless moral of the [studies'] story is that resistance is futile” [5] . What is more, this work has shaped our understanding not only of conformity but of human nature more broadly [6] .

Building on an established body of theorizing in the social identity tradition—which sees group-based influence as meaningful and conditional [7] , [8] —we argue, however, that these understandings are mistaken. Moreover, we contend that evidence from the studies themselves (as well as from subsequent research) supports a very different analysis of the psychology of conformity.

The Classic Studies: Conformity, Obedience, and the Banality Of Evil

In Milgram's work [9] , [10] members of the general public (predominantly men) volunteered to take part in a scientific study of memory. They found themselves cast in the role of a “Teacher” with the task of administering shocks of increasing magnitude (from 15 V to 450 V in 15-V increments) to another man (the “Learner”) every time he failed to recall the correct word in a previously learned pair. Unbeknown to the Teacher, the Learner was Milgram's confederate, and the shocks were not real. Moreover, rather than being interested in memory, Milgram was actually interested in seeing how far the men would go in carrying out the task. To his—and everyone else's [11] —shock, the answer was “very far.” In what came to be termed the “baseline” study [12] all participants proved willing to administer shocks of 300 V and 65% went all the way to 450 V. This appeared to provide compelling evidence that normal well-adjusted men would be willing to kill a complete stranger simply because they were ordered to do so by an authority.

Zimbardo's Stanford Prison Experiment took these ideas further by exploring the destructive behaviour of groups of men over an extended period [13] , [14] . Students were randomly assigned to be either guards or prisoners within a mock prison that had been constructed in the Stanford Psychology Department. In contrast to Milgram's studies, the objective was to observe the interaction within and between the two groups in the absence of an obviously malevolent authority. Here, again, the results proved shocking. Such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just 6 days. Zimbardo's conclusion from this was even more alarming than Milgram's. People descend into tyranny, he suggested, because they conform unthinkingly to the toxic roles that authorities prescribe without the need for specific orders: brutality was “a ‘natural’ consequence of being in the uniform of a ‘guard’ and asserting the power inherent in that role” [15] .

Within psychology, Milgram and Zimbardo helped consolidate a growing “conformity bias” [16] in which the focus on compliance is so strong as to obscure evidence of resistance and disobedience [17] . However their arguments proved particularly potent because they seemed to mesh with real-world examples—particularly evidence of the “banality of evil.” This term was coined in Hannah Arendt's account of the trial of Adolf Eichmann [18] , a chief architect of the Nazis' “final solution to the Jewish question” [19] . Despite being responsible for the transportation of millions of people to their death, Arendt suggested that Eichmann was no psychopathic monster. Instead his trial revealed him to be a diligent and efficient bureaucrat—a man more concerned with following orders than with asking deep questions about their morality or consequence.

Much of the power of Milgram and Zimbardo's research derives from the fact that it appears to give empirical substance to this claim that evil is banal [3] . It seems to show that tyranny is a natural and unavoidable consequence of humans' inherent motivation to bend to the wishes of those in authority—whoever they may be and whatever it is that they want us to do. Put slightly differently, it operationalizes an apparent tragedy of the human condition: our desire to be good subjects is stronger than our desire to be subjects who do good.

Questioning the Consensus: Conformity Isn't Natural and It Doesn't Explain Tyranny

The banality of evil thesis appears to be a truth almost universally acknowledged. Not only is it given prominence in social psychology textbooks [20] , but so too it informs the thinking of historians [21] , [22] , political scientists [23] , economists [24] , and neuroscientists [25] . Indeed, via a range of social commentators, it has shaped the public consciousness much more broadly [26] , and, in this respect, can lay claim to being the most influential data-driven thesis in the whole of psychology [27] , [28] .

Yet despite the breadth of this consensus, in recent years, we and others have reinterrogated its two principal underpinnings—the archival evidence pertaining to Eichmann and his ilk, and the specifics of Milgram and Zimbardo's empirical demonstrations—in ways that tell a very different story [29] .

First, a series of thoroughgoing historical examinations have challenged the idea that Nazi bureaucrats were ever simply following orders [19] , [26] , [30] . This may have been the defense they relied upon when seeking to minimize their culpability [31] , but evidence suggests that functionaries like Eichmann had a very good understanding of what they were doing and took pride in the energy and application that they brought to their work. Typically too, roles and orders were vague, and hence for those who wanted to advance the Nazi cause (and not all did), creativity and imagination were required in order to work towards the regime's assumed goals and to overcome the challenges associated with any given task [32] . Emblematic of this, the practical details of “the final solution” were not handed down from on high, but had to be elaborated by Eichmann himself. He then felt compelled to confront and disobey his superiors—most particularly Himmler—when he believed that they were not sufficiently faithful to eliminationist Nazi principles [19] .

Second, much the same analysis can be used to account for behavior in the Stanford Prison Experiment. So while it may be true that Zimbardo gave his guards no direct orders, he certainly gave them a general sense of how he expected them to behave [33] . During the orientation session he told them, amongst other things, “You can create in the prisoners feelings of boredom, a sense of fear to some degree, you can create a notion of arbitrariness that their life is totally controlled by us, by the system, you, me… We're going to take away their individuality in various ways. In general what all this leads to is a sense of powerlessness” [34] . This contradicts Zimbardo's assertion that “behavioral scripts associated with the oppositional roles of prisoner and guard [were] the sole source of guidance” [35] and leads us to question the claim that conformity to these role-related scripts was the primary cause of guard brutality.

But even with such guidance, not all guards acted brutally. And those who did used ingenuity and initiative in responding to Zimbardo's brief. Accordingly, after the experiment was over, one prisoner confronted his chief tormentor with the observation that “If I had been a guard I don't think it would have been such a masterpiece” [34] . Contrary to the banality of evil thesis, the Zimbardo-inspired tyranny was made possible by the active engagement of enthusiasts rather than the leaden conformity of automatons.

Turning, third, to the specifics of Milgram's studies, the first point to note is that the primary dependent measure (flicking a switch) offers few opportunities for creativity in carrying out the task. Nevertheless, several of Milgram's findings typically escape standard reviews in which the paradigm is portrayed as only yielding up evidence of obedience. Initially, it is clear that the “baseline study” is not especially typical of the 30 or so variants of the paradigm that Milgram conducted. Here the percentage of participants going to 450 V varied from 0% to nearly 100%, but across the studies as a whole, a majority of participants chose not to go this far [10] , [36] , [37] .

Furthermore, close analysis of the experimental sessions shows that participants are attentive to the demands made on them by the Learner as well as the Experimenter [38] . They are torn between two voices confronting them with irreconcilable moral imperatives, and the fact that they have to choose between them is a source of considerable anguish. They sweat, they laugh, they try to talk and argue their way out of the situation. But the experimental set-up does not allow them to do so. Ultimately, they tend to go along with the Experimenter if he justifies their actions in terms of the scientific benefits of the study (as he does with the prod “The experiment requires that you continue”) [39] . But if he gives them a direct order (“You have no other choice, you must go on”) participants typically refuse. Once again, received wisdom proves questionable. The Milgram studies seem to be less about people blindly conforming to orders than about getting people to believe in the importance of what they are doing [40] .

Tyranny as a Product of Identification-Based Followership

Our suspicions about the plausibility of the banality of evil thesis and its various empirical substrates were first raised through our work on the BBC Prison Study (BPS [41] ). Like the Stanford study, this study randomly assigned men to groups as guards and prisoners and examined their behaviour with a specially created “prison.” Unlike Zimbardo, however, we took no leadership role in the study. Without this, would participants conform to a hierarchical script or resist it?

The study generated three clear findings. First, participants did not conform automatically to their assigned role. Second, they only acted in terms of group membership to the extent that they actively identified with the group (such that they took on a social identification) [42] . Third, group identity did not mean that people simply accepted their assigned position; instead, it empowered them to resist it. Early in the study, the Prisoners' identification as a group allowed them successfully to challenge the authority of the Guards and create a more egalitarian system. Later on, though, a highly committed group emerged out of dissatisfaction with this system and conspired to create a new hierarchy that was far more draconian.

Ultimately, then, the BBC Prison Study came close to recreating the tyranny of the Stanford Prison Experiment. However it was neither passive conformity to roles nor blind obedience to rules that brought the study to this point. On the contrary, it was only when they had internalized roles and rules as aspects of a system with which they identified that participants used them as a guide to action. Moreover, on the basis of this shared identification, the hallmark of the tyrannical regime was not conformity but creative leadership and engaged followership within a group of true believers (see also [43] , [44] ). As we have seen, this analysis mirrors recent conclusions about the Nazi tyranny. To complete the argument, we suggest that it is also applicable to Milgram's paradigm.

The evidence, noted above, about the efficacy of different “prods” already points to the fact that compliance is bound up with a sense of commitment to the experiment and the experimenter over and above commitment to the learner (S. Haslam, SD Reicher, M. Birney, unpublished data) [39] . This use of prods is but one aspect of Milgram's careful management of the paradigm [13] that is aimed at securing participants' identification with the scientific enterprise.

Significantly, though, the degree of identification is not constant across all variants of the study. For instance, when the study is conducted in commercial premises as opposed to prestigious Yale University labs one might expect the identification to diminish and (as our argument implies) compliance to decrease. It does. More systematically, we have examined variations in participants' identification with the Experimenter and the science that he represents as opposed to their identification with the Learner and the general community. They always identify with both to some degree—hence the drama and the tension of the paradigm. But the degree matters, and greater identification with the Experimenter is highly predictive of a greater willingness among Milgram's participants to administer the maximum shock across the paradigm's many variants [37] .

However, some of the most compelling evidence that participants' administration of shocks results from their identification with Milgram's scientific goals comes from what happened after the study had ended. In his debriefing, Milgram praised participants for their commitment to the advancement of science, especially as it had come at the cost of personal discomfort. This inoculated them against doubts concerning their own punitive actions, but it also it led them to support more of such actions in the future. “I am happy to have been of service,” one typical participant responded, “Continue your experiments by all means as long as good can come of them. In this crazy mixed up world of ours, every bit of goodness is needed” (S. Haslam, SD Reicher, K Millward, R MacDonald, unpublished data).

The banality of evil thesis shocks us by claiming that decent people can be transformed into oppressors as a result of their “natural” conformity to the roles and rules handed down by authorities. More particularly, the inclination to conform is thought to suppress oppressors' ability to engage intellectually with the fact that what they are doing is wrong.

Although it remains highly influential, this thesis loses credibility under close empirical scrutiny. On the one hand, it ignores copious evidence of resistance even in studies held up as demonstrating that conformity is inevitable [17] . On the other hand, it ignores the evidence that those who do heed authority in doing evil do so knowingly not blindly, actively not passively, creatively not automatically. They do so out of belief not by nature, out of choice not by necessity. In short, they should be seen—and judged—as engaged followers not as blind conformists [45] .

What was truly frightening about Eichmann was not that he was unaware of what he was doing, but rather that he knew what he was doing and believed it to be right. Indeed, his one regret, expressed prior to his trial, was that he had not killed more Jews [19] . Equally, what is shocking about Milgram's experiments is that rather than being distressed by their actions [46] , participants could be led to construe them as “service” in the cause of “goodness.”

To understand tyranny, then, we need to transcend the prevailing orthodoxy that this derives from something for which humans have a natural inclination—a “Lucifer effect” to which they succumb thoughtlessly and helplessly (and for which, therefore, they cannot be held accountable). Instead, we need to understand two sets of inter-related processes: those by which authorities advocate oppression of others and those that lead followers to identify with these authorities. How did Milgram and Zimbardo justify the harmful acts they required of their participants and why did participants identify with them—some more than others?

These questions are complex and full answers fall beyond the scope of this essay. Yet, regarding advocacy, it is striking how destructive acts were presented as constructive, particularly in Milgram's case, where scientific progress was the warrant for abuse. Regarding identification, this reflects several elements: the personal histories of individuals that render some group memberships more plausible than others as a source of self-definition; the relationship between the identities on offer in the immediate context and other identities that are held and valued in other contexts; and the structure of the local context that makes certain ways of orienting oneself to the social world seem more “fitting” than others [41] , [47] , [48] .

At root, the fundamental point is that tyranny does not flourish because perpetrators are helpless and ignorant of their actions. It flourishes because they actively identify with those who promote vicious acts as virtuous [49] . It is this conviction that steels participants to do their dirty work and that makes them work energetically and creatively to ensure its success. Moreover, this work is something for which they actively wish to be held accountable—so long as it secures the approbation of those in power.

  • 1. Strachan H (1983) European armies and the conduct of war. London: Unwin Hyman (p.3).
  • 2. Kelman HC, Hamilton VL (1990) Crimes of obedience. New Haven: Yale University Press.
  • 3. Novick P (1999) The Holocaust in American life. Boston: Houghton Mifflin.
  • 4. Jetten J, Hornsey MJ (Eds.) (2011) Rebels in groups: dissent, deviance, difference and defiance. Chichester, UK: Wiley-Blackwell.
  • 5. Parker I (2007) Revolution in social psychology: alienation to emancipation. London: Pluto Press. (p.84)
  • 6. Smith, JR, Haslam SA. (Eds.) (2012) Social psychology: revisiting the classic studies. London: Sage.
  • 7. Turner JC (1991) Social influence. Buckingham, UK: Open University Press.
  • 8. Turner JC, Hogg MA, Oakes PJ, Reicher SD, Wetherell MS (1987). Rediscovering the social group: A self-categorization theory. Oxford: Blackwell.
  • View Article
  • Google Scholar
  • 10. Milgram S (1974) Obedience to authority: an experimental view. New York: Harper & Row.
  • 11. Blass T (2004) The man who shocked the world: the life and legacy of Stanley Milgram. New York, NY: Basic Books.
  • 14. Zimbardo P (2007) The Lucifer effect: how good people turn evil. London, UK: Random House.
  • 16. Moscovici S (1976) Social influence and social change. London, UK: Academic Press.
  • 18. Arendt H (1963) Eichmann in Jerusalem: a report on the banality of evil. New York: Penguin.
  • 19. Cesarani D (2004) Eichmann: his life and crimes. London: Heinemann.
  • 20. Miller A (2004). What can the Milgram obedience experiments tell us about the Holocaust? Generalizing from the social psychology laboratory. Miller A, ed. The social psychology of good and evil. New York: Guilford. pp. 193–239.
  • 21. Browning C (1992) Ordinary men: Reserve Police Battalion 101 and the Final Solution in Poland. London: Penguin Books.
  • 25. Harris LT (2009) The influence of social group and context on punishment decisions: insights from social neuroscience. Gruter Institute Squaw Valley Conference May 21, 2009. Law, Behavior & the Brain. Available at SSRN: http://ssrn.com/abstract=1405319 .
  • 26. Lozowick Y (2002) Hitler's bureaucrats: the Nazi Security Police and the banality of evil. H. Watzman, translator. London: Continuum.
  • 27. Blass T (Ed.) (2000) Obedience to authority. Current perspectives on the Milgram Paradigm. Mahwah (New Jersey): Erlbaum.
  • 30. Vetlesen AJ (2005) Evil and human agency: understanding collective evildoing. Cambridge: Cambridge University Press.
  • 34. Zimbardo P (1989) Quiet rage: The Stanford Prison Study [video]. Stanford: Stanford University.
  • 35. Zimbardo P (2004) A situationist perspective on the psychology of evil: understanding how good people are transformed into perpetrators. Miller A, editor. The social psychology of good and evil. New York: Guilford. pp. 21–50.
  • 42. Tajfel H, Turner JC (1979) An integrative theory of intergroup conflict. Austin WG, Worchel S, editors. The social psychology of intergroup relations. Monterey (California): Brooks/Cole. pp.33–47.
  • 45. Haslam SA, Reicher SD, Platow MJ (2008) The new psychology of leadership: identity, influence and power. Hove, UK: Psychology Press.
  • 48. Oakes PJ, Haslam SA, Turner JC (1994) Stereotyping and social reality. Oxford: Blackwell.

February 19, 2016

How Nazi's Defense of "Just Following Orders" Plays Out in the Mind

Modern-day Milgram experiment shows that people obeying commands feel less responsible for their actions

By Joshua Barajas & PBS NewsHour

In a  1962 letter , as a last-ditch effort for clemency, Holocaust organizer Adolf Eichmann wrote that he and other low-level officers were “forced to serve as mere instruments,” shifting the responsibility for the deaths of millions of Jews to his superiors. The “just following orders” defense, made famous in the post-WWII  Nuremberg trials , featured heavily in Eichmann’s court hearings.

But that same year Stanley Milgram, a Yale University psychologist, conducted a series of famous experiments that tested whether “ordinary” folks would inflict harm on another person after following orders from an authoritative figure. Shockingly, the results suggested any human was capable of a heart of darkness.

Milgram’s research  tackled whether a person could be coerced into behaving heinously, but new research released Thursday offers one explanation as to why.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

“In particular, acting under orders caused participants to perceive a distance from outcomes that they themselves caused,” said study co-author Patrick Haggard, a cognitive neuroscientist at University College London, in an email.

In other words, people actually feel disconnected from their actions when they comply with orders, even though they’re the ones committing the act.

The study,  published in the journal Current Biology , described this distance as people experiencing their actions more as “passive movements than fully voluntary actions” when they follow orders.

Researchers at University College London and Université libre de Bruxelles in Belgium arrived at this conclusion by investigating how coercion could change someone’s “sense of agency,” a psychological phenomenon that refers to one’s awareness of their actions causing some external outcome.

More simply, Haggard described the phenomenon as flipping a switch (action) to turn on a light (external outcome). The time between the action and its outcome is typically experienced as a simultaneous event. Through two experiments, however, Haggard and the other researchers showed that people experienced a longer lapse in time in between the action and outcome, even if the outcome was unpleasant. It’s like you flip the switch, but it takes a beat or two for the light to appear.

“This [disconnect] suggests a reduced sense of agency, as if the participants’ actions under coercion began to feel more passive,” Haggard said.

Unlike Milgram’s classic research, Haggard’s team introduced a shocking element that was missing in the original 1960s experiments: actual shocks. Haggard said they used “moderately painful, but tolerable, shocks.” Milgram feigned shocks up to 450 volts.

According to Milgram’s experiments, 65 percent of his volunteers, described as “teachers,” were willing (sometimes reluctantly) to press a button that delivered shocks up to 450 volts to an unseen person, a “learner” in another room. Although pleas from the unknown person could be heard, including mentions of a heart condition, Milgram’s study said his volunteers continued to shock the “learner” when ordered to do so. At no point, however, did someone truly experience an electric shock.

“Milgram’s studies rested on a deception: Participants were instructed to administer ‘severe shocks’ to an actor, who in fact merely feigned being shocked,” Haggard said. “It’s difficult to ascertain whether participants are really deceived or not in such situations.”

When Yale received reams of Milgram’s documents in the 2000s, other psychologists started to  criticize the famous electric-shock study  when they sifted through the notes more closely.

Gina Perry, author of “Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments,” found a litany of methodological problems with the study. Perry said Milgram’s experiments were far less controlled than originally thought and introduced variables that appeared to goose the numbers.

Haggard said his team’s study was more transparent. In the first experiment, he said participants—an “agent” and a “victim”—took turns delivering mild shocks or inflicting a financial penalty on each other. In some cases, a third person—an “experimenter”—sat in the room and gave orders on whether to inflict harm. In other cases, the experimenter looked away, while the agent acted on their own volition.

does this case study support the findings of milgram and asch

In this test, the “agent” can shock or take money from the “victim,” either acting on orders or by their own choice. Image courtesy of Caspar et al., Current Biology (2016)

The result? Researchers measured a “small, but significant” increase in the perceived time between a person’s action and outcome when coercion was involved. That is, when people act “under orders,” they seem to experience less agency over their actions and outcomes than when they choose for themselves, Haggard said.

does this case study support the findings of milgram and asch

In this test, the “agent” is not under the watchful gaze of an authority figure. They’re free to shock or take money from the “victim,” if they want. Image courtesy of Caspar et al., Current Biology (2016)

In a second experiment, the team explored whether the loss of agency could also be seen in the brain activity of subjects. Prior work had found that brain activity is dampened when people are forced to follow orders.

So akin to before, subjects had to decide whether to shock a person with or without coercion, but now they heard an audible tone while making the choice. This tone elicited a brain response that could be measured by an electroencephalogram (EEG) cap.

Haggard’s team found that brain activity in response to this tone is indeed dampened when being coerced. Haggard’s team also used a questionnaire in the second experiment to get explicit judgments from the volunteers, who explained they felt less responsible when they acted under orders.

Haggard said his team’s findings do not legitimate the Nuremberg defense and that anyone who claims they were “just following orders” ought to be  viewed with skepticism .

But, “our study does suggest that this claim might potentially correspond to the basic  experience  that the person had of their action at the time,” Haggard said.

“If people acting under orders really do feel reduced responsibility, this seems important to understand. For a start, people who give orders should perhaps be held more responsible for the actions and outcomes of those they coerce,” he said.

This article is reprinted with permission from  PBS NewsHour . It was first published  on February 18, 2016.

Academia’s Response to Milgram’s Findings and Explanation

  • First Online: 18 September 2018

Cite this chapter

Book cover

  • Nestar Russell 2  

1001 Accesses

In this chapter, Russell provides a brief overview of the key issues that Stanley Milgram’s academic peers debated after the publication of his Obedience to Authority research. More specifically, Russell presents and assesses the prominent ethical and methodological critiques of Milgram’s research. Then with a focus on the Holocaust, Russell explores the debate over the generalizability of Milgram’s results beyond the laboratory walls. Finally, Russell examines the scholarly reaction to Milgram’s agentic state theory, particularly with reference to its application to the Milgram-Holocaust linkage.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Ancona, L., & Pareyson, R. (1968). Contributo allo studio della aggressione: La dinamica della obbedienza distruttiva [Contribution to the study of aggression: The dynamics of destructive obedience]. Archivio di Psicologiu. Neurologia. e Psichiatria, 29 (4), 340–372.

Google Scholar  

Askenasy, H. (1978). Are we all Nazis? Secaucus, NJ: Lyle Stuart Inc.

Bandura, A. (1999). Moral disengagement in the perpetration of inhumanities. Personality and Social Psychology Review, 3 (3), 193–209.

Article   PubMed   Google Scholar  

Bandura, A., Underwood, B., & Fromson, M. E. (1975). Disinhibition of aggression through diffusion of responsibility and dehumanization of victims. Journal of Research in Personality, 9 (4), 253–269.

Article   Google Scholar  

Bartov, O. (2001). The Eastern Front, 1941–45, German troops and the barbarization of warfare (2nd ed.). New York: Palgrave.

Bauman, Z. (1989). Modernity and the Holocaust . Ithaca, NY: Cornell University Press.

Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s ‘behavioral study of obedience’. American Psychologist, 19 (6), 421–423.

Baumrind, D. (2013). Is Milgram’s deceptive research ethically acceptable? Theoretical and Applied Ethics, 2 (2), 1–18.

Baumrind, D. (2015). When subjects become objects: The lies behind the Milgram legend. Theory and Psychology, 25 (5), 690–696.

Beauvois, J. L., Courbet, D., & Oberlé, D. (2012). The prescriptive power of the television host. A transposition of Milgram’s obedience paradigm to the context of TV game show. European Review of Applied Psychology, 62 (3), 111–119.

Berger, L. (1983). A psychological perspective on the Holocaust: Is mass murder part of human behavior? In R. L. Braham (Ed.), Perspectives on the Holocaust (pp. 19–32). Boston, MA: Kluwer-Nijhoff Publishing.

Berkowitz, L. (1999). Evil is more than banal: Situationism and the concept of evil. Personality and Social Psychology Review, 3 (3), 246–253.

Blass, T. (1991). Understanding behavior in the Milgram obedience experiment: The role of personality, situations, and their interactions. Journal of Personality and Social Psychology, 60 (3), 398–413.

Blass, T. (1993). Psychological perspectives on the perpetrators of the Holocaust: The role of situational pressures, personal dispositions, and their interactions. Holocaust and Genocide Studies, 7 (1), 30–50.

Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram . New York: Basic Books.

Blass, T. (2012). A cross-cultural comparison of studies of obedience using the Milgram paradigm: A review. Social and Personality Psychology Compass, 6 (2), 196–205.

Bocchiaro, P., & Zimbardo, P. G. (2010). Defying unjust authority: An exploratory study. Current Psychology, 29 (2), 155–170.

Brannigan, A. (2013). Beyond the banality of evil: Criminology and genocide . Oxford: Oxford University Press.

Book   Google Scholar  

Brannigan, A., Nicholson, I., & Cherry, F. (2015). Introduction to the special issue: Unplugging the Milgram machine. Theory & Psychology, 25 (5), 551–563.

Brief, A. P., Buttram, R. T., Elliott, J. D., Reizenstein, R. M., & McCline, R. L. (1995). Releasing the beast: A study of compliance with orders to use race as a selection criterion. Journal of Social Issues, 51 (3), 177–193.

Browning, C. R. (1992). Ordinary men: Reserve Police Battalion 101 and the final solution in Poland . New York: HarperCollins.

Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64 (1), 1–11.

Burger, J. M., Girgis, Z. M., & Manning, C. C. (2011). In their own words: Explaining obedience to authority through an examination of participants’ comments. Social Psychological and Personality Science, 2 (5), 460–466.

Burley, P. M., & McGuinness, J. (1977). Effects of social intelligence on the Milgram paradigm. Psychological Reports, 40, 767–770.

Charny, I. W. (1982). How can we commit the unthinkable? Genocide, the human cancer . Boulder, CO: Westview Press.

Costanzo, E. M. (1977). The effect of probable retaliation and sex related variables on obedience (Doctoral thesis). Retrieved from Dissertation Abstracts International (UMI No. 77-3253).

Darley, J. M. (1992). Social organization for the production of evil. Psychological Inquiry, 3 (2), 199–218.

Darley, J. M. (1995). Constructive and destructive obedience: A taxonomy of principal-agent relationships. Journal of Social Issues, 51 (3), 125–154.

De Swaan, A. (2015). The killing compartments: The mentality of mass murder . New Haven, CT: Yale University Press.

Dicks, H. V. (1972). Licensed mass murder: A socio-psychological study of some SS killers . London: Heinemman Educational for Sussex University.

Doliński, D., Grzyb, T., Folwarczny, M., Grzybała, P., Krzyszycha, K., Martynowska, K., et al. (2017). Would you deliver an electric shock in 2015? Obedience in the experimental paradigm developed by Stanley Milgram in the 50 years following the original studies. Social Psychological and Personality Science, 8 (8), 927–933.

Earl, H. C. (2009). The Nuremberg SS-Einsatzgruppen trial, 1945–1958: Atrocity, law, and history . Cambridge: Cambridge University Press.

Eckman, B. K. (1977). Stanley Milgram’s ‘obedience’ studies. Et cetera, 34 (1), 88–99.

Edwards, D. M., Franks, P., Friedgood, D., Lobban, G., & Mackay, H. C. G. (1969). An experiment on obedience . Unpublished Student Report, University of the Witwatersrand, Johannesburg, South Africa.

Elms, A. C. (1972). Social psychology and social relevance . Boston, MA: Little, Brown.

Elms, A. C. (1995). Obedience in retrospect. Journal of Social Issues, 51 (3), 21–31.

Etzioni, A. (1968). A model of significant research. International Journal of Psychiatry, 6 (4), 279–280.

PubMed   Google Scholar  

Fenigstein, A. (1998a). Reconceptualizing the obedience of the perpetrators. In D. G. Shilling (Ed.), Lessons and legacies, volume II: Teaching the Holocaust in a changing world (pp. 55–84). Evanston, IL: Northwestern University Press.

Fenigstein, A. (1998b). Were obedience pressures a factor in the Holocaust? Analyse & Kritik, 20 (1), 54–73.

Fenigstein, A. (2015). Milgram’s shock experiments and the Nazi perpetrators: A contrarian perspective on the role of obedience pressures during the Holocaust. Theory & Psychology, 25 (5), 581–598.

Foddy, W. H. (1971). Compliance to rational-legal authority (Monograph Series of Experimental Sociology Laboratory No. 3). Vancouver: Department of Anthropology and Sociology, University of British Columbia.

Fromm, E. (1973). The anatomy of human destructiveness . London: Jonathan Cape.

Geller, D. M. (1976). A role-playing simulation of obedience: Focus on involvement (Doctoral thesis). Retrieved from Dissertation Abstracts International (UMI No. 76-276).

Gibson, S. (2013a). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52 (2), 290–309.

Gibson, S. (2013b). “The last possible resort”: A forgotten prod and the in situ standardization of Stanley Milgram’s voice-feedback condition. History of Psychology, 16 (3), 177–194.

Gibson, S. (2014). Discourse, defiance, and rationality: “Knowledge work” in the “obedience” experiments. Journal of Social Issues, 70 (3), 424–438.

Goldhagen, D. J. (1996). Hitler’s willing executioners: Ordinary Germans and the Holocaust . London: Alfred A. Knopf.

Gonen, J. Y. (2000). The roots of Nazi psychology: Hitler’s utopian barbarism . Lexington: The University of Kentucky.

Gupta, I. (1983). Obedience to authority amongst university students: An experimental analysis (Unpublished Doctoral thesis). University of Delhi, Delhi, India.

Hamilton, V. L., & Sanders, J. (1995). Crimes of obedience and conformity in the workplace: Surveys of Americans, Russians, and Japanese. Journal of Social Issues, 51 (3), 67–88.

Hamilton, V. L., & Sanders, J. (1999). The second face of evil: Wrongdoing in and by the corporation. Personality and Social Psychology Review, 3 (3), 222–233.

Harré, R. (1979). Social being: A theory for social psychology . Oxford, UK: Basil Blackwell.

Haslam, S. A., Reicher, S. D., & Birney, M. E. (2016). Questioning authority: New perspectives on Milgram’s ‘obedience’ research and its implications for intergroup relations. Current Opinion in Psychology, 11, 6–9.

Helm, C., & Morelli, M. (1979). Stanley Milgram and the obedience experiment: Authority, legitimacy, and human action. Political Theory, 7 (3), 321–345.

Helm, C., & Morelli, M. (1985). Obedience to authority in a laboratory setting: Generalizability and context dependency. Political Studies, 33 (4), 610–627.

Herrera, C. D. (2001). Ethics, deception, and ‘those Milgram experiments’. Journal of Applied Philosophy, 18 (3), 245–256.

Hilberg, R. (1961). The destruction of the European Jews (1–3 vols.). New York: Holmes & Meier.

Hilberg, R. (1980). The significance of the Holocaust. In H. Friedlander & S. Milton (Eds.), The Holocaust: Ideology, bureaucracy, and genocide (the San José Papers) (pp. 95–102). Millwood, NY: Kraus International Publications.

Hofling, C. K., Brotzman, E., Dalrymple, S., Graves, N., & Pierce, C. (1966). An experimental study of nurse-physician relations. Journal of Nervous and Mental Disease, 143 (2), 171–180.

Höss, R. (2001). Commandant of Auschwitz: The autobiography of Rudolf Hoess . London: Phoenix Press.

Humphreys, L. (1970). Tearoom trade: Impersonal sex in public places . Chicago: Aldine Publishing Co.

Kaposi, D. (2017). The resistance experiments: Morality, authority and obedience in Stanley Milgram’s account. Journal for the Theory of Social Behaviour, 47 (4), 382–401.

Kaufmann, H. (1967). The price of obedience and the price of knowledge. American Psychologist, 22 (4), 321–322.

Kelman, H. C. (1972). Human use of human subjects: The problem of deception in social psychological experiments. In A. G. Miller (Ed.), The social psychology of psychological research (pp. 163–178). New York: Free Press.

Kelman, H. C., & Hamilton, V. L. (1989). Crimes of obedience: Toward a social psychology of authority and responsibility . New Haven, CT: Yale University Press.

Kilham, W., & Mann, L. (1974). Level of destructive obedience as a function of transmitter and executant roles in the Milgram obedience paradigm. Journal of Personality and Social Psychology, 29 (5), 696–702.

Lawson, T. (2010). Debates on the Holocaust . Manchester: Manchester University Press.

Lippman, M. (1982). The trial of Adolf Eichmann and the protection of universal human rights under international law. Houston Journal of International Law, 5 (1), 1–34.

Luban, D., Strudler, A., & Wasserman, D. (1992). Moral responsibility in the age of bureaucracy. Michigan Law Review, 90 (8), 2348–2392.

Lutsky, N. (1995). When is ‘obedience’ obedience? Conceptual and historical commentary. Journal of Social Issues, 51 (3), 55–65.

Mandel, D. R. (1998). The obedience alibi: Milgram’s account of the Holocaust reconsidered. Analyse & Kritik, 20 (1), 74–94.

Mantell, D. M. (1971). The potential for violence in Germany. Journal of Social Issues, 27 (4), 101–112.

Mantell, D. M., & Panzarella, R. (1976). Obedience and responsibility. British Journal of Social and Clinical Psychology, 15 (3), 239–245.

Marcus, S. (1974). Book review of ‘obedience to authority’ by Stanley Milgram. The New York Times Book Review, 79 (2), 1–3.

Martin, J., Lobb, B., Chapman, G. C., & Spillane, R. (1976). Obedience under conditions demanding self-immolation. Human Relations, 29 (4), 345–356.

Mastroianni, G. R. (2002). Milgram and the Holocaust: A reexamination. Journal of Theoretical and Philosophical Psychology, 22 (2), 158–173.

McGaha, A. C., & Korn, J. H. (1995). The emergence of interest in the ethics of psychological research with humans. Ethics & Behavior, 5 (2), 147–159.

Meeus, W. H. J., & Raaijmakers, Q. A. W. (1986). Administrative obedience: Carrying out orders to use psychological-administrative violence. European Journal of Social Psychology, 16, 311–324.

Meeus, W. H. J., & Raaijmakers, Q. A. W. (1995). Obedience in modern society: The Utrecht studies. Journal of Social Issues, 51 (3), 155–175.

Meyer, P. (1970, February). If Hitler asked you to electrocute a stranger, would you? Probably. Esquire , 73, 128, 130, 132.

Miale, F. R., & Selzer, M. (1975). The Nuremberg mind: The psychology of the Nazi leaders . New York: Quadrangle.

Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67 (4), 371–378.

Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19 (11), 848–852.

Milgram, S. (1972). Interpreting obedience: Error and evidence. A reply to Orne and Holland. In A. G. Miller (Ed.), The social psychology of psychological research (pp. 138–154). New York: Free Press.

Milgram, S. (1974). Obedience to authority: An experimental view . New York: Harper & Row.

Milgram, S. (1977). Subject reaction: The neglected factor in the ethics of experimentation. Hastings Center Report, 7 (5), 19–23.

Miller, A. G. (1986). The obedience experiments: A case study of controversy in social science . New York: Praeger.

Miller, A. G. (1995). Constructions of the obedience experiments: A focus upon domains of relevance. Journal of Social Issues, 51 (3), 33–53.

Miller, A. G. (2004). What can the Milgram obedience experiments tell us about the Holocaust? Generalizing from the social psychology laboratory. In A. G. Miller (Ed.), The social psychology of good and evil (pp. 193–237). New York: Guilford Press.

Miller, A. G., Collins, B. E., & Brief, D. E. (1995). Perspectives on obedience to authority: The legacy of the Milgram experiments. Journal of Social Issues, 51 (3), 1–19.

Miranda, F. S. B., Caballero, R. B., Gomez, M. N. G., & Zamorano, M. A. M. (1981). Obediencia a la autoridad. Psiquis, 2, 212–221.

Mixon, D. (1972). Instead of deception. Journal of the Theory of Social Behavior, 2 (2), 145–177.

Mixon, D. (1976). Studying feignable behavior. Representative Research in Social Psychology, 7, 89–104.

Mixon, D. (1989). Obedience and civilization: Authorized crime and the normality of evil . London: Pluto Press.

Mook, D. G. (1983). In defense of external invalidity. American Psychologist, 38 (4), 379–387.

Morelli, M. F. (1983). Milgram’s dilemma of obedience. Metaphilosophy, 14 (3–4), 183–189.

Orne, M. T. (1962). On the social psychology of the psychology experiment: With particular reference to demand characteristics and their implications. American Psychologist, 17 (11), 776–783.

Orne, M. T., & Holland, C. C. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6 (4), 282–293.

Parker, I. (2000). Obedience. Granta: The Magazine of New Writing, 71, 99–125.

Patten, S. C. (1977). Milgram’s shocking experiments. Philosophy, 52 (202), 425–440.

Penner, L. A., Hawkins, H. L., Dertke, M. C., Spector, P., & Stone, A. (1973). Obedience as a function of experimenter competence. Memory and Cognition, 1 (3), 241–245.

Perry, G. (2012). Beyond the shock machine: The untold story of the Milgram obedience experiments . Melbourne: Scribe.

Perry, G. (2015). Seeing is believing: The role of the film obedience in shaping perceptions of Milgram’s obedience to authority experiments. Theory & Psychology, 25 (5), 622–638.

Pigden, C. R., & Gillet, G. R. (1996). Milgram, method and morality. Journal of Applied Philosophy, 13 (3), 233–250.

Reicher, S., & Haslam, S. A. (2011). After shock? Towards a social identity explanation of the Milgram ‘obedience’ studies. British Journal of Social Psychology, 50 (1), 163–169.

Rosenbaum, M. (1983). Compliance. In M. Rosenbaum (Ed.), Compliant behavior: Beyond obedience to authority (pp. 25–49). New York: Human Sciences Press Inc.

Rosenhan, D. (1969). Some origins of concern for others. In P. Mussen, J. Langer, & M. Covington (Eds.), Trends and issues in developmental psychology (pp. 134–153). New York: Holt, Rinehart & Winston.

Roth, P. A. (2004). Hearts of darkness: ‘Perpetrator history’ and why there is no why. History of the Human Sciences, 17 (2–3), 211–251.

Russell, N. J. C. (2009). Stanley Milgram’s obedience to authority experiments: Towards an understanding of their relevance in explaining aspects of the Nazi Holocaust (Unpublished Doctoral thesis). Victoria University of Wellington, New Zealand.

Russell, N. J. C. (2014). Stanley Milgram’s obedience to authority “relationship” condition: Some methodological and theoretical implications. Social Sciences, 3 (2), 194–214.

Russell, N. J. C., & Gregory, R. J. (2005). Making the undoable doable: Milgram, the Holocaust and modern government. American Review of Public Administration, 35 (4), 327–349.

Russell, N. J. C., & Gregory, R. J. (2011). Spinning an organizational “web of obligation”? Moral choice in Stanley Milgram’s “obedience” experiments. The American Review of Public Administration, 41 (5), 495–518.

Russell, N. J. C., & Gregory, R. J. (2015). The Milgram-Holocaust linkage: Challenging the present consensus. State Crime Journal, 4 (2), 128–153.

Sabini, J., & Silver, M. (1982). Moralities of everyday life . New York: Oxford University Press.

Sabini, J., Siepmann, M., & Stein, J. (2001a). The really fundamental attribution error in social psychological research. Psychological Inquiry, 12 (1), 1–15.

Sabini, J., Siepmann, M., & Stein, J. (2001b). Authors’ response to commentaries. Psychological Inquiry, 12 (1), 41–48.

Saltzman, A. L. (2000). The role of the obedience experiments in Holocaust studies: The case of renewed visibility. In T. Blass (Ed.), Obedience to authority: Current perspectives on the Milgram paradigm (pp. 125–143). Mahwah, NJ: Lawrence Erlbaum Associates.

Schlenker, B. R., & Forsyth, D. R. (1977). On the ethics of psychological research. Journal of Experimental Social Psychology, 13 (4), 369–396.

Schuler, H. (1982). Ethical problems in psychological research . New York: Academic Press.

Schurz, G. (1985). Experimentelle Uberprufung des Zusammenhangs zwischen Persönlichkeitsmerkmalen und der Bereitschaft zum destruktiven Gehorsam gegenuber Autoritäten. Zeitschrift für Experimentelle und Angewandte Psychologie, 32, 160–177.

Sereny, G. (2000). The German trauma: Experiences and reflections 1938–2000 . London: Allen Lane.

Shanab, M. E., & Yahya, K. A. (1977). A behavioral study of obedience in children. Journal of Personality and Social Psychology, 35 (7), 530–536.

Shanab, M. E., & Yahya, K. A. (1978). A cross-cultural study of obedience. Bulletin of the Psychonomic Society, 11 (4), 267–269.

Shelton, G. A. (1982). The generalization of understanding to behaviour: The role of perspective in enlightenment (Unpublished Doctoral thesis). University of British Columbia, Vancouver.

Sheridan, C. L., & King, R. G. (1972). Obedience to authority with an authentic victim. Proceedings of the American Psychological Association (80th Annual Convention) , 7 , 165–166.

Slater, L. (2004). Opening Skinner’s box: Great psychological experiments of the twentieth century . New York: W. W. Norton.

Staub, E. (1989). The roots of evil: The origins of genocide and other group violence . Cambridge: Cambridge University Press.

Steiner, J. M. (1980). The SS yesterday and today: A sociopsychological view. In J. E. Dimsdale (Ed.), Survivors, victims, and perpetrators: Essays on the Nazi Holocaust . New York: Hemisphere Publications.

Tedeschi, J. T., Lindskold, S., & Rosenfeld, P. (1985). Introduction to social psychology . St. Paul, MN: West Publishing Company.

Waller, J. (2002). Becoming evil: How ordinary people commit genocide and mass killing (2nd ed.). New York: Oxford University Press.

Zangwill, N. (2003). Perpetrator motivation: Some reflections on the Browning/Goldhagen debate. In E. Garrard & G. Scarre (Eds.), Moral philosophy and the Holocaust (pp. 89–102). Burlington, VT: Ashgate.

Download references

Author information

Authors and affiliations.

University of Calgary, Calgary, AB, Canada

Nestar Russell

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Nestar Russell .

Rights and permissions

Reprints and permissions

Copyright information

© 2018 The Author(s)

About this chapter

Russell, N. (2018). Academia’s Response to Milgram’s Findings and Explanation. In: Understanding Willing Participants, Volume 1. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-319-95816-3_6

Download citation

DOI : https://doi.org/10.1007/978-3-319-95816-3_6

Published : 18 September 2018

Publisher Name : Palgrave Macmillan, Cham

Print ISBN : 978-3-319-95815-6

Online ISBN : 978-3-319-95816-3

eBook Packages : Behavioral Science and Psychology Behavioral Science and Psychology (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10686423

Logo of plosone

The power of social influence: A replication and extension of the Asch experiment

Axel franzen.

Institute of Sociology, University of Bern, Bern, Switzerland

Sebastian Mader

Associated data.

The data used in this study is publicly available in the repository of the University of Bern at https://boris.unibe.ch/id/eprint/169645 .

In this paper, we pursue four goals: First, we replicate the original Asch experiment with five confederates and one naïve subject in each group (N = 210). Second, in a randomized trial we incentivize the decisions in the line experiment and demonstrate that monetary incentives lower the error rate, but that social influence is still at work. Third, we confront subjects with different political statements and show that the power of social influence can be generalized to matters of political opinion. Finally, we investigate whether intelligence, self-esteem, the need for social approval, and the Big Five are related to the susceptibility to provide conforming answers. We find an error rate of 33% for the standard length-of-line experiment which replicates the original findings by Asch (1951, 1955, 1956). Furthermore, in the incentivized condition the error rate decreases to 25%. For political opinions we find a conformity rate of 38%. However, besides openness, none of the investigated personality traits are convincingly related to the susceptibility of group pressure.

1. Introduction

A core assumption in sociology is that what humans think and do does not only depend on their own attitudes and disposition, but also to a large extent on what others think and do. The power of social influence on individuals’ behavior was demonstrated already in the 1950s in a series of experiments by Solomon Asch [ 1 – 3 ]. Asch invited individuals into the lab and assigned them the task of judging the length of a line. He also placed 6 confederates into the lab who were assigned to give wrong answers publicly, so that the naïve subject could hear them before he provided his own answer. The results were very surprising: on average 35% of the real subjects followed the opinions of the confederates even if their answer was obviously wrong. The work of Asch has attracted a great amount of attention in the social sciences. Hence, a multitude of replications, extensions, and variations of the original studies have been conducted. However, many of these replications were done with student samples in the US, and fewer studies consist of samples from other countries. Furthermore, many replications were undertaken in the 40 years following the original experiment of Asch, but there are fewer replications thereafter. This raises two important questions: First, are the findings of Asch universal or do they predominantly apply to American students? And second, are the findings still valid today or has the influence of others diminished over time, for instance through increased education and democratization?

Moreover, many experiments in psychology are not incentivized by monetary rewards. This is also true for Asch’s original experiments and for most replications of it. However, in real life outside the lab, decisions are usually associated with consequences, either pleasant in the form of rewards, or unpleasant in the form of some kind of punishment. To make the study of decision-making more realistic, experiments in economics usually use monetary incentives [ 4 ]. To provide a conforming but wrong judgment in the original Asch experiment has no consequences, giving rise to the interesting question of whether the finding of Asch still holds when correct answers are rewarded. So far, the effect of incentives in the Asch decision situation has only been investigated rarely [ 5 – 7 ], with inconclusive evidence. Baron et al. [ 5 ] report that use of monetary incentives actually increased conformity when the task was difficult. A decreased conformity rate was only found in situations with easy tasks. Bhanot & Williamson [ 6 ] conducted two online experiments and found that incentivizing correct answers increases the number of conforming answers. Fujita and Mori [ 7 ] compared group reward and individual rewards in the Asch experiment and found that conformity vanished in the individual reward condition. Thus, the existing evidence on the role of incentives is inconclusive, calling for further investigations of the effect of incentives.

Of course, misjudging the length of lines when others do is not important in itself; the Asch experiment created so much attention because it elicits the suspicion that social influence is also present in other and more important social realms, for instance when it comes to political opinions. Early research by Crutchfield [ 8 ] suggests that the original findings on line judgment also transfer over to political opinions. We are only aware of one further study by Mallinson and Hatemi [ 9 ] that investigates the effect of social influence on opinion formation. However, the authors used a group discussion in the treatment condition, and hence diverged somewhat from the original Asch design. Furthermore, investigations of the effect of social conformity on political opinions are always idiosyncratic making further replications on the transferability from lines to a variety of political opinions important and interesting.

Moreover, behaving in a conforming way and misjudging tasks raises a number of interesting questions. About one third of Asch’s subjects was susceptible to social pressure on average. The rest solved the task correctly irrespective of the confederates’ opinion most of the time. How do those who are not influenced by the group differ from the ones that behave in a conformative manner? Crutchfield [ 8 ] investigated a number of personality traits such as competence, self-assertiveness, or leadership ability on the susceptibility to the pressure to conform to the groups’ judgment. However, many of the measurement instruments used by him or by others [ 10 , 11 ] investigating similar questions are suboptimal, and furthermore produced inconclusive results. Hence, it is worthwhile to further investigate the characteristics of those who conform to social pressure and of those who resist it. We are particularly interested in the Big Five, intelligence, self-esteem, and the need for social approval.

The remainder of the article proceeds in four sections. First, in section two, we present an elaborate literature review of the original Asch experiment, and its various replications. Section three describes how we conducted the replication of the Asch experiment and its variant by using political opinions. Furthermore, we describe how we implemented the incentives, and how we measure the various traits that are presumably related to behavior in the Asch experiment. Section four presents the results and section five concludes and discusses ideas for further research.

2. Literature review

In Asch’s [ 2 ] original experiment 6 to 8 confederates gathered in an experimental room and were instructed to give false answers in matching a line with the length of three reference lines. An additional uninstructed subject was invited into the experimental room and asked to provide his judgment after the next to last of the confederates. Asch [ 2 ] reports a mean error rate of 36.8% of the 123 real subjects in the critical trials in which the group provided the wrong answer. This result was replicated remarkably consistently. Bond and Smith [ 12 ] conducted a meta study including 44 strict replications, and report an average error rate of 25%. As with the study by Asch [ 2 ], the vast majority of these replications were conducted with male university students in the US. However, more recent studies from Japan [ 13 , 14 ], and Bosnia and Herzegovina [ 15 ] also confirm Asch’s findings. Takano and Sogon [ 14 ] found an error rate of 25% in male Japanese university students (n = 40) in groups with 6 to 9 confederates. Mori and Arai [ 13 ] used the fMORI technique in which participants wear polarized sunglasses allowing the perception of different lines from the same presentation. The method allows to abandon the use of confederates in the Asch judgment task. They replicated the conformity rate for Japanese female subjects (N = 16) but found no conformity for male subjects (N = 10). Usto et al. [ 15 ] found an error rate of 35% in 95 university students of both sexes from Bosnia and Herzegovina with five confederates per group. Other studies also show that subjects are influenced by groups, when the confederates provided their judgments anonymously or with respect to different judgment tasks such as judging the size of circles, completing rows of numbers, or judging the length of acoustic signals [ 8 , 12 , 16 – 21 ]. More recent studies conducted the Asch experiment also with children [ 22 – 25 ] suggesting that the conformity effect can also be found in preschool children. However, some studies also found age effects, such that younger children conformed to the groups majority judgment, but the effect decreases for adolescents [ 26 , 27 ]. To summarize, given the results of the literature, we expect to find a substantial conformity rate in the replication of the original Asch line experiment (H 1 ).

2.1 Monetary incentives

An important extension of the original Asch experiment is the introduction of incentives. In everyday life, decisions are usually associated with consequences. However, in the Asch experiment, as in many other experiments in psychology, decisions or behavior in the lab usually have no consequences, besides of standing out in the laboratory group. This raises questions of the external validity of non-incentivized experiments. Theoretically, it can be expected that correct judgments are less important if they are not incentivized. This could imply that the findings of the Asch experiment are partly methodological artifacts. So far there is only limited and inconclusive empirical evidence with respect to monetary incentives in the Asch experiment. Early studies analysed the role of the perceived societal or scientific importance of the task [ 20 ]. Later research incentivized correct answers in various conformity experiments. Andersson et al. [ 28 ] report that individual incentives decreased the effect of conformity on the prediction of stock prices. However, Bazazi et al. [ 29 ] report the opposite. They found that individualized incentives increase conformity in comparison to collective payoffs in an estimation task. In the study of Baron et al. [ 5 ] 90 participants solved two eyewitness identifications tasks (a line-up task and a task of describing male figures) in the presence of two unanimously incorrectly-answering confederates. Additionally, task importance (low versus high) and task difficulty (low versus high) were experimentally manipulated resulting in a 2 x 2 between subject design. Subjects in the high task importance condition received $20 if ranked in the top 12% of participants with regard to correct answers. Subjects in the low task importance condition received no monetary incentive for correct answers. The results of Baron et al. [ 5 ] show a conformity rate that closely replicates Asch’s [ 1 – 3 ] finding in the condition without monetary incentives. In the condition including a monetary incentive for correct answers, conformity rates drop by about half to an error rate of 15%. However, this result only emerges in the condition with low task difficulty. For the high task difficulty condition, the opposite effect of monetary incentives was observed. Thus, monetary incentives increased conformity when the task was difficult and decreased conformity in situations with easy tasks. However, one drawback of the study of Baron et al. [ 5 ] is a rather low sample size, which might explain the differential effects by experimental condition.

Fujita and Mori [ 7 ] analysed the effect of individual vs collective payoff in the Asch experiment. They found that the conformity effect disappears in the individually incentivized condition. However, also this study suffered from low sample sizes since there were only 10 subjects in the individualized minority incentive condition. Furthermore, Fujita and Mori [ 7 ] used the fMORI method and report that some subjects might have noticed the trick.

Bhanot and Williamson [ 6 ] conducted online experiments (using Amazon Mechanical Turk) in which 391 participants answered 60 multiple-choice trivia-knowledge questions while the most popular answer was displayed at each question. Correct answers were incentivized randomly with $0, $1, $2 or $3 each in a within-subject design, i.e. randomized over trials, not over subjects. Bhanot and Williamson [ 6 ] found that monetary incentives increase the proportion of answers that align with the majority. Hence, the studies using incentives yield inconclusive and contradicting results: Particularly, Baron et al. [ 5 ] found both an accuracy-increasing and accuracy-decreasing effect of monetary incentives depending on task difficulty. Bhanot and Williamson [ 6 ] found an increased conformity rate, and Fujita and Mori [ 7 ] found that the conformity bias disappears in the individually incentivized condition. Overall, we follow the economic notion that monetary incentives matter and expect that rewards for nonconformity decrease group pressure (H 2 ).

2.2 Political opinions

Another critical question is, whether matters of fact can be generalized to matters of attitude and opinion. Crutchfield [ 8 ] investigated experimentally the influence of social pressure on political opinions in an Asch-like situation. He found that agreement with the statement “Free speech being a privilege rather than a right, it is proper for a society to suspend free speech whenever it feels itself threatened” was almost 40 percentage points higher in the social pressure condition (58%, n = 50) than in the individual judgment condition (19%, n = 40). Furthermore, he observed a difference of 36%-points if the confederates answer “subversive activities” to the question "Which one of the following do you feel is the most important problem facing our country today? Economic recession, educational facilities, subversive activities, mental health or crime and corruption” as compared to an individual judgment condition (48% vs 12%). However, the results are based on a rather small number of cases and decisions were anonymous, unlike the original design of Asch.

To the best of our knowledge, there is only one further study that experimentally investigates the influence of social pressure on opinions regarding political issues in an Asch-like situation. In the study of Mallinson and Hatemi [ 9 ] participants (n = 58) were asked to give their opinion on a specific local political issue before and after a 30–45 minutes face-to-face group discussion (treatment condition). In the control condition subjects received written information that contradicts their initial opinion. They found that in the control condition only 8% changed their initial opinion when provided with further information, while in the treatment condition 38% of subjects changed their opinion. Yet, in this recent study the sample size is also rather small. To sum up, given the results of these two studies, we expect that groups exert influence also on political opinions (H 3 ).

2.3 Individual differences

Crutchfield [ 8 ] was also the first who investigated the relationship between personality traits and the susceptibility to the pressure of conformity. He found that low conformity rates were related to high levels of intellectual competence, ego strength, leadership ability, self-control, superiority feelings, adventurousness, self-assertiveness, self-respect, tolerance of ambiguity, and freedom from compulsion regarding rules. High levels of conformity were observed for subjects with authoritarian, anxious, distrustful, and conventional mindsets. However, no substantial correlation was found for neuroticism. Obviously, Crutchfield’s [ 8 ] study is limited by a rather low number of subjects (N = 50). Moreover, the measurement instruments used may be debatable from a contemporary point of view. We are aware of one more recent study with a sufficiently high number of study subjects and more rigid measurement instruments to test the influence of personality traits on conformity in Asch-like situations: Kosloff et al. [ 19 ] analysed the association of the Big Five personality traits (agreeableness, conscientiousness, extraversion, neuroticism, and openness) with conformity in public ratings of the humorousness of unfunny cartoons in 102 female college students. Kosloff et al. [ 19 ] found that subjects with low neuroticism, high agreeableness, and high conscientiousness scores show high levels of conformity. Extraversion, and openness were not associated with conformity ratings. Beyond that, we are not aware of any more studies that investigate the influence of the Big Five personality traits in the original Asch situation. However, there is evidence that openness is linked to nonconformity. Eck and Gebauer [ 30 ] argue that “open people engage in independent thought and, thus, rely little on the conformity heuristic”.

Crutchfield [ 8 ] studied the effect of intellectual competence on conformity. He found that higher competence was associated with lower levels of conformity. However, intelligence was measured by the subjective ratings of the experimental staff. Iscoe, Williams, and Harvey [ 10 ] exposed high school students (7 to 15 years) to group pressure in an acoustic task (counting metronome ticks), and approximated intelligence by subjects’ school records. They found no correlation of school records with conformity. Uchida et al. [ 31 ] studied 12 to 14 year-old high school students and assessed scholastic achievements by their school performance. They report that high achievers conformed less to the majority than low achievers. Hence, results of the effect of intelligence on conformity are inconclusive so far and the existing studies use indirect measures (school grades) but do not measure intelligence directly.

The effect of self-esteem (or self-assertiveness, self-consciousness) on conformity was only investigated in a few studies so far. Kurosawa [ 11 ] found no effect on conformity when the decision of the minority subject was preceded by two confederates. In groups of four, confederates’ self-esteem had a negative effect on conformity. Similarly, Tainaka et al. [ 32 ] found in a sample of Japanese female students that those with low self-esteem conformed more often in a co-witness task.

In addition, the need for social approval may explain individual differences in conformity behavior. The urge to please others by adhering to social norms is expected to be positively related to conformity, simply because conformity is socially approved in many situations and because of a general tendency among humans toward acquiescence. Once more, Crutchfield [ 8 ] provided the first hints of a positive relationship between the need for social approval and conforming behavior in an anonymous Asch situation. However, the measurement instrument he used is debatable. Strickland and Crowne [ 33 ] confirmed Crutchfield’s [ 8 ] finding in a sample of 64 female students exposed to an Asch-like acoustic judgment task using the Crowne-Marlowe (CM) social desirability scale [ 34 , 35 ] to gauge the need for social approval. Again, we are not aware of any other more recent study on this aspect. Hence, we investigate the association of the need for social approval using the CM social desirability scale as well as a more recent and supposedly more appropriate instrument to capture the need for social approval [ 36 ]. Summarizing, we expect to find a positive association between social approval and conformity (H 4 ), and negative associations for intelligence (H 5 ) and self-esteem (H 6 ). With respect to the Big Five we follow Eck and Gebauer [ 30 ] and expect a negative relation between openness and conformity (H 7 ).

Finally, Crutchfield [ 8 ] also analysed the influence of gender on conformity in a sample of 40 female and 19 male college students (study two). He found that young women show higher conformity rates than young men. Yet, in a third study he found that female college alumnae (N = 50) show lower conformity rates than in study one. Hence, Crutchfield’s [ 8 ] findings for the gender effect are inconclusive. However, Bond and Smith [ 12 ] report in their meta-analysis higher conformity rates for females. The study by Griskevicius et al. [ 18 ] shows that gender-differences in conformity depend on the activation of behavioral motives. Men who were primed to attract a mate revealed more independent judgments than women primed to attract a mate, supposedly because of differing mating preferences in men and women. Therefore, we wonder, whether we can replicate the finding that females are more conformative than males in the Asch experiment.

3. Design and method

3.1 procedure and materials.

The experiment consisted of three parts. Part 1 was designed to replicate the original Asch experiment. For this purpose, we recruited 210 subjects on the campus of the University of Bern. Informed consent was obtained verbally before participants entered the experimental room. We randomized subjects into two groups. In group one subjects had to judge the length of lines, as in the original Asch experiment. For this purpose, we placed 5 confederates in addition to a naïve subject in a room. The confederates were asked to behave as naïve subjects and entered the room one after the other. The front row of the seats in the experimental room were numbered such that subjects sat next to each other. The naïve subject was always assigned to seat number 5, leaving the last seat to another confederate. First, we presented some instructions to the subjects: “Welcome to our study on decision-making behavior and opinions. This study consists of two parts: In the first part in this room, we ask you to solve a total of 10 short tasks. In the second part in the room next door, we ask you to complete a short questionnaire on the laptop. In total, this study takes about 40 minutes. As compensation for your participation, you will receive 20 Swiss francs in cash after completing the study.” We then presented a reference line to subjects next to three other lines that were numbered 1 through 3 on projected slides. Subjects were asked to judge the length of the reference line by naming the number of the line that corresponds to the reference line in length. We presented 10 such line tasks (see Fig A1 in the S1 Appendix ). In the first two trials as well as in trials number 4 and 8, confederates pointed out the correct lines. Four trials were easy tasks, since the difference between the reference line and two of the other lines was large. The other six trials were more difficult, since the differences were small. Subjects were asked to call out the number of the correct line always starting with subject 1 through 6.

After the line task in part 2 of the experiment subjects were confronted with 5 general questions on different political issues. The statements were selected because we believe they describe fundamental attitudes towards different political or social groups in a democracy. The five statements read (1) “Do you think that the Swiss Federal Government should be given more power?”, (2) “Do you think trade unions should be given more power in Switzerland?”, (3) “Do you think that the employers’ association in Switzerland should be given more power?”, (4) “Do you think that citizens should be given more liberties in Switzerland?”, and (5) “Do you think that companies in Switzerland should be given more freedom?”. Subjects were asked to answer all 5 questions with either yes or no. The confederates in this group were instructed to answer “yes” to the first question and “no” to the rest. We chose this sequence of “yes” and “no” to prevent that subjects discover the existence of confederates. Finally, part 3 of the experiment consisted of an online questionnaire which subjects were asked to complete. To conceal that some participants were confederates all 6 participants were accompanied to separate rooms where the online-questionnaire was installed on a laptop. The questionnaire was designed to measure a number of different personality traits. Particularly, we measured the Big Five using a 10-item scale (two items for each of the 5 traits) as suggested by Rammstedt et al. [ 37 ] (see Table A1 in the S1 Appendix for item wording); a 10-item scale measuring self-esteem as suggested by Rosenberg [ 38 ] (see Table A2 in the S1 Appendix ); a short version of the Hagen Matrices Test [ 39 ] to measure intelligence and the 10-item version of the Martin Larson Approval Motivation Scale (MLAM) [ 36 ].

In group 2 the experimental design and procedure was the same as in group 1 besides the fact that correct answers in the length of lines judgment task were incentivized. In addition to the 20 Swiss francs show-up fee, subjects received one Swiss franc for every correct answer in the line judgment task, and hence, could earn up to 30 Swiss francs in total. Since there are no correct answers to political opinions these were not incentivized. However, we randomized the confederates’ answers to political opinion questions independently of whether a subject was in the incentivized or non-incentivized group. In one version confederates answered “yes” to the first question and “no” to the four other questions. In the other version the sequence of the confederates’ response was “no” to the first question and “yes” in response to the other four. The experiment was conducted by three different research teams consisting of 7 student assistants each. In every group 5 students acted as confederates and 2 as research assistants, recruiting subjects, welcoming and instructing them in the laboratory room, and reading out loud the projected instructions.

A power analysis suggested that we need about 100 subjects per experimental condition to find statistically significant (α = 0.05) differences of 5 percentage points for a power of 0.8. Hence, we stopped recruiting subjects after reaching 210 participants. The experiment was conducted between March 16, 2021 and April 30, 2021. The authors had no access to any information that links individual identifiers to the data. Subjects were debriefed after the end of the study by email.

Overall, 210 subjects participated in the experiment (female = 61%, mean age = 22.6). 102 subjects were randomly assigned to the non-incentivized group and 108 into the group with incentives. Moreover, 113 subjects were assigned to the sequences of “yes” and four “no” of the political opinion task and 97 to the reversed sequence, suggesting that the randomization procedure worked well. The questionnaire also contained an attention check. The question reads “In the following we show you five answer categories. Please do not tick any of the answers”. Four subjects failed to comply and ticked an answer, suggesting that they did not pay proper attention to the question wording. These subjects were excluded from the analysis. Furthermore, we asked subjects at the end of the questionnaire what they think the experiment was about. Three subjects recognized that the experiment was the line task experiment of Asch or expressed the suspicion that some of the other group members were confederates. We also excluded these three subjects from the analysis. Moreover, one subject answered the question about their gender with “other” and was also excluded from the analysis. Hence, these exclusions result in 202 valid cases. However, the results presented do not depend on these eight excluded observations.

Fig 1 presents the results of the ten line length tasks for the non-incentivized (grey bars) and for the incentivized conditions (blue bars). As can be clearly seen, almost none of the naïve subjects gave an incorrect answer when the group provided the correct answer which was the case in decision situations 1, 2, 4 and 8. However, when the group provides the false answer a substantial number of naïve subjects provided this incorrect answer as well (decision situations 3, 5, 6, 7, 9, and 10). The proportion of incorrect answers in the non-incentivized condition is relatively small in decision 3 (10%), but relatively high in decisions number 6 and 7 (44% and 47%). The average of incorrect answers is 33% in the non-incentivized group, which is a perfect replication of Asch’s (1955) original 36.8% result (two sample two-sided T-test, t(16) = 0.59, p = 0.57).

An external file that holds a picture, illustration, etc.
Object name is pone.0294325.g001.jpg

Note: Percent of correct answers by experimental group and trial including 95% confidence intervals. The numbers on top of the bars denote the trial numbers. “correct” stands for uncritical trials, “false” for critical trials. “easy” denotes easy trials with big differences between the lines, and “hard” denotes more difficult trials with smaller differences between the lines. The numbers between the bars denote the difference in proportions between the groups in percentage points. One-sided T-tests: * = p < 0.05. N without incentive (no) = 99, n with incentive (yes) = 103.

When correct answers are incentivized, the proportion of incorrect answers decreases by on average 8%-points. The difference between the groups is statistically significant in 2 out of 6 critical trials (p < 0.05 for one-sided T-tests). The difference also becomes evident when we consider the number of incorrect answers in the 6 critical trials. When decisions were not incentivized subjects gave on average 1.97 incorrect answers. In the incentivized condition the average number dropped to 1.47, leading to a statistically significant difference of 0.5 incorrect answers (t(208) = 2.24, p = 0.03 for two-sided T-test).

Next, Fig 2 presents the results concerning the five political questions. When the group said “yes” to the question of whether the Swiss Federal Council (the government in Switzerland) should have more power, 27% of the naïve subjects did so as well. When the group said “no” only 3% of the subjects said “yes” resulting in a difference of 23.4%-points. When the group said that trade unions should have more power 72% of the subjects answered “yes” as compared to only 29% when the group said “no” resulting in a difference of 43%-points. Similarly, the question of whether the employers’ association should have more power is agreed to by 44% and 6% respectively, depending on the group agreeing or disagreeing. Moreover, 81% of the subjects agreed that citizens in Switzerland should be given more liberties when the group does so, and 33% agreed to this question when the group says “no”. Finally, 46% said that companies should be given more freedom when the group agreed but only 8% did so when the group denied this question. The average difference in the proportion of yes-answers is 38%-points and all 5 differences are statistically highly significant. This result corresponds astonishingly close to the result in the length of line experiment and suggests that the influence of group pressure can be generalized to the utterance of political opinions.

An external file that holds a picture, illustration, etc.
Object name is pone.0294325.g002.jpg

Note: Percent of ‘yes’ answers to five general questions on political opinions in which all confederates answered uniformly ‘yes’ or ‘no’, by experimental group including 95% confidence intervals. The numbers on top of the bars stand for the difference in proportions between the respective groups in percentage points. Two-sided T-tests: *** = p < 0.001. n (sequence yes, no, no, no, no) = 109, n (sequence no, yes, yes, yes, yes) = 93.

One interesting question is whether the susceptibility to group pressure is linked to certain personality traits. To investigate this question, we count the number of wrong answers in the six critical trials of the length of line task. This variable is our dependent variable and runs from 0 when a subject always gave correct answers to 6 for subjects who gave only wrong answers. First, we wondered whether conformity is linked to the Big Five personality traits. We measured the Big Five using a short 10-item version as suggested by Rammstedt et al. [ 37 ] which measures each trait (openness, extraversion, agreeableness, conscientiousness, and neuroticism) with two questions (see Table A1 in the S1 Appendix ).

Second, we incorporate a 10-item measure of self-esteem, as suggested by Rosenberg [ 38 ], into the analysis (see Table A2 in the S1 Appendix ). Each item of the scale has four answer categories ranging from 1 = “disagree strongly”, 2 = “disagree”, 3 = “agree” to 4 = “agree strongly”. Subjects that score high on self-esteem are expected to have stronger confidence in their own perception and should be less influenced by the group’s opinion. Third, we measured individuals’ intelligence using a short version of the Hagen Matrices Test (HMT) [ 39 ]. The HMT consists of six 9-field matrices that show graphical symbols that follow a logical order. The last field is missing and the task of the subjects is to pick the correct symbol, out of eight, that fits and completes the pattern of the matrix. Hence, the HMT ranges from 0 if no answer is correct to 6 for subjects who provided six correct answers. The hypothesis is that subjects who score high on the HMT are less susceptible to the pressure of the group and also provide more correct answers in the line task.

Finally, conformity might be linked to the need for social approval. We measured the need for social approval with a 10-item version of the Martin Larson Approval Motivation Scale (MLAM) [ 36 ] (see Table A3 in the S1 Appendix ). Individuals that score highly on the MLAM display high need for social approval by others. Hence, we expect that subjects with higher values on the MLAM should also conform more often to the opinions of others in order to receive social approval. A summary of the descriptive information of the considered variables is depicted in Table A4 in the S1 Appendix . To investigate whether any of the measured personality traits are linked to the answering behavior in the line task we conducted multiple OLS regression analysis. The results of this analysis are depicted in the coefficient plots in Fig 3 (see also Table A5 in the S1 Appendix ).

An external file that holds a picture, illustration, etc.
Object name is pone.0294325.g003.jpg

Note: N = 202. Unstandardized coefficients of multiple linear OLS regressions including robust 95% confidence intervals. Poisson and negative binomial models do not alter the results in any substantial way. Variables marked with an ‘*’ indicate statistically significant differences in the coefficients between models (2) and (3).

First, model 1 presents the effects on the number of conforming answers for the whole sample. In the incentivized condition subjects gave on average 0.43 fewer conforming answers as compared to the unincentivized condition. This effect mirrors the bivariate result already presented in Fig 1 and is statistically significant for the 5% level. In tendency, females show more conforming answers, but this effect is statistically only significant for the 10% level. Besides “openness” none of the personality traits contained in the Big Five show any statistically significant effects. This is also true for the other effects of intelligence, self-esteem, and the measure for social approval seeking. Models 2 and 3 show the results for men and women separately. The separate results suggest that women react somewhat more strongly to incentives than do men. However, a test for differences in coefficients suggests that the effects do not differ (χ 2 (1) = 1.11, p = 0.29). Intelligence seems to have greater importance for men, leading to 0.27 fewer conforming answers for every correct answer of the HMT. However, this effect does not differ statistically from the effect for females (χ 2 (1) = 1.35, p = 0.25). No difference in effects can be observed for self-esteem. However, in the female sample the need for social approval is positively linked to the number of conforming answers, which is not the case in the male sample (χ 2 (1) = 4.23, p = 0.04); but the effect of social approval in the female sample is relatively small.

We conducted a number of robustness checks with the presented analyses. Since our dependent variable is a count variable (number of conforming answers) the models can also be estimated using Poisson regressions or negative-binomial models. However, none of our presented results change in any substantial way using these alternatives. Furthermore, we excluded 24 more subjects who when asked at the end of the experiment about the goal of the study said that the experiment was about group pressure or conformity, although they did not explicitly mention Asch or the suspicion that other participants were confederates. But these additional exclusions also did not change the results substantially (see Table A6 in the S1 Appendix ). Finally, we also incorporated the 10-item version of the Marlowe-Crowne Scale [ 34 , 35 ] suggested by Clancy [ 40 , 41 ]. However, inclusion of the scale did not show any statistically significant effects or did change any of the other estimates.

5. Conclusion and discussion

In this study we first replicated the original experiment of Asch [ 1 – 3 ] with 5 confederates and ten line tasks. We find an average error rate of 33% which replicates the original findings of Asch very closely and which is in line with other replications that were conducted predominately with American students [ 12 ]. Together with recent studies from Japan [ 14 ], and Bosnia and Herzegowina [ 15 ], our study provides further evidence that the influence of groups on individuals’ judgments is a universal phenomenon, and is still valid today. Furthermore, we incentivized the decisions and find a drop of the error rate by 8%-points to 25%. Hence, monetary incentives do not eliminate the effect of group pressure. This finding sheds doubt on former results which predominately show the opposite effect, namely that incentives increase compliance [ 5 , 6 ].

Moreover, our study suggests that group pressure is not only influential in the simple line task but also when it comes to political opinions. We randomized the groups’ response to five different political statements and find an average conformity rate of 38%. Hence, these results suggest that the original finding of Asch can also be generalized to matters of opinion. This result is in line with former evidence by Crutchfield [ 8 ], and Mallinson and Hatemi [ 9 ]. However, both of these studies had only small sample sizes of 50 and 58 subjects respectively, which called for further replication studies. Finally, we measured the Big Five, intelligence, self-esteem and social approval. With the exception of openness, our study finds no support that these personality traits are statistically significantly related to the susceptibility of group pressure.

Of course, our study has some limitations, which suggest a number of further research questions. First, we used a relatively large sample of 202 subjects providing more statistical power than former replications and extensions of the Asch experiment; however, our subjects were also students, and hence, it would be important to have further replications with non-student samples. This would allow further investigations of the susceptibility to group pressure with respect to age, different occupational groups, different social backgrounds, and different levels of social experience.

Second, the subjects we investigate are strangers. That means the single naïve subjects did not know the confederates. An interesting question for further research would be, whether group pressure is stronger among non-strangers or whether dissent becomes more acceptable among a group of friends.

Third, we demonstrate that monetary incentives reduce the error rate. However, our incentives were one Swiss franc for every correct answer, and hence small. Thus, the interesting question remains whether larger incentives reduce the error rate further, or can even lead to the elimination of it.

Fourth, the political statements we choose are relatively moderate and general. This leaves the question open as to whether subjects would also conform to more extreme or socially less acceptable statements. Furthermore, our subjects might have rarely thought about the statements we provided, leaving the question of what would happen with respect to statements about which subjects had stronger opinions or which are more related to their identity.

With the exception of openness all personality traits considered (e.g. intelligence, self-esteem, need for social approval) are not related to conformity. This raises a number of very interesting research questions. One possibility is that the traits were not measured good enough, and that measurement errors impede the identification of these individual differences. This concern applies particularly to the measurement of the Big Five where we relied on the short 10-item version suggested by Rammstedt et al. [ 37 ]. Hence, the puzzling result that openness leads to less conformity must be replicated before it can count as a reliable finding. However, the finding is in line with the assumption of Eck and Gebauer [ 30 ]. Another possibility is that other personality traits are more important when it comes to conformity behavior. Hence, there is much room for further interesting research concerning conformity behavior in situations of group pressure.

Supporting information

S1 appendix, acknowledgments.

We like to thank our student assistants for helping us with the data collection. Their names are: Yvonne Aregger, Elias Balmer, Ambar Conca, Davide Della Porta, Shania Flück, Julian Gerber, Anna Graf, Ina Gutjahr, Kim Gvozdic, Anna Häberli, Chiara Heiss, Paula Kühne, Jenny Mosimann, Remo Parisi, Elena Raich, Virginia Reinhard, Fiona Schläppi, Maria Tournas, Angela Ventrici, Marco Zbinden, Sarah Zwyssig.

Funding Statement

The author(s) received no specific funding for this work.

Data Availability

does this case study support the findings of milgram and asch

Final dates! Join the tutor2u subject teams in London for a day of exam technique and revision at the cinema. Learn more →

Reference Library

Collections

  • See what's new
  • All Resources
  • Student Resources
  • Assessment Resources
  • Teaching Resources
  • CPD Courses
  • Livestreams

Study notes, videos, interactive activities and more!

Psychology news, insights and enrichment

Currated collections of free resources

Browse resources by topic

  • All Psychology Resources

Resource Selections

Currated lists of resources

Study Notes

Conformity - Asch (1951)

Last updated 6 Sept 2022

  • Share on Facebook
  • Share on Twitter
  • Share by Email

Asch (1951) conducted one of the most famous laboratory experiments examining conformity. He wanted to examine the extent to which social pressure from a majority, could affect a person to conform.

Asch’s sample consisted of 50 male students from Swarthmore College in America, who believed they were taking part in a vision test. Asch used a line judgement task, where he placed on real naïve participants in a room with seven confederates (actors), who had agreed their answers in advance. The real participant was deceived and was led to believe that the other seven people were also real participants. The real participant always sat second to last.

In turn, each person had to say out loud which line (A, B or C) was most like the target line in length.

does this case study support the findings of milgram and asch

Unlike Jenness’ experiment , the correct answer was always obvious. Each participant completed 18 trials and the confederates gave the same incorrect answer on 12 trials, called critical trials. Asch wanted to see if the real participant would conform to the majority view, even when the answer was clearly incorrect.

Asch measured the number of times each participant conformed to the majority view. On average, the real participants conformed to the incorrect answers on 32% of the critical trials. 74% of the participants conformed on at least one critical trial and 26% of the participants never conformed. Asch also used a control group, in which one real participant completed the same experiment without any confederates. He found that less than 1% of the participants gave an incorrect answer.

Asch interviewed his participants after the experiment to find out why they conformed. Most of the participants said that they knew their answers were incorrect, but they went along with the group in order to fit in, or because they thought they would be ridiculed. This confirms that participants conformed due to normative social influence and the desire to fit in.

Evaluation of Asch

Asch used a biased sample of 50 male students from Swarthmore College in America. Therefore, we cannot generalise the results to other populations, for example female students, and we are unable to conclude if female students would have conformed in a similar way to male students. As a result Asch’s sample lacks population validity and further research is required to determine whether males and females conform differently

Furthermore, it could be argued that Asch’s experiment has low levels of ecological validity . Asch’s test of conformity, a line judgement task, is an artificial task, which does not reflect conformity in everyday life. Consequently, we are unable to generalise the results of Asch to other real life situations, such as why people may start smoking or drinking around friends, and therefore these results are limited in their application to everyday life.

Finally, Asch’s research is ethically questionable. He broke several ethical guidelines , including: deception and protection from harm . Asch deliberately deceived his participants, saying that they were taking part in a vision test and not an experiment on conformity. Although it is seen as unethical to deceive participants, Asch’s experiment required deception in order to achieve valid results. If the participants were aware of the true aim they would have displayed demand characteristics and acted differently. In addition, Asch’s participants were not protected from psychological harm and many of the participants reporting feeling stressed when they disagreed with the majority. However, Asch interviewed all of his participants following the experiment to overcome this issue.

  • Normative Social Influence
  • Task Difficulty

You might also like

Conformity - variations of asch (1951), types of conformity, explanations for conformity, ethics and psychology, conformity - jenness (1932), conformity to social roles, conformity to social roles as investigated by zimbardo, resistance to social influence - social support, our subjects.

  • › Criminology
  • › Economics
  • › Geography
  • › Health & Social Care
  • › Psychology
  • › Sociology
  • › Teaching & learning resources
  • › Student revision workshops
  • › Online student courses
  • › CPD for teachers
  • › Livestreams
  • › Teaching jobs

Boston House, 214 High Street, Boston Spa, West Yorkshire, LS23 6AD Tel: 01937 848885

  • › Contact us
  • › Terms of use
  • › Privacy & cookies

© 2002-2024 Tutor2u Limited. Company Reg no: 04489574. VAT reg no 816865400.

MozartCultures

  • Archaeology
  • Movie/Serie

Asch and Milgram Experiments : Social Psychology

Asch and Milgram Experiments : Social Psychology

  • Ciccarelli, S. K., & White, J. N. (2018), Psychology (Fifth Edition). Pearson.
  • https://www.youtube.com/watch?v=NyDDyT1lDhA
  • https://www.verywellmind.com/the-asch-conformity-experiments-2794996
  • https://www.khanacademy.org/ela/cc-9th-reading-vocab/x73a76fccbaf2a246:cc-9th-social-psychology/x73a76fccbaf2a246:reading-for-understanding-informational-text/v/milgram-experiment-on-obedience

Gülce Gürel

No comments yet, be the first by filling the form.

IMAGES

  1. Milgram experiment

    does this case study support the findings of milgram and asch

  2. Milgram experiment

    does this case study support the findings of milgram and asch

  3. Milgram’s Experiment: Power or Influence?

    does this case study support the findings of milgram and asch

  4. PPT

    does this case study support the findings of milgram and asch

  5. PPT

    does this case study support the findings of milgram and asch

  6. The Psychology Experiment That Shocked the World: Milgram's Obedience

    does this case study support the findings of milgram and asch

VIDEO

  1. The Milgram Obedience Study: A Shocking Revelation

  2. AQA A Level Psychology: Social Influence topic

  3. Muu Kusonoki is now GUILTY according to the QRTs (Milgram)

  4. Social Influence: L1

  5. Modern Day Stanley Milgram Experiment

  6. Double (English Cover) MILGRAM【Can】

COMMENTS

  1. Milgram Shock Experiment: Summary, Results, & Ethics

    Social Support Condition. ... Milgram's findings have been replicated in a variety of cultures and most lead to the same conclusions as Milgram's original study and in some cases see higher obedience rates. ... D. E. (1996). The Stanley Milgram papers: A case study on appraisal of and access to confidential data files. American Archivist ...

  2. Milgram Experiment: Overview, History, & Controversy

    The Milgram experiment was a famous and controversial study that explored the effects of authority on obedience. During the 1960s, Yale University psychologist Stanley Milgram conducted a series of obedience experiments that led to some surprising results. In the study, an authority figure ordered participants to deliver what they believed were ...

  3. Milgram experiment

    Milgram experiment. The setup of the "shock generator" equipment for Stanley Milgram's experiment on obedience to authority in the early 1960s. The volunteer teachers were unaware that the shocks they were administering were not real. (more) Milgram included several variants on the original design of the experiment.

  4. Contesting the "Nature" Of Conformity: What Milgram and Zimbardo's

    The Classic Studies: Conformity, Obedience, and the Banality Of Evil. In Milgram's work , members of the general public (predominantly men) volunteered to take part in a scientific study of memory. They found themselves cast in the role of a "Teacher" with the task of administering shocks of increasing magnitude (from 15 V to 450 V in 15-V increments) to another man (the "Learner ...

  5. The Asch Conformity Experiments

    The Asch conformity experiments were a series of psychological experiments conducted by Solomon Asch in the 1950s. The experiments revealed the degree to which a person's own opinions are influenced by those of a group . Asch found that people were willing to ignore reality and give an incorrect answer in order to conform to the rest of the group.

  6. Asch conformity studies (Asch line studies)

    The Asch line experiments, conducted in the 1950s, explored how group behavior influences individual actions. The study found that 75% of participants conformed to the group's incorrect answer at least once due to perceived pressure. This phenomenon is known as Normative Social Influence and Informational Social Influence.

  7. What can we learn from the Milgram experiment

    The Milgram Study reveals that most people obey authority, even if it means harming others. This obedience is often justified by the Just World Phenomenon and the shedding of personal responsibility. The study urges us to be aware of these tendencies, take responsibility for our actions, and show compassion for all people. Created by Brooke Miller.

  8. 6.6C: The Asch Experiment- The Power of Peer Pressure

    Conducted by social psychologist Solomon Asch of Swarthmore College, the Asch conformity experiments were a series of studies published in the 1950s that demonstrated the power of conformity in groups. They are also known as the Asch paradigm. In the experiment, students were asked to participate in a group "vision test.

  9. What can Milgram and Zimbardo teach ethics committees and qualitative

    Philip Zimbardo's (1973) Stanford Prison Study and Stanley Milgram's (1974) Obedience study are convenient shorthand fall guys for justifying the necessity of ethics review. As with Adam and Eve's original sin producing the fall of man in the Christian faith, Zimbardo and Milgram are cast in this role, not only for use in psychology, but emblematic of the need to evaluate behavioral ...

  10. 12.4 Conformity, Compliance, and Obedience

    Conformity to a group norm to fit in, feel good, and be accepted by the group. Informational social influence. Conformity to a group norm prompted by the belief that the group is competent and has the correct information. Obedience. Changing your behavior to please an authority figure or to avoid aversive consequences.

  11. The Milgram-holocaust Linkage: Challenging the Present Consensus

    Thus, building on Solomon Asch's conformity experiment, Milgram developed a procedure and carried out a pilot study where, just as in the Holocaust, "ordinary" (American) people were given "orders" to act "aggressively towards another per-son" (Tavris 1974a: 80). The apparent power of "obedience to authority" during

  12. The Asch Conformity Experiments and Social Pressure

    The Asch Conformity Experiments. What Solomon Asch Demonstrated About Social Pressure. The Asch Conformity Experiments, conducted by psychologist Solomon Asch in the 1950s, demonstrated the power of conformity in groups and showed that even simple objective facts cannot withstand the distorting pressure of group influence.

  13. Asch: Social Influence, Conforming in Groups

    The Asch conformity experiments were a series of studies that starkly demonstrated the power of conformity in groups. Experimenters led by Solomon Asch asked students to participate in a "vision test." In reality, all but one of the partipants were shills of the experimenter, and the study was really about how the remaining student would react ...

  14. Contesting the "Nature" Of Conformity: What Milgram and ...

    The Classic Studies: Conformity, Obedience, and the Banality Of Evil. In Milgram's work , members of the general public (predominantly men) volunteered to take part in a scientific study of memory. They found themselves cast in the role of a "Teacher" with the task of administering shocks of increasing magnitude (from 15 V to 450 V in 15-V increments) to another man (the "Learner ...

  15. How Nazi's Defense of "Just Following Orders" Plays Out in the Mind

    Gina Perry, author of "Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments," found a litany of methodological problems with the study. Perry said Milgram's ...

  16. Academia's Response to Milgram's Findings and Explanation

    Perhaps the most common negative reaction to Milgram's first published results centered on the study's arguably unethical treatment of its participants. 5 Of particular concern were the potential long-term psychological and physiological harms associated with participation. As Milgram said in his first publication, one participant entered the laboratory, cheerful and merry, only to be ...

  17. The power of social influence: A replication and extension of the Asch

    1. Introduction. A core assumption in sociology is that what humans think and do does not only depend on their own attitudes and disposition, but also to a large extent on what others think and do. The power of social influence on individuals' behavior was demonstrated already in the 1950s in a series of experiments by Solomon Asch [ 1 - 3 ].

  18. The Asch Line Study (+3 Conformity Experiments)

    Asch Line Study vs. Milgram Experiment. Both the Asch Line Study and the Milgram Experiment look at conformity, obedience, and the negative effects of going along with the majority opinion. Those negative effects are slightly awkward, like in the Asch Line Study, or dangerous, like in the Milgram Experiment.

  19. Conformity

    Share : Asch (1951) conducted one of the most famous laboratory experiments examining conformity. He wanted to examine the extent to which social pressure from a majority, could affect a person to conform. Asch's sample consisted of 50 male students from Swarthmore College in America, who believed they were taking part in a vision test.

  20. Does this case study support the findings of Milgram and Asch ...

    This case study does not support the findings of Milgram and Asch. Their hypothesis was that individual will conform to group when pressured to do so. Rockwood did just the opposite; his respect for human rights was stronger.

  21. Asch and Milgram Experiments : Social Psychology

    The Asch Experiment is a study conducted by the American Social Psychologist Solomon Asch, published in 1953, also known as the Asch Line Conformity Study. In this experiment, it has been tried to measure the importance of people around a person in the decision-making process. For this experiment, while the participants sit around a table, the ...

  22. The Asch Study

    In 1951, Solomon Asch conducted an experiment to study the level at which social pressure from a group affects an individual's decision-making. This study fixated on conformity, which is defined ...